Skip to main content

3/2/2018: Accelerating AI with GPUs (hosted by NVIDIA)

Posted by on Friday, February 23, 2018 in Pizza and Programming.

We have a few more speakers lined up for this semester, but would love to hear from current ACCRE users this summer! Please email Will if you are even remotely interested and we can chat about it.


Data scientists in both industry and academia have been using GPUs for AI and machine learning to make groundbreaking improvements across a variety of applications including image classification, video analytics, speech recognition and natural language processing. In particular, Deep Learning – the use of sophisticated, multi-level “deep” neural networks to create systems that can perform feature detection from massive amounts of unlabeled training data – is an area that has been seeing significant investment and research.

Although AI has been around for decades, two relatively recent trends have sparked widespread use of Deep Learning within AI: the availability of massive amounts of training data, and powerful and efficient parallel computing provided by GPU computing. Early adopters of GPU accelerators for machine learning include many of the largest web and social media companies, along with top tier research institutions in data science and machine learning. With thousands of computational cores and 10-100x application throughput compared to CPUs alone, GPUs have become the processor of choice for processing big data for data scientists.

Leave a Response

You must be logged in to post a comment