News
 

Bookmark and Share

(0) 

Nvidia Corp. has announced that it had collaborated with a research team at Stanford University to create the world's largest artificial neural network built to model how the human brain learns. Computer-based neural networks are capable of "learning" how to model the behavior of the brain – including recognizing objects, characters, voices and audio in the same way that humans do.

GPU-Based Neural Network

Creating large-scale neural networks is extremely computationally expensive. For example, Google used approximately 1000 CPU-based servers, or 16000 CPU cores, to develop its neural network, which taught itself to recognize cats in a series of YouTube videos. The network included 1.7 billion parameters, the virtual representation of connections between neurons. The network created by Nvidia and Stanford is 6.5 times bigger than the previous record-setting network developed by Google in 2012.

In contrast, the Stanford team, led by Andrew Ng, director of the university's artificial intelligence lab, created an equally large network with only three servers using Nvidia GPUs to accelerate the processing of the big data generated by the network. With 16 Nvidia GPU-accelerated servers, the team then created an 11.2 billion-parameter neural network – 6.5 times bigger than a network Google announced in 2012.

The bigger and more powerful the neural network, the more accurate it is likely to be in tasks such as object recognition, enabling computers to model more human-like behavior.

"Delivering significantly higher levels of computational performance than CPUs, GPU accelerators bring large-scale neural network modeling to the masses. Any researcher or company can now use machine learning to solve all kinds of real-life problems with just a few GPU-accelerated servers," said Sumit Gupta, general manager of the Tesla accelerated computing business unit at Nvidia.

GPU Accelerators Power Machine Learning

Machine learning, a fast-growing branch of the artificial intelligence (AI) field, is the science of getting computers to act without being explicitly programmed. In the past decade, machine learning has given us self-driving cars, effective web search and a vastly improved understanding of the human genome. Many researchers believe that it is the best way to make progress towards human-level AI.

One of the companies using GPUs in this area is Nuance, a leader in the development of speech recognition and natural language technologies. Nuance trains its neural network models to understand users' speech by using terabytes of audio data. Once the models are trained, they can then recognize the pattern of spoken words by relating them to the patterns that the model learned earlier.

"GPUs significantly accelerate the training of our neural networks on very large amounts of data, allowing us to rapidly explore novel algorithms and training techniques. The resulting models improve accuracy across all of Nuance's core technologies in healthcare, enterprise and mobile-consumer markets," said Vlad Sejnoha, chief technology officer at Nuance.

Tags: Nvidia, Tesla, GPGPU

Discussion

Comments currently: 0

Add your Comment




Related news

Latest News

Wednesday, August 20, 2014

10:53 am | AMD to Cut Prices on FX-9000, Other FX Processors: New Prices Revealed. AMD to Make FX Chips More Affordable, Discontinue Low-End Models

10:32 am | LG to Introduce World’s First Curved 21:9 Ultra-Wide Display. LG Brings Curved Displays to Gamers, Professionals

9:59 am | AMD Readies FX-8370, FX-8370E Microprocessors. AMD Preps Two New “Mainstream” FX Chips

Monday, August 4, 2014

4:04 pm | HGST Shows-Off World’s Fastest SSD Based on PCM Memory. Phase-Change Memory Power’s World’s Fastest Solid-State Drive

Monday, July 28, 2014

6:02 pm | Microsoft’s Mobile Strategy Seem to Fail: Sales of Lumia and Surface Remain Low. Microsoft Still Cannot Make Windows a Popular Mobile Platform

12:11 pm | Intel Core i7-5960X “Haswell-E” De-Lidded: Twelve Cores and Alloy-Based Thermal Interface. Intel Core i7-5960X Uses “Haswell-EP” Die, Promises Good Overclocking Potential