During today's GPU Technology Conference keynote, Nvidia CEO Jen-Hsun Huang took the wraps off the Titan X graphics card. Be sure to read Scott's in-depth review for more details on Nvidia's uber-Maxwell. The Titan X was only the beginning of the topics that Huang discussed.
It's clear that Nvidia has far greater ambitions for its graphics chips than gaming (although 3D graphics remains an important focus for the company). Through the acceleration of deep learning and neural networks, Nvidia wants its GPUs to tackle a new set of computing problems in the years to come.
If you're wondering what deep learning is, it might help to read these Wikipedia articles first. In short, deep learning programs have several useful applications, such as handwriting recognition and the automatic characterization and cataloging of objects within images (like naming the species of animal in a given shot). Deep learning applications can also control systems based on visual input, like self-driving cars. More on that in a second.
In order to make these kinds of applications work, the underlying artificial neural network has to be trained on sample data sets. That process can take a lot of time—up to 43 days, according to Huang, on an (admittedly non-specific) 16-core Xeon CPU. Running an implementation of the same neural network in CUDA on the original Titan graphics card cuts the training time to one week. On the Titan X, and with an improvement to CUDA called cuDNN, only three days are required. That's a huge time savings for people researching deep learning.
Huang showed off a couple of interesting examples of deep learning applications in action. The real jaw-dropper might be Nvidia's computer for self-driving cars, called the Drive PX. This system is based on twin Tegra X1 chips, and it has up to 12 camera inputs. Drive PX is capable of running a complex deep learning program, AlexNet, at 184 frames per second while managing 630 million "connections" (a measure of neural network complexity). Drive PX's improvements over the system behind an earlier deep-learning-controlled vehicle, dubbed DAVE, are supposed to allow the development of vehicles that can operate reliably in complicated, highly variable environments like city streets.
Elon Musk, CEO of Tesla Motors, came on stage to talk self-driving cars during the last stretch of the keynote. Musk said that "we know what to do [for autonomous cars], and we'll be there in a few years." Musk also said that "what NVIDIA is doing with Tegra is really interesting and really important for self-driving in the future." Maybe we'll see a "The Way It's Meant to be Driven" campaign from Nvidia and Tesla.
To aid in further deep learning research, Nvidia announced a development framework called DIGITS to ease the creation of neural-network-based applications. It also introed a development system called the DIGITS DevBox, which contains four Titan X graphics cards and has common deep learning applications preinstalled. Don't expect to pick up one of these at the local Best Buy, though. Nvidia intends these systems to be used mostly by "qualified deep learning researchers." At $15,000 a pop, the price reflects the custom-built, institutional nature of this hardware.
|1. BIF - $340||2. Ryu Connor - $250||3. mbutrovich - $250|
|4. YetAnotherGeek2 - $200||5. End User - $150||6. Captain Ned - $100|
|7. Anonymous Gerbil - $100||8. Bill Door - $100||9. ericfulmer - $100|
|10. dkanter - $100|
|Surface Studio puts the iMac on notice||34|
|Microsoft Surface Book i7 packs a bigger punch and more batteries||19|
|G.Skill KM570 MX keyboard goes back to the basics||3|
|Intel's Purley server platform won't use 3D XPoint memory||3|
|In the lab: EVGA's GeForce GTX 1050 Ti Superclocked graphics card||35|
|iPhone sales continue to shrivel in Apple's fiscal fourth quarter||45|
|Leaked MacBook Pro pics suggest OLED touch bar and Touch ID||30|
|Eizo FlexScan EV2780 monitor cuts cable clutter||11|
|Nvidia's GeForce GTX 1060 graphics card reviewed||98|
|Signing your posts is daftly redundant. Meadows||+29|