Google boosts machine learning with its Tensor Processing Unit

— 8:31 AM on May 19, 2016

Google has some new hardware out, and no, it's not a Nexus device. The search giant makes extensive use of machine learning to power services like RankBrain and Street View, and it felt it could give those tasks a little more oomph. Enter the Tensor Processing Unit, or TPU for short.

The TPU is a custom-designed ASIC small enough to fit into a hard drive slot in Google's data center racks. Although the TPU has only just been revealed to the world, Google says it's actually been using the hardware in its datacenters for over a year as a "stealthy project." Google's engineers say that the TPU offers a 10x performance-per-watt improvement over off-the-shelf solutions when dealing with machine learning tasks. Unsurprisingly, the TPU is optimized for the company's open-source TensorFlow machine intelligence library.

Google's TPU does use a dirty trick of sorts—it works with with "reduced precision," roughly meaning that the results of an operation are approximations of the "proper" result. Although the notion may sound counterintuitive at first, it's actually a perfectly acceptable method in some computing tasks. For example, a number of algorithms for calculating a square root rely on approximating the calculation to the actual answer until the deviation is small enough to not matter. Tailoring the TPU to reduced-precision tasks apparently netted Google big gains when it came to hardware design. That approach let the company kill off a substantial number of transistors that would otherwise have been necessary for common operations.

Like what we're doing? Pay what you want to support TR and get nifty extra features.
Top contributors
1. BIF - $340 2. Ryu Connor - $250 3. mbutrovich - $250
4. YetAnotherGeek2 - $200 5. End User - $150 6. Captain Ned - $100
7. Anonymous Gerbil - $100 8. Bill Door - $100 9. ericfulmer - $100
10. dkanter - $100
Tip: You can use the A/Z keys to walk threads.
View options

This discussion is now closed.