Are you salivating over the Nvidia Tesla V100? If you are, then you best get a bib ready. Right after unveiling the V100, Nvidia showed the Volta-powered DGX-1 rack unit, the DGX Station, the HGX-1 GPU cloud computing unit, and a Volta-powered server add-in card.
If one Tesla V100 was already impressive on its own, how about eight of them in a rack unit? That's what the DGX-1 is. Nvidia claims that this box o' fun can replace 400 standard servers for machine learning tasts and should be good to deliver 960 tensor TFLOPS. The DGX-1 uses an NVLink Hybrid Cube interconnect. Price is set at $149K, and Nvidia expects it'll deliver the units in Q3 of 2017. Customers who order the Pascal-powered DGX rack now will get a free upgrade to the Volta version when it's out.
Not everyone can just waltz into their own personal datacenter to do some number-crunching, though. Nvidia has apparently been bagdered over the years with requests to make a Tesla-powered workstation, and the company is now delivering it. The DGX station is a water-cooled tower "PC" packing four Tesla V100 cards, three DisplayPort outputs, and capable of punching through 480 tensor TFLOPS. You'll need a lot of juice to make this machine go at full tilt: all of 1500W. Nvidia will sell you a DGX Station for $69K.
Nvidia also made a Volta-powered "GPU cloud computing" unit called HGX-1, though it didn't offer many details on it. We surmise that this is a version of the previous HGX-1, except it's based on Tesla V100 cards. The box uses a VLink Hybrid Cube interconnect and apparently has eight Tesla V100s inside it.
Last but by no means least, Nvidia made an apparently-unnamed add-in card with a V100 chip aboard tuned for inference tasks. The full-height, half-length appears to connect using a PCIe x16 interface and draws 150W of board power. Nvidia says that 33 "nodes" of V100 (we assume thirty-three of these cards) can deliver inference performance equivalent to 500 servers with 1000 CPUs in them, equating it to a 15x reduction in cost for the same performance.