Scientists make eight-GPU ‘desktop supercomputer’

At around $500 a pop, Nvidia’s dual-GPU GeForce 9800 GX2 graphics card may be a little too pricey for most gamers to afford. However, scientists at the University of Antwerp in Belgium think it’s a pretty good building block for desktop supercomputing. The ASTRA research group at the University’s Vision Lab has built a desktop system with four GeForce 9800 GX2 graphics cards, which it uses for tomography computations.

In ASTRA’s words, tomography “is a technique used in medical scanners to create three-dimensional images of the internal organs of patients, based on a large number of X-ray photos that are acquired over a range of angles.” With the computing power of four 9800 GX2s, ASTRA says it can perform tomography calculations at the same rate as 350 modern microprocessor cores all working together. That can cut computing times from weeks on a regular PC to just a few hours. Building the machine cost a total of less than €4,000 ($6,200), which is at least a couple orders of magnitude cheaper than a conventional server cluster.

You can check out a video of the eight-GPU machine in action below:

To learn more about the ASTRA team and the machine itself, check out the FASTRA website. (Thanks to TR reader Marco for mailing this in.)

Comments closed
    • bfellow
    • 13 years ago

    There is actually a fan and heatsink on a a 9800GX2! It isn’t passively being cooled

    • Meadows
    • 13 years ago

    You know, the fans spin and sort of move air. It tends to cool the heatsink which then takes the heat away from the GPU and VRAM.

    • pani_alex
    • 13 years ago

    i have a cuestion, how does it get cooled if all the card are togeter?

    • HighTech4US
    • 13 years ago

    They have Overclocked the GPUs.

    • HighTech4US
    • 13 years ago

    They have Overclocked the GPUs. That is slightly naughty.

    • Meadows
    • 13 years ago

    Yeah, I think it’s a bit far. I mean, there’s nothing naughty about what they’ve done here.

    • lucas1985
    • 13 years ago

    #50,
    I used the term “hacked” because they didn’t buy the “Pro” versions of CUDA (Quadro VGAs or Tesla boxes). They bought consumer/mainstream VGAs and used them to build a custom “cluster” of GPUs. I put hacked between quotes because CUDA is officialy supported on certain consumer GPUs.
    Does it make sense now? Did I carry the term hack too far away?

    • continuum
    • 13 years ago

    They make the nForce 780a chipset for AMD, but at least from Asus I don’t see any 4x PCI-e x16 (physical) slot boards available. As already pointed out since they’re not actually using SLI or Crossfire the researchers didn’t care, they just needed the PCI-e x16 slots. =)

    • Meadows
    • 13 years ago

    Tell me how is it “hacked”, even in quotes. I’m dying to hear.

    • lucas1985
    • 13 years ago

    #14,
    This is the cheap, self-built, “hacked” version of Tesla, much like modding a Geforce to Quadro in the past.
    A Tesla box is many times more expensive than 4 9800 GX2 cards. You don’t need to buy a Tesla box to get CUDA support.

    • Meadows
    • 13 years ago

    You’re confused. AMD’s high-end chipsets are made with 4 PCIe sockets (in this example it’s a particularly good board, having room for 4 dual-slot cards) for CrossfireX. But since neither SLI nor Crossfire interested these researchers, I’m betting an AMD chipset was a more solid option than taking a buggy poop from nVidia.

    Mistake me not, their graphics cards are brilliant, but that’s where it ends.

    • BobbinThreadbare
    • 13 years ago

    It’s probably only a matter of time before someone gets a linux distro running on cuda.

    • d0g_p00p
    • 13 years ago

    I guess I am confused. Does nVidia not make a Phenom chipset or is it only the 7XX series by AMD?

    • provoko
    • 13 years ago

    If that guy only had a pocket protector.

    • shank15217
    • 13 years ago

    but many computational tasks are heavily fp intensive or can be ported to an algorithm that is fp intensive. All DSP applications are fp intensive and scientific modeling is generally fp intensive so in other words most interesting problems and models fall in the fp domain of computation. That’s generally where computers have their primary role and what moves microprocessor manufacturers to produce faster and faster cpus. GPGPUs are a real threat to CPUs and is a good reason for AMD to merge technologies with ATI. It may have been ill timed but its the only way going forward. In 10 years cpus will have specialized GPU type hardware if not sooner. SIMD instructions are not good enough and don’t provide the order of magnitude jumps GPGPUs do.

    • Meadows
    • 13 years ago

    Which means it’s a Crossfire motherboard.

    • d0g_p00p
    • 13 years ago

    That’s not a crossfire motherboard since it is using a Phenom CPU.

    • Fighterpilot
    • 13 years ago

    Actually I think its more likely the 206BW judging by its size in relation to the keyboard.
    @ #22 Hardly surprising that a 3D graphics card produces a 3D tomographic image much more efficiently than a CPU…

    • Anomymous Gerbil
    • 13 years ago

    Aah, but here’s another thing – many apps simply don’t need or use FP calcs.

    • Meadows
    • 13 years ago

    I realized that, but then one would think two G92 chips should perform better. That means my initial statements remain, it seems.

    • Silus
    • 13 years ago

    You seem to have got wrong what NVIDIA said about CPUs. They said that as we know them, CPUs are dead, but they are not out to replace CPUs, but they are set to promote the GPU as the most capable hardware component, for many tasks that were only done by CPUs, in the past, which is the logic behind NVIDIA’s “having a more powerful GPU is much more important than having a more powerful CPU”.

    Through CUDA and these sort of applications, that much is becoming even more evident. We already know that having a much more powerful GPU and a mid-low end CPU for gaming, is a MUCH better choice than having a powerful CPU and a mid-low end graphics card. Now, we are seeing xamples of GPUs absolutely killing CPUs in many types of processing, other than games. Games are also set to use the computational power of GPUs to do physics (Nurien is a good example). And besides tomography, you also have this:

    §[<http://www.youtube.com/watch?v=8C_Pj1Ep4nw<]§ As more and more developers start using this, It's only a matter of time, until most of the current apps that mostly use CPUs, start using GPUs only. As for raytracing, you don't seem to remember everything. Intel was focused on raytracing, but they recently confirmed that Larrabee was going to focus on rasterization first and ray tracing in a distant second. NVIDIA on the other hand, never said raytracing was not worth looking into, but they did say that to focus just on raytracing (as was Intel's initial idea), was wrong and stupid. And it is, since it's still beyond the current techs capabilities to do it in real-time.

    • A_Pickle
    • 13 years ago

    Can I fire up Mozilla Firefox with JUST a GPU? Can I use Microsoft Office with JUST a GPU? No. Nvidia is on point when they say that CPU’s have gotten fast enough these days to where you can settle for a little bit of a lower end CPU and get a better GPU and have a superior computing experience…

    …but GPU’s are still in no place to take the CPU. Unless all you do is Tomography. Now… in the past, computers in general started out as “useless” to the ordinary layman, valuable only to those interested in things like tomography. Maybe in 10 or 20 years, computers powered by processors that more resemble the GPU’s than the CPU’s of today may start cropping up, I wouldn’t doubt it.

    But if Nvidia thinks it’s got the upper hand because they make graphics cards, and proclaims the doom of Intel… I think… Jen Hsun Huang has drank WAY too much kool-aid. Like, WAY too much. I don’t think he has though, because after a long-winded rant of “Raytracing is just ridiculous, and Intel is just silly!” Nvidia went and bought a raytracing firm. Hmm.

    • willyolio
    • 13 years ago

    ah, but there’s the thing: how many applications can you make that utilizes floating-point calculations almost exclusively?

    currently, it’s not so much, but as people learn to program for GPGPUs, the number (or more importantly, the %) of programs that run better on GPUs than CPUs will rise.

    • SpikeMeister
    • 13 years ago

    The 226BW? I have that one too.

    • Krogoth
    • 13 years ago

    Except that board defeats the entire purpose of their project. 😉

    • Spotpuff
    • 13 years ago

    Wonder if it’s 80+ certified…

    • Steel
    • 13 years ago

    Only 4 cards? They need to think bigger.

    §[<http://www.chassis-plans.com/single-board-computer/S6806-backplane.htm<]§

    • Krogoth
    • 13 years ago

    apples and oranges comparison at best.

    GPUs are certainly fast at floating operations that are heavily parallelized. They stump at anything beyond that range.

    CPUs on the other hand may not be the fastest at one set of operations, but can handle assortment of operations without a problem.

    • Silus
    • 13 years ago

    A single GX2 works as it should on any motherboard. The point is that the 4 cards are not interacting through SLI, because the motherboard doesn’t support SLI. A single GX2 is essentially SLI on a card, but for two or more GX2 to work you still need SLI support on the motherboard, which is not the case here. So the performance in Crysis using FASTRA, would be similar to a setup with just one GX2.

    • Meadows
    • 13 years ago

    I thought single-card stuff can do it without the “secret sauce”, my bad. In the light of this, it’s even worse. I bet they set settings high for some reason.

    • Silus
    • 13 years ago

    Indeed. It certainly makes NVIDIA’s remarks on how a GPU is much more important than a CPU even more relevant.
    I’m sure that it’s quite embarrassing for Intel, to see 8 NVIDIA GPUs bitch-slapping 300+ of their CPUs…

    • Silus
    • 13 years ago

    Well, when someone in any country does something like this, they certainly deserve attention 🙂

    • ludi
    • 13 years ago

    And the REAL news is…somebody found a use for a 1500W power supply!

    • bfellow
    • 13 years ago

    No who would thought 4 9800GX2’s on an ATI CrossfireX board would work and outperform over 300 CPUs!

    • pogsnet
    • 13 years ago
    • aleckermit
    • 13 years ago

    lol exactly my thoughts

    • h22chen
    • 13 years ago

    Fascinating, adds fuel to the flames of war between Nvidia and Intel!

    • Usacomp2k3
    • 13 years ago

    Umm. It’s a crossfire board. They can’t enable SLI.

    • Fragnificent
    • 13 years ago

    Dude I went to your country. It f*cking RULED HARD. Best beer I have EVER had ever, best chocolate ever, and your women are absolutely drop dead gorgeous. I will be returning to that part of Europe as soon as I have some money. Belgium PWNED me.

    • firestorm02
    • 13 years ago

    dont forget its also the home of Spa-Francorchamps and the fabled Eau Rouge.

    • crazybus
    • 13 years ago

    and beer…..how could you forget that? Not to mention waffles and of course Herg

    • dmitriylm
    • 13 years ago

    So you guys are good for something other than chocolate.

    • Meadows
    • 13 years ago

    Indeed, they don’t know how to set crap. They can use a single card out of 4, which could be OK, but even then they need to set SLI properly to be able to use 2 GPUs. I doubt those people know a rat’s ass about gaming or game hardware stuff.

    • zimpdagreene
    • 13 years ago

    Nice! I sure could use a app for video encoding .

    • Stijn
    • 13 years ago

    l[

    • Spotpuff
    • 13 years ago

    Weird I didn’t even know they made those; looks like it’s the AM2 7092A or whatever; it’s the only one with the slots in a configuration that would support 4 dual slot video cards.

    • shank15217
    • 13 years ago

    windows xp 64, AMD processor and motherboard.. who would have thought right??

    • ecalmosthuman
    • 13 years ago

    Hahahahaha look how shitty the framerate was in Crysis on that thing!

    • Spotpuff
    • 13 years ago

    Where the heck did they get a motherboard with 4 16x PCI-E slots?

    Also, that cabling job is great. Neat cabling always excites me in ways it probably shouldn’t.

    §[<http://fastra.ua.ac.be/images/pic_internal.jpg<]§ Seriously, that's erotic.

    • bdwilcox
    • 13 years ago

    That guy is a walking Dilbert character.

    • Meadows
    • 13 years ago

    T’was in the shortbread.

    • Voldenuit
    • 13 years ago

    Awesomely cool stuff. Props to Cyril for finding the link.

    • titan
    • 13 years ago

    That video was actually kind of funny. My girlfriend said that the host needs to go on Beauty and the Geek.

    It’s very awesome that they got such an inexpensive hardware to outperform a larger and more expensive setup. Doctors will be able to see what’s going on in our bodies Firefly style.

    • Usacomp2k3
    • 13 years ago

    What do you think CT scan stands for? Computed Tomography.

    • Silus
    • 13 years ago

    Not really. It would perform like a single 9800 GX2. They are using CUDA to harness the power of all 4 graphics cards. It’s not SLI, which is even more clear since that’s a Crossfire motherboard.

    • Fighterpilot
    • 13 years ago

    Good to see them using my monitor 🙂
    and yes…it probably will run Crysis at very high settings.
    oops almost forgot…Flip Mode will probably be on point when he posts too.

    • Dposcorp
    • 13 years ago

    Very cool. Thanks.

Pin It on Pinterest

Share This