Personal computing discussed

Moderators: renee, morphine, SecretSquirrel

Is GPU-based distributed computing a good idea?

YES
12 (75%)
NO
4 (25%)
 
Total votes: 16
 
mac_h8r1
Minister of Gerbil Affairs
Topic Author
Posts: 2974
Joined: Tue Sep 24, 2002 6:57 pm
Location: Somewhere in the Cloud
Contact:

GPU-based distributed computing....

Fri Dec 20, 2002 1:38 am

The chips on our video cards have as many as three times as many transistors as our CPU's, and most of the time, they aren't being used. Unless we're running a 3d game or some rendering programs, the 80+ million transistors are unused, yet the FAN STILL GOES!!!

What if nVidia and ATI (and whoever else feels like it) added a few hundred transistors to allow the computer to pump data for analysis through it. It could obviously be software based to allow users to disable it when they wanted nothing to be processed, and, like Prime95, SETI@home, FAH, United Devices, and all the others, it would have a low [or idle, depending on OS] priority. Intel, AMD, SIS, VIA, ATI, and nVidia could easily come up with a standard that would enable this.

I'm sure that it would not be too hard to implement. Think about games today. Take Counterstrike for example. It supports ATI's TRUFORM technology, but if the card does not have the capability, the game works just as well without it. The Radeon 8500 introduced SMOOTHVISION technology, but Counterstrike does not support it, so it is not implemented. No reprocussions there either. GPU based distributed computing should be simple to add to future chips, and software shouldn't be too hard to include in operating systems or in the driver set provided with the chip.

What do you people think about this idea. I'd like to know the popular opinion about this. Feedback from chip-making people would be great.
 
DiMaestro
Gerbil Elite
Posts: 890
Joined: Wed Dec 26, 2001 7:00 pm
Location: North Dakota NoMoah!

Fri Dec 20, 2002 1:58 am

A 'GPU' is not a CPU. CPU's are able to perform a much wider, more varied series of tasks than a 'GPU'. Sure a video card has more transistors, but they are geared to one thing, and one thing only.

No, it's not a viable, nor a real doable idea.
 
HowardDrake
Grand Gerbil Poohbah
Posts: 3523
Joined: Thu Dec 27, 2001 7:00 pm
Location: Action Jim's Rumpus Room
Contact:

Fri Dec 20, 2002 4:00 am

I don't know, if you can use a PS2 for scientific research, why not this?
No wonder television's a medium. It's so seldom rare or well done. -Mighty Mouse
Image
 
b3n113
Minister of Gerbil Affairs
Posts: 2519
Joined: Sun May 12, 2002 3:00 am

Fri Dec 20, 2002 4:09 am

(PS2) The main processor is an off-the-shelf MIPS CPU, but everything else is custom, including to esoteric vector processors previously available only on high-end mainfraim computers. Cray Supercomputers were using gigahertz speed vector CPUs as early as 1980, but until the PS2 they were never released in any consumer-level device, and there are very few people with the know-how to program one.
/rip

So it's not like the ps2 only has a GPU - if you want to call it that. It's a straight up PC.. of sorts. Add the linux kit and it is..
 
mac_h8r1
Minister of Gerbil Affairs
Topic Author
Posts: 2974
Joined: Tue Sep 24, 2002 6:57 pm
Location: Somewhere in the Cloud
Contact:

Fri Dec 20, 2002 12:13 pm

While graphics processors are currently specialized for graphics, the fact is that they PROCESS DATA. they take a formula and apply it to a specific set of positions, coordinates, and other dimensions. It is still raw data, albeit enhanced raw data, until another chip [ramdac] converts it to a standard format [DVI-x, xVGA, composite, component, s-video].

I believe that with a few [true, it could be closer to a few thousand] additional transistors on the GPU, and an additional chip nearby, the GPU would be able to process actual data and send it to that extra chip which would send it all back through the AGP bus to the processor. Don't forget that every quality chip since the GeForce3 is "programmable", meaning that it can be made, through software, to handle different instructions, or other forma of data.

With the number of transistors on chips now, even the addition of a few hundred thousand new gates would only bump the price of a chip up about 10 cents. Plus, this would enable the CPU to run more advanced diagnostics on the video card.
 
pattouk2001
Gerbil Jedi
Posts: 1903
Joined: Thu May 30, 2002 10:44 am
Location: Birmingham, UK.
Contact:

PS2=PC

Fri Dec 20, 2002 1:53 pm

I don't know, if you can use a PS2 for scientific research, why not this?


A PS2 has a 300mhz main CPU, and a 143mhz GPU, known as the emotion engine, so again this same principle applys to the games console as well as the PC. The PS2 doesn't process data using purly a GPU, it uses it's 300mhz CPU for that, but for it's graphics processing, the 143mhz GPU is used, much like a PC, for example a AthlonXP 2000+ / GF4 Ti4200 combo, the AthlonXP does the general FPS pushing processing power, while the graphics processing is relied upon the GPU.
 
Dissonance
Gerbil Elite
Posts: 535
Joined: Wed Dec 26, 2001 7:00 pm
Location: Vancouver, BC
Contact:

Fri Dec 20, 2002 2:00 pm

mac_h8r1: There's no real motivation for a graphics chip designer like ATI, NVIDIA, or others to add even just a few transistors to a GPU to make it more capable of doing processing for a distributed computing client. I'm not so sure it would be that simple, either.

Yes, a GPU performs calculations and whatnot using the transistors it already has. If you wanted to harness a GPU's power you'd likely need software rather than hardware. You'd also need a distributed computing project whose processing requirements could reasonably be tackled by the extremely specialized nature of current GPUs.

It's a nice idea, in theory, but I wouldn't expect to see any distributed computing projects really using GPU power in the near future. That said, I wouldn't mind seeing a DC project leverage a GPU for some neat screensaver effects *if* they could do it without slowing down the number crunching.

Who is online

Users browsing this forum: No registered users and 1 guest
GZIP: On