Thoughts/rumors on GF4

From the pixels, bits, and shaders to the graphic cards that power them. Discuss the latest from AMD and NVIDIA here.

Moderators: morphine, SecretSquirrel

Postposted on Sat Dec 29, 2001 1:17 am

I've been hearing the usual rumors without factual backing, fifty different clock speeds, 50GB/s memory bandwidth, etc, etc.

The only solid info I've seen was from a readme from Comanche 4, which accidently let slip that there will be fps-hit-free (or nearly so) FSAA on NV25. I've heard fairly reliable reports that hard accelerated lines and hardware line AA will be back in NV25GL, whereas it's missing in Quadro DCC.

Anybody else have any good tidbits?
Forge
Lord High Gerbil
Silver subscriber
 
 
Posts: 8061
Joined: Wed Dec 26, 2001 7:00 pm
Location: SouthEast PA

Postposted on Sat Dec 29, 2001 11:30 am

Id have to say these 2 are more than likely going to appear:

second vertex shader pipeline
DVD decoding on the hardware level
Coldfirex
Graphmaster Gerbil
 
Posts: 1109
Joined: Wed Dec 26, 2001 7:00 pm
Location: College Station, TX

Postposted on Sat Dec 29, 2001 5:09 pm

Acording to digitimes, we should be seeing the nv25 fairly soon. They say it should go into mass production in febuary and be widely available by march..

mmmm... Doom3 on the nv25 and R300.Pretty colors.=0)


DxGuilio
Guilio
Gerbil In Training
 
Posts: 3
Joined: Fri Dec 28, 2001 7:00 pm
Location: bliss

Postposted on Sun Dec 30, 2001 1:36 am

Doom3 on the GF4/R300=same colors as GF3/R200 but with a faster frame rate. The current version of the Doom3 engine was designed for GF3 and i am betting they wont implement too many features used in the GF4/R300. Im keeping my GF3 which I believe I wasted my money on, because only Aquanox does anything with it, and thats not even a good game..
_____________________________________________
Dmitriy Markelov
1Ghz Tbird
256Mb
ECS K7VZA
GF3 Ti200
Aureal Vortex2
Win98/2K
40Gb Maxtor HD
16/10/40A Plextor
40X Samsung (cdrom)
dmitriylm
Graphmaster Gerbil
 
Posts: 1054
Joined: Thu Dec 27, 2001 7:00 pm
Location: Bay Area/Cali

Postposted on Sun Dec 30, 2001 3:41 am

Hit free FSAA relative to what? Antialiasing is always going to require much more calculation than standard rendering. Is this some kind of magic chip that just happens to squeak out more IPC simply because its AA switch is flipped on? Color me skeptical.

The real specs are more or less the same as the current ones -- the memory may be faster, but fillrate will be more or less the same, and the pixels it does draw will look more purty...
champs
Gerbil First Class
 
Posts: 194
Joined: Wed Dec 26, 2001 7:00 pm
Location: PDX

Postposted on Sun Dec 30, 2001 4:34 am

"Doom3 on the GF4/R300=same colors as GF3/R200 but with a faster frame rate."

Yeah, I know. It was an expression of my anticipation for the new cards that are coming out.

Duh the new cards will have faster framerates, but I am sure that there will be quite a few features in Doom3 that the nv25 will be able to use, that the gf3 will not. I believe Carmack said that the gf3 will be running 30-40 fps on Doom3 with the detail settings at a mid range. Obviously newer cards with better optimizations and more advanced features will perform and look better.

"Im keeping my GF3 which I believe I wasted my money on, because only Aquanox does anything with it, and thats not even a good game.."

That’s why I didn't buy one.


Lets hope that Nvidia can learn from kyro (with its tile rendering), and make use of that gigapixel technology they have sitting around from acquiring 3dfx.


DxGuilio
Guilio
Gerbil In Training
 
Posts: 3
Joined: Fri Dec 28, 2001 7:00 pm
Location: bliss

Postposted on Sun Dec 30, 2001 6:28 am

Dronez is much better than Aquanox, IMHO. Neither is very good though.
Siglessness is boring.
Image - M4800-Eight1
Image - Vargr-Z97
Forge
Lord High Gerbil
Silver subscriber
 
 
Posts: 8061
Joined: Wed Dec 26, 2001 7:00 pm
Location: SouthEast PA

Postposted on Sun Jan 13, 2002 7:00 pm

Okay, I'm now going to cry because my system has a puny 2D graphics card that can't even do higher than 1024x768, and has all the pixel pushing power of a dead snail :smile:

Also, my monitor will refuse to do 1024x768 at anything higher than 60Hz... just so you know, if you are more than 6 inches away from my screen at school you won't be able to see it, I have to have it that dark or it gives me headaches. They run at 60Hz too :smile:

Saving desperatly, and hopelessly :smile:,
IntelMole
IntelMole
Grand Gerbil Poohbah
 
Posts: 3529
Joined: Sat Dec 29, 2001 7:00 pm
Location: The nearest pub

Postposted on Mon Jan 14, 2002 12:45 am

I think I should be able to whip up a GF4 preview after the volt mod Tuesday. 275/600 looks like the shipping speed, and the second Vertex unit shouldn't affect benches much, I think. Stay tuned.
Forge
Lord High Gerbil
Silver subscriber
 
 
Posts: 8061
Joined: Wed Dec 26, 2001 7:00 pm
Location: SouthEast PA

Postposted on Mon Jan 14, 2002 8:34 am

are you saying that you know for certain that the only difference between a GF3 and a GF4 is an extra vertex unit and a clock speed goose to the core and the memory?
Despite
Gerbil XP
 
Posts: 496
Joined: Thu Dec 27, 2001 7:00 pm
Location: Oklahoma

Postposted on Mon Jan 14, 2002 11:11 am

X-Box's GPU is essentially the NV25.
I think they are too busy working on the nv30 to make another adjustment to the architecture.

So yes it's just the extra vertex-shader and a higher clock due to a better manufacturing process.
K-Wulf
Gerbil First Class
 
Posts: 120
Joined: Thu Dec 27, 2001 7:00 pm
Location: The Netherlands

Postposted on Mon Jan 14, 2002 11:34 pm

Just some thoughts on Nvidia's business strategy. They released the GF3 not too terribly long ago. The early adopters paid $350 for a blazingly fast video card, so why would they want to pay that same amount again to make Quake run at 300 fps instead of "just" 170? As for those who are just now buying a GF3/Ti200/Ti500, why would they suddenly want to buy an expensive new GF4?

All I ever heard about was how the GF3 was entirely revolutionary. Can the GF4 be so much more? There isn't even a new version of DirectX to be fully compliant with (save for minor upgrades to the vertex and pixel shaders). It just seems to me that Nvidia is flooding the market with video cards that, while certainly (or simply marginally) better than the previous generation, are unnecessary and will only serve to limit their sales. Now, I know all about their dedication to a 6-month product cycle. But, the home console market thrives on a 5-year cycle. Just look at the revenue for home consoles. They are just as big an industry as movies. Why, then, would Nvidia find it necessary to marginally improve a product every six months?

Anyone out there, please enlighten me.
Narf007
Gerbil In Training
 
Posts: 3
Joined: Sun Jan 13, 2002 7:00 pm
Location: GA Tech

Postposted on Sun Jan 20, 2002 9:35 pm

Heh, Take a looksee over at X-Bit.
All will become clear :smile:
K-Wulf
Gerbil First Class
 
Posts: 120
Joined: Thu Dec 27, 2001 7:00 pm
Location: The Netherlands

Postposted on Mon Jan 21, 2002 7:08 pm

what was posted before at xbit since its been removed already?
Coldfirex
Graphmaster Gerbil
 
Posts: 1109
Joined: Wed Dec 26, 2001 7:00 pm
Location: College Station, TX

Postposted on Mon Jan 21, 2002 7:50 pm

Damn!

Lessee if i can find it anywhere.
Don't think i can though :sad:

I'll start a new thread with the info i can remember.
K-Wulf
Gerbil First Class
 
Posts: 120
Joined: Thu Dec 27, 2001 7:00 pm
Location: The Netherlands


Return to Graphics

Who is online

Users browsing this forum: Yahoo [Bot] and 7 guests