Since DDR2 and GDDR3 versions of the GeForce FX 5700 Ultra share the same 475MHz core clock speed and only differ in memory clocks by 44MHz, the cards should have roughly the same pixel pushing power. However, just to be sure, I've run a couple of benchmarks to confirm that the two cards produce roughly equivalent frame rates. Since we've covered the GeForce FX 5700 Ultra's performance extensively in other articles I've limited our test suite to focus on performance differences between the two 5700 Ultras.
While it's unlikely that our 3D performance tests will show much difference between DDR2 and GDDR3 versions of the 5700 Ultra, I've also thrown in a couple of power consumption tests that should highlight one of GDDR3's big selling points.
Our testing methods
All tests were run three times, and their results were averaged, using the following test system.
|Processor||Athlon 64 3200+ 2.0GHz|
|Front-side bus||HT 16-bit/800MHz downstream|
HT 16-bit/800MHz upstream
|North bridge||VIA K8T800|
|South bridge||VIA VT8237|
|Chipset driver||Hyperion 4.51|
|Memory size||512MB (1 DIMM)|
|Memory type||Corsair XMS3500 PC3000 DDR SDRAM|
|Graphics||NVIDIA GeForce FX 5700 Ultra 128MB|
NVIDIA GeForce FX 5700 Ultra 128MB with GDDR3
|Graphics driver||Forceware 56.64|
Maxtor 740X-6L 40GB 7200RPM ATA/133 hard drive
|Operating System||Windows XP Professional|
Service Pack 1 and DirectX 9.0b
We used the following versions of our test applications:
The test systems' Windows desktop was set at 1024x768 in 32-bit color at a 75Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.
All the tests and methods we employed are publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.