In other words, I'm really cheap.
I had other reasons to resist, too, including the fact that LCDs have traditionally been slower and had less color contrast than the best CRTs. Even more importantly, multiscan CRTs are capable of displaying multiple resolutions more or less optimally, while LCDs have a single, fixed native resolution.
However, LCD prices continue to plummet, and I recently found myself staring at a price of just under $1200 for a 3007WFP. That isn't exactly cheap, but then we're talking about reviewing a graphics subsystem that costs roughly the same amount. To mate it with a lesser display would be, well, unbalanced.
And the progress LCDs have made over CRTs has been tremendous lately. Not only have they made huge strides in color clarity and reached more-or-less acceptable response times, but they have also simply eclipsed CRTs in terms of resolution. In fact, every single common PC display resolution is a subset of the 3007WFP's native 2560x1600 grid, so the 3007WFP can show lower resolutions in pixel-perfect 1:1 clarity as a box in the middle of the screen, if scaling is disabled.
That realization pretty much quelled any graphics-geek or display-purist objections I had to using an LCD for graphics testing, and I was at last dragged into the 21st century with the rest of the world$1200 poorer, but with one really sweet monitor on the GPU test bench.
Make no mistake: the 3007WFP is worth every penny. It's that good. Once you see it in person, sitting on your desk, it will force you to reevaluate your opinions on a host of issues, including the need for multiple monitors, the aspect ratio debate, the prospects for desktop computers versus laptops ('tis bright!), and whether your eight-megapixel digital photos are really sharp enough.
Moreover, the 3007WFP (or something like it) really is necessary to take advantage of two GeForce 8800 cards in SLI in the vast majority of current games. Prior to the 3007WFP, our max resolution for testing was 2048x1536 on a 21" CRT. In our initial GeForce 8800 review, we found that the GeForce 8800 GTX could run the gorgeous and graphically intensive The Elder Scrolls IV: Oblivion at this resolution with the game's Ultra High Quality presets, 4X AA with transparency supersampling, and high quality 16X anisotropic filtering at very acceptable frame rates. The GTX hit an average of 54 FPS and a low of 41 FPS, while the GTS averaged 37 FPS and bottomed out at 27 FPS The GTX even achieved a very playable 45 FPS with 16X CSAA added to the mix. Who needs a second GPU when a single graphics card can crank out visuals like that at playable frame rates? Going for 8800 SLI doesn't make sense in the context of a more conventional display, at least not with today's game titles.
Given that fact, I had hoped to conduct testing for this article in a trio of 16:10 resolutions: 1680x1050, 1920x1200, and 2560x1600. That would have given us resolutions of 1.8, 2.3, and 4.1 megapixels, all with the same aspect ratio. I have my doubts about whether the shape of the viewport will impact performance in any noticeable way, but I wanted to give it a shot. However, I ran into problems with both games and video drivers supporting wide-aspect resolutions consistently, so I had to revert to another option, testing at 1600x1200, 2048x1536, and 2560x1600. That gives us a tidy look at performance scaling at two, three, and four megapixels.
Insects crawl out from under the multi-GPU woodwork
While we're on the subject of expensive things that are nice to have, we should take a moment to note some of the problems you're buying into if you go this route. I can't tell you how many times Nvidia and ATI have extolled the virtues of running "extreme HD gaming" involving high resolutions, wide aspect ratios, and multiple GPUs in the past couple of years. Too many to count. Yet when we took the plunge and went head-on into this world, we ran into unexpectedly frequent problems.
Of course, you're probably already aware that multi-GPU support in games is spotty, because it typically requires a profile in the video card driver or special allowances in the game code itself. On top of that, not all games scale well with multiple GPUs, because performance scaling depends on whether the game is compatible with one of the various possible load-balancing methods. Nvidia and ATI have worked to encourage game developers to make their applications compatible with the best methods, such as alternate-frame rendering, but top new games still may not work well with SLI or CrossFire. This drawback is largely accepted as a given for multi-GPU configs today.
However, we ran into a number of other problems with wide-aspect multi-GPU gaming during our testing with SLI and CrossFire, including the following:
The point remains that support for widescreen gaming at very high resolutions with multiple GPUs doesn't appear to be a high priority for either Nvidia or ATI. Perhaps that's understandable given the impending debut of Windows Vista and DirectX 10, which will require new drivers for all of their GPUs. Still, those who venture into this territory can expect to encounter problems. These issues are probably more plentiful with the GeForce 8800 because it's still quite green. If you're planning on laying out over $1100 on a pair of graphics cards, you might be expecting a rock-solid user experience. Based on what I've seen, you should expect some growing pains instead.
Now that I've wrecked any desire you had to build an SLI gaming rig, let's repair that a little by looking at some performance numbers.
|PSU deathmatch: Cooler Master V750 vs. Rosewill Capstone-750-M||3|
|Eizo's FlexScan EV3237 has 31.5'' of 4K goodness||14|
|Logitech gaming mouse combines optical and motion sensors||41|
|Silent Power PC is cooled by copper foam||34|
|ARM-based Opteron now available in $2,999 developer kit||17|
|Best Buy CEO: Tablets 'crashing,' PC seeing 'revival'||111|
|Core i5 powers bizarro Android convertible||21|
|EA to charge $4.99/month for access to its biggest games||58|