Single page Print
One display to rule them all


1600x1200 is a subset of the 3007WFP's max resolution

Ok, so I resisted jumping on the mega-display bandwagon when ATI and Nvidia first started talking up "Extreme HD gaming" and the like. I first saw the Dell UltraSharp 3007WFP at CES last year, and I thought, "Hey, that's gorgeous, but I'd rather have a used car for that kind of money," and went on my merry way. I didn't really consider the likes of this monitor or Apple's big Cinema Displays as a live option for most folks, so grabbing one for video card testing seemed like a stretch.

In other words, I'm really cheap.

I had other reasons to resist, too, including the fact that LCDs have traditionally been slower and had less color contrast than the best CRTs. Even more importantly, multiscan CRTs are capable of displaying multiple resolutions more or less optimally, while LCDs have a single, fixed native resolution.

However, LCD prices continue to plummet, and I recently found myself staring at a price of just under $1200 for a 3007WFP. That isn't exactly cheap, but then we're talking about reviewing a graphics subsystem that costs roughly the same amount. To mate it with a lesser display would be, well, unbalanced.

And the progress LCDs have made over CRTs has been tremendous lately. Not only have they made huge strides in color clarity and reached more-or-less acceptable response times, but they have also simply eclipsed CRTs in terms of resolution. In fact, every single common PC display resolution is a subset of the 3007WFP's native 2560x1600 grid, so the 3007WFP can show lower resolutions in pixel-perfect 1:1 clarity as a box in the middle of the screen, if scaling is disabled.

That realization pretty much quelled any graphics-geek or display-purist objections I had to using an LCD for graphics testing, and I was at last dragged into the 21st century with the rest of the world—$1200 poorer, but with one really sweet monitor on the GPU test bench.

Make no mistake: the 3007WFP is worth every penny. It's that good. Once you see it in person, sitting on your desk, it will force you to reevaluate your opinions on a host of issues, including the need for multiple monitors, the aspect ratio debate, the prospects for desktop computers versus laptops ('tis bright!), and whether your eight-megapixel digital photos are really sharp enough.

Moreover, the 3007WFP (or something like it) really is necessary to take advantage of two GeForce 8800 cards in SLI in the vast majority of current games. Prior to the 3007WFP, our max resolution for testing was 2048x1536 on a 21" CRT. In our initial GeForce 8800 review, we found that the GeForce 8800 GTX could run the gorgeous and graphically intensive The Elder Scrolls IV: Oblivion at this resolution with the game's Ultra High Quality presets, 4X AA with transparency supersampling, and high quality 16X anisotropic filtering at very acceptable frame rates. The GTX hit an average of 54 FPS and a low of 41 FPS, while the GTS averaged 37 FPS and bottomed out at 27 FPS The GTX even achieved a very playable 45 FPS with 16X CSAA added to the mix. Who needs a second GPU when a single graphics card can crank out visuals like that at playable frame rates? Going for 8800 SLI doesn't make sense in the context of a more conventional display, at least not with today's game titles.

Width Height Mpixels
640 480 0.3
720 480 0.3
1024 728 0.8
1280 720 0.9
1280 960 1.2
1280 1024 1.3
1400 1050 1.5
1680 1050 1.8
1600 1200 1.9
1920 1080 2.1
1920 1200 2.3
1920 1440 2.8
2048 1536 3.1
2560 1600 4.1
The 3007WFP's 2560x1600 resolution, though, raises the ante substantially. The table to the left shows a number of common PC and TV display resolutions along with their pixel counts. As you can see, four megapixels is a class unto itself—well above the three megapixels of our previous max, 2048x1536, or the more common two-megapixel modes like 1600x1200 or 1920x1080. The screen's 16:10 (or 8:5, if you're picky) aspect ratio also mirrors the shape of things to come, with the growing popularity of wide-aspect displays in everything from laptops to desktops to HDTVs. In fact, our recent poll suggested 45% of our readers already use a wide-aspect 16:9 or 16:10 primary display with their PCs.

Given that fact, I had hoped to conduct testing for this article in a trio of 16:10 resolutions: 1680x1050, 1920x1200, and 2560x1600. That would have given us resolutions of 1.8, 2.3, and 4.1 megapixels, all with the same aspect ratio. I have my doubts about whether the shape of the viewport will impact performance in any noticeable way, but I wanted to give it a shot. However, I ran into problems with both games and video drivers supporting wide-aspect resolutions consistently, so I had to revert to another option, testing at 1600x1200, 2048x1536, and 2560x1600. That gives us a tidy look at performance scaling at two, three, and four megapixels.

Insects crawl out from under the multi-GPU woodwork
While we're on the subject of expensive things that are nice to have, we should take a moment to note some of the problems you're buying into if you go this route. I can't tell you how many times Nvidia and ATI have extolled the virtues of running "extreme HD gaming" involving high resolutions, wide aspect ratios, and multiple GPUs in the past couple of years. Too many to count. Yet when we took the plunge and went head-on into this world, we ran into unexpectedly frequent problems.

Of course, you're probably already aware that multi-GPU support in games is spotty, because it typically requires a profile in the video card driver or special allowances in the game code itself. On top of that, not all games scale well with multiple GPUs, because performance scaling depends on whether the game is compatible with one of the various possible load-balancing methods. Nvidia and ATI have worked to encourage game developers to make their applications compatible with the best methods, such as alternate-frame rendering, but top new games still may not work well with SLI or CrossFire. This drawback is largely accepted as a given for multi-GPU configs today.

However, we ran into a number of other problems with wide-aspect multi-GPU gaming during our testing with SLI and CrossFire, including the following:

  • Nvidia's control panel has settings to disable the scaling up of lower resolutions to fit the 3007WFP's full res, but this option doesn't "stick" on the latest GeForce 8800 drivers. Getting a native 1:1 display at lower resolutions on this monitor currently isn't possible on the GeForce 8800 as a result. Nvidia confirmed for us that this is an unresolved bug. They are planning a fix, but it's not available yet. This one isn't an issue on the GeForce 7 or with the corresponding display settings in ATI's Catalyst Control Center for the Radeon X1950.

  • Nvidia's G80 drivers also offer no option for 1680x1050 display resolutions, either for the Windows desktop or for 3D games—another confirmed bug. I'm unsure whether this problem affects monitors with native 1680x1050 resolutions like the Dell UltraSharp 2007WFP, but it's a jarring omission, nonetheless.

  • When picking various antialiasing modes via the Nvidia control panel on the GeForce 8800 with SLI, we found that Quake 4 crashed repeatedly. We had to reboot between mode changes in order avoid crashes.

  • With Nvidia's latest official GeForce 7-series drivers, 93.71, we found that quad SLI's best feature, the SLI 8X AA mode, did not work. The system appeared to be doing 2X antialiasing instead. We had to revert to the ForceWare 91.47 drivers to test quad SLI 8X AA.

  • At higher resolutions with 4X AA and 16X aniso filtering enabled, Radeon X1950 CrossFire doesn't work properly in Oblivion. The screen appears washed out, and antialiasing is not applied. ATI confirmed this is a bug in the Catalyst 7.1 drivers. We tried dropping back to Catalyst 6.12 and 6.11, with similar results, and ATI then confirmed that this bug has been around for a while. We had to go back to Cat 6.10 in order to test in Oblivion.

  • This isn't entirely the fault of the graphics chip makers, but in-game support for wide-aspect resolutions is still spotty. For instance, the PC version of Rainbow Six: Vegas lacks built-in wide-aspect resolution options, despite the fact that it's a port from the HDTV-enabled Xbox 360. That may be one symptom of a larger problem, that R6: Vegas is a half-assed PC port, but the game comes with an ATI logo on the box. Similarly, Battlefield 2142 has no widescreen support, and ships with an Nvidia "The way it's meant to be played" logo on its box. Apparently, the extreme HD gaming hype hasn't yet seeped into these firms' developer relations programs.

  • It's still not possible to drive multiple monitors with SLI or CrossFire enabled. As I understand it, this is a software limitation.

  • Finally, Nvidia's first official Windows Vista driver release will not include SLI support.
I also think Nvidia has taken a pronounced step backward with its new control panel application for a number of reasons, including the fact that they're buried the option to turn on SLI's load balancing indicators, but I will step down off of my soapbox now.

The point remains that support for widescreen gaming at very high resolutions with multiple GPUs doesn't appear to be a high priority for either Nvidia or ATI. Perhaps that's understandable given the impending debut of Windows Vista and DirectX 10, which will require new drivers for all of their GPUs. Still, those who venture into this territory can expect to encounter problems. These issues are probably more plentiful with the GeForce 8800 because it's still quite green. If you're planning on laying out over $1100 on a pair of graphics cards, you might be expecting a rock-solid user experience. Based on what I've seen, you should expect some growing pains instead.

Now that I've wrecked any desire you had to build an SLI gaming rig, let's repair that a little by looking at some performance numbers.