Caught in the CrossFire?
Of course, this scheme does impose some limitations on CrossFire configurations, not least of which is the need for a master card in order for the scheme to work. The master card has a high-density DMS-59 port onboard. An external, three-headed Y cable plugs into this high-density port on the master card and into the DVI output on the CrossFire slave card. The cable's third port is a DVI output port, for connection to a monitor (or to a DVI-to-VGA converter.)
All of this works rather transparently once everything is connected properly, but it is a bit of a hassle to plug together. Also, when CrossFire is enabled, the slave card's secondary video output cannot be used to drive a display. Fortunately, CrossFire can be enabled and disabled via the Catalyst Control Center without rebooting, unlike SLI.
Our pre-production CrossFire master card had another annoying limitation. When connected to our massive Mitsubishi Diamond Plus 200 monitor, it would not display POST messages, BIOS menus, or the WinXP boot screen. ATI says this is an incompatibility between pre-production master cards and monitors with missing or incomplete EDID data, and they claim it will be resolved in production master cards. I hope that's true, because it was a mighty annoying problem that rendered almost useless a slightly older, but still very nice, monitor. (Ok, it's a hunk o' junk, but I still wish it worked.)
More onerous is a problem ATI can't easily resolve: CrossFire is limited to a peak resolution of 1600x1200 at a 60Hz refresh rate. CrossFire relies on the single-link DVI output of existing Radeon X800-family graphics cards, and that connection tops out at 1600x1200 at 60Hz. Now, most folks don't tend to play games at resolutions above 1600x1200, but for an uber-high-end dual-graphics platform, this limitation isn't easily ignored. We've already demonstrated in our past efforts at SLI benchmarking that current games often don't benefit from a dual-graphics performance boost at mere mortal resolutions. More importantly, owners of nice, big CRT monitors probably won't appreciate being confined to the flickery domain of 60Hz refresh rates at that peak 1600x1200 resolutionespecially since NVIDIA's SLI doesn't share this limitation.
Some folks have speculated about the possibility that ATI might circumvent the refresh rate limitations of the existing Radeon X800 cards' DVI ports through a clever implementation that would interleave, say, 60Hz output from the slave card with 60Hz output from the master card, resulting in 120Hz effective output. This scheme could conceivably work with certain 3D graphics load-balancing schemes, like alternate frame rendering. However, such an implementation would require the FPGA compositing engine to have a large amount of embedded memory onboard (or some external memory) in order to hold a full frame of image data at 1600x1200, and it still wouldn't work in most rendering modes. ATI decided that an exotic scheme like this wasn't wholly workable or cost effective. It would have to live with the 1600x1200 at 60Hz limit, and its choice of components for the master cards, including the FPGA and other chips, was informed by this decision.
When pressed about this limitation, ATI argues that higher resolutions simply won't matter to most gamers, but also says forthrightly that "future generations" of CrossFire will be capable of higher resolutions and refresh rates. With ATI's next-gen R520 GPU looming close on the horizon, one could infer that we may not have to wait long for these future versions of CrossFire.
|Cooler Master's MasterCase Pro 6 reviewed||8|
|Aorus AC300W case offers fancy front panel connectivity||5|
|Lenovo's Towers and Y25f monitor join its Legion||3|
|HTC Vive price permanently drops to $599||6|
|Acer Nitro 5 Spin boards the eighth-gen Core train||3|
|Eighth-gen Core desktop CPUs pack six cores and need new mobos||36|
|Intel kicks off eighth-gen Core with four cores and eight threads in 15W||61|
|Asus Vivobook Pro N580VD-DB74T can do offices and kids' parties||15|
|AMD's Ryzen Threadripper 1920X and Ryzen Threadripper 1950X CPUs reviewed||116|