MSI’s FX5700-VT2DR256 graphics card

Manufacturer MSI
Model FX5700-VT2DR256
Price (street) $192
Availability Now

FOR WHATEVER reason, graphics card manufacturers have been reluctant to offer consumer-level products with dual DVI outputs. Back in the day, Gainward had a pretty sweet GeForce4 Ti 4600 with dual DVI ports, but the company hasn’t made a dual DVI card since. Last August, Tyan announced a dual DVI Radeon 9600 Pro, but bailed on the product shortly thereafter due to “market forces.” More recently, XFX, Asus, and HIS have made some noise about dual DVI, but their cards don’t seem to be readily available in North America.

So what’s a dual DVI enthusiast to do? One could opt for something from Matrox, but they haven’t had a competitive 3D performer in quite some time. Workstation-class FireGL and Quadro products from ATI and NVIDIA generally sport dual DVI ports, but the price premium associated with high-end workstation gear makes FireGLs and Quadros hardly affordable. If only there were a better way.

There is. The MSI FX5700-VT2DR256 graphics card brings dual DVI outputs to NVIDIA’s mid-range GeForce FX 5700. And there’s more to the VT2DR256 than its digital output ports; the card also sports VIVO capabilities, a remote control unit, and 11-CD software bundle. Best of all, the VT2DR256 is currently available from Newegg and at least one other North American retailer.

I’ve spent a lot of quality time with the VT2DR256 over the past couple of weeks not only testing the card’s 3D performance and dual DVI output ports, but also playing with the latest ForceWare graphics drivers from NVIDIA. Read on to see why the VT2DR256, combined with NVIDIA’s latest nView desktop management software, could be the ultimate consumer-level multimonitor graphics solution.

The specs
As always, let’s kick things off with a look at the FX5700-VT2DR256’s spec sheet.

Core clock 425MHz
Pixel pipelines 4
Peak pixel fill rate 1700 Mpixels/s
Texture units/pixel pipeline 1
Textures per clock 4
Peak texel fill rate 1700 Mtexels/s
Memory size 256MB
Memory clock 550MHz
Memory type DDR SDRAM
Memory bus width 128-bit
Peak memory bandwidth 8.8GB/s
Ports DVI (2), S-Video output, composite Video input
Auxiliary power connector None

The VT2DR256 is based on NVIDIA’s NV36 GPU, which you can read more about here. Graphics cards based on NV36 have been around for over six months now, so the GPU is nothing new. However, MSI brings a few extra goodies to the VT2DR256 in the form of dual DVI outputs and a composite video input port.

The VT2DR256 is built on a bright red board that’s sure to stand out through a case window, but the card is otherwise inconspicuous. Well, until you notice the dual DVI ports.

For those who don’t have DVI-equipped flat panel displays, MSI ships the VT2DR256 with a DVI-to-VGA adapter. I suppose one could criticize MSI for not including two DVI-to-VGA adapters (one for each DVI output), but that seems a little petty considering that MSI offers a number of other GeForce FX 5700-based products with analog outputs.

In addition to dual DVI outputs, the VT2DR256 also has VIVO capabilities courtesy of Philips’ SAA7115HL video decoder. The decoder is capable of detecting and decoding NTSC, PAL, and SECAM streams, but without an integrated TV tuner, the VT2DR256’s multimedia capabilities fall a little short of All-in-Wonder and Personal Cinema products.

On the memory front, the VT2DR256 uses K4D551638D-TC36 DDR chips from Samsung. These TSOP chips are rated for operation at speeds up to 275MHz (an effective 550MHz given DDR’s double data rate), which nicely matches the GeForce FX 5700’s memory spec.

Unlike most mid-range graphics cards, the VT2DR256 features 256 rather than 128MB of memory. Given that even high-end graphics cards have a hard time taking advantage of 256MB of memory, it’s unlikely an extra 128MB will have much of an impact on performance with a lowly FX 5700. Today’s applications don’t really target cards with 256MB of graphics memory, and the FX 5700 doesn’t have the horsepower to handle antialiasing and anisotropic filtering at high enough resolutions to make use of the extra memory.

The card also comes with an IR remote for use with MSI’s own Media Center Deluxe II software.

Unfortunately, my VT2DR256 sample lacked the IR dongle necessary for the remote control to work, so I can’t comment on its usefulness. However, the remote is only meant to work with MSI’s Media Center software, so it’s a far cry from ATI’s more versatile Remote Wonder. Were the VT2DR256 equipped with a TV tuner or more clearly targeted at home theater PCs, the Media Center-specific remote might make more sense.

MSI’s Media Center software comes on one of a whopping 11 CDs bundled with the VT2DR256. With a claimed value of $1,200US, the software bundle is undeniably plentiful, but I can’t find a killer app among the 11 CDs. The software bundle includes InterVideo’s WinDVD and WinCinema, FarStone VirtualDrive+ and RestoreIt!, MSI’s 3D desktop utility, Morrowind: The Elder Scrolls III, Ghost Recon, Duke Nukem Manhattan Project, and a 7-in-1 game CD filled with demos for ancient titles like Serious Sam SE and Oni. There’s a lot there, but most of it’s filler.

With NVIDIA pushing Call of Duty and ATI bundling Half-Life 2 coupons, the VT2DR256’s game bundle looks dated at best. The fact that MSI was bundling Battlefield 1942, Command & Conquer Generals, and Unreal II with select GeForce FX graphics cards last summer makes Ghost Recon and Morrowind look even more antiquated.


ForceWare’s new features
Although the VT2DR256’s bundle isn’t full of must-have titles and utilities, the card gets a big software boost from NVIDIA’s latest ForceWare driver package. The ForceWare 56.64s don’t promise a performance revolution, but they do add a number of useful features that give the VT2DR256 a little extra polish.

For gamers, the ForceWare drivers offer a number of different image quality presets to make managing antialiasing and anisotropic filtering levels a breeze. Users can take advantage of pre-programmed image quality levels for a number of different games, or roll their own. The drivers can handle multiple profiles for each game to allow users to crank up the eye candy for a more visually stunning single-player experience, but also ensure buttah-smooth frame rates for competitive multiplayer gaming. Currently, the user must activate a desired profile before launching a game. However, NVIDIA plans to have future driver revisions launch image quality profiles automatically when games are loaded.

Image quality profiles are really only useful for 3D applications, but the ForceWare drivers also offer plenty of neat little tools for desktop applications with the nView desktop manager.

nView starts out with a number of helpful wizards that make configuring multiple monitors and desktop preferences a snap. Those looking to dig deeper can browse through one of nView’s tabbed screens to control properties for multiple desktops, window transparency effects, hot keys, zooming, and even mouse kinematics.

The mouse kinematics are a personal favorite of mine because they allow you to “throw” an application window from one screen to another in a multimonitor environment by simply flicking the mouse. Users can also associate actions with a number of different mouse gestures and even control the sensitivity of those gestures to match their reflexes.

In the eye candy department, nView lets users control the transparency of the Windows task bar. Users can also enable transparency for window dragging, though there’s no way to dictate transparency levels for stationary windows.

Transparency effects also make an appearance in nView’s pop-up blocker, which can briefly preview and then fade out pop-up windows. Users can also choose to completely block pop-ups or only allow certain sites to launch new windows.

At first glance, it might seem a little odd that NVIDIA has integrated a pop-up blocker into its graphics driver. However, since nView is all about desktop and window management, it makes sense to block pop-ups that could otherwise pollute a finely tuned desktop environment.

Speaking of which, nView gridlines let users segment desktop space into distinct regions to help organize multiple windows.

Drawing out gridlines to define screen regions is a simple click-and-drag affair, and once the grid is in place, users can maximize application windows within distinct screen regions to avoid window overlap.

Tabbed browsing may be a better bet for navigating multiple web sites, but gridlines are perfect for reserving screen real estate for instant messaging apps, IRC windows, and Winamp playlists. Gridlines also work with multiple monitors, though individual gridlines may not span multiple screens. Overall, that’s not a crippling limitation, especially since neither ATI nor Matrox offers anything even remotely like gridlines.

As much as I love all the features and functionality that NVIDIA’s ForceWare drivers and nView desktop management software bring to the table, I have one nit to pick: nView doesn’t offer a “smart” taskbar. Realtime Soft’s UltraMon software has a nifty little feature that allows each screen in a multimonitor environment to have its own “smart” Windows taskbar. Each screen’s taskbar shows tabs for only those windows that are currently open on that screen, which makes organizing multiple applications on multiple screens much easier. For me, the “smart” taskbar is a must-have feature for any multimonitor Windows environment, and I’d like to see similar functionality integrated into nView.


Our testing methods
Because the VT2DR256 is a dual DVI card, I would be remiss not to test it against something from the Matrox camp. Price-wise, the Millennium P750 is a more appropriate competitor. However, the P750’s anemic 3D horsepower is really no match for even low-end graphics cards, so I’m invoking a mercy rule and using a Parhelia instead. Not that it will make much difference.

Since ATI partner HIS is apparently making dual DVI Radeon 9600 Pro cards, I’ve included a 9600 Pro as well. It wouldn’t be much of a party without at least some representation from the red corner.

All tests were run three times, and their results were averaged, using the following test systems.

Processor Athlon 64 3200+ 2.0GHz
Front-side bus HT 16-bit/800MHz downstream
HT 16-bit/800MHz upstream
Motherboard Abit KV8-MAX3
North bridge VIA K8T800
South bridge VIA VT8237
Chipset driver Hyperion 4.51
Memory size 512MB (1 DIMM)
Memory type Corsair XMS3500 PC3000 DDR SDRAM
Graphics Radeon 9600 Pro 128MB FX5700-VT2DR256 256MB Parhelia 128MB
Graphics driver CATALYST 4.3 ForceWare 56.64 Matrox 105.01.008

Maxtor 740X-6L 40GB 7200RPM ATA/133 hard drive

Operating System Windows XP Professional
Service Pack 1 and DirectX 9.0b

I tested all cards with antialiasing and anisotropic filtering disabled, and the ATI and NVIDIA cards with 4X antialiasing and 8X aniso. Since Parhelia isn’t capable of 8X aniso, I’ve left it out of our AA/AF tests.

We used the following versions of our test applications:

The test systems’ Windows desktop was set at 1024×768 in 32-bit color at a 75Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests. All of the 3D gaming tests used the high detail image quality settings.

All the tests and methods we employed are publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.


The VT2DR256 does well in Quake III Arena, just trailing the Radeon 9600 Pro. Both cards are well ahead of the Parhelia.

Parhelia doesn’t fare quite as poorly in Unreal Tournament 2003, where the VT2DR256 loses a little ground to the 9600 Pro. However, the VT2DR256 does manage to sneak ahead of the 9600 Pro at 1600×1200 with 4X antialiasing and 8X aniso.

The VT2DR256 jumps out to the lead in Wolfenstein: Enemy Territory. Here, the Radeon 9600 Pro can’t seem to handle 4X antialiasing and 8X anisotropic filtering at 1600×1200.


With antialiasing and anisotropic filtering disabled, the VT2DR256 just trails the 9600 Pro in AquaMark3. However, at 4X/8X, the 9600 Pro widens the performance gap considerably.

Parhelia is surprisingly competitive in Halo, but likely only because it can’t run the game’s pixel shader 2.0-powered effects. Though technically the slowest of the lot, the VT2DR256 is still right in the thick of things.

The VT2DR256 is slower than the 9600 Pro in Splinter Cell, just missing 30 frames per second at 1280×1024. Here’s a look at how the cards perform across the length of the Splinter Cell demo:


The VT2DR256 comes out ahead in five of viewperf’s six tests, but gets blown away by the Radeon 9600 Pro in ugs-03. As we’ve seen throughout testing, Parhelia is far from competitive.


In testing, I was able to get the VT2DR256 stable with core and memory clock speeds of 483 and 624MHz, respectively. Overall, that’s nearly a 14% boost.

Of course, just because I was able to get my VT2DR256 sample stable and artifact-free at 483/624 doesn’t mean that all cards will reach those speeds; some my overclock higher, and others not at all.

The clock speed boosts yield better performance in AquaMark3, but not even the overclocked VT2DR256 catch the Radeon 9600 Pro with antialiasing and aniso enabled.


If the VT2DR256 were a run-of-the-mill GeForce FX 5700 with VIVO capabilities, I’d call its $192 price tag outrageous. However, the card’s dual DVI outputs put it in a different class altogether. For starters, ATI’s cheapest offering with dual DVI is the FireGL Z1, which sells for almost $400. In the NVIDIA camp, the dual DVI Quadro NVS 280 looks like a steal at $134, but it’s based on the GeForce4 MX core and has only 64MB of memory. To get dual DVI with DirectX 9 support, we’d need to upgrade to the Quadro FX 1100, which runs over $600. And let’s not forget Matrox, whose Parhelia sells for $321. Matrox also has a dual DVI Millennium P750 that can be purchased for as little as $212, but with half the pixel-pushing horsepower of the already pokey Parhelia, don’t expect much in the way of 3D performance.

In the end, the VT2DR256 actually looks pretty affordable when compared with its dual-DVI competition, especially when you consider the card’s competent performance in our 3D performance tests. However, the VT2DR256’s dual DVI outputs wouldn’t be nearly as impressive without NVIDIA’s excellent nView software, which is by far the most thoughtful and innovative set of multimonitor and desktop management tools I’ve ever seen. Features like gridlines and mouse kinematics are a joy to use, and you won’t find them in ATI’s HydraVision or Matrox’s PowerDesk software.

I’m not sure why it took so long for MSI to jump onto the dual DVI train, and I can only hope that others are able to deliver competitive dual-DVI solutions to market. Until then, the VT2DR256 is easily the most compelling consumer-level graphics card with a pair of DVI ports. 

Comments closed
    • LauFu
    • 16 years ago

    So is the April Fools joke here that there is no real news?

    (Having tried and subsequently failed to run a similarly themed site, I normally wouldn’t comment, but you’d think it was a Sunday Today. That and this is the first day I’ve had to spend any time reading the site in a week.)

    • AmishRakeFight
    • 16 years ago

    Can I really get free smileys?? oh wait, that’s not part of the review.

      • Pete
      • 16 years ago

      Wow, that dual DVI XFX 5700U is a pretty decent choice for the money.

    • derFunkenstein
    • 16 years ago

    What about this HIS Radeon 9600?

    §[<<]§ Sure, it's not as fast as a 9600 pro or anything, but it's still there...and it's cheaper.

    • liquidsquid
    • 16 years ago

    Hmm, this is good news for me getting dual flatscreens at work! We’ll see…

    • indeego
    • 16 years ago

    Finally Dual DVI. They heard the TR rantersg{<.<}g -fpking

      • Pete
      • 16 years ago

      I’ve read that newer LCDs look almost as good with analog input as with digital. Is this true even for 12×10 17″-19″ LCDs? That would make a ~$190 5900XT a far better purchase.

        • Dissonance
        • 16 years ago

        In my experience, that’s not true at all. Analog doesn’t look horrible, but DVI produces a much crisper image.

          • MagerValp
          • 16 years ago

          Analog tends to have lots of ghosting on vertical lines, especially in 1280 and up. DVI is the way to go with LCD monitors.

          • JustAnEngineer
          • 16 years ago

          I agree with Geoff. DVI looks better on both the Dell 2000FP and 2001FP here.

            • indeego
            • 16 years ago

            Which is odd because a 22″ flat CRT looks better than *most* LCD’s and it’s analogg{.}g

            • highlandr
            • 16 years ago

            Part of that is the design. CRTs use an old analog design, progressively shooting the picture onto the phosphorous. LCDs use a digital design, turning on or off individual colored pixels. It is extremely rare (if not nonexistent) to see digital in on a normal CRT (HDTV doesn’t count), because it would need the extra circuitry to convert the signal to analog. Similarly, LCDs are poor at analog because the video card goes DAC (digital to analog) then the LCD goes ADC (analog back to digital). The extra conversions introduce garbage into the signal, which reduces the quality.

            Anybody feel lerned from that? I don know half that stuff wer in m’ hed!

            • Hattig
            • 16 years ago

            Depends on the user to be honest.

            I hate CRTs because they suck at straight lines – there’s always a curve somewhere however much you adjust the screen (and I had a reasonably good Iiyama monitor) that just pissed me off. TFTs do perfectly straight lines.

            It probably won’t be long until TFTs match or exceed CRTs in all aspects – brightness, response time, etc. Give it 3 or 4 years … they win on picture stability, size, power consumption, coolness factor, subpixel antialiasing and more at the moment. I don’t get eyeache either when looking at the screen for too long either, which is a big bonus (yes, that was at 85Hz).

            I’ve given up on OLED for home computer displays within the next 8 years now. PDAs, Phones, etc – yeah, within the next 3 years they will be very common.

            Anyway, as long as this 19″ TFT holds out, it will be what I am using until 24″ 1920×1080 widescreen displays reach a nice price point and picture quality.

            • indeego
            • 16 years ago

            I can’t stand LCD resolution limitations (which I find severe and very limiting as I run different things at about 3 different resolutions,) shadowing in games/movies, dead pixels. I find it one step forward and a few steps back. I never get headaches via CRT’s, they cost about 33% less for even larger displays, and desk real-estate isn’t a concerng{<.<}g

            • JustAnEngineer
            • 16 years ago

            I am not going back to analog.

            The Dell UltraSharp 2000FP (1600×1200 @60Hz 20.1″ LCD) side-by-side with my Viewsonic PF-815 (1600×1200 @85Hz, 22″ perfect flat Diamondtron) absolutely put the CRT to shame. The LCD was significantly brighter and sharper than the CRT. This was with a Gigabyte Radeon 9700 Pro, a Sapphire Radeon 9800 and a 3dfx Voodoo5-5500. With the crappy output filters on my old RIVA TNT2 Ultra, the difference was even more noticeable.

          • PerfectCr
          • 16 years ago

          Analong on LCD is noticably worse than DVI for sure.

          • Pete
          • 16 years ago

          K, thanks. I’m testing a Sony SDM-X53, and it manages to make even the blurry mess coming out of my Asus A7N266-VM look good (I suppose the 60Hz refresh helps). I did notice less distinguished lines via analog, though. Still, I’m pleasantly surprised by the distinct lack of ghost trailing I can detect, although my low, low framerates from my single-channel IGP is probably helping the LCD in this case.

      • lemonhead
      • 16 years ago

      After seeing people still remembering me, I have come out of the shadows to post once more. Hail to the king! It is my quest to bring back the gerbils!

Pin It on Pinterest

Share This