BACK WHEN THE GeForce4 was first announced, we nicked NVIDIA pretty hard for introducing the GeForce4 MX 460 aimed at a price range between $149 and $199. The GF4 MX is just a pumped-up GeForce2, and it simply doesn't belong in the same price range as GeForce3 and Radeon 8500 cards. Our previous favorite card, the GeForce3 Ti 200, was apparently being replaced by inferior technology.
ATI saw the GF4 MX 460 coming and took the opportunity introduce the Radeon 8500LE 128MB in the same price range. With real pixel and vertex shaders, the Radeon 8500LE is a steal. Third-party card makers have introduced cards, like Hercules' 8500LE, that sell for as little as 140 bucks online. For that price, we have been inclined to recommend the 8500LE as the best graphics value for the money.
Thank goodness for competition, because now NVIDIA appears to be changing course. The GeForce4 MX 460 is AWOL, and the GeForce4 Ti 4200 is stepping in to take its place. The GeForce4 Ti 4200 is, as the name suggests, a GeForce4 Titanium card with lower clock speeds for the GPU and memory. For those not looking to drop over 300 bucks on a video card, the Ti 4200 may be just the ticket.
We've got the 64MB variety of the Ti 4200 here in Damage Labs, and we've run a full set of benchmarks to compare it against nine of its closest competitors, including ATI's Radeon 8500LE. Read on to see how it matches up against the competition.
GF4 Ti 4200 cards will come with either 64MB or 128MB of memory, and they'll be priced at about $179 and $199, respectively. Our review unit is the 64MB version, and the most surprising thing about it is how normal it looks. Have a look:
The 64MB Ti 4200 looks pretty much like a GeForce2 or GeForce3 cardmuch, much smaller than a GF4 Ti 4600. For comparison's sake, check this out:
These cards ought to fit easily into motherboards like the Epox EP-8KHA+ and the ECS K7S5A that haven't followed the rules for the AGP spec and have trouble accommodating longer cards like the GF4 Ti 4600 reference cards. And I'd wager these smaller cards are cheaper to make.
One area where NVIDIA hasn't skimped is nView. Dual monitors worked fine with a DVI-to-VGA converter plugged into the reference card's DVI out port. I was even able to run multiple, independent desktop resolutions in Windows XP.