Historical graphics review compares 64 cards

Benchmarking graphics cards is a time-consuming affair. A lot of games require manual testing with Fraps, and since driver updates make historical data obsolete, cards usually have to be re-tested for each review. That’s why you usually only see relatively recent offerings compared.

Of course, if you have an inordinate amount of time on your hands—not to mention dedication—you can do what Hardware.info did: gather 64 different cards dating back to the Radeon HD 3000 and GeForce 8000 series, benchmark every last one of them, and gather the numbers in a single data set.

If you’ve ever wanted to know how your old GeForce 9800 GT compares to the new 7770, or how much faster Nvidia’s GeForce GTX 660 is than the old Radeon HD 3870, now’s your chance. Games tested include Battlefield 3, Dirt: Showdown, and Spec Ops: The Line, and you can view the data either in one big graph or in smaller charts that compare different GPU generations from the same vendor.

It’s too bad Hardware.info didn’t go inside the second like we do on TR. Still, the work is impressive—and pretty handy for folks who don’t upgrade very often. Seeing all of these successive leaps in performance also makes me a little sad that seven-year-old consoles are still holding game developers back. If only today’s games were designed with just the PC in mind…

Comments closed
    • shaurz
    • 7 years ago

    Nice to see how my last 3 cards compare (4850, 6850, 7870). Each new card is roughly twice as fast as the previous card I had.

    • Bensam123
    • 7 years ago

    Interesting results, especially the end where they compare high and low ends of both companies, you can see the performance jumps between generations.

    Nvidia had largely linear strides of about 30% performance increase, where AMD makes some huge jumps and then none at all (like 5xxx to 6xxx).

    Some of these results you can’t even compare anymore though. Using really old cards on modern games is kinda ridiculous (although some people do it). They were designed for a completely different set of performance standards. So a really old card that barely supports DX10 wont necessarily play DX10 games well or modern games as the game industry has evolved (or not evolved because of consolization). This definitely provides insight for people clinging to old cards playing todays games, but doesn’t give you a baseline for performance of games from their time period (like how good a 3870 was at the time it was released).

    The only way to accomplish that would be by benchmarking games from those time periods. They could also do some really interesting things with these results, like aggregate all of them and show the % increase of performance between generations.

    • RickyTick
    • 7 years ago

    On one hand:

    [quote<]"Seeing all of these successive leaps in performance also makes me a little sad that seven-year-old consoles are still holding game developers back. If only today's games were designed with just the PC in mind..."[/quote<] On the other hand, in [i<]Here's what kind of PC you'll need for Crysis 3.[/i<] [quote<]Yikes. We're talking about something like the Editor's Choice build in our System Guide, and even that has a slower graphics card than what Crytek is suggesting.[/quote<] [quote<] I hope lower-end configs can enjoy all that eye candy, even if they have to live without uber-high texture detail or insane resolutions[/quote<] Are we complaining about consoles holding things back...or are we complaining about Crytek demanding too much out of our pc?

      • A_Pickle
      • 7 years ago

      Yeah. Actually, one thing I like about consoles is how developers manage to eke better graphics out of them by skillfully using system resources. On the PC, developers seem to just demand faster graphics hardware, and give us shitty console ports with mediocre-graphics and crushing system requirements.

    • Wirko
    • 7 years ago

    An extensive review indeed but they completely forgot IGPs. They could at least include the last generation.

    • swaaye
    • 7 years ago

    Looks like 8800GTX is still capable enough. It’s what almost 6 years old now? I don’t think too many people were gaming on their GeForce 2 in 2006.

      • rrr
      • 7 years ago

      Except it costed about as much as 670 or 680 does now, so it better have lasted some time.

        • swaaye
        • 7 years ago

        It wasn’t really an exception on pricing. It certainly was a vastly better value than the various ~$500-550 special edition X800/X1800/X1900 and GF 6/7 cards.

    • jdaven
    • 7 years ago

    Looking at BF3, the 8800 GTX is about 6 times slower than the 7970 GHz edition. The 8800 GTX was released almost 6 years ago. I think that is pretty good improvement.

      • sweatshopking
      • 7 years ago

      i have a 4890 and was wondering how it would compare to a new card, and whether it was worth upgrading. a 7970 (not in my budget by any stretch) is 4x faster. that shows real progress, and maybe i’ll pick up an 8k or 7xx series card (if i somehow drug my wife and sleep with my bill collectors)

        • Beelzebubba9
        • 7 years ago

        I went from a 4890 to a GeForce 660 Ti and the biggest difference was….

        Noise. Sure, games run better, but the 4890 was pretty capable at the 1920×1200 resolutions I play at, so mostly I bought into the wonderful silence of my MSI TwinFrozr 660 Ti.

          • sweatshopking
          • 7 years ago

          good to know. i’ll sit it out for a while!

    • Game_boy
    • 7 years ago

    The Wii U is a one month old console and still holding developers back (its CPU is weaker than a 360) yet it costs more than the Wii did at launch. Maybe there’s something fundamental that’s stopping consoles from advancing, since 7 years didn’t improve it?

      • superjawes
      • 7 years ago

      Consoles are clearly moving away from having strong or relatively strong hardware, and are instead opting for other technologies for the gaming experience. Nintendo was wildly successful with this approach on the Wii, and the WiiU’s tablet is a similar step in that direction. Microsoft and Sony opted for Kintect and Move to add functionality.

      So yeah…consoles are probably going to hold developers back on the CPU/GPU side of things, but there will still be opportunities for innovative development.

        • MadManOriginal
        • 7 years ago

        Ooor it could just be that Nintendo hasn’t been concerned with being ‘high performance’ for a long time now.

          • superjawes
          • 7 years ago

          While I agree with you, I don’t think it has much to do on the CPU/GPU side of things. I think the drop in need for “high performance” hardware is a direct result of chaning [i<]how[/i<] games are played. Now, [speculation] I think that seeing Microsoft and Sony follow in Nintendo's footsteps will mean that the XB360 and PS3 sucessors will have minimal hardware increases, instead favoring Kinect and Move updates and successors.[/speculation] I think this also has something to do with a set target for resolutions (1080p) and console gamers caring less about continued graphical improvement in favor of reaching "good enough" detail to enjoy a game. Sure, there are things that CAN be done to improve console output, but the added cost would probably not be worth it to most console gamers.

        • swaaye
        • 7 years ago

        The vibe I get from little developer hints is Xbox 3 isn’t going to be this way. PS4 is a mystery though. Nintendo is doing what they’ve always done, with Gamecube maybe being a slight exception for the time.

      • albundy
      • 7 years ago

      milking reinvented by nintendo. they keep regurgitating the same games that i’ve played since i was a kid. i have no idea why talent just stopped after the early 90’s.

      • Geistbar
      • 7 years ago

      I would think that memory is what holds developers back the most, particularly so for anything ported to PC. In that front the Wii U is a solid improvement with 2 GB available — though it looks like only 1 GB is accessible to games.

      I wouldn’t worry too much about what the Wii U says for hardware anyway. I doubt Sony / MS will move into the “better than anything you could get in a PC at release” that the old days had, but Nintendo seems happy to strike a midpoint of capability between rivals’ hardware generations.

    • willmore
    • 7 years ago

    Bit of a gap in the current AMD lineup. There’s no Pitcairn in the ‘generations’ section. Being the owner of one, this is an important elision to me. The charts do make it clear that going from my GF9800GTX+ to an HD7850 was good for a 3x increase in performance. Not surprising I’m CPU bound, now. *sigh*

    • derFunkenstein
    • 7 years ago

    ZOMG that website barely renders in both IE and Chrome. Text is cut off, the white backgrounds don’t go down the screen, etc. etc. The only way I’ve been able to read it is to bring up a page in IE9 on Windows 7, let it load incorrectly, and then hit refresh to re-load the page. Compatibility view didn’t help.

    Seriously, guys, make a site that works before you do cool stuff.

      • MadManOriginal
      • 7 years ago

      It looks fine to me. IE9 on Win 7, no reloading.

        • derFunkenstein
        • 7 years ago

        Must have been a temporary site burp or maybe an issue on my side; it seems OK now.

          • RickyTick
          • 7 years ago

          Nope, it just happened to me in Chrome.
          Looked ok after a page refresh.

            • derFunkenstein
            • 7 years ago

            well at least I’m not totally crazy. Thanks for saying so. 😀

      • colinstu12
      • 7 years ago

      Looks fine in Chrome.

      • no51
      • 7 years ago

      Works fine in Opera.

        • sweatshopking
        • 7 years ago

        opera?! wp?! we should hug!

          • no51
          • 7 years ago

          Other people might get jealous…

            • sweatshopking
            • 7 years ago

            let them!

            • Prion
            • 7 years ago

            :hug:

      • willmore
      • 7 years ago

      Perfect on FF, too. It loads slowly, maybe you’re seeing a timeout on loading one of the important CSS files? A reload woudl fix that.

      • BoBzeBuilder
      • 7 years ago

      Looks fine in Safari too.

      • Deanjo
      • 7 years ago

      Looks fine in Lynx. 😛

      • A_Pickle
      • 7 years ago

      Looks fine in Firefox…

      • Grigory
      • 7 years ago

      Looks fine in IE9 here.

      • rrr
      • 7 years ago

      Lesson: use Firefox and not some half assed botnets.

    • Hattig
    • 7 years ago

    Brilliant stuff, have been looking for something like this for a while, to truly put old cards in perspective.

    Shame they only go back in time to middle-aged cards though, I guess pensionable cards can’t run most of the benchmarks, or the testing experience was tear-inducingly painful!

      • mako
      • 7 years ago

      It would make my day if they could throw a Voodoo3 or a TNT2 in the mix.

        • albundy
        • 7 years ago

        they wouldnt be able to. the games they tested required a minimum of directx 8 or 9 to play.

    • Geistbar
    • 7 years ago

    A historical comparison that doesn’t include the 9700 Pro makes me sad. That was my first great graphics card. As far as I know it still works, but the hard drive in that computer died a while ago and I’m too lazy to see if I can hobble something together. Of course that’s a decade old card on a now thoroughly abandoned format, so I can’t blame them.

      • derFunkenstein
      • 7 years ago

      I see why they went back to the DX10 era – a lot of games either aren’t playable even on low settings, or flat-out won’t even run some of today’s most popular games (often times not because of anything other than a reluctance by the GPU maker to include them in current drivers). At some point you have to cut your losses, and the best place to stop is where they’re not supported in current drivers any longer.

      Oh, the whole AGP-only thing hurts the 9000-series’ chances of being re-tested in a SB-E system. 😉

        • MadManOriginal
        • 7 years ago

        It does make sense, it looks like they went back to the first worthwhile unified shader architectures (HD 2000 was a poor competitor to Nvidia’s 8000 series.) It’s all pretty new games though which I don’t think does much for people with older cards that aren’t going to run them well anyway, and still doesn’t answer the question I always have which is how do modern IGPs stack up in older games to either older cards or new, low-end cards. We can get an idea from newer games, but they usually have to be run at very low settings. If they had included an HD 4000 chip like the i3 3225 and an A10 5800K or A10 5700 that would have been awesome.

        • Geistbar
        • 7 years ago

        Yeah, as I said I don’t blame them for not including it, but I would have liked to see how far things have progressed since my first proper graphics card. I did note the AGP problem though — “on a now thoroughly outdated format”.

        • just brew it!
        • 7 years ago

        [quote<]Oh, the whole AGP-only thing hurts the 9000-series' chances of being re-tested in a SB-E system. ;)[/quote<] An X300 or X600 would've probably been a reasonable stand-in though. Same basic architecture as the 9000 series, but on a PCIe bus...

          • derFunkenstein
          • 7 years ago

          True, an X600 was a Radeon 9600 with PCI-e.

      • Sargent Duck
      • 7 years ago

      The 9700 Pro was iconic, but a line has to be drawn somewhere. I guess DX10 is as good enough as any.

      What I’d be interested in is how far coolers have come since the Geforce 5800 Ultra

        • Airmantharp
        • 7 years ago

        I remember being able to hear that thing over the noise in a Quakecon conference room (one where Kyle Bennett was speaking…).

Pin It on Pinterest

Share This