My GPU history:
Model Paid Date
X800Pro $250 7/2005
8800GTS $310 8/2007
HD 4870 $270 8/2008
HD 7950 $400 6/2012
I think the performance per dollar is important. Maybe not for seeing if an old card is viable in a modern build, but for upgrade cost effectiveness. My 4870 lasted me 4 years and is still on par with the lesser modern cards (HD 7750, GTX650). Spending $270 for 4+ years of PC gaming is pretty good (my nephew is running the 4870 now). As far as I can tell, the 7950 should carry me quite a ways too. It's on the same tier as the current R9 270/280 and GTX 670/760 cards. I'd like to see where the 7950 and the latest cards match up on the scatter plots. I blame TR for making them so damn useful
Maybe performance per dollar per game would be a decent method of getting all TRs data into "master plots?" If a card was tested for Farcry 3, put it on a Farcy 3 scatter plot. There's got to be some game overlap in the collected data.
Just so my earlier point was clear, performance per dollar [P/$] is
important, but the metric has a shelf life. Once a card is off the market, it doesn't do any good comparing its performance per dollar because the prices are off or nonexistent. If you compared the 4870's P/$ (from 2008) to the 7950's (from 2012), the 4870 would look awful. In order to maintain an accurate P/$ scatter plot with these older cards, you would not only need to estimate the performance, but the price as well. What's a fair price for a 2008 card in new condition?
It would be worth it to estimate and compare the performance of both, since there are plenty of people who could be upgrading, but few people are going to buying a NEW 4870 in 2013, and virtually no one will buy one at its 2008 price.
On second thought, let's not go to TechReport. Tis a silly place.