We tested Oblivion by manually playing through a specific point in the game five times while recording frame rates using the FRAPS utility. Each gameplay sequence lasted 60 seconds. This method has the advantage of simulating real gameplay quite closely, but it comes at the expense of precise repeatability. We believe five sample sessions are sufficient to get reasonably consistent and trustworthy results. In addition to average frame rates, we've included the low frames rates, because those tend to reflect the user experience in performance-critical situations. In order to diminish the effect of outliers, we've reported the median of the five low frame rates we encountered.
We set Oblivion's graphical quality settings to "Ultra High." The screen resolution was set to 1600x1200 resolution, with HDR lighting enabled. 16X anisotropic filtering was forced on via the cards' driver control panels.
Nvidia's default texture filtering routine also produces quite a bit more moire and pixel crawling in this game than ATI's. Neither is perfect, though.
Personally, I thought that both the Radeon X1900 XT and the GeForce 7950 GT ran Oblivion reasonably well at these settings. They might be stretching a little, but not much.
Ghost Recon Advanced Warfighter
We tested GRAW with FRAPS, as well. We cranked up all of the quality settings for this game, with the exception of antialiasing. However, GRAW doesn't allow cards with 256MB of memory to run with its highest texture quality setting, so those cards were all running at the game's "Medium" texture quality.
Running around other areas of the game, I got the impression that the 7950 GT was a little faster than the X1900 XT 256MB. Both played the game acceptably, for the most part. Our FRAPS session produces some pretty low numbers because we're blowing stuff up. Just sneaking around the city, you'll see higher frame rates than these. And if you run SLI, you'll be treated to playable frame rates at 2048x1536, as well.