Fear no evil—further tinkering
On the previous page, we saw that the GeForce GTX 460 and Radeon HD 6850 struggled to some extent with the "high" detail preset, while the GeForce GTX 560 Ti and Radeon HD 6950 1GB seemed like they might be able to do more work.
Let's see how the slower pair of graphics cards performs at the "medium" detail preset. Since we've established that raw FPS numbers can be misleading, we're going to lead with our frame latency graph.
Both cards clearly offer a smoother experience at this detail preset, with frame times hovering closer to, and often below, the 20-ms mark. Our FPS graph corroborates that—52 FPS works out to an average frame time of 19 ms. The FPS result don't tell the whole story, though. Again, the Radeon exhibits more frame latency spikes than the GeForce. When we filter out the 1% of highest frame latencies with our 99th-percentile calculation, the Radeon HD 6850 proves to be a higher-latency solution that the GeForce GTX 460, by a handful of milliseconds. Since we're talking about frame times in the 25-31 ms range, we expect these differences could be perceptible, though not huge.
What about "ultra" detail on the higher-end cards?
So much for that. The GTX 560 Ti and 6950 1GB perform worse at the "ultra" setting than the GTX 460 and 6850 do at the "high" setting, with all too many frames taking 30 ms or longer to render. That might pass muster in other titles, but it severely degrades the experience in a twitch shooter like Battlefield 3.
|A first look at the Windows 10 Technical Preview||15|
|Friday night topic: The nosehair trimmer dilemma||83|
|$250 Samsung Chromebook 2 has Intel inside||13|
|Deal of the week: The Pentium Anniversary Edition Pentium for $55, and more||38|
|This mini Bay Trail PC is the size of a thumb drive||42|
|FCC docs hint at Chromecast dongle with 5GHz Wi-Fi||8|
|AMD suffers falling revenue, announces 7% staff cut||101|
|Apple refreshes iPad lineup, adds iMac with 5K display||178|