Single page Print

Far Cry 4


Click the buttons above to cycle through the plots. Each card's frame times are from one of the three test runs we conducted for that card.

Right away, the Titan X's plot looks pristine, with very few frame time spikes and a relatively steady cadence of new frames.

Click over to the plot for the Radeon R9 295 X2, though. That's the Titan X's closest competition from the Radeon camp, a Crossfire-on-a-stick monster with dual Hawaii GPUs and water cooling. In the first part of the run, the frame time plots are much more variable than on the Titan X. That's true even though we're measuring with Fraps, early in the frame production pipeline, not at the display using FCAT. (I'd generally prefer to test multi-GPU solutions with both Fraps and FCAT, given that they sometimes have problems with smooth frame dispatch and delivery. I just haven't been able to get FCAT working at 4K resolutions.) AMD's frame-pacing for CrossFire could possibly smooth the delivery of frames to the display beyond what we see in Fraps, but big timing disruptions in the frame creation process like we're seeing above are difficult to mask (especially since we're using a three-frame moving average to filter the Fraps data).

Then look what happens later in the test session: frame times become even more variable. This is no fluke. It happens in each test run in pretty much the same way.

Note that we used a new Catalyst beta driver supplied directly by AMD for the R9 295 X2 in Far Cry 4. The current Catalyst Omega driver doesn't support multi-GPU in this game. That said, my notes from this test session pretty much back up what the Fraps results tell us. "New driver is ~50 FPS. Better than before, but seriously doesn't feel like 50 FPS on a single GPU."

Sorry to take the spotlight off of the Titan X, but it's worth noting what the new GeForce has to contend with. The Radeon R9 295 X2 is capable of producing some very high FPS averages, but the gaming experience it delivers doesn't always track with the traditional benchmark scores.

The gap between the FPS average and the 99th percentile frame time tells the story of the Titan X's smoothness and the R9 295 X2's stutter.

We can understand in-game animation fluidity even better by looking at the "tail" of the frame time distribution for each card, which illustrates what happens in the most difficult frames.

The 295 X2 produces 50-60% of the frames in our test sequence much quicker than anything else. That changes as the proportion of frames rendered rises, though, and once we hit 85%, the 295 X2's frame times actually cross over and exceed the frame times from the single Hawaii GPU aboard the R9 290X. By contrast, the Titan X's curve is low and flat.


These "time spent beyond X" graphs are meant to show "badness," those instances where animation may be less than fluid—or at least less than perfect. The 50-ms threshold is the most notable one, since it corresponds to a 20-FPS average. We figure if you're not rendering any faster than 20 FPS, even for a moment, then the user is likely to perceive a slowdown. 33 ms correlates to 30 FPS or a 30Hz refresh rate. Go beyond that with vsync on, and you're into the bad voodoo of quantization slowdowns. And 16.7 ms correlates to 60 FPS, that golden mark that we'd like to achieve (or surpass) for each and every frame.

One interesting quirk of this test is demonstrated in the 33-ms results. The GeForce GTX 980 produces almost every single frame in less than 33.3 milliseconds, nearly matching the Titan X. While playing, the difference in the smoothness of animation between the two cards isn't terribly dramatic.

Meanwhile, the GeForce GTX 780 Ti suffers by comparison to the GTX 980 and the R9 290X. I suspect that's because its 3GB of video memory isn't quite sufficient for this test scenario.