Is FCAT more accurate than Fraps for frame time measurements?


— 11:36 AM on July 22, 2015

Here's a geeky question we got in response to one of our discussions in the latest episode of the podcast that deserves a solid answer. It has to do with our Inside the Second methods for measuring video game performance using frame times, as demonstrated in our Radeon R9 Fury review. Specifically, it refers to the software tool Fraps versus the FCAT tools that analyze video output.

TR reader TheRealSintel asks:

On the FRAPS/frametime discussion, I remember during the whole FCAT introduction that FRAPS was not ideal, I also heard some vendors performance can take a dive when FRAPS is enabled, etc.

I actually assumed the frametimes in each review were captured using FCAT instead of FRAPS.

When you guys introduce a new game to test, do you ever measure the difference between in-game reporting, FCAT and FRAPS?

I answered him in the comments, but I figure this answer is worth promoting to a blog entry. Here's my response:

There's a pretty widespread assumption at other sites that FCAT data is "better" since it comes from later in the frame production process, and some folks like to say Fraps is less "accurate" as a result. I dispute those notions. Fraps and FCAT are both accurate for what they measure; they just measure different points in the frame production process.

It's quite possible that Fraps data is a better indication of animation smoothness than FCAT data. For instance, a smooth line in an FCAT frame time distribution wouldn't lead to smooth animation if the game engine's internal simulation timing doesn't match well with how frames are being delivered to the display. The simulation's timing determines the *content* of the frames being produced, and you must match the sim timing to the display timing to produce optimally fluid animation. Even "perfect" delivery of the frames to the display will look awful if the visual information in those frames is out of sync.

What we do now for single-GPU reviews is use Fraps data (or in-engine data for a few games) and filter the Fraps results with a three-frame moving average. This filter accounts for the effects of the three-frame submission queue in Direct3D, which can allow games to tolerate some amount of "slop" in frame submission timing. With this filter applied, any big spikes you see in the frame time distribution are likely to carry through to the display and show up in FCAT data. In fact, this filtered Fraps data generally looks almost identical to FCAT results for single-GPU configs. I'm confident it's as good as FCAT data for single-GPU testing.

For multi-GPU configs, things become more complicated because frame metering/pacing comes into the picture. In that case, Fraps and FCAT may look rather different. That said, a smooth FCAT line with multi-GPU is not a guarantee of smooth animation alone. Frame metering only works well when the game advances its simulation time using a moving average or a fixed cadence. If the game just uses the wall clock for the current frame, then metering can be a detriment. And from what I gather, game engines vary on this point.

(Heck, the best behavior for game engine timing for SLI and CrossFire—advancing the timing using a moving average or fixed cadence—is probably the opposite of what you'd want to do for a variable-refresh display with G-Sync or FreeSync.)

That's why we've been generally wary of AFR-based multi-GPU and why we've provided video captures for some mGPU reviews. See here.

At the end of the day, a strong correlation between Fraps and FCAT data would be a better indication of smooth in-game animation than either indicator alone, but capturing that data and quantitatively correlating it is a pain in the rear and lot of work. No one seems to be doing that (yet?!).

Even further at the end of the day, all of the slop in the pipeline between the game's simulation and the final display is less of a big deal than you might think so long as the frame times are generally low. That's why we concentrate on frame times above all, and I'm happy to sample at the point in the process that Fraps does in order to measure frame-to-frame intervals.

I should also mention: I don't believe the presence of the Fraps overlay presents any more of a performance problem than the presence of the FCAT overlay when running a game. The two things work pretty much the same way, and years of experience with Fraps tells me its performance impact is minimal.

Here's hoping that answer helps. This is tricky stuff. There are also the very practical challenges involved in FCAT use, like the inability to handle single-tile 4K properly and the huge amount of data generated, that make it more trouble than it's worth for single-GPU testing. I think both tools have their place, as does the in-engine frame time info we get from games like BF4.

In fact, the ideal combination of game testing tools would be: 1) in-engine frame time recordings that reflect the game's simulation time combined with 2) a software API from the GPU makers that reflects the flip time for frames at the display. (The API would eliminate the need for fussy video capture hardware.) I might add: 3) a per-frame identification key that would let us track when the frames produced in the game engine are actually hitting the display, so we can correlate directly.

For what it's worth, I have asked the GPU makers for the API mentioned in item 2, but they'd have to agree on something in common in order for that idea to work. So far, nobody has made it a priority.

Like what we're doing? Pay what you want to support TR and get nifty extra features.
Top contributors
1. BIF - $340 2. Ryu Connor - $250 3. mbutrovich - $250
4. YetAnotherGeek2 - $200 5. End User - $150 6. Captain Ned - $100
7. Anonymous Gerbil - $100 8. Bill Door - $100 9. ericfulmer - $100
10. dkanter - $100
   
Register
Tip: You can use the A/Z keys to walk threads.
View options

This discussion is now closed.