Personal computing discussed

Moderators: renee, Dposcorp

 
MrJP
Gerbil Elite
Posts: 836
Joined: Wed Nov 20, 2002 5:04 pm
Location: Surrey, UK

Re: techreport, meet anandtech

Wed Mar 27, 2013 2:27 pm

Does anyone else feel even slightly apprehensive that sites are going to use a tool supplied by nvidia as their source of benchmark results?

I also firmly agree that timing at both ends of the pipe needs to be checked. A mismatch in timing between simulation and display is going to be felt at least as much as uneven spacing of displayed frames.

Overall I do feel that Anandtech are lagging behind the other sites here. Their GPU reviews for a while seem to be very much a conveyer belt with lots of benchmarks but not much considered thinking about what and why they are testing. It's not a place where you expect to see anything surprising or novel that hasn't already been picked up elsewhere. At least not with GPUs. Anand's own articles about SSDs are usually excellent.
i7-6700K ~ Asus Z170I Pro Gaming ~ 16GB Corsair DDR4-2400 ~ Radeon R9 Nano
Noctua NHU9S ~ Corsair Obsidian 250D ~ Corsair RM650X ~ Crucial MX100 512GB
SMSL M3 ~ Sennheiser HD555 ~ Logitech G500S/G910/G29 ~ LG 34UC79G
 
cobalt
Gerbil First Class
Posts: 171
Joined: Mon Oct 30, 2006 11:28 am

Re: techreport, meet anandtech

Wed Mar 27, 2013 3:59 pm

I quite liked what they did for a little while there with cross-cutting comparisons. Like in pages 4-6 of this article:
http://www.anandtech.com/show/2745/4

They show a little higher-level analysis pretty well. For example, instead of simply showing one-game-detail-per-page, they have one page devoted to 4890 vs GTX 275 across games and resolutions, and then on another page they show what performance hit you take if you drop down $70 to the 4870, and then on another page show the $180 shootout between the 4870 and the 260 core 216. But I haven't seen that kind of higher-level summary from them in a few years.
 
Damage
Gerbil Jedi
Posts: 1787
Joined: Wed Dec 26, 2001 7:00 pm
Location: Lee's Summit, Missouri, USA
Contact:

Re: techreport, meet anandtech

Wed Mar 27, 2013 4:49 pm

Scott Wasson - "Damage"
 
Firestarter
Gerbil Elite
Posts: 773
Joined: Sun Apr 25, 2004 11:12 am

Re: techreport, meet anandtech

Wed Mar 27, 2013 5:30 pm

MrJP wrote:
Does anyone else feel even slightly apprehensive that sites are going to use a tool supplied by nvidia as their source of benchmark results?

Slightly. I mean, it does highlight a problem with AMD's cards, but then again if AMD hadn't already had their own toolset to identify the problems, they sure do have one now.
 
CampinCarl
Graphmaster Gerbil
Posts: 1363
Joined: Mon Jul 04, 2005 9:53 pm

Re: techreport, meet anandtech

Wed Mar 27, 2013 6:26 pm

Firestarter wrote:
MrJP wrote:
Does anyone else feel even slightly apprehensive that sites are going to use a tool supplied by nvidia as their source of benchmark results?

Slightly. I mean, it does highlight a problem with AMD's cards, but then again if AMD hadn't already had their own toolset to identify the problems, they sure do have one now.


Also, I'd like to think that AMD's engineers could easily reverse engineer everything in the tools suite (hardware, software) to make sure that there's no funny business. If they do find it, then they get to use it as PR against nVidia. Like I said on the front page comment thread, if this type of thing is the way forward (until we get to the point where, as Scott mentioned, both the GPU devs and game devs give benchmarkers and the public at large access into the 'chain' to take measurements), then the whole suite should be an FOSS suite.
Gigabyte AB350M Gaming-3 | R7 1700X | 2x8 GB Corsair Vengeance DDR4-3200 (@DDR4-2933)| Samsung 960 Evo 1TB SSD | Gigabyte GTX1080 | Win 10 Pro x86-64
 
Airmantharp
Emperor Gerbilius I
Posts: 6192
Joined: Fri Oct 15, 2004 10:41 pm

Re: techreport, meet anandtech

Wed Mar 27, 2013 6:34 pm

CampinCarl wrote:
Firestarter wrote:
MrJP wrote:
Does anyone else feel even slightly apprehensive that sites are going to use a tool supplied by nvidia as their source of benchmark results?

Slightly. I mean, it does highlight a problem with AMD's cards, but then again if AMD hadn't already had their own toolset to identify the problems, they sure do have one now.


Also, I'd like to think that AMD's engineers could easily reverse engineer everything in the tools suite (hardware, software) to make sure that there's no funny business. If they do find it, then they get to use it as PR against nVidia. Like I said on the front page comment thread, if this type of thing is the way forward (until we get to the point where, as Scott mentioned, both the GPU devs and game devs give benchmarkers and the public at large access into the 'chain' to take measurements), then the whole suite should be an FOSS suite.


I'll bet it gets FOSS fast, and even multi-platform (could what FRAPS does be done in Java?). Then again, I'm still aghast that GPU vendors and engine developers haven't been doing this kind of testing all along. How on earth would you otherwise measure the 'experience' of your game, especially when it comes to consoles and recommended minimum hardware requirements?

And to that, I've always looked at 'Recommended' hardware requirements and laughed heartily. If anything, those were the real 'minimums,' and what the publisher puts in those columns stinks of being written entirely by marketing departments.
 
PainIs4ThaWeak1
Gerbil
Posts: 77
Joined: Fri Jun 26, 2009 11:13 am

Re: techreport, meet anandtech

Wed Mar 27, 2013 6:54 pm

Scott - Posted a question for you -> http://techreport.com/discussion/24553/ ... ost=719800
 
MrJP
Gerbil Elite
Posts: 836
Joined: Wed Nov 20, 2002 5:04 pm
Location: Surrey, UK

Re: techreport, meet anandtech

Wed Mar 27, 2013 7:31 pm

cobalt wrote:
I quite liked what they did for a little while there with cross-cutting comparisons. Like in pages 4-6 of this article:
http://www.anandtech.com/show/2745/4

They show a little higher-level analysis pretty well. For example, instead of simply showing one-game-detail-per-page, they have one page devoted to 4890 vs GTX 275 across games and resolutions, and then on another page they show what performance hit you take if you drop down $70 to the 4870, and then on another page show the $180 shootout between the 4870 and the 260 core 216. But I haven't seen that kind of higher-level summary from them in a few years.


Yes, but that article is knocking on four years old, and co-authored by Anand. I've not been nearly so impressed with anything more recent, to the point where they are probably only about fourth or fifth in the list of sites I'd look to for a GPU review. Don't get me wrong, they're still competent and unbiased (sadly not as common as you'd hope in tech sites), I just think that as they've grown they've let this area slide relative to their previous high standards.

This new FCAT study is a case in point. It was interesting to read about this over at Anandtech earlier today, but the article (and the earlier one about AMD and FRAPS) just never really seemed to get into the problem as I was expecting. I accept there's a Part 2 yet to come, but as it stands it's almost like they are happy to just parrot back what AMD and Nvidia have been telling them.

Now I check back here this evening and find Scott's new article. What a contrast. Far more incisive, and really getting into the crux of the problem without taking anything at face value. Impressive stuff.
i7-6700K ~ Asus Z170I Pro Gaming ~ 16GB Corsair DDR4-2400 ~ Radeon R9 Nano
Noctua NHU9S ~ Corsair Obsidian 250D ~ Corsair RM650X ~ Crucial MX100 512GB
SMSL M3 ~ Sennheiser HD555 ~ Logitech G500S/G910/G29 ~ LG 34UC79G
 
MadManOriginal
Gerbil Jedi
Posts: 1533
Joined: Wed Jan 30, 2002 7:00 pm
Location: In my head...

Re: techreport, meet anandtech

Thu Mar 28, 2013 5:32 am

MrJP wrote:
cobalt wrote:
I quite liked what they did for a little while there with cross-cutting comparisons. Like in pages 4-6 of this article:
http://www.anandtech.com/show/2745/4

They show a little higher-level analysis pretty well. For example, instead of simply showing one-game-detail-per-page, they have one page devoted to 4890 vs GTX 275 across games and resolutions, and then on another page they show what performance hit you take if you drop down $70 to the 4870, and then on another page show the $180 shootout between the 4870 and the 260 core 216. But I haven't seen that kind of higher-level summary from them in a few years.


Yes, but that article is knocking on four years old, and co-authored by Anand. I've not been nearly so impressed with anything more recent, to the point where they are probably only about fourth or fifth in the list of sites I'd look to for a GPU review. Don't get me wrong, they're still competent and unbiased (sadly not as common as you'd hope in tech sites), I just think that as they've grown they've let this area slide relative to their previous high standards.

This new FCAT study is a case in point. It was interesting to read about this over at Anandtech earlier today, but the article (and the earlier one about AMD and FRAPS) just never really seemed to get into the problem as I was expecting. I accept there's a Part 2 yet to come, but as it stands it's almost like they are happy to just parrot back what AMD and Nvidia have been telling them.

Now I check back here this evening and find Scott's new article. What a contrast. Far more incisive, and really getting into the crux of the problem without taking anything at face value. Impressive stuff.


I'd say it's the difference between being a reporter and an investigative journalist. The AT article is the former, TR the latter. Scott obviously has more invested into the topic as well. I was glad when I saw his article posted because it seemed like AT, while giving credit, was also trying to take some kind of high ground to justify being late to the game.
 
south side sammy
Gerbil
Topic Author
Posts: 67
Joined: Tue Nov 10, 2009 6:48 pm

Re: techreport, meet anandtech

Thu Mar 28, 2013 2:32 pm

when it rains, it pours. now Anandtech is going to have to carefully research this and come up with it's own conclusions and AMD is going to have to do something other than pick apart other software/hardware to blame. guru3d is now using frame latency benchmarking.

"Lately there have been some new measurement introduced, latency measurements or what i like to call frame experience measurements. When you record a certain number of seconds of a recording whilst tracking the number of frames within a set time, then output that in a graph and then zoom in, then you can see the turnaround time for the GPU. Basically the time it takes to render one frame can be monitored and tagged with a number, this is latency. One frame can take say 17ms. A colleague website discovered a while ago that there were some latency discrepancies in-between NVIDIA and AMD graphics cards

http://www.guru3d.com/articles_pages/bi ... ark,1.html
 
MrJP
Gerbil Elite
Posts: 836
Joined: Wed Nov 20, 2002 5:04 pm
Location: Surrey, UK

Re: techreport, meet anandtech

Thu Mar 28, 2013 4:00 pm

Guru3D wrote:
Mind you that Average FPS matters more then frametime measurements. It's just an additional page or two of information that from now on we'll be serving you.

Well they just don't get it at all.
i7-6700K ~ Asus Z170I Pro Gaming ~ 16GB Corsair DDR4-2400 ~ Radeon R9 Nano
Noctua NHU9S ~ Corsair Obsidian 250D ~ Corsair RM650X ~ Crucial MX100 512GB
SMSL M3 ~ Sennheiser HD555 ~ Logitech G500S/G910/G29 ~ LG 34UC79G
 
Airmantharp
Emperor Gerbilius I
Posts: 6192
Joined: Fri Oct 15, 2004 10:41 pm

Re: techreport, meet anandtech

Thu Mar 28, 2013 4:21 pm

MrJP wrote:
Guru3D wrote:
Mind you that Average FPS matters more then frametime measurements. It's just an additional page or two of information that from now on we'll be serving you.

Well they just don't get it at all.


Hmm. I wanted to argue for their statement, as average FPS gives you the 'best' indication of performance potential, but potential performance is not the same as actual performance, or the HD7970 should be as fast as a Titan, and Crossfire would be amazing.
 
Firestarter
Gerbil Elite
Posts: 773
Joined: Sun Apr 25, 2004 11:12 am

Re: techreport, meet anandtech

Thu Mar 28, 2013 4:42 pm

MrJP wrote:
Guru3D wrote:
Mind you that Average FPS matters more then frametime measurements. It's just an additional page or two of information that from now on we'll be serving you.

Well they just don't get it at all.

Well to be fair, if the frame times and stuttering and all that are OK, that is, normal and not all kinds of screwed up, then the average FPS metric is really a great way of comparing performance. Not saying that it's better than comparing 99 percentile framerate or something like that, but it's still a very valid way of looking at performance. In a sense, you could forgo with FRAPS and FCAT completely if it's clear that the game is completely smooth. However, that case of a completely smooth game is a rare one indeed, and you could further argue that if a game is smooth enough as to not bother with detailed frametime analysis, the game is smooth enough to not bother with any analysis at all.
 
cynan
Graphmaster Gerbil
Posts: 1160
Joined: Thu Feb 05, 2004 2:30 pm

Re: techreport, meet anandtech

Thu Mar 28, 2013 5:24 pm

I don't really understand the need to emphasize one over the other (FPS vs frame time or latency). They both get at the same thing, from different angles. FPS is a good quick summary of average performance, and frame time and latency provide extra information to let you know how reflective the average performance (FPS) is of smooth, even 3D visuals (at least in theory - and so far this has been shown to more or less be the case to the extent that it can be validated). Frame time is really getting at the the same thing as FPS, just magnified down to the individual frames, instead of some aggregate measure of number of frames that occur in one second, on average, across the test run.

The only reason why FPS is considered by some to be more "important" is because A) It's a simpler concept to conceptualize and therefore more accessible and B) It's the status quo.

Declaring one supreme over the other is silly. It's kind of like trying to decide how warmly to dress before going outside and only looking at the temperature (FPS). Sure, that's a great start, but knowing windchill, relative humidity and whether or not and degree of precipitation, etc - all factors that can affect how warm you feel at a given temperature depending on what you're wearing - (analogous to frame time datan) will better equip you to make your decision. Eg, these additional meteorological factors will tell you whether or not you will actually be warm enough in shorts at 65 degrees, or not... Just like frame latency data will give you an idea of whether an FPS of 40 or 50 is reflective of smooth performance. The analogy isn't perfect because FPS and frame time metrics are even more closely related...
 
Captain Ned
Global Moderator
Posts: 28704
Joined: Wed Jan 16, 2002 7:00 pm
Location: Vermont, USA

Re: techreport, meet anandtech

Thu Mar 28, 2013 6:35 pm

cynan wrote:
The only reason why FPS is considered by some to be more "important" is because A) It's a simpler concept to conceptualize and therefore more accessible and B) It's the status quo.

And it's better when bigger instead of 99th percentile time where better is smaller. Far too many purchasers have invested far too much internetz in having the bigger number.
What we have today is way too much pluribus and not enough unum.
 
deathBOB
Gerbil
Posts: 16
Joined: Sat Jul 23, 2005 10:54 pm

Re: techreport, meet anandtech

Fri Mar 29, 2013 9:08 am

Captain Ned wrote:
cynan wrote:
The only reason why FPS is considered by some to be more "important" is because A) It's a simpler concept to conceptualize and therefore more accessible and B) It's the status quo.

And it's better when bigger instead of 99th percentile time where better is smaller. Far too many purchasers have invested far too much internetz in having the bigger number.


No one wants an "inny" E-peen instead of an "outy"!

I also found the Anandtech article a bit annoying, particularly because it declares FRAPS useless before even testing that claim.
 
MrJP
Gerbil Elite
Posts: 836
Joined: Wed Nov 20, 2002 5:04 pm
Location: Surrey, UK

Re: techreport, meet anandtech

Fri Mar 29, 2013 9:25 am

But if you like big numbers you can easily convert the 99th percentile frametime into an equivalent instantaneous FPS. TR already do this in the conclusion page value plots because it makes them easier to understand, and it was noticeable that the Nvidia-provided FCAT percentile plots did the same thing, for the same reason.

It's been a long-running argument in the article comments, but I still think this nicely bridges the gap between the more informative frametime analysis and people's existing comfort level with understanding FPS. Using the compulsory automotive analogy, my car displays instantaneous MPH and I find that much easier to grasp than if it displayed seconds per wheel revolution.

At the end of the day this is really a minor style issue, and Damage can plot whatever he thinks best as long as the interesting test results keep coming. 8)

My earlier comment about the Guru3D article was perhaps a bit unclear. I didn't mean to say that overall average FPS is irrelevant (it's not), but reading their words suggests that they are massively underestimating how important the measurement of smoothness is when coupled with the average FPS. They seem to be regarding it as a bit of bolt on analysis to be done just because others are doing it, but not actually to be believed or discussed. It just comes over as lazy and ill-informed.

Other than TR and PC Perspective, the "inside the second" way of looking at things seems to be uncovering a lack of real critical thinking in a lot of other sites.
i7-6700K ~ Asus Z170I Pro Gaming ~ 16GB Corsair DDR4-2400 ~ Radeon R9 Nano
Noctua NHU9S ~ Corsair Obsidian 250D ~ Corsair RM650X ~ Crucial MX100 512GB
SMSL M3 ~ Sennheiser HD555 ~ Logitech G500S/G910/G29 ~ LG 34UC79G
 
SuperSpy
Minister of Gerbil Affairs
Posts: 2403
Joined: Thu Sep 12, 2002 9:34 pm
Location: TR Forums

Re: techreport, meet anandtech

Fri Mar 29, 2013 9:42 am

I like seeing frame latency over FPS, it reminds me of 'service time'-style server benchmarks.
Desktop: i7-4790K @4.8 GHz | 32 GB | EVGA Gefore 1060 | Windows 10 x64
Laptop: MacBook Pro 2017 2.9GHz | 16 GB | Radeon Pro 560
 
Captain Ned
Global Moderator
Posts: 28704
Joined: Wed Jan 16, 2002 7:00 pm
Location: Vermont, USA

Re: techreport, meet anandtech

Fri Mar 29, 2013 9:44 am

MrJP wrote:
But if you like big numbers you can easily convert the 99th percentile frametime into an equivalent instantaneous FPS.

Ugh. Yes, you can, but when you do you still exclude the huge outliers that kill the gaming experience and that Damage has so easily dealt with in the first page of any "Inside the Second" article. Averages hide extremes and the new paradigm in GFX reviews is to find the extremes.

Is it really the case that so many people are so caught up in "my card has 5 FPS over your card"? Really? You'll take the 200ms frame that kills the experience just because your FPS is bigger?

I see a sad analogy to the watts per channel wars of late 1970's magazine ads for stereo receivers.

Bloody hell. As I typed that last line (and I was around for those ads) a flash came to me. The '70s WPC wars were controlled by nicely asking manufacturers to cite the Total Harmonic Distortion at said power rating. When it became clear that the power claims came with 10% THD, the power claims were scaled back to WPC at THD under 1%. I can see the same regime for GFX cards, citing overall FPS with 99th percentile times of X, with smaller values of X (i.e. THD) being subjectively better. The remaining issue is the identification of a standard benchmark for determining 99th percentile frame latency. You get both the overall speed of the card combined with its ability to pass frames without clogging up. I think this is doable.

I feel like that young girl in a café in Rickmansworth who sussed it all out 2 seconds before the Vogons blew it all up.
What we have today is way too much pluribus and not enough unum.
 
DPete27
Grand Gerbil Poohbah
Posts: 3776
Joined: Wed Jan 26, 2011 12:50 pm
Location: Wisconsin, USA

Re: techreport, meet anandtech

Fri Mar 29, 2013 11:19 am

Damage wrote:

Scott, I'd like to see you revisit the Multi-GPU "micro-stutter" problem once AMD releases their July driver update. Apparently AMD is currently favoring input lag at the expense of frame interval smoothness. (opposite of what Nvidia is currently doing) Their July drive update is supposed to allow the user to choose which they would like to prioritize.
Main: i5-3570K, ASRock Z77 Pro4-M, MSI RX480 8G, 500GB Crucial BX100, 2 TB Samsung EcoGreen F4, 16GB 1600MHz G.Skill @1.25V, EVGA 550-G2, Silverstone PS07B
HTPC: A8-5600K, MSI FM2-A75IA-E53, 4TB Seagate SSHD, 8GB 1866MHz G.Skill, Crosley D-25 Case Mod
 
MrJP
Gerbil Elite
Posts: 836
Joined: Wed Nov 20, 2002 5:04 pm
Location: Surrey, UK

Re: techreport, meet anandtech

Fri Mar 29, 2013 11:48 am

Captain Ned wrote:
Ugh. Yes, you can, but when you do you still exclude the huge outliers that kill the gaming experience and that Damage has so easily dealt with in the first page of any "Inside the Second" article. Averages hide extremes and the new paradigm in GFX reviews is to find the extremes.

I think you misunderstand. I'm not talking about going back to the average FPS numbers which would throw away all the useful data. I'm only talking about changing the axis in the plot to frames/second rather than seconds/frame. You still plot the datapoint for every individual frame, you just present it as equivalent instantaneous frames/second simply by taking the reciprocal: instantaneous frames/second = 1/frametime. As in the table that you often see at the start of the reviews, 10ms=100fps, 16.7ms=60fps, 20ms=50fps, 33ms=30fps and so on. This is just a question of units, not changing the data that is presented.

The plots still show the same frame-to-frame variance, spikes, etc it's just done in a way where the higher-performing parts are towards the top of the graph, and people can look at it and make straightforward judgements like "this one hovers around 60fps but often spikes down to 10fps" rather than "this one hovers around 16ms frametime but often spikes up to 100ms". See how that's more easy to relate to for most gamers?

Even if you don't think this makes sense for the frametime vs frame number plot itself, it definitely makes sense for the percentile plots. Indeed that's exactly what is done in the example Nvidia FCAT analysis plot as seen in Scott's article here. The faster solutions are higher up the graph, and the ones with smoothness problems drop away earlier going from left-to-right. It shows exactly the same data as the TR percentile plots, I just think it's presented in a more intuitive fashion.

Sorry for going further off-topic, and I appreciate this has been discussed in previous threads, I just feel that a lot of people who are against this suggestion are misinterpreting it as a desire to go back to the old average FPS analysis. Nothing could be further from the truth.
i7-6700K ~ Asus Z170I Pro Gaming ~ 16GB Corsair DDR4-2400 ~ Radeon R9 Nano
Noctua NHU9S ~ Corsair Obsidian 250D ~ Corsair RM650X ~ Crucial MX100 512GB
SMSL M3 ~ Sennheiser HD555 ~ Logitech G500S/G910/G29 ~ LG 34UC79G
 
Captain Ned
Global Moderator
Posts: 28704
Joined: Wed Jan 16, 2002 7:00 pm
Location: Vermont, USA

Re: techreport, meet anandtech

Fri Mar 29, 2013 12:02 pm

MrJP wrote:
Sorry for going further off-topic, and I appreciate this has been discussed in previous threads, I just feel that a lot of people who are against this suggestion are misinterpreting it as a desire to go back to the old average FPS analysis. Nothing could be further from the truth.

When the core measurement is a unit of time, why convert it into a derived measurement? Converting a time-based latency plot into an FPS-based plot hides the damage (heh). Face it. The defining numbers in future GFX reviews will not be large, they will be very small.
What we have today is way too much pluribus and not enough unum.
 
flip-mode
Grand Admiral Gerbil
Posts: 10218
Joined: Thu May 08, 2003 12:42 pm

Re: techreport, meet anandtech

Fri Mar 29, 2013 12:37 pm

MrJP wrote:
I think you misunderstand. I'm not talking about going back to the average FPS numbers which would throw away all the useful data. I'm only talking about changing the axis in the plot to frames/second rather than seconds/frame. You still plot the datapoint for every individual frame, you just present it as equivalent instantaneous frames/second simply by taking the reciprocal: instantaneous frames/second = 1/frametime. As in the table that you often see at the start of the reviews, 10ms=100fps, 16.7ms=60fps, 20ms=50fps, 33ms=30fps and so on. This is just a question of units, not changing the data that is presented.

Sorry for going further off-topic, and I appreciate this has been discussed in previous threads, I just feel that a lot of people who are against this suggestion are misinterpreting it as a desire to go back to the old average FPS analysis. Nothing could be further from the truth.


Hmm.... I've made this suggestion before (and I've specifically mentioned the term "instantaneous FPS")... and I've stepped away from the suggestion because I've come to believe it is entirely without purpose. What is the purpose of stating things in terms of FPS (in which higher is better) when in fact the real metrics of interest are frame latency and frame latency variation.

What is the point of even referring to FPS? Why use the term instantaneous FPS? It's just an attempt to cling to an old way of thinking. But it's the old way of thinking that is broken and so we really should look away from it and look beyond it. Use the terminology that is most accurate and that really describes exactly what we're talking about. And we're not talking about instantaneous FPS even if we can force that term to work for us. What we're really talking about is frame latency and frame latency variation. Why force new terms when the truly accurate terms are already available? We have the right terminology already, so there's no need to invent less precise terms to take the place of the words that we really mean to say.
 
southrncomfortjm
Gerbil Elite
Posts: 574
Joined: Mon Nov 12, 2012 7:57 pm

Re: techreport, meet anandtech

Fri Mar 29, 2013 1:11 pm

Damage wrote:


My main takeaway from that TR article is that FRAPs is pretty decent at doing what we all wish it did. Its a close enough approximation of what we want that we can use it without being too far off base. Good interpretation?

I hope that's right. I read the article late at night and haven't had a chance to get back to it.
Gaming: i5-3570k/Z77/212 Evo/Corsair 500R/16GB 1600 CL8/RX 480 8GB/840 250gb, EVO 500gb, SG 3tb/Tachyon 650w/Win10
 
Airmantharp
Emperor Gerbilius I
Posts: 6192
Joined: Fri Oct 15, 2004 10:41 pm

Re: techreport, meet anandtech

Fri Mar 29, 2013 2:03 pm

southrncomfortjm wrote:
Damage wrote:


My main takeaway from that TR article is that FRAPs is pretty decent at doing what we all wish it did. Its a close enough approximation of what we want that we can use it without being too far off base. Good interpretation?

I hope that's right. I read the article late at night and haven't had a chance to get back to it.


Thinking about it, I don't think that there's anything that can be improved without better tools- even the newest one still doesn't measure 'total system response time,' which is the point we're all trying to get at (and have been since the dawn of interactive computer games).

Average FPS still allows you to rank cards by their performance potential, and provides the best 'first look' into how they might perform. It's a good thing, and will continue to be relevant, especially as the various parties involved in computer gaming focus on reducing and smoothing all of the various latencies involved.

Instantaneous FPS isn't something weird or different or evil either; it's just like a speedometer as Flip said above, and it explains frame latency irregularities in terms of FPS. It's how I've thought of it for the last decade or so, when trying to explain to myself or someone else how something like Counter-Strike that's running at 150FPS could still chop. It's saying that while you're still getting some obscene framerate, for some period of time less than a full second, your framerate dropped below say 10FPS. The only real problem is that's it's seemingly more difficult to illustrate in visual data than the methods Scott has come up with for frametimes.
 
Waco
Maximum Gerbil
Posts: 4850
Joined: Tue Jan 20, 2009 4:14 pm
Location: Los Alamos, NM

Re: techreport, meet anandtech

Fri Mar 29, 2013 3:09 pm

MrJP wrote:
The plots still show the same frame-to-frame variance, spikes, etc it's just done in a way where the higher-performing parts are towards the top of the graph, and people can look at it and make straightforward judgements like "this one hovers around 60fps but often spikes down to 10fps" rather than "this one hovers around 16ms frametime but often spikes up to 100ms". See how that's more easy to relate to for most gamers?

I'm behind showing both - because explaining FPS to people is difficult enough. Jumping from FPS to frame times just makes it even more difficult to explain things.

People want a simple metric and they have grasped the notion of FPS pretty well by now. Presenting the same data in a different way (arguably in a non-sensical format, but oh well) that makes it more understandable to the masses is probably not a bad thing.

I know "instantaneous" FPS is a contradiction of terms but it's a lot easier explain...and bigger numbers are still better. :P
Last edited by Waco on Fri Mar 29, 2013 3:16 pm, edited 1 time in total.
Victory requires no explanation. Defeat allows none.
 
Captain Ned
Global Moderator
Posts: 28704
Joined: Wed Jan 16, 2002 7:00 pm
Location: Vermont, USA

Re: techreport, meet anandtech

Fri Mar 29, 2013 3:15 pm

Waco wrote:
People want a simple metric and they have grasped the notion of FPS pretty well by now. Presenting the same data in a different way (arguably in a non-sensical format, but oh well) that makes it more understandable to the masses is probably not a bad thing.

I've traded a few PMs with Damage where I support something along the lines of the way audio amplifiers are marketed, i.e. 200 watts per channel at .01% Total Harmonic Distortion. For GFX cards, it would be 90 FPS at a 99th% latency of X with the lower X being better. The bitch of my idea, though, is deciding upon the standardized test, as unless there's a standard test there's no common frame of reference.
What we have today is way too much pluribus and not enough unum.
 
Waco
Maximum Gerbil
Posts: 4850
Joined: Tue Jan 20, 2009 4:14 pm
Location: Los Alamos, NM

Re: techreport, meet anandtech

Fri Mar 29, 2013 3:21 pm

Captain Ned wrote:
I've traded a few PMs with Damage where I support something along the lines of the way audio amplifiers are marketed, i.e. 200 watts per channel at .01% Total Harmonic Distortion. For GFX cards, it would be 90 FPS at a 99th% latency of X with the lower X being better. The bitch of my idea, though, is deciding upon the standardized test, as unless there's a standard test there's no common frame of reference.

I think the only problem with that is that that 99% metric can still hide some pretty crazy latency in the degenerate case.

1000 frames at 1 millisecond with a single spike of 1 second of latency would hide behind that, yes? The problem is somewhat similar to the way some amplifiers can be rated at "1000 watts peak" when they can really only sustain an RMS power output of nearly an order of magnitude below that but since they technically can produce that power for a very short period of time without going over the THD limits...you get the point.


I think all I've said here is that it's difficult to quantify this...but I think we all already knew that. :lol:
Victory requires no explanation. Defeat allows none.
 
MrJP
Gerbil Elite
Posts: 836
Joined: Wed Nov 20, 2002 5:04 pm
Location: Surrey, UK

Re: techreport, meet anandtech

Fri Mar 29, 2013 3:43 pm

You guys make some good points. We might just have to agree to differ. :)

I'll just make a last stab at it with two final arguments:

1. What are we ultimately trying to check here? Smoothness of animation. How is that usually measured in film and TV (as well as in computer monitor refresh rate)? Frames per second (or Hertz to be more technically correct).

The key thing in all of this is not how long a single individual frame takes to be processed, it's whether the overall rate of frames is high enough and the variance low enough for the illusion of movement to be convincing. Traditional "FPS" measured over a whole benchmark gives a (less than perfect) measure of the former, and the frametime analysis allows us to check the latter. Indeed, both bits of information drop out of the percentile plot, which is why I think it's the most informative bit of the new-style reviews. However ultimately since it's the rate of the image update that we're interested in, then it makes sense that the results should be presented as a rate, even when it's been derived from individual frames. So frametimes should be converted to frames/second (or better yet Hertz, but then we're back in to the unfamiliar).

2. OK so you don't buy that one. You still think that we should be talking in terms of frametimes as being the final measure. So why do other plots in the review still use frames/second? Why the inconsistency?

At the moment, the traditional FRAPS average FPS results are still plotted in the review as frames/second. The final value scatter plots are plotted with frames/second (converted instantaneous FPS for those that think that's impossible!). This straight away makes it hard to compare the results from the two types of analysis. It's trivial to convert the FRAPS average FPS into an equivalent overall average frametime (frametime=1/FPS). If frametime is the final measure we're interested in, then shouldn't everything be presented in this way?

I suspect the truth is that things have been presented this way because that's how FRAPS writes them out. Average frame rate is presented in frames/second, and the individual frame times are reported in ms. Please don't take this as an intended criticism because I think the reviews are fantastic, and ultimately the content is the draw rather than the way its presented.

And getting back on topic, TR > Anandtech. :)
i7-6700K ~ Asus Z170I Pro Gaming ~ 16GB Corsair DDR4-2400 ~ Radeon R9 Nano
Noctua NHU9S ~ Corsair Obsidian 250D ~ Corsair RM650X ~ Crucial MX100 512GB
SMSL M3 ~ Sennheiser HD555 ~ Logitech G500S/G910/G29 ~ LG 34UC79G
 
cynan
Graphmaster Gerbil
Posts: 1160
Joined: Thu Feb 05, 2004 2:30 pm

Re: techreport, meet anandtech

Fri Mar 29, 2013 5:03 pm

MrJP wrote:
You guys make some good points. We might just have to agree to differ. :)

I'll just make a last stab at it with two final arguments:

1. What are we ultimately trying to check here? Smoothness of animation. How is that usually measured in film and TV (as well as in computer monitor refresh rate)? Frames per second (or Hertz to be more technically correct).

The key thing in all of this is not how long a single individual frame takes to be processed, it's whether the overall rate of frames is high enough and the variance low enough for the illusion of movement to be convincing. Traditional "FPS" measured over a whole benchmark gives a (less than perfect) measure of the former, and the frametime analysis allows us to check the latter. Indeed, both bits of information drop out of the percentile plot, which is why I think it's the most informative bit of the new-style reviews. However ultimately since it's the rate of the image update that we're interested in, then it makes sense that the results should be presented as a rate, even when it's been derived from individual frames. So frametimes should be converted to frames/second (or better yet Hertz, but then we're back in to the unfamiliar).

2. OK so you don't buy that one. You still think that we should be talking in terms of frametimes as being the final measure. So why do other plots in the review still use frames/second? Why the inconsistency?

At the moment, the traditional FRAPS average FPS results are still plotted in the review as frames/second. The final value scatter plots are plotted with frames/second (converted instantaneous FPS for those that think that's impossible!). This straight away makes it hard to compare the results from the two types of analysis. It's trivial to convert the FRAPS average FPS into an equivalent overall average frametime (frametime=1/FPS). If frametime is the final measure we're interested in, then shouldn't everything be presented in this way?

I suspect the truth is that things have been presented this way because that's how FRAPS writes them out. Average frame rate is presented in frames/second, and the individual frame times are reported in ms. Please don't take this as an intended criticism because I think the reviews are fantastic, and ultimately the content is the draw rather than the way its presented.

And getting back on topic, TR > Anandtech. :)


I think Hz is a bit inappropriate, because something measured in Hz implies constant periodicity. That is, the rate doesn't change. This is great for monitor refresh rates or recorded video, etc, as the frame rates don't change. The whole point of frame time data is because frame rates do change.

But I don't have a problem with the rest of what you said. Whether frame times are presented as ms between frames or instantaneous FPS makes no difference to me. The key is to pick one and have the industry adopt it as this avoids confusion. The one thing that instantaneous FPS has against it, is that because it looks similar to average FPS, people might more readily confuse the two. At least with ms between frames, there is little chance of this.

Who is online

Users browsing this forum: No registered users and 1 guest
GZIP: On