Personal computing discussed

Moderators: renee, mac_h8r1, Nemesis

 
ultima_trev
Gerbil XP
Topic Author
Posts: 363
Joined: Sat Mar 27, 2010 11:14 am
Contact:

Framerate (or latency): Average vs Consistency?

Fri Jun 19, 2015 1:02 am

What are the actual benefits of high framerates outside of competitive multiplayer twitch shooters? Surely it's not necessary for a slow paced single player campaign? I mean, I can definitely tell the difference between 30 and 80 frames per second in terms of actual responsiveness, especially the mouse look in shooters. But as long as FPS stays 40 (or 25 ms latency) and doesn't budge any lower, I could care less about average or peak. I'd rather have a constant 40 as opposed to 60 or 120 with dips to below 40. Maybe this wouldn't fly for professional, competitive gamers though?
Ryzen 7 1800X - Corsair H60i - GA AB350 Gaming - 32GB DDR4 2933 at 16,16,16,36 - GTX 1080 at 1924 / 5264 (undervolted) - 250GB WD Blue SSD - 2TB Toshiba 7200rpm HDD
 
Krogoth
Emperor Gerbilius I
Posts: 6049
Joined: Tue Apr 15, 2003 3:20 pm
Location: somewhere on Core Prime
Contact:

Re: Framerate (or latency): Average vs Consistency?

Fri Jun 19, 2015 1:09 am

Consistent framerate hands down.

It is the frame-dips and spikes that are annoying above all else. Anyway, online gaming is already coded with hard caps on rendering (60-85FPS) for timing, syncing and server-to-client prediction reasons.
Gigabyte X670 AORUS-ELITE AX, Raphael 7950X, 2x16GiB of G.Skill TRIDENT DDR5-5600, Sapphire RX 6900XT, Seasonic GX-850 and Fractal Define 7 (W)
Ivy Bridge 3570K, 2x4GiB of G.Skill RIPSAW DDR3-1600, Gigabyte Z77X-UD3H, Corsair CX-750M V2, and PC-7B
 
LostCat
Minister of Gerbil Affairs
Posts: 2107
Joined: Thu Aug 26, 2004 6:18 am
Location: Earth

Re: Framerate (or latency): Average vs Consistency?

Fri Jun 19, 2015 1:59 am

It depends what games you're playing IMO...I'd rather have 144 in racing games and space sims as well.

Also some action RPGs I've seen some fast animations which could've been smoothed out with more frames.
Meow.
 
Aphasia
Grand Gerbil Poohbah
Posts: 3710
Joined: Tue Jan 01, 2002 7:00 pm
Location: Solna/Sweden
Contact:

Re: Framerate (or latency): Average vs Consistency?

Fri Jun 19, 2015 9:20 am

As long as framerate is above refresh rate, consistency all the way. And yes, I hate tearing and thus also pay a price for using vsync. So soon as decent freesync monitors are out in force, I'll get one which might help that quite a ways considering a play a whole lot of FPS, not to mention having more then 60fps would be nice too.
 
just brew it!
Administrator
Posts: 54500
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer

Re: Framerate (or latency): Average vs Consistency?

Fri Jun 19, 2015 9:43 am

Aphasia wrote:
As long as framerate is above refresh rate, consistency all the way. And yes, I hate tearing and thus also pay a price for using vsync. So soon as decent freesync monitors are out in force, I'll get one which might help that quite a ways considering a play a whole lot of FPS, not to mention having more then 60fps would be nice too.

Yup. Absolutely ridiculous that we've been tied to the frame rates of the 75-year-old NTSC broadcast television standard this long. We should've standardized on higher (e.g. doubled) frame rates years ago.
Nostalgia isn't what it used to be.
 
Topinio
Gerbil Jedi
Posts: 1839
Joined: Mon Jan 12, 2015 9:28 am
Location: London

Re: Framerate (or latency): Average vs Consistency?

Fri Jun 19, 2015 10:10 am

Average, in a post- fixed frame rate world. I feel liberated from the jarring analogue hold-over and now don't really notice or care about consistency, and am not unhappy when a game's FPS is all over the place between 100 and 144.
Desktop: 750W Snow Silent, X11SAT-F, E3-1270 v5, 32GB ECC, RX 5700 XT, 500GB P1 + 250GB BX100 + 250GB BX100 + 4TB 7E8, XL2730Z + L22e-20
HTPC: X-650, DH67GD, i5-2500K, 4GB, GT 1030, 250GB MX500 + 1.5TB ST1500DL003, KD-43XH9196 + KA220HQ
Laptop: MBP15,2
 
Damage
Gerbil Jedi
Posts: 1787
Joined: Wed Dec 26, 2001 7:00 pm
Location: Lee's Summit, Missouri, USA
Contact:

Re: Framerate (or latency): Average vs Consistency?

Fri Jun 19, 2015 10:16 am

Obligatory link to my first article on frame times, just in case:

http://techreport.com/review/21516/insi ... nchmarking

I wanted to weigh in and say that I think we should be finished with asking questions framed in such a simple fashion, rate vs. consistency, when what we really want to understand is the shape of the frame time distribution. PC games and hardware combine to give us all sorts of variety in per-frame performance over time, and the best means I've found of describing how well a card plays a game involves several metrics, none of which boils down to single, easy number entirely.

See some examples here, for instance:

http://techreport.com/review/28356/nvid ... reviewed/5

The thing I'd say is that if you want one "best" number for comparing the quality of the experience on different hardware, the 99th percentile frame time has been very good to us over a pretty long span of time now. But if you want to debate numbers, the one I'd put against it is the time spend processing frames beyond the 50-ms threshold. That number quantifies how much time you spend below 20 FPS, and it's a good indicator of whether gaming is smooth.

Notice that we're talking here about time in milliseconds, not rates, with both of those numbers. I'm persuaded that a real-time animation system like a PC game wants to avoid high frame rendering times more than it cares about consistency in the sense of statistical variance. I think that goes back to the display subsystem and probably into human biological systems.

I think we're wired to notice slowdowns more than variance in frame rates, in other words. That's what I've gathered by comparing frame time numbers to my perceptions over a lotta testing with fast, variable-refresh displays and the like.
Scott Wasson - "Damage"
 
just brew it!
Administrator
Posts: 54500
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer

Re: Framerate (or latency): Average vs Consistency?

Fri Jun 19, 2015 10:30 am

Agreed. Once you get above a certain minimum threshold, variance should be mostly invisible to the user. Dips BELOW that threshold, OTOH, will still be quite noticeable and may be even MORE jarring since they will stick out like a sore thumb.
Nostalgia isn't what it used to be.
 
superjawes
Minister of Gerbil Affairs
Posts: 2475
Joined: Thu May 28, 2009 9:49 am

Re: Framerate (or latency): Average vs Consistency?

Fri Jun 19, 2015 10:54 am

Damage wrote:
I think we're wired to notice slowdowns more than variance in frame rates, in other words. That's what I've gathered by comparing frame time numbers to my perceptions over a lotta testing with fast, variable-refresh displays and the like.

Pretty much. The imperfections stick out, and the greater the imperfection, the more it sticks out.

So when we translate to GPU power, the main goal should be to minimize those times below the threshold, and if that means a lower overall frame rate, so be it.
On second thought, let's not go to TechReport. It's infested by crypto bull****.
 
Aphasia
Grand Gerbil Poohbah
Posts: 3710
Joined: Tue Jan 01, 2002 7:00 pm
Location: Solna/Sweden
Contact:

Re: Framerate (or latency): Average vs Consistency?

Sat Jun 20, 2015 10:19 pm

Damage - Pretty much since you started crunching the 99th value of frame numbers, that has been the goto graph I look at for most cards, that and the frame time distribution graphs. And that is also what I used when I compared my Mantle exports on BF4 compared to DX since Fraps is pretty useless with Mantle.

At least there is starting to come out a few 27% 2560x1440 monitors with 144Hz and Freesync. Now, if they could only get IPS/xVA panels and AdobeRGB into that as well I might exchange my 30"'er. Although considering my time is mostly spent gaming nowdays anyway I'm thinking of splitting off my Lightroom and Photoshop work onto a Workstation that will double as screen for my work laptop and keep the gaming setup geared for just that.

But yeah, consistency for me basically has the meaning that, if possible, tune the game engine to never produce above my monitor refresh and set all settings as to always, or 99,x% frames rendered below the required frametimes required to hit my refresh rate.
 
rahulahl
Gerbil Team Leader
Posts: 256
Joined: Thu Aug 16, 2012 8:57 am
Location: Australia

Re: Framerate (or latency): Average vs Consistency?

Sat Jun 20, 2015 11:03 pm

ultima_trev wrote:
What are the actual benefits of high framerates outside of competitive multiplayer twitch shooters? Surely it's not necessary for a slow paced single player campaign? I mean, I can definitely tell the difference between 30 and 80 frames per second in terms of actual responsiveness, especially the mouse look in shooters. But as long as FPS stays 40 (or 25 ms latency) and doesn't budge any lower, I could care less about average or peak. I'd rather have a constant 40 as opposed to 60 or 120 with dips to below 40. Maybe this wouldn't fly for professional, competitive gamers though?

Personally, I can tell the difference between 100 and 144 quite easily. That said, if the fps only occasionally dips down to 100, its not really much of an annoyance. I still factor in the time spent below x amount of milliseconds chart in TR reviews before I buy a GPU though. This is one of the reason I chose to go Nvidia, before I even decided to get G-Sync. The dips hurt more when you already have a low FPS. But if you can manage to get your FPS 100+ and even touching 144, these tiny dips wont be as distracting. Of course, having G-Sync helps as well. Since with only V-Sync those tiny dips would halve my FPS for that duration.
Krogoth wrote:
Consistent framerate hands down.

It is the frame-dips and spikes that are annoying above all else. Anyway, online gaming is already coded with hard caps on rendering (60-85FPS) for timing, syncing and server-to-client prediction reasons.

Not all of them.
Most of the online games I play like Chivalry, Counter Strike, Sanctum 2, Torchlight, etc. I have yet to play a single game that I play online which has a limit on FPS. Of course, it is quite possible, that I simply don't play those games, but there certainly are a number of online games that do not have this limit.
Intel i7 4790K
MAXIMUS VII RANGER motherboard
EVGA GTX 1080
16GB Ram
550Watt Seasonic PSU
Asus Rog Swift PG278Q
Windows 10
 
Krogoth
Emperor Gerbilius I
Posts: 6049
Joined: Tue Apr 15, 2003 3:20 pm
Location: somewhere on Core Prime
Contact:

Re: Framerate (or latency): Average vs Consistency?

Sun Jun 21, 2015 11:27 pm

just brew it! wrote:
Aphasia wrote:
As long as framerate is above refresh rate, consistency all the way. And yes, I hate tearing and thus also pay a price for using vsync. So soon as decent freesync monitors are out in force, I'll get one which might help that quite a ways considering a play a whole lot of FPS, not to mention having more then 60fps would be nice too.

Yup. Absolutely ridiculous that we've been tied to the frame rates of the 75-year-old NTSC broadcast television standard this long. We should've standardized on higher (e.g. doubled) frame rates years ago.



Actually, it is older than that. It is a limitation from the days of motion picture (celluloid). Where increasing the framerate cost $$$$ and the filmmakers found that 24FPS was the spot sweet for film length while retaining a relatively smooth framerate as long you don't pan the camera around like a caffeine junkie.
Gigabyte X670 AORUS-ELITE AX, Raphael 7950X, 2x16GiB of G.Skill TRIDENT DDR5-5600, Sapphire RX 6900XT, Seasonic GX-850 and Fractal Define 7 (W)
Ivy Bridge 3570K, 2x4GiB of G.Skill RIPSAW DDR3-1600, Gigabyte Z77X-UD3H, Corsair CX-750M V2, and PC-7B
 
Krogoth
Emperor Gerbilius I
Posts: 6049
Joined: Tue Apr 15, 2003 3:20 pm
Location: somewhere on Core Prime
Contact:

Re: Framerate (or latency): Average vs Consistency?

Sun Jun 21, 2015 11:33 pm

rahulahl wrote:
ultima_trev wrote:
What are the actual benefits of high framerates outside of competitive multiplayer twitch shooters? Surely it's not necessary for a slow paced single player campaign? I mean, I can definitely tell the difference between 30 and 80 frames per second in terms of actual responsiveness, especially the mouse look in shooters. But as long as FPS stays 40 (or 25 ms latency) and doesn't budge any lower, I could care less about average or peak. I'd rather have a constant 40 as opposed to 60 or 120 with dips to below 40. Maybe this wouldn't fly for professional, competitive gamers though?

Personally, I can tell the difference between 100 and 144 quite easily. That said, if the fps only occasionally dips down to 100, its not really much of an annoyance. I still factor in the time spent below x amount of milliseconds chart in TR reviews before I buy a GPU though. This is one of the reason I chose to go Nvidia, before I even decided to get G-Sync. The dips hurt more when you already have a low FPS. But if you can manage to get your FPS 100+ and even touching 144, these tiny dips wont be as distracting. Of course, having G-Sync helps as well. Since with only V-Sync those tiny dips would halve my FPS for that duration.
Krogoth wrote:
Consistent framerate hands down.

It is the frame-dips and spikes that are annoying above all else. Anyway, online gaming is already coded with hard caps on rendering (60-85FPS) for timing, syncing and server-to-client prediction reasons.

Not all of them.
Most of the online games I play like Chivalry, Counter Strike, Sanctum 2, Torchlight, etc. I have yet to play a single game that I play online which has a limit on FPS. Of course, it is quite possible, that I simply don't play those games, but there certainly are a number of online games that do not have this limit.


The following examples have hard-coded limits or you will end-up running into de-syncing issues. Any decent multiplayer game has build-in limits. FPS counters don't tell the whole story. Counter-Strike is the most famous example and the CS junkies go as far as tweaking the server-client predictions.

Single player also have their own limits to prevent timing and rendering issues or you will end-up having "turbo mode". This wasn't the case with really old games that were coded back when 80286/80386 were the hotness.
Gigabyte X670 AORUS-ELITE AX, Raphael 7950X, 2x16GiB of G.Skill TRIDENT DDR5-5600, Sapphire RX 6900XT, Seasonic GX-850 and Fractal Define 7 (W)
Ivy Bridge 3570K, 2x4GiB of G.Skill RIPSAW DDR3-1600, Gigabyte Z77X-UD3H, Corsair CX-750M V2, and PC-7B
 
just brew it!
Administrator
Posts: 54500
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer

Re: Framerate (or latency): Average vs Consistency?

Mon Jun 22, 2015 5:41 am

Krogoth wrote:
just brew it! wrote:
Yup. Absolutely ridiculous that we've been tied to the frame rates of the 75-year-old NTSC broadcast television standard this long. We should've standardized on higher (e.g. doubled) frame rates years ago.

Actually, it is older than that. It is a limitation from the days of motion picture (celluloid). Where increasing the framerate cost $$$$ and the filmmakers found that 24FPS was the spot sweet for film length while retaining a relatively smooth framerate as long you don't pan the camera around like a caffeine junkie.

Umm... how does 24 FPS from film relate? NTSC is 60 fields / 30 frames per second, and I don't recall computer monitors ever running below 60 Hz, other than for specialized applications.

If they were really trying the refresh rate of electronic displays to film, NTSC would've been 48 fields / 24 frames per second. NTSC was actually problematic when it came to broadcasting content which was originally shot on film, since it required every 4th frame to be shown twice, or the use of "3:2 pulldown" to match the frame rates; this resulted in visual artifacts. (Modern digital processing techniques can mitigate this.)

The 60 Hz NTSC rate was actually chosen to match the AC power line frequency, to reduce visual artifacts caused by leakage of AC hum into the (analog) circuits of early televisions.
Nostalgia isn't what it used to be.
 
Glorious
Gerbilus Supremus
Posts: 12343
Joined: Tue Aug 27, 2002 6:35 pm

Re: Framerate (or latency): Average vs Consistency?

Mon Jun 22, 2015 6:46 am

Krogoth wrote:
The following examples have hard-coded limits or you will end-up running into de-syncing issues. Any decent multiplayer game has build-in limits. FPS counters don't tell the whole story. Counter-Strike is the most famous example and the CS junkies go as far as tweaking the server-client predictions.


Tick rate is different than frame rate. Obviously interrelated, but still distinct.

Fundamentally, if the tick rate is 100 and I'm rendering at 300 FPS, where is the problem? That everything will magically de-sync because I have three rendered frames for every one game state update? :o How? Why?

I don't think you know what you are talking about, again.
 
Glorious
Gerbilus Supremus
Posts: 12343
Joined: Tue Aug 27, 2002 6:35 pm

Re: Framerate (or latency): Average vs Consistency?

Mon Jun 22, 2015 6:50 am

JBI wrote:
The 60 Hz NTSC rate was actually chosen to match the AC power line frequency, to reduce visual artifacts caused by leakage of AC hum into the (analog) circuits of early televisions.


Yup, and that's equally why PAL is 50Hz, much to the chagrin of British/European/Australian old school console speed runners. :wink:
 
Krogoth
Emperor Gerbilius I
Posts: 6049
Joined: Tue Apr 15, 2003 3:20 pm
Location: somewhere on Core Prime
Contact:

Re: Framerate (or latency): Average vs Consistency?

Mon Jun 22, 2015 10:53 pm

Glorious wrote:
Krogoth wrote:
The following examples have hard-coded limits or you will end-up running into de-syncing issues. Any decent multiplayer game has build-in limits. FPS counters don't tell the whole story. Counter-Strike is the most famous example and the CS junkies go as far as tweaking the server-client predictions.


Tick rate is different than frame rate. Obviously interrelated, but still distinct.

Fundamentally, if the tick rate is 100 and I'm rendering at 300 FPS, where is the problem? That everything will magically de-sync because I have three rendered frames for every one game state update? :o How? Why?

I don't think you know what you are talking about, again.


FPS junkies keep forgetting that games are programs at heart. Any decent programmer puts in limits in games so do not to go into "turbo mode" becoming unplayable if there is enough hardware prowess.

You can see this with really old games which lack such coded limits on modern systems. Id's FPS games have a "timedemo" argument that disables the limits when they do a demo playback. That's why such demos go super fast on a modern system unlike normal gameplay under the same system.

That 300FPS on the FPS counter is somewhat misleading. The game is actually rendering the graphics at 85-100FPS, but without the limits would render graphics at full 300FPS but it would completely unplayable. It is also why the visual benefits of 120FPS+ rendering seems to "disappear".

I understand the ticking rate but the same framerate limits also help prevent bizarre syncing and prediction issues in multplayer games.

The real reason why more FPS = better meme took off is because of interesting issues with the physic portion of the Quake 3 engine. Astute players discovered that they could jump slightly further and strife-running faster if the framerate was high enough (100FPS+). This gave competitive players an edge over their competition that were running at slower framerates.
Gigabyte X670 AORUS-ELITE AX, Raphael 7950X, 2x16GiB of G.Skill TRIDENT DDR5-5600, Sapphire RX 6900XT, Seasonic GX-850 and Fractal Define 7 (W)
Ivy Bridge 3570K, 2x4GiB of G.Skill RIPSAW DDR3-1600, Gigabyte Z77X-UD3H, Corsair CX-750M V2, and PC-7B
 
Krogoth
Emperor Gerbilius I
Posts: 6049
Joined: Tue Apr 15, 2003 3:20 pm
Location: somewhere on Core Prime
Contact:

Re: Framerate (or latency): Average vs Consistency?

Mon Jun 22, 2015 11:00 pm

just brew it! wrote:
Krogoth wrote:
just brew it! wrote:
Yup. Absolutely ridiculous that we've been tied to the frame rates of the 75-year-old NTSC broadcast television standard this long. We should've standardized on higher (e.g. doubled) frame rates years ago.

Actually, it is older than that. It is a limitation from the days of motion picture (celluloid). Where increasing the framerate cost $$$$ and the filmmakers found that 24FPS was the spot sweet for film length while retaining a relatively smooth framerate as long you don't pan the camera around like a caffeine junkie.

Umm... how does 24 FPS from film relate? NTSC is 60 fields / 30 frames per second, and I don't recall computer monitors ever running below 60 Hz, other than for specialized applications.

If they were really trying the refresh rate of electronic displays to film, NTSC would've been 48 fields / 24 frames per second. NTSC was actually problematic when it came to broadcasting content which was originally shot on film, since it required every 4th frame to be shown twice, or the use of "3:2 pulldown" to match the frame rates; this resulted in visual artifacts. (Modern digital processing techniques can mitigate this.)

The 60 Hz NTSC rate was actually chosen to match the AC power line frequency, to reduce visual artifacts caused by leakage of AC hum into the (analog) circuits of early televisions.


Because the limits from celluloid set the standard for the source material in A/V for over a century. 60Hz was chosen because it not only match AC in residential grids in USA/Canada it is also divisible by 12 like 24FPS set by celluloid.
Gigabyte X670 AORUS-ELITE AX, Raphael 7950X, 2x16GiB of G.Skill TRIDENT DDR5-5600, Sapphire RX 6900XT, Seasonic GX-850 and Fractal Define 7 (W)
Ivy Bridge 3570K, 2x4GiB of G.Skill RIPSAW DDR3-1600, Gigabyte Z77X-UD3H, Corsair CX-750M V2, and PC-7B
 
just brew it!
Administrator
Posts: 54500
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer

Re: Framerate (or latency): Average vs Consistency?

Mon Jun 22, 2015 11:18 pm

Krogoth wrote:
Because the limits from celluloid set the standard for the source material in A/V for over a century. 60Hz was chosen because it not only match AC in residential grids in USA/Canada it is also divisible by 12 like 24FPS set by celluloid.

Being divisible by 12 was likely more of a happy accident; matching the line frequency was the overriding concern (otherwise they would've gone with an integer multiple of 24, i.e. 48 or 72 Hz). European standard (PAL) is 50 Hz to match their line frequency, and is not divisible by 12; too bad for them! Going with double the line frequency (120 Hz) would have probably made television receivers too expensive for the average consumer, given the limitations of contemporary electronics back in the 1940s. Heck, the interlaced format was also a compromise dictated by technological limitations, which we have (finally!) ditched with the shift to digital.
Nostalgia isn't what it used to be.
 
Captain Ned
Global Moderator
Posts: 28704
Joined: Wed Jan 16, 2002 7:00 pm
Location: Vermont, USA

Re: Framerate (or latency): Average vs Consistency?

Mon Jun 22, 2015 11:24 pm

just brew it! wrote:
Krogoth wrote:
Because the limits from celluloid set the standard for the source material in A/V for over a century. 60Hz was chosen because it not only match AC in residential grids in USA/Canada it is also divisible by 12 like 24FPS set by celluloid.
Being divisible by 12 was likely more of a happy accident; matching the line frequency was the overriding concern (otherwise they would've gone with an integer multiple of 24, i.e. 48 or 72 Hz). European standard (PAL) is 50 Hz to match their line frequency, and is not divisible by 12; too bad for them! Going with double the line frequency (120 Hz) would have probably made television receivers too expensive for the average consumer, given the limitations of contemporary electronics back in the 1940s. Heck, the interlaced format was also a compromise dictated by technological limitations, which we have (finally!) ditched with the shift to digital.

And when you wait until dark and tell your Oppo Blu-Ray player to send the Blu-Ray of "2001" to the TV at 1080p24 and the TV says "oh goody", one comes to understand Kubrik. I've yet to regret going plasma.
What we have today is way too much pluribus and not enough unum.
 
Glorious
Gerbilus Supremus
Posts: 12343
Joined: Tue Aug 27, 2002 6:35 pm

Re: Framerate (or latency): Average vs Consistency?

Wed Jun 24, 2015 2:02 pm

Krogoth wrote:
FPS junkies keep forgetting that games are programs at heart. Any decent programmer puts in limits in games so do not to go into "turbo mode" becoming unplayable if there is enough hardware prowess.


Like I said, they may be interrelated (especially depending on the implementation), but they are philosophically distinct. There is no inherent dependency between the amount of times you render a frame and the game state depicted therein.

The absurdity of what you are saying is well-illustrated by the pre-netcode patch BF4: the tickrate was literally ten. Yes, as in every 100 milliseconds. When we were screaming "MORE TRADES THAN WALLSTREET" and "THIS ISN'T A GAME: IT'S A HIGH-FREQUENCY TRADING SIMULATOR" about the incessant trade-killing, it wasn't because some of us had 30 FPS, or 60 FPS, or 120 FPS, or even 200 FPS and everything was "de-syncing."

No, it was simply because the tick rate was so frigging low that the false simultaneity was inevitable fact of life.

krogoth wrote:
You can see this with really old games which lack such coded limits on modern systems. Id's FPS games have a "timedemo" argument that disables the limits when they do a demo playback. That's why such demos go super fast on a modern system unlike normal gameplay under the same system.


You're missing the point. The better example is the one you ended with: Quake 3 had infamously frame rate dependent physics. You could rocketjump further & longer at certain frame rates with certain gravity levels (the key one I remember is 125 FPS at 800 gravity). But again, this was not inherent at all! Only implementation. There are re-implementations of the quake 3 engine that can remove the dependency entirely. I think quake live did it too.

Krogoth wrote:
That 300FPS on the FPS counter is somewhat misleading. The game is actually rendering the graphics at 85-100FPS, but without the limits would render graphics at full 300FPS but it would completely unplayable. It is also why the visual benefits of 120FPS+ rendering seems to "disappear".


Does what you just said make any sense to you? It doesn't to me!

What? It's all a conspiracy in which they lie about FPS (even though I can actually confirm the extra work the card is doing above 100 FPS by viewing the utilization & heat output of my card :roll: )?

The game would catch on fire if rendered any faster than 85-100FPS (as proven by the timedemo argument to ~20 year old idtech games. :o )?

It's all fake because I don't see the difference above 120 FPS (uh.... I personally can't reliably tell the difference above like 90 or so. Every additional frame gets you asymptotically closer to perfect verisimilitude, right? Hence after some point I couldn't tell the difference regardless of the real output, and for me I'm pretty sure that's well before 120 FPS)?

Krogoth wrote:
I understand the ticking rate but the same framerate limits also help prevent bizarre syncing and prediction issues in multplayer games.


I'm not sure what you understand...

Krogoth wrote:
The real reason why more FPS = better meme took off is because of interesting issues with the physic portion of the Quake 3 engine. Astute players discovered that they could jump slightly further and strife-running faster if the framerate was high enough (100FPS+). This gave competitive players an edge over their competition that were running at slower framerates.


I'm not talking about more FPS = better. I'm talking about your belief that games CAN'T render higher than 100 FPS without unplayable game state, and that when they claim to render higher than that they are LYING. :roll:

And again, you're got it completely wrong. It wasn't automagically more frames = jumper higher. Certain framerates at certain gravities maximized the value, it was a curve. Too much (like say 150 FPS versus 125 FPS at 800) and you'd actually jump about 10% lower.
 
Krogoth
Emperor Gerbilius I
Posts: 6049
Joined: Tue Apr 15, 2003 3:20 pm
Location: somewhere on Core Prime
Contact:

Re: Framerate (or latency): Average vs Consistency?

Thu Jun 25, 2015 1:45 am

You really need to play some really old games that lack limiters to see the "turbo mode" problem at work.

Try any of the old games that use Unreal Engine 1.0 and turn off Vsync under a modern system. It is much worse with older titles coded back early 1990s and throughout 1980s. It is actually the reason why the "Turbo" button existed back in the day before programmers addressed the timing and speed issues.
Gigabyte X670 AORUS-ELITE AX, Raphael 7950X, 2x16GiB of G.Skill TRIDENT DDR5-5600, Sapphire RX 6900XT, Seasonic GX-850 and Fractal Define 7 (W)
Ivy Bridge 3570K, 2x4GiB of G.Skill RIPSAW DDR3-1600, Gigabyte Z77X-UD3H, Corsair CX-750M V2, and PC-7B
 
rahulahl
Gerbil Team Leader
Posts: 256
Joined: Thu Aug 16, 2012 8:57 am
Location: Australia

Re: Framerate (or latency): Average vs Consistency?

Thu Jun 25, 2015 5:27 am

Krogoth wrote:
You really need to play some really old games that lack limiters to see the "turbo mode" problem at work.

Try any of the old games that use Unreal Engine 1.0 and turn off Vsync under a modern system. It is much worse with older titles coded back early 1990s and throughout 1980s. It is actually the reason why the "Turbo" button existed back in the day before programmers addressed the timing and speed issues.

No one is disputing that older games had a turbo issue. Its just that in todays game, FPS and the tick rate are not the same.
Intel i7 4790K
MAXIMUS VII RANGER motherboard
EVGA GTX 1080
16GB Ram
550Watt Seasonic PSU
Asus Rog Swift PG278Q
Windows 10
 
Westbrook348
Gerbil
Posts: 91
Joined: Tue Nov 18, 2014 4:15 pm

Re: Framerate (or latency): Average vs Consistency?

Thu Jun 25, 2015 7:55 am

Just reading the Fury X review on Tom's Hardware this morning and comparing it to Scott's. I really dislike Tom's emphasis on frame time variance. I think the way Tech Report displays frame times: 99th percentile frame time, time spent >X ms, and a percentile chart.. that's the data we care about. I'm still not sure why some sites have chosen to go with variance.

Although, Scott, one minor suggestion is that I personally would rather see the percentile charts zoomed in to the last 3-10% I think. I don't mind the way you're doing it, per se. It's just harder to analyze the most important section of the curve with the current x-axis setup. Going all the way to the 50th percentile may give us a little more info, but at a loss of detail, and it's a little redundant since 50th percentile frame time is calculatable from the FPS numbers that you continue to provide for those readers still stuck in the past.
 
Glorious
Gerbilus Supremus
Posts: 12343
Joined: Tue Aug 27, 2002 6:35 pm

Re: Framerate (or latency): Average vs Consistency?

Thu Jun 25, 2015 9:01 am

krogoth wrote:
You really need to play some really old games that lack limiters to see the "turbo mode" problem at work.


Look, you originally claimed this:

Krogoth wrote:
Anyway, online gaming is already coded with hard caps on rendering (60-85FPS) for timing, syncing and server-to-client prediction reasons.


Which is utterly and completely false. It is simply bad information, and I happen to dislike bad information.

As I keep repeatedly saying, while certain engines and games (even today!) may have a interrelationship between their tick rate and their frame rate, it not a philosophical requirement. It is not an unavoidable necessity.

I've given examples of actual modern games that disprove your claim. I've offered a thought experiment to demonstrate why what you are saying is nonsense. I've even further explained of your own "evidence" (which was already ~20 years old) in the case of Quake 3's physics.

All you've done is reach back OVER THIRTY YEARS to the LITERAL DAWN of (IBM) PC computing to try and make your claims about today have some shred of either truth or relevance (Even though what was online gaming then? MUDs?).

Krogoth wrote:
Try any of the old games that use Unreal Engine 1.0 and turn off Vsync under a modern system. It is much worse with older titles coded back early 1990s and throughout 1980s. It is actually the reason why the "Turbo" button existed back in the day before programmers addressed the timing and speed issues.


Dude, please, just stop.

The turbo button existed because PCs were originally *THE* PC by IBM, which had a 4.77mhz clocked 8088 or whatever. Programmers relied on that for timing for a variety of reasons, so when newer PCs came with faster chips, those programs didn't work right. Hence computer manufacturers put in a "turbo button." If it was on, the PC ran at native speed. If it was off, back to speeds approximating the original ~4.77mhz 8088. Or vice versa, who knows.

And what were those reasons? Well, it's before my time, but timing is always a very tricky subject. It's easy to say that they were lazy and just relied on crude dummy loops because they assumed it'd be 4.77mhz 4EVAR or that they just didn't care about the future (or simply believed that it would be dominated by something like Amiga, not IBM). But, in actuality, they were a lot more limited in their options. RDTSC didn't exist until the Pentium. HPET wasn't common until a decade or so ago. There was pretty much only the PIT or the RTC, and I imagine using either one was probably somewhat perilous involving interrupts (which were likely slow and liable to be patched unpredictably because that's what always seemed to happen to DOS interrupt routines). I don't exactly know, this is ancient history and before my time.

At any rate, programmers didn't "address" the timing and speed issues in those programs. They may have written NEW programs that relied on different timing sources, but you can probably find those same programs on an abandonware site somewhere and they'll have the exact same problem they always did. These practices may have persisted into later games (like into the 90s and maybe even later...), but it just goes back to inertia and the fact that the same basic problem existed in a slightly different form: no high quality timer available/dire performance implications for non-cycle dependent timing.

But that's just history. The REAL POINT here is that for the last ten years or so (AT LEAST) the kind of problem you are vaguely alluding to is EXTREMELY UNCOMMON in multi-player games. It is perhaps less rare in singleplayer games, as I can dimly remember a few that had scripts and few other things break at extremely high framerates, but it is still very rare.

Thus what you are saying about hard caps of 60-85 FPS in multiplayer games is just flagrantly incorrect.

This whole thing reeks of you having an ill-conceived preconception about how things must work that you are refusing to just move past.

Let it go.
 
Krogoth
Emperor Gerbilius I
Posts: 6049
Joined: Tue Apr 15, 2003 3:20 pm
Location: somewhere on Core Prime
Contact:

Re: Framerate (or latency): Average vs Consistency?

Thu Jun 25, 2015 11:36 pm

At any rate, programmers didn't "address" the timing and speed issues in those programs. They may have written NEW programs that relied on different timing sources, but you can probably find those same programs on an abandonware site somewhere and they'll have the exact same problem they always did. These practices may have persisted into later games (like into the 90s and maybe even later...), but it just goes back to inertia and the fact that the same basic problem existed in a slightly different form: no high quality timer available/dire performance implications for non-cycle dependent timing.

But that's just history. The REAL POINT here is that for the last ten years or so (AT LEAST) the kind of problem you are vaguely alluding to is EXTREMELY UNCOMMON in multi-player games. It is perhaps less rare in singleplayer games, as I can dimly remember a few that had scripts and few other things break at extremely high framerates, but it is still very rare.


You are just admitting that modern programmers use limiters in their code that prevent the earlier issues related that arose when computer hardware were able to run application at "turbo speeds". That's why such issues are such a rarity these days. :roll:
Gigabyte X670 AORUS-ELITE AX, Raphael 7950X, 2x16GiB of G.Skill TRIDENT DDR5-5600, Sapphire RX 6900XT, Seasonic GX-850 and Fractal Define 7 (W)
Ivy Bridge 3570K, 2x4GiB of G.Skill RIPSAW DDR3-1600, Gigabyte Z77X-UD3H, Corsair CX-750M V2, and PC-7B
 
Pancake
Gerbil First Class
Posts: 161
Joined: Mon Sep 19, 2011 2:04 am

Re: Framerate (or latency): Average vs Consistency?

Fri Jun 26, 2015 12:33 am

Glorious wrote:
JBI wrote:
The 60 Hz NTSC rate was actually chosen to match the AC power line frequency, to reduce visual artifacts caused by leakage of AC hum into the (analog) circuits of early televisions.


Yup, and that's equally why PAL is 50Hz, much to the chagrin of British/European/Australian old school console speed runners. :wink:


We enjoyed our higher definition 625 line (per frame period) vs your crude low res 525 line displays. Mind you, it took a bit of work to get sprites into the border on my C64 to use all that extra resolution goodness.
 
Glorious
Gerbilus Supremus
Posts: 12343
Joined: Tue Aug 27, 2002 6:35 pm

Re: Framerate (or latency): Average vs Consistency?

Fri Jun 26, 2015 5:50 am

Krogoth wrote:
You are just admitting that modern programmers use limiters in their code that prevent the earlier issues related that arose when computer hardware were able to run application at "turbo speeds". That's why such issues are such a rarity these days.


No, I'm not.

Since you don't refute anything I wrote but rather just restate the your previous erroneous statement with ZERO further elaboration...

...Meh. I'm not impressed. :wink:

Pancake wrote:
We enjoyed our higher definition 625 line (per frame period) vs your crude low res 525 line displays.


:lol:

Pancake wrote:
Mind you, it took a bit of work to get sprites into the border on my C64 to use all that extra resolution goodness.


I can believe it! :P
 
Mr Bill
Gerbil Jedi
Posts: 1819
Joined: Mon Jan 21, 2002 7:00 pm
Location: Colorado Western Slope
Contact:

Re: Framerate (or latency): Average vs Consistency?

Wed Jul 01, 2015 9:11 pm

I was reading the news today about the Fury X whining and read through to the Fury X review over at pcper.com.

Damage - its interesting that pcper.com uses a second pc to capture the video card output and post process the data. As they mentioned imitating TR in their Fury X review.
http://www.pcper.com/reviews/Graphics-C ... -and-Testi

How does Tech Report actually get its frame data? I did a quick skim through the article and maybe I missed it but I could not find where you explained how TR goes about capturing the signal.

Edit: Does TR use this same kind of hardware to get the signals?
http://www.pcper.com/reviews/Graphics-C ... nce-Metric

Edit2:
Westbrook348 wrote:
Explanation of Damage Labs FCAT setup: http://techreport.com/review/24553/insi ... ture-tools

Thank You!
Last edited by Mr Bill on Thu Jul 02, 2015 9:23 am, edited 1 time in total.
X6 1100T BE | Gigabyte GA-990FXA-UD3 AM3+ | XFX HD 7870 | 16 GB DDR3 | Samsung 830/850 Pro SSD's | Logitech cherry MX-brown G710+ | Logitech G303 Daedalus Apex mouse | SeaSonic SS-660XP 80+ Pt | BenQ 24' 1900x1200 IPS | APC Back-UPS NS-1350 | Win7 Pro
 
Westbrook348
Gerbil
Posts: 91
Joined: Tue Nov 18, 2014 4:15 pm

Re: Framerate (or latency): Average vs Consistency?

Wed Jul 01, 2015 10:51 pm

Explanation of Damage Labs FCAT setup: http://techreport.com/review/24553/insi ... ture-tools

Who is online

Users browsing this forum: No registered users and 1 guest
GZIP: On