GeForce versus Radeon captured on high-speed video

Our recent article comparing the Radeon HD 7950 to the GeForce GTX 660 Ti in many of the season’s top new games has attracted some new attention to our latency-focused game testing methods. Some folks are skeptical about whether there’s added value to testing with any more granularity than an FPS average would provide. Others have wondered about whether the tool we’re using to grab frame time data, Fraps, really captures an accurate reflection of how frames are arriving at the display. There are some interesting questions to explore along those lines, but our intention has always been to illustrate the impact of high-latency frames on animation smoothness visually.

And we can do that quite easily, thanks to the high-speed camera I bought a while back for just that purpose. I’ve waited much too long to put it to use.

One of the test scenarios with the starkest difference between the GeForce and Radeon in our recent tests is our new Skyrim sequence, where we take a walk through the countryside. You can see the data and graphs we’ve reported from it here. Neither card performs poorly in this test—the 7950 averages 69 FPS, while the GTX 660 Ti averages 74 FPS—but frame delivery is generally uneven on the Radeon, punctuated by occasional hitches where frames take 60 milliseconds or more to arrive. Such spikes are relatively rare on the GeForce. Here’s a look at the frame time plot, which tells the story:

The difference in smoothness between the two cards was obvious as we conducted our play-testing. Since folks were wondering, we figured we might as well capture some high-speed video to show you the difference between the two.

We have a couple videos to share. The first one was filmed at 120 FPS, twice the speed of our 60Hz IPS display. I recommend hitting the “view full screen” button to get a better sense of motion.

I think the occasional hitches on the Radeon are pretty easy to see. The big, obvious slowdowns only happen every so often, but the GeForce avoids them—just as the test results told us.

Remember, the Radeon HD 7950 turns in an average of 69 FPS in this very test run, a rate that has been considered “good” in FPS terms for years. This is why measuring frame latencies, not just average rates, is so crucial. FPS averages don’t capture what’s happening from moment to moment.

The next video was shot at 240 frames per second, four times the speed of the display.

This one is a little more tedious to watch, I’ll admit. However, it should provide many hours of entertainment to those who want to do fine-grained visual comparisons between the two cards. The big hitches are still apparent on the Radeon, but here it may be possible to see the superior moment-by-moment smoothness on the GeForce. I dunno. In some ways, I think it’s easier to get a sense of the smoothness at full speed than it is in slow motion.

For what it’s worth, Cyril recommends staring at the border between the two videos and unfocusing your eyes a bit in order to best monitor motion on both sides at once. Sounds like a recipe for a headache to me.

Anyhow, you now have a little bit of visual evidence to go with the mountains of data we’ve provided. Make of it what you will.

Comments closed
    • Badben
    • 7 years ago

    The Tech Report is the only place I’ve seen this frame latency testing and it’s fantastic. I don’t trust regular average fps tests anymore, it doesn’t tell the whole story at all. I would much rather have a few less fps second but have it smooth. I really hope this kind of testing becomes standard for all graphic card tests. It has actually convinced me to buy a gtx 660 ti.

    • beck2448
    • 7 years ago

    Great testing!!! Finally a real explanation of why frame rate numbers can be misleading or even manipulated.

    • chelsie09xmarie
    • 7 years ago
    • kristi_johnny
    • 7 years ago

    Scott, you should post a video with both cards running @60 fps.
    It will make the comparison more clear, we will see how cards behave normally, without the video being slowed down to 120/240fps.

    • OneAboveAll
    • 7 years ago

    Why not use more than one game? I like this kind of benchmarks; but using different games would make it more valid, Skyrim is a console port at best and I don’t think it’s the best optimized game for PC out there.
    You could have used demanding games such as BF3, Metro 2033 or maybe even Arma 2.

    • jessterman21
    • 7 years ago

    Anyone find frame tearing more annoying than random high-latencies in third-person games? I play Arkham City on Extreme, but the small hitches annoy me far less than the frame tearing in Prince of Persia (no motion-blur to hide it).

    —Also, MSI Afterburner has introduced a frame-time metric in their graphs! And it perfectly portrays the limitation of fps-over-time metrics (appears right above frame-times in the graphs).

    • Kaleid
    • 7 years ago

    Middle pricing band – This pricing band was far less competitive as the Radeon HD 7950 with Boost simply demolished the GTX 660 Ti across the board with regards to raw frame rates and overall game play experience across our suite of testing.

    Low pricing band – Not to be left out, the Radeon HD 7870 held up well on its own, matching the game play experience and raw frame rates to the GTX 660 Ti from the middle pricing band (except for Sleeping Dogs).

    [url<]http://hardocp.com/article/2012/11/12/fall_2012_gpu_driver_comparison_roundup/8[/url<]

    • egon
    • 7 years ago

    Really appreciate the way TR has been exploring new ways to benchmark.

    I’m not much of a gamer in general but am a bit of a train sim enthusiast, and have long observed severe stuttering in RailWorks/Train Simulator – what was frustrating is it wasn’t reflected in conventional FPS benchmark results, but was finally made evident in an ‘objective’ way thanks to frametime-based benchmarking.

    A recent key observation while benchmarking the game on my Radeon 6850 is how inconsistently the frames are delivered – one frame might take 17ms, the next 24ms, then 17ms, and so on. It revealed why, regardless of a good average FPS and all but the occasional frame taking more than 40ms, there was a constant lack of fluidity to the motion in the game. I posted some of the results here:

    [url<]http://railworksamerica.com/forum/viewtopic.php?f=6&t=4066&start=60#p71216[/url<] Not sure if it's a Radeon specific issue (it's hard to get train simmers interested in running FRAPS and comparing results) but the consistency of frame delivery is something that does appear to have a significant impact on perceived smoothness, yet is largely overlooked in benchmarking.

      • aspect
      • 7 years ago

      [url=https://www.youtube.com/watch?feature=player_detailpage&v=86PUB4u2s2A#t=93s<]Agreed Train Simulator is serious business.[/url<]

      • lalelilolu
      • 7 years ago

      Read the above. I have AMD too and use it for race simulators.
      I was able to eliminate all the stuttering with RadeonPro: triple buffering + dybamic frame limiter.

      The cost was like 5% in FPS.

    • stacey1x0pp
    • 7 years ago
    • ultimo
    • 7 years ago

    This cries out for back-to-back testing with different drivers, 12.3, 12.7, 12.11 etc… Can you please do that, TR?

    For my fellow AMD users: I also tested this weekend on my 7950, Skyrim, 12.11 beta 11 drivers. Using Lucidlogix MVP seems to be a remedy (as it was in D3).

      • lalelilolu
      • 7 years ago

      The author should know that microstutter (on single or multi) is not a problem for quite some time now.
      All you need to do is use triple-buffering + vsync + cap.
      Thats what is used on consoles for 6 years with rare exceptions like Uncharted.
      Even Afterburner, Dxtory, Fraps… would do the trick. And the average FPS will be the same.

      Far Cry 3 is broken on both nVidia/AMD.

      You make the configs permanent using the AMD Radeon hardware alone using RadeonPro [url<]http://www.radeonpro.info[/url<] A review here: [url<]http://www.tomshardware.com/reviews/radeon-hd-7990-devil13-7970-x2,3329-11.html[/url<] Even with multi-gpu you can get results equal to those from single cards. I used MVP for the evaluation period and while it didnt improve performance it did smoothed framerate. But it is linked to a especific hardware platform and it is paid.

        • ultimo
        • 7 years ago

        [quote=”lalelilolu”<] triple-buffering + vsync + cap. [/quote<] yeah, but that will introduce quite a bit of lag. Something MVP doesn't, it's designed to even reduce it.

          • lalelilolu
          • 7 years ago

          vsync has other options like “vsync” and “double vsync”.
          If you apply the second (or the first without triple buffering), youre gonna have huge lag with steps from 60 straight to 30 fps.
          I used the “lag vsync” for a long time before I discovered that vsync has other options ¬¬…
          Ive lost a LOT of game experience until I started to play racing simulators (rFactor, iRacing) and find a lot of vsyncs.
          You can use vsync + tb, double vsync, swap tear vsync/dynamic vsync, dynamic frame cap and use different parameters with the vsync: sync by timer, sync by display refresh rate, sync by gpu.
          And inside those you can fine tune it to increase latency and increase fps:
          Render Once Per VSync=”0″
          Steady Framerate Thresh=”0.00000″
          Flush Previous Frame=”0″
          Synchronize Frame=”0.00000″
          Delay Video Swap=”0″

          You are right about it: MVP here eliminates tearing WITHOUT limiting the framerates (cap).
          But in any motion picture like gpu rendered scenes, the ideal is to have no variation.
          This can be tested in a race sim like F1 2012 (theres a demo on Steam).
          The MVP results will have less lag. But you cannot control a car with variable framerate.

      • lalelilolu
      • 7 years ago

      I ran some tests with Fraps (for MVP the results are from months ago, Fraps keeps the benchs in CSV files) and make the calculations with Excel.

      Game: Need for Speed Most Manted – Criterion
      Hardware: i5 sandy + HD 6970

      *Lucid MVP:
      Frames, Time (ms), Min, Max, Avg
      3033, 76082, 31, 49, 39.865

      *Triple buffering + Vsync + dynamic frame limiter:
      Frames, Time (ms), Min, Max, Avg
      2871, 71792, 39, 41, 39.991

      And the average latencies variation (considering the variation in the Excel graphic)(excluding 3-4 little spikes):

      MVP: 3ms

      With the second settings:
      1ms – a straight line

      no tearing on both

        • ultimo
        • 7 years ago

        how do you measure latencies? probably we are talking about two different things: I’m interested in the lag between an input (mouse movement, key press) and the feedback on the screen. That includes computation time as well, but IIRC, the graphics is accountable for the biggest proportion (there was a nice article on Anand)

          • ultimo
          • 7 years ago

          there we go:
          [url<]http://www.anandtech.com/show/2803[/url<] the graphics pipeline is indeed very important when it comes to responsiveness of a game.

            • lalelilolu
            • 7 years ago

            News: decreasing memory frequencies (I could not controled memory times/latency) helped even more:
            I could get straight line with dynamic frame cap.

            Edit: and now I see the confusion eith terminology. My mistake: I was using “latency” where I should use “latency variance”. The bigger the variance in latency is what gives the sttutering (not always coupled with input lag).
            The spikes (the deviation from a given standar) increse avg FPS but introduces stuttering.
            The stuttering (big gap) between sequential frames is eliminated by what I mentioned.
            But you still can have lag for example: 0 variation on and FPS of 10.

            And thats something not ususual with GPU overclocking like in the 7950: you can increase like 400Mhz on the base memory clock and no errors. But we’ve checked the latencies/memory times?

            The problematic title: Far Cry 3. Impossible to fix on DX11.

            About the input lag: I still have both LCD60Hz and CRT. The LCDs are much worse. You can overclock them but with a huge increase in ghosting.
            And the portion beyond GPU going through display is bad for everyone but 240Hz: you can send a package from US to UK trough internet without a frame on a 60Hz LCD being displayed.

            Fraps and RivaTunerOSD can log at what time a frame was sent. You open the log with Excel and ask it to calculate the delta between each frame. Then you plot in grpahic and can also apply statistical functions like variance to see how much the latencies fluctuate.

            The pattern I’ve found with old gen 6900 was that the card sends a frame very fast (the spike) and than falls to a huge latency on the next frame. This increased avg FPS by more than 5%.

            MVP VirtualVsync (without HyperP) decreased avg FPS but the graphics are very smooth: 4ms avg fluctuation (NeedFSMW) latencies and it did as advertised: more even frames deliverance without fixed frame cap.

            With dynamic frame cap + vsync + triple buffering: almost a straight line. Never more than 2ms of latency variance.
            But fixed FPS (you get plot the data from Fraps and fix the FPS in order to obtain more even latencies).

            In both cases, MVP and Dynamic Frame Cap, the loss eliminating the spikes was more than 5%.

            And about the subjective feeling of how smooth the games were displayed:
            There were not even one case where less latency calculated in Excel/OSD log showed no real improvement in actual perception of the game.
            On racegames the improvements in experience were tremendous.

      • lalelilolu
      • 7 years ago

      Making more and more tests I see the pattern of spikes again and again in all cases.

      And the gain with the spikes is like 5%. Thats too much.

    • DarkUltra
    • 7 years ago

    Scott, you [i<]really[/i<] should test this on a 120hz monitor with those 3d cards, and two high end cards. Then we would see if it scales, and it is much easier to see on a 120hz display. [url<]http://www.youtube.com/watch?v=ScFAvPN7aJM[/url<]

      • Airmantharp
      • 7 years ago

      It’d make a good test, but 120Hz is still far from mainstream. Even if we’d like it to be.

        • Prestige Worldwide
        • 7 years ago

        While it might not be mainstream, 120Hz is the best gaming upgrade I’ve ever made. Day and night difference in the fluidity of gameplay jumps out at you the second you hop into any game. Strongly recommend it if you have the dosh to splurge on a new monitor.

        I game on a BenQ XL 2420T.

    • Kaleid
    • 7 years ago

    January:

    [url<]https://techreport.com/r.x/radeon-hd-7950/skyrim-nv.gif[/url<] [url<]https://techreport.com/r.x/radeon-hd-7950/skyrim-beyond.gif[/url<] Now: [url<]https://techreport.com/r.x/7950-vs-660ti/skyrim.gif[/url<] [url<]https://techreport.com/r.x/7950-vs-660ti/skyrim-beyond-50.gif[/url<] [url<]https://techreport.com/r.x/radeon-win8/skyrim-beyond-50.gif[/url<] So what happened?

      • green
      • 7 years ago

      any number of things

      butpossibly started with this:
      techreport.com/news/22710/catalyst-12-3-drivers-fix-bugs-fully-support-new-radeons
      [quote<]Skyrim: No longer displays flickering and texture corruption.[/quote<] where fixing it reduced performance by forcing a different / longer code process but probably would have been counter-acted by this: techreport.com/news/23190/new-catalyst-drivers-improve-security-performance [quote<]Up to 25% Skyrim[/quote<] as well as this: techreport.com/news/23770/update-new-catalyst-betas-boost-7000-series-performance (image indicating an 8% boost to skyrim @1080) but then counter-counter-acted by this: techreport.com/news/23918/latest-catalysts-improve-performance-in-far-cry-3-linux [quote<]Resolves the Skyrim lighting issue (missing a lighting pass) for the AMD Radeon HD 7900 Series[/quote<] where if it was "missing" a lighting pass, adding it back in is gonna eat up some performance gain this is also related given the motherboard used: techreport.com/news/24005/new-catalysts-fix-bugs-improve-far-cry-3-performance [quote<]Resolves a sporadic system hang encountered with a single AMD Radeon HD 7000 Series GPU seen on X58 and X79 chipsets.[/quote<] where the fix may have had an impact on performance although it's beta 11 and not beta 8 as used in the article though since it's all part of the 12.11 package it could just be "included" as a general thing for 12.11 i think overall this is a good thing someone noticed something odd going on with product X manufacturer has been alerted to the problem and is looking into it we'll hopefully get a response from them next year (can't expect them to work through christmas / new years) and a fix not too long after this can only result in making product X and (hopefully) future products better

        • Kaleid
        • 7 years ago

        “Skyrim: No longer displays flickering and texture corruption.”
        Well, this isn’t true, I’ve tried 12.3 and many from that and it still flickers. Haven’t tried the latest betas though.

      • faptastic
      • 7 years ago

      It was all fine and dandy in September with Catalyst 12.7 beta:

      [url<]https://techreport.com/r.x/geforce-gtx-660/skyrim-99th.gif[/url<] [url<]https://techreport.com/r.x/geforce-gtx-660/skyrim-beyond-50.gif[/url<] Anyway I can't replicate the smoothness issue that Techreport is advocating with a 7950 on my 7850 in Skyrim at least. Here's how I tested: [url<]http://www.youtube.com/watch?v=Yj79q5oLm7o[/url<] i5 760@4GHz | P55A-UD3 | Corsair XMS3 2x4GB 1600C9 | HD7850 2GB@1100/5400 | Skyrim is installed on a 1TB WD Caviar Blue | Win 7 64bit Pro, Cat 12.11 b11 | Vanilla Skyrim with high res textures, Max Settings, 2xMSAA, SMAA@Ultra, 16xAF Used RadeonPro to Force settings. vsync off, triple buffering off, flip queue size = default [url<]http://img717.imageshack.us/img717/9799/novsync.jpg[/url<] vsync off, no triple buffering off, flip queue size = 1 [url<]http://img145.imageshack.us/img145/5136/novsyncflipqueuesize1.jpg[/url<] vsync on, triple buffering on, flip queue size = 1 [url<]http://img405.imageshack.us/img405/296/vsynctriplebufferingfli.jpg[/url<] Getting the huge spikes at similar intervals which I assuming is caused by other factors(game engine streaming in new data/area/textures etc...), not the card, so I was fairly accurate with my runs. Here's the save file if anyone wishes to try and replicate my runs, I hit bench key and run right after the guy finishes speaking: [url<]http://www.mediafire.com/?kod300m7v86saul[/url<]

    • madgun
    • 7 years ago

    Just saw this and thought I’d share:

    (Tweet from anandtech’s Guru, Anand himself):

    “I’ve known @scottwasson for a while and I’ve never known him to be biased in his GPU coverage”

    • kristi_johnny
    • 7 years ago

    Well, that escalated quickly

    • ULYXX
    • 7 years ago

    I just wanted to say I like the intro animation and the theme jingle for its simplicity. There’s a lot of reviewers out there with annoying back ground theme music that play through out the whole videos. Less is definitely more here. I also agree with ItemSquare’s comment on mirroring if it’s suggested to look at the center lines for an easier comparison when the cards hiccup. It will defintely help reducing eyes darting around from left to right. I look forward to more videos like these if it’s not too much trouble for you guys. Keep up the good works. 🙂

    • vargis14
    • 7 years ago

    I hate to ask this but I am pretty sure both brands have been tested on windows7 with a i7 CPU and core parking enabled. I hear a good bit here and there that disabling core parking on i7 cpus helps with microstutters.

    Cyril have you ever tried it with core parking disabled. I have it disabled on my 2600k and my 2120s since they have all have hyperthreading and it gives the cpu a better load balance. Since the cores do not have to take the time to wake up since they never go dark and park…Seems reasonable that it could help even if the cpu is not fully loaded.

    I know disabling core parking it has fixed stutters in BF3 for people and the only article i have seen that compared core parking on vs off used winRAR as a test bed and some cpus had a 40% increase in performance….but i cannot find the article now!!!!!
    BTW its win 7 and i7 cpus from Ivy-sandy bridge-E all the way back to lynnfield “that would be a 875k i thinks:)” even further back I7 cpus. Edit: even 8 core HT sandy-e server Z chips could benefit from better load balancing etc.

    Plus i been plugging for a cpu article since i found out about disabling core parking on i7 cpus and wonder if it would help i5 and i3’s also.

    Anyways keep up the good work TR i know you guys are not biased in any way. Least i have not noticed any in ten years 🙂

      • indeego
      • 7 years ago

      Do you have any benches with core parking off versus on or is this just hearsay? It’s the same CPU, even if AMD can’t handle core parking that still points to AMD as the issue.

        • Bensam123
        • 7 years ago

        You know, somethings are about more then just pointing fingers…

        Just like I’ve heard increases microstutters and in my experience it does. So I had HTing off on my i7-860 even though it’s one of the features that I paid for. I’ve written to Scott about this, but never got a reply back on it.

        Sometimes things start out as subjective and people have to do research to make them quantifiable and objective. You can’t always pull up benchmarks from someone elses work because it doesn’t exist.

          • derFunkenstein
          • 7 years ago

          Some things might be about more than just “pointing fingers” but until data is produced, “pointing fingers” is all it is.

            • lilbuddhaman
            • 7 years ago

            aka “Suggest a Hypothesis” in that scientific method thing….

            • derFunkenstein
            • 7 years ago

            If all the farther you get is suggesting, though, you’re not helpful. Great, it’s an idea, but there’s no evidence.

            • Bensam123
            • 7 years ago

            Pointing fingers is about the blame game. It makes people who are on the opposite end feel better about being right compared to people who are on the other.

            So it’s possible to completely polarize a unbiased topic that is designed around gathering information. Science in and of itself isn’t biased, people make it that way when they start trying to force their own agenda or views on it.

            A hypothesis is unbiased and is purely formed around finding facts. Just like if someone finds out data that doesn’t prove their hypothesis you don’t go ‘lolol u rong, u wasted all that money, it was so pointless’, because failure and finding data is about figuring things out. It doesn’t matter if a hypothesis is true or not, it matters that you learn something from it.

            Just like in the case of benchmarking, it’s not about proving a hypothesis, it’s about figuring things out and learning from it. In this case Scott emailed AMD and notified them about what he found because they can then learn from his results or perhaps offer insight as to why they’re happening… possibly even offer solutions.

        • Ryu Connor
        • 7 years ago

        [url=http://www.xtremehardware.com/eng-reviews/eng-reviews/core-parking-on-windows-seven-winrar-performance-with-sandy-bridge–201111226092/<]Link[/url<] This was the article he was referencing that showed a benefit from core parking being disabled. Unfortunately the article limits the test to just one application. It is my understanding that Microsoft tweaked their power management code for core parking in Windows 8.

          • Bensam123
          • 7 years ago

          I don’t think microstuttering would show a big impact in benchmarks as microstuttering has a lot to do with fluidity rather then simply returning numbers which may have latency associated with them.

          Like hyperthreading I describe shows very little downside in benchmarks, but you can tell when a core gets overloaded as it takes longer to deliver time sensitive data (such as in games), even if it raises the overall throughput.

      • Bensam123
      • 7 years ago

      What is the technology called so I can find it in my bios? Personally I disabled HTing because I found it caused microstuttering (although more like smaller stutters then micro-suttering if that makes sense… mini-stutters?). I’ve heard other people doing the same.

        • MadManOriginal
        • 7 years ago

        It’s not a BIOS option, it’s Windows. I followed this guide (or one just like it) [url<]http://forum.notebookreview.com/asus/494232-how-adjust-core-parking-inside-windows-7-a.html[/url<] I like this one better than the ones that just straight delete registry keys because it allows adjustment within Control Panel -> Power Options. Confirmed to work in Resource Montior (none of the cores say 'parked' any more.)

          • Bensam123
          • 7 years ago

          You sure it doesn’t have a bios option? C-states are also part of the OS and the OS can influence them, but they can also be found in the bios and disabled there.

          Where do you get a report about parked cores in resource monitor? I’ve never seen that.

          Edit: I think my normal activity is so high that they never park, so I’ve never seen the parked status. I’ll have to try this out though.

          Doing some googling I actually found a few utilities that do it without going into the registry.

          [url<]http://www.thewindowsclub.com/enable-disable-core-parking-windows[/url<] [url<]http://www.bitsum.com/about_cpu_core_parking.php[/url<] It appears as though this substantially increases the performance of AMD Bulldozer/Vishera CPUs in time sensitive workloads... IE gaming, encoding, or audio work. O_O What do you think the chances are Scott actually disabled core parking for Vishera? There is even a Microsoft KB for it. [url<]http://support.microsoft.com/kb/2646060[/url<] Edit2: Holy fuck, this may be a placebo, but my mouse movement feels ridiculously improved simply by doing this... Overall system responsiveness on my i5-3570k and even my laptop, which is a Core2 T8300 is massively increased. Looking at my voltage for my processor, it still goes into C-States and reduces clock frequency. Looking at power consumption, it hasn't changed like at all in Argus Monitor. Dude, wtf.

            • MadManOriginal
            • 7 years ago

            lol. yw 🙂

            I don’t like the utilities because I am a ‘go as deep as possible solution’ kind of guy. But they would make a lot of sense to use on a laptop to switch between wall and battery power.

            • Bensam123
            • 7 years ago

            I like having a .exe because you can leave it on your desktop and it reminds you to do it on other computers as well as being able to reverse the changes all with a easy to use UI.

            Yeah, I thought this is something like C-State, but it appears to be a MS thing.

      • Bensam123
      • 7 years ago

      I would extremely, extraordinarily, highly, beyond a doubt recommend anyone that does any sort of encoding, gaming, or audio work DISABLE CORE PARKING!

      I just did it and I’m instantly a believer. This improves system responsiveness in ways I can’t even begin to comprehend (I’m highly sensitive to this sort of thing…). Even if your cores are never listed as parked it most definitely STILL improves fluidity immensely.

      If you’re using Vishera and Bulldozer it improves performance as well.

      Here is a easy to use utility for doing it, including a profile for battery (in case you’re on a laptop).

      [url<]http://www.bitsum.com/about_cpu_core_parking.php[/url<] It says it may increase power consumption, but C-States still work for me. Frequency still goes down, voltage still goes down when it's not under load, and looking at power consumption in Argus Monitor it hasn't increased. However, system fluditiy and responsiveness is off the charts now. I would estimate it increased by maybe 400%. The easiest thing to notice is mouse responsiveness.

        • vargis14
        • 7 years ago

        Thanks Bensam123..sorry i did not notice your post about the disabling it in the bios. Its’s strictly confined to windows 7 to my knowledge.
        I did it the old fashioned way since there was no utility available to disable core parking.

        I had to log in as a admin and change a command line # save and reboot.
        Needless to say i did it on all my machines.

        As for the finger pointing and evidence…all i have is first hand experience and many accounts from people on different forums. Plus the one article i found but lost but Ryu conner found….thanks Ryu, i am very happy you found that.

        All i can say if you do not believe or try it is this. Its your PC if you want it to perform more smoothly try it out if not live with what you have……but it makes you PC feel good:)

        The reason i posted this info here and in the forums is because i wanted a full blown TR CPU review article weighing the benifits of having core parking disabled vs enabled to get some facts about the Core parking issue.
        Who better to get data and do a article on it is better then TR? Nobody.

        Also i would like to point out i did not by any means discover this…i just came across it gave it a try and liked the results….but i had no baseline to post any of my findings.

        • indeego
        • 7 years ago

        [quote<]The easiest thing to notice is mouse responsiveness.[/quote<] Well that solves it then. 😐

          • Bensam123
          • 7 years ago

          I don’t understand? Try it out, then ridicule me.

    • SSJGoku
    • 7 years ago

    No one needs fanboys, if AMD goes bankrupt, ya’ll gonna pay at least 1000 USD for the gk110 next year 😀

    • Krogoth
    • 7 years ago

    200+ responses

    Most of them are delicious tears of fanboys. It is hilarious that two pieces of fancy silicon can generate so much rage. It is even more sad that the difference between the two pieces is trivial at best. Finding the said difference requires the use of specialized tools under deep scrutiny.

      • DeadOfKnight
      • 7 years ago

      I predict that in 2014 we’ll all be looking at AMD. Then in 2016 or 2017 we’ll be looking back again.

      These two leapfrog each other on a regular basis. I wish I could say the same about AMD and Intel.

    • madgun
    • 7 years ago

    In their quest to get higher FPS, AMD introduced higher latencies. I wouldn’t want micro-stuttering ruining my gaming experience. For people crying foul, I would suggest they prioritize gaming quality vs. bragging rights. If they can live up with the micro-stutter or don’t notice it, then by all means this article is not for them.

    • WaltC
    • 7 years ago

    Tthe methodology in tests like these is subjective and so it is subjective results that we get at the end of the day. The attempt to chart the subjective characteristic known as “smoothness” by way of millisecond measurements recorded as “spikes” in a chart is just one more attempt to take what is at best a subjective experience and turn it into a scientific measurement leaving no room for doubt. Problem is: it doesn’t work that way. Look at this thread. One man’s smoothness is another man’s briar patch…;)

    Comparison 120fps: there appear to be slight but noticeable differences in the position of the camera relative to the subject matter. The camera sometimes seems physically closer to one scene than the other scene, sporadically, making it appear as if the terrain is traveling faster under one camera than under the other, from time to time–hence the unevenness that appears. Human error in the methodology creates human error in the results.

    Comparison 240fps: This example is really skewed. The most obvious thing is that the Radeon camera is positioned dead ahead of the oncoming terrain while the TI camera is positioned off-center, so that the terrain seems to flow under the camera from right to left on an angle. This makes the terrain passing in front of the TI camera often appear to flow faster than the terrain traveling under the Radeon camera, and adds one other difference as well. The terrain under the Ti camera is far more “jumpy” and sporadic as it passes by the camera–but not because of any characteristic native to the Ti graphics card…;)

    Basically the differences we see in these short films can be attributed to:

    1) The difference between shooting 120 fps versus 240 fps
    2) The difference of the two cameras in terms of their positions relative to the subject matter
    3) The difference in distance and angles of the two cameras relative to the subject matter

    If the 120fps test was being played back at 60fps, was every other frame dropped? Likewise, were 3 out of every 4 frames dropped or “averaged” to compile the test results for the 240 fps test?

    Not to put too fine a point on it, but I think these kind of tests do kind of show what you think they show, it’s just that they don’t mean what you seem to think they mean…;)

    4)

      • DarkUltra
      • 7 years ago

      You are missing one. Even if this is very subjective, hell even if it is not noticable at all, we should still look for and expose these things. If not, graphics card wendors may introduce more such issues and they will stack and become truly noticable.

      • badpool
      • 7 years ago

      Your list misses the simplest and most obvious explanation for the differences in the video output recordings, ie. the one given by the article. Geforces and Radeons are completely different hardware + software combinations, even if they both attempt to do the same thing they are not the same. A modicum of *constant* difference between camera angles would not explain what is clearly an *inconsistent* frame time on the Radeon.

    • ClickClick5
    • 7 years ago

    This blew up fast!

    The Radeon seems smoother, with the occasional 5 frame jump.
    The Geforce is consistent with small frame jumps.

    • MIDOKENZO
    • 7 years ago
    • sammied54413
    • 7 years ago
    • DPete27
    • 7 years ago

    Looks like [url=http://www.fudzilla.com/home/item/29813-radeon-hd-7950-has-no-problems-with-windows-8<]the folks at fudzilla[/url<] are fans of TR's coverage of the 7950 vs 660Ti battle.

      • MadManOriginal
      • 7 years ago

      Wierd, first they call this site Tech Report, then they call it Extreme Tech.

      I think Scott has a surprise coming for us….

      EXTREME TECH REPORT!

        • Damage
        • 7 years ago

        We have always been extreme. /me sips Mountain Dew.

          • Darkmage
          • 7 years ago

          Extreme people [i<]chug[/i<] Mountain Dew.

            • Damage
            • 7 years ago

            Dude, it’s hard enough to type a comment and take a sip of Dew during a base jump. Cut me some slack.

            • Arclight
            • 7 years ago

            Ah, so you filmed it too
            [url<]http://www.youtube.com/watch?v=oaMTSOI1Zk4[/url<]

            • Meadows
            • 7 years ago

            Be wary of tropical islands and slave trader pirates.

          • MadManOriginal
          • 7 years ago

          Not enough allcaps, man…hardly extreme at all.

            • superjawes
            • 7 years ago

            That implies that SSK is “extreme.”

            • MadManOriginal
            • 7 years ago

            Oh, there’s no implication, IT’S A COLD, HARD FACT.

    • phez
    • 7 years ago

    AMD are you even trying anymore?

      • HisDivineOrder
      • 7 years ago

      AMD shrugs. “Meh.”

    • flip-mode
    • 7 years ago

    I’ll stand by this:
    [url<]https://techreport.com/discussion/24051/geforce-versus-radeon-captured-on-high-speed-video?post=694487[/url<]

      • flip-mode
      • 7 years ago

      Anand replied to to my tweet with the following:
      [quote<] I've known @scottwasson for a while and I've never known him to be biased in his GPU coverage[/quote<] That's very unsurprising as I know Scott and Anand hold each other in high regard. I'd paste a link to Anand's tweet if I knew how to do that. I don't do much twitter. (that was my second tweet ever LOL).

        • MadManOriginal
        • 7 years ago

        You’re +2 tweets over me then :p

          • flip-mode
          • 7 years ago

          Bro, you should totally tweet me something.

            • MadManOriginal
            • 7 years ago

            TWEET THIS #$%^@#%@ !! 😀

            • flip-mode
            • 7 years ago

            We’re getting downvoted!

            • MadManOriginal
            • 7 years ago

            TWEET THAT!

            I am going to start using TWEET like the Smurfs use ‘smurf’ It will be loads of fun!

            • superjawes
            • 7 years ago

            I uptweeted you guys just to be nice.

            • flip-mode
            • 7 years ago

            LOL, uptweet.

        • derFunkenstein
        • 7 years ago

        Click the timestamp on his reply and voila:

        [url<]https://twitter.com/anandshimpi/status/279440323208417282[/url<]

      • maxxcool
      • 7 years ago

      Lazy ignorant fanboi haters gotta hate… I up voted. And FWIW, I come to Techreport and Anand for information I can trust and have yet to be disapointed..

      GJ TR, ignore the h8 and tech info libr8…PEACE!

    • Ryhadar
    • 7 years ago

    In the end, if this means that the first, “true” winner of the 7970 giveaway forfeits his prize due to the latency worries and I win because of it then I’m all for these Radeon latency spikes.

    Just fix it after the giveaway, mmmmmk AMD?

      • derFunkenstein
      • 7 years ago

      That is pretty amusing, since it’s like an hour away.

    • jimbo75
    • 7 years ago
      • Arclight
      • 7 years ago

      I see an analogy to global warming where people feel like they are being deceived because scientists are telling again and again that it is happening…..

        • l33t-g4m3r
        • 7 years ago

        I don’t. There is no correlation, plus there are other scientists claiming global warming doesn’t exist, and there’s always the question of funding and who benefits from the related legislation. Al Gore summarizes a large part of what’s wrong with the global warming movement. There is no solid ground, nor is this subject even remotely relevant.

          • sweatshopking
          • 7 years ago

          ? i’ve seen a large number of studies that suggest a LARGE majority of scientists are in favor of man made impacts.

          I agree it’s not relevant.

        • yogibbear
        • 7 years ago

        Yeah but there’s no evidence of global warming. The world hasn’t statistically increased in global mean surface temperature in the last 16 years, yet we’ve increased the CO2 in the atmosphere EVERY SINGLE YEAR over that time span. Don’t tell me that isn’t a trend that should call into question the validity of global warming.

        The 100+ different “climate models” they use to make those stupid predictions you read about the sea level rises 2m by 2100 and the world temp increasing by 8-10 deg. celsius in the IPCC AR4 and the pre-release journal articles for the AR5 review ALL say in their basis of design that if a flat trend is seen for 15 years in global mean surface temperature that they are WORTHLESS. So why are they not scratching their heads and throwing their hands up in the air in disgust at Doha? Maybe because they’re all chumps… wanting to tax the world and control it.

        Thumb me down, but if you accept the so-called “science” you are the chump.

        • Mumrik
        • 7 years ago

        Video card performance is all about what we see with our own eyes. Global warming is an absurd comparison.

        • cynan
        • 7 years ago

        Nah. The whole issue with global warming is not whether it is happening – it is (all you need to do is look at the disappearing glaciers and rising sea levels). The issue is whether or not our recent (relative to human existence) output of greenhouse gases from consuming fossil fuels and mass livestock farming is accelerating warming that would naturally already be occurring. THIS is what is hard to prove and where the debate lies.

        /back on topic

          • Arclight
          • 7 years ago

          [quote=”cynan”<]The whole issue with global warming is not whether it is happening - it is (all you need to do is look at the disappearing glaciers and rising sea levels).[/quote<] The whole issue with this HD 7950 vs GTX 660 Ti articles is not whether the frame times are worse for the AMD card - they are (all you need to do is look at the graphs where they measured said frame times). [quote="cynan"<]The issue is whether or not our recent (relative to human existence) output of greenhouse gases from consuming fossil fuels and mass livestock farming is accelerating warming that would naturally already be occurring. THIS is what is hard to prove and where the debate lies.[/quote<] The issue is whether or not the recent issue of Catalyst drivers are the cause for this occurance. But THIS is not where the debate lies (we only talk about what's best to buy now at that price point, those who generalize the results through entire past, current and future video cards generations from one brand or another are fanboys). ________________________________________ Yeah you're right, there is a difference

      • superjawes
      • 7 years ago

      1) TR avoids giving too much weight to average FPS measurements for a reason
      2) [citation needed]
      3) Actually not sure which one you’re talking about
      4) Which shows what they’re trying to convey in the results
      5) Missing the point of these benchmarks entirely.

      Perhaps if you tried understanding the purpose you wouldn’t waste your time with such posts.

      • derFunkenstein
      • 7 years ago

      When both cards run faster than the monitor’s refresh rate, they should both look extremely smooth. And the point here is that they don’t.

      Hey if you’re happy with your Radeon card and you need to rationalize your puchase to everyone else have a ball 😀

        • bittermann
        • 7 years ago

        If your happy with whatever card, AMD or NV then why do you have to rationalize anything?

          • superjawes
          • 7 years ago

          BECASUE NVIDIA FANBOIS AND LIBERAL BIAS AND RWRAWERAFAFH

          /giantsquidofanger

          • derFunkenstein
          • 7 years ago

          Asking the wrong dude, dude.

          If I had to guess, though, probably the rationalization is required to validate the purchase. The only way to be satisfied with a sub-standard experience is to rationalize it.

            • MadManOriginal
            • 7 years ago

            [quote<]The only way to be satisfied with a sub-standard experience is to rationalize it.[/quote<] Hey, if it works for about half of marriages, why not video cards?!

            • derFunkenstein
            • 7 years ago

            And the other half end in divorce! /rimshot

        • jimbo75
        • 7 years ago
          • derFunkenstein
          • 7 years ago

          it’s a camera pointed at a monitor. It’s not going to look like it does when you play it yourself. The color quality will drop. /facederp

            • superjawes
            • 7 years ago

            It’s also being shot at 120 or 240 fps, then slowed down so that a human eye can see the individual frames and transition between. Slow ANYTHING down to a few frames per second and it will look crappy.

            • derFunkenstein
            • 7 years ago

            True, but I was referring to the actual image quality, which is what I assumed he meant.

            • superjawes
            • 7 years ago

            And what happens when we assume?

            j/k

            But I personally thought the image quality was a little better on the 7950 side. The images looked just a little crisper than the 660 Ti.

            • derFunkenstein
            • 7 years ago

            1.) Touche
            2.) I don’t think I can make IQ judgments without actual screenshots. About all you can really tell is motion, and motion on the right is far better.

            • MrJP
            • 7 years ago

            I’m glad I’m not the only one who thought this. While I’m reluctant to make too much of resampled Youtube videos, I perceived the 7950 half of each video to be just a touch sharper. It would be interesting to see proper image quality screenshots at the tested settings as we sometimes get in new architecture reviews (with apologies to the reviewers for asking for yet more work…).

      • HisDivineOrder
      • 7 years ago

      Seems like you’re wrong from the get-go because a lot of people are already convinced…

      Even AMD’s convinced. Perhaps you might want to realize when the company involved says “alarm bells are raised,” then that might mean… there IS a problem.

      If you want to protect AMD at all costs, you best realize when AMD is saying they acknowledge the problem because they can’t promise to fix it AND deny there’s a problem at the same time. You might want to get your rhetoric in line with AMD’s at least.

        • l33t-g4m3r
        • 7 years ago

        It’s a real problem, we just want to know how prevalent it really is.

        None of you nvidia trolls are helping things with your bias. More games clearly need to be tested if you really want to prove this is across the board, and anyone who would disagree is doing so because they think it might hurt their view.

        It won’t hurt to test more games if you’re right.

          • superjawes
          • 7 years ago

          1. Prevalent enough to get AMD’s attention.
          2. Having or not having the issue in other games is irrelevant. For the games tested, it’s a real problem (you said it yourself). Problems should get fixed.

          /end

            • l33t-g4m3r
            • 7 years ago

            I don’t disagree with either point. The catch is if you don’t care about those games, and it plays the games you care about just fine. That’s the real issue.

            • bittermann
            • 7 years ago

            Wow, the nvidia fanbois are really down voting today to get validation…the interwebs makez them feelz powerfulz…

            • MadManOriginal
            • 7 years ago

            It’s fun playing whack-a-amole with l33t-g4m3r’s comments!

            • l33t-g4m3r
            • 7 years ago

            What? It’s fun playing YOU”RE A TROLL? Good for you. BTW, it’s pretty obvious that’s what you’re doing.

            • MFergus
            • 7 years ago

            Some people take this stuff way to seriously. It’s not a huge deal, both cards have drawbacks. It’s not even like AMD’s card stutters drastically more. Were talking about $300 graphics cards. They aren’t some huge investment worth the stress of all this angry debating.

            • MadManOriginal
            • 7 years ago

            I don’t know, do trolls play whack-a-mole?

            But in any case, yes I HAD FUN COMMENTING ON THIS. Not because I was ‘trolling’ but because you take it so damn seriously. The thing about trolling is that it’s intended to elicit a negative emotional response…but with you, anything other than I AGREE 100% WITH l33T-G4M3R would elicit a negative response. Because of that, you think that any posts which don’t just agree with you are trolling when they aren’t – they are discussing the topic at hand.

          • MrJP
          • 7 years ago

          There you go again. You make a perfectly valid point (is it just these games?) then undermine it by calling anyone who doesn’t see things your way a troll. You’re forcing people to reject your opinions out of hand by taking an unjustifiably extreme position.

    • Sabresiberian
    • 7 years ago

    I lean towards Nvidia for 2 reasons: PhysX, and a smoother experience.

    No one has to tell me which is better in both these areas, I can see it with my own eyes. (That being said, Nvidia isn’t always smother.)

      • HisDivineOrder
      • 7 years ago

      Pretty much.

      • l33t-g4m3r
      • 7 years ago

      I know my 470 certainly isn’t, but there’s certainly less show-stopping bugs like with Rage. At least you’re being sensible about your preference and not trolling all over people who have a different opinion.

        • MadManOriginal
        • 7 years ago

        ‘Opinions’ that are based on irrefutable and demonstrable facts are not opinions, they are statements of reality.

        ‘Opinions’ that are predetermined by the outcome one wants to be true but are counter to factual data are also not opinions, they are delusions.

          • HisDivineOrder
          • 7 years ago

          And these definitions are facts.

            • MadManOriginal
            • 7 years ago

            You are correct. Imo. 😀

          • l33t-g4m3r
          • 7 years ago

          Facts are based on demonstrable data. TR only has data on a few select games, and that is a FACT. Benchmark more games if you want to claim stuttering exists across the board. All I’m asking for is more data. I’m not claiming the stuttering doesn’t exist, YOU ARE. It is a admitted FACT that you are putting words in my mouth and doing nothing but trolling.

          Results are easy enough to game when you control the tests.

            • flip-mode
            • 7 years ago

            /waits for leet-gamer’s unbiased website that tests with unlimited number of games

            • l33t-g4m3r
            • 7 years ago

            I’m just wondering why we’re just hearing about this NOW, and older dx11 benchmarking titles weren’t used for comparison. Test Metro2033 and Batman, or AvP and BF3. It’s likely TR is blowing this out of proportion if the results are skewed.

            • MadManOriginal
            • 7 years ago

            I’d hardly say you’re wondering, it seems pretty clear you already decided it’s a conspiracy of some sort.

            • l33t-g4m3r
            • 7 years ago

            Right, just continue making things up. That really helps your credibility.

            • MadManOriginal
            • 7 years ago

            If you’re allowed to then so am I! 🙂

            • l33t-g4m3r
            • 7 years ago

            Made what up? I haven’t said anything other than we need more tests before drawing a conclusion.

            • MadManOriginal
            • 7 years ago

            That TR is biased against AMD in some kind of wacky conspiracy theory when all the reasons for doing so are simple, verifiable facts.

            • l33t-g4m3r
            • 7 years ago

            Wrong. TR disagreed with AMD’s trinity preview restrictions, and didn’t do one. FACT. TR’s 460 review tests Metro 2033 with tessellation on, but dx11 off. FACT. TR disabled dx11 in batman for the 660Ti review, FACT. You can’t seriously tell me that TR doesn’t write articles without a certain amount of spin favoring nvidia. It happens all the time, and is even noticeable when they cover features, because AMD usually has less written about them, and are done in a rush last minute. Also, I recall TR using a lot of factory overclocked nvidia cards vs stock AMD ones and pretending that was a fair comparison. No, there is a great deal of bias here.

            • superjawes
            • 7 years ago

            TR recommended AMD cards over Nvidia ones for every single system in the latest system guide. FACT.

            If TR has an Nvidia bias, they’re doing it wrong.

            EDIT: and 3/5 systems in the guide before that had AMD recommendations. FACT.

            And TR just did three giveaways sponsored by AMD. FACT.

            • l33t-g4m3r
            • 7 years ago

            It’s not overt, it’s subtle. But it’s clearly enough to put nvidia on top when things are close.

            Like I said, It won’t hurt to test more games if you’re right.

            However, judging by how the benchmarks were done, and the huge push of shills in the discussion to ignore perspective, I’d say there’s a very good chance things aren’t what they seem. The shills alone make things seem suspicious.

            • superjawes
            • 7 years ago

            The only people I see who are suspicious of the results fall into one of three categories:

            Those who don’t understand the latency metrics
            AMD faboys
            You

            And your conspiracy theories make you sound a lot like category two. We can run around for years and never accomplish what you consider to be “fair,” so unless you are personally willing to conduct your own tests to confirm your hypotheses, then we might as well end it here.

            • l33t-g4m3r
            • 7 years ago

            No, TR has data on what 3-4 VERY QUESTIONABLE games? It’s quite obvious that you people are extrapolating an OPINION, because you DON’T have enough data to say otherwise.

            Seriously. If a user or another review site decides to do the testing instead, and proves this is the case, then there goes any credibility you could have hoped to ever have. It’s already bad enough. If you think you have nothing to hide, then you have the responsibility to do the extra tests. You don’t make general accusations like this without comprehensive data.

            You have to have a SERIOUS problem with your mentality to think more testing is bad. All I’m asking for here is proof this is what it’s claimed to be, because there isn’t enough evidence as is.

            TR has a responsibility to do those tests, because THAT’S A HUGE LAWSUIT WAITING TO HAPPEN. If they’re cherry picking games. TR is responsible for any lost holiday sales due to a potentially skewed article. If a lawsuit is what’s needed to get some damn balance and perspective around here, so be it.

            • derFunkenstein
            • 7 years ago

            Games that AMD bundles with their cards. The fact they don’t run as well as NVidia cards is pretty astounding.

            • superjawes
            • 7 years ago

            Shouldn’t you be benchmarking? How are you going to have time to test DOOM if you spend all your time here?

            • MFergus
            • 7 years ago

            You really think TR did anything that makes them liable for a lawsuit? This is just crazy talk.

            They have zero responsibility to test every game imaginable. If they test a few more I don’t care but no site does in depth benches of 20 games. They have done enough games that they can move on to another project if they choose to do so.

            • MadManOriginal
            • 7 years ago

            I think some meds would bring balance to these discussions better than a lawsuit would.

            And there are 7 games in the ‘revisited’ article which started all this. (Obviously TR has reviews stretching back much further than that.) Here’s the neat thing – if anything there are games biased toward AMD in that article because they are ‘AMD Gaming Evolved’ games. You already know this though because many other people have pointed it out to you so I’m not sure why you keep going on about it. About those AMD biased Gaming Evolved games in particular:

            Sleeping Dogs: NV -4 avg FPS, 99th percentile frame time is a tie, and while the 7950 has over 60 frames that take over 60ms, the 660 Ti has…6. [url=https://techreport.com/review/23981/radeon-hd-7950-vs-geforce-gtx-660-ti-revisited/5<]'Revisited' review page[/url<] Hitman: Absolution: NV -22 FPS, 99th percentile frame time is a tie, while the 7950 has 175 frames that take over 33.3ms, the 660 Ti has...54. [url=https://techreport.com/review/23981/radeon-hd-7950-vs-geforce-gtx-660-ti-revisited/7<]'Revisited' review page[/url<] MoH: Warfighter: pretty much a tie all around, although the summary for that game does say "The result is an almost identical finish in every metric, with the slightest of advantages to the 7950 in the latency-focused numbers." Yup, anti-AMD bias right there. In the other games you can literally see how AMD is sacrificing frame times for FPS if you take a look at the results. (AC3 aside where the 7950 doesn't do well at all.) When the 7950 is close in FPS but waaay behind in frame times that tells us that the gameplay experience on the 7950 is less smooth, hence Scott's conclusion stating just that.

      • derFunkenstein
      • 7 years ago

      nVidia always tries to smother me, too.

        • Meadows
        • 7 years ago

        A job well done, derFunk.

    • flip-mode
    • 7 years ago

    What other fun can be had with this ridiculously cool camera? Can you start a high-speed video blog? Capture a spinning CPU fan? Is there any prospect for making high-speed video capture a more regular part of video card reviews.

      • MrJP
      • 7 years ago

      Maybe you could capture the fanboys throwing their toys out of the pram while reading each review.

      • MadManOriginal
      • 7 years ago

      I take it you saw the slow motion videos of stuff being blown up?

    • Chrispy_
    • 7 years ago

    Easy to see results, especially in Fullscreen HD.

    I find the high-latency frames are [i<]even more[/i<] of a problem with vsnyc on, especially if the regular occurence of them crosses a 16.6ms or 33.3ms threshold; It's really really jarring then....

    • Haqqelbaqqer
    • 7 years ago

    This is why TR has been on my favorite tab for several years.

    Being a half-wit, it took me quite some time to grasp your “inside the second” concept.
    Once I did, however, it seemed like an ingenious approach to measuring the real world performance of a graphics card.

    These videos prove just that. Frame latencies really ARE more important than average fps.

    Only when I saw it captured like this did I realise I’ve been annoyed by it quite a bit over the years, through many different graphics cards, but never managed to pinpoint why and how…
    Thank you TR for your great efforts towards educating the community, while at the same time delivering the most informative graphics card tests available today!

      • RenatoPassos
      • 7 years ago

      ^ This. Some hundred times.

    • ItemSquare
    • 7 years ago

    If you mirror one video or show half of each, it’ll be easier to compare.

      • Meadows
      • 7 years ago

      I will bump this, because it’s so simple, yet I didn’t think of it either.

    • JrezIN
    • 7 years ago

    It is a nice way to compare the differences, but IMHO it would be easier to see the differences with videos splitted in the middle of the screen.
    IMHO of course…

    • Jigar
    • 7 years ago

    [i<][b<]Fudzilla wrote[/b<] - AMD spokesman Antal Tungler is on the ball and said that the review had "raised some alarms" internally at the company. AMD is investigating and hoped to have some answers for us "before the holiday." It appears that AMD also expects the 7950 to perform well in FPS-based benchmarks and give the GeForce GTX 660 Ti a good kicking too. [/i<]

      • chuckula
      • 7 years ago

      So AMD itself agrees that this isn’t some vast pro-Nvidia conspiracy by TR. If only the AMD fanboys agreed with what AMD itself had to say on the issue… but then again, some of them still think that Steamroller is going to be on sale before Christmas, so there’s no sense reasoning with them.

        • l33t-g4m3r
        • 7 years ago

        It’s real, but TR has literally spun this into an endorsement of the 660Ti. That’s the problem. AMD’s looking into the stuttering, and this hit-piece will be obsolete once it’s fixed. The 660Ti is only a better card until the second AMD releases fixed drivers.

        Are you really going to choose a 660Ti over a fixed 7950? Not if you have a brain in your head. Perspective.

          • flip-mode
          • 7 years ago

          If TR starts endorsing products just because they have better performance characteristics, that would definitely cross the line.

            • l33t-g4m3r
            • 7 years ago

            But it doesn’t have better performance characteristics. It’s a cheap shot derived from a short term driver bug.

            • flip-mode
            • 7 years ago

            Yeah, good point, it’s bad form to mention the fact that Radeons constantly have short-term driver problems. TechReport really needs to quit trying to make their video card reviews coincide with Radeon driver bugs. It’s very biased.

            • MadManOriginal
            • 7 years ago

            It’s like pretty much all of AMD’s problems then – short-term, but the fix is always one quarter away!

            • HisDivineOrder
            • 7 years ago

            Or they’ll get it with the next card. Honest.

            • l33t-g4m3r
            • 7 years ago

            It is biased. Because this is a problem that’s existed since the 8500 and quake 3. If TR is going to base their reviews off new releases and driver bugs, then I’m not going to consider it a neutral review site. You might as well plaster TWIMTBP ads all over, and start shilling for PhysX too. I’ve always made my Radeon purchases knowing full well ati/amd had poor support of new titles, which is offset with better long term usability. Patience is a virtue.

            • MadManOriginal
            • 7 years ago

            Since you didn’t answer it elsewhere…how do you fiugre the 1+ year old Skyrim is a ‘new title’? I’ll keep asking until you answer 🙂

            • l33t-g4m3r
            • 7 years ago

            ” I’ll keep asking until you answer :)” Nice. You want to start mimicking me too? Maybe you could just devolve into outright trolling while you’re at it. (oh wait, you already have.)

            I never said Skyrim was a new title, but it is a game that TR selected instead of many other viable choices to highlight this issue. Do we see Dirt: Showdown here? No, we see AC3 and the TWIMTBP title with PhysX Borderlands 2. Meaning this is less prevalent in other games, and more prevalent in the games TR selected. Cherry picking a few games to test and generalizing the results is in no way professional or proves this is a widespread problem. The only thing that this shows is that there is a problem with Skyrim, and nothing more.

            • MadManOriginal
            • 7 years ago

            You have implicitly said that Skyrim is a new title numerous times now by saying things like ‘driver issues (stuttering) are normal in new games’ when discussing these results for Skyrim. Now that you’ve stated that Skyrim isn’t a new game, and the basis for your entire justification of why this stuttering is acceptable is that it only happens in ‘new games,’ I will consider your justification for stuttering refuted. Thanks for playing!

            • l33t-g4m3r
            • 7 years ago

            Is BF3 a new game? How long did it take to fix that, but it WAS fixed. I never said Skyrim is a new title, nor have I implied it. AMD is just slow with fixing bugs, and being that Skyrim performs fast enough I bet it just wasn’t on the priority list for fixing. Guess what? It is now. Thanks for playing!

            You know, my main point isn’t even about the stuttering, it’s about how TR and people like you are handling it. The Nvidia bias here is overwhelming.

            • MadManOriginal
            • 7 years ago

            Oh, I missed this gem:

            [quote<]The Nvidia bias here is overwhelming.[/quote<] Don't worry, you're singlehandedly more than making up for it with your AMD bias! Bazinga!

            • l33t-g4m3r
            • 7 years ago

            Right. My AMD bias. While using a 470, which FYI stutters WAY more than the 7950 does, yet is handling games at a level I’m satisfied with.

            I’m biased towards value and fairness. Something you seem to know nothing about. Troll.

            • HisDivineOrder
            • 7 years ago

            You keep changing the subject. This isn’t about the 7950 vs cards from two generations ago. This isn’t about the 7950 vs some driver you think existed pre-chop. This isn’t about the ideal version of the 7950 driver that magically fixes all the bugs that have been present for months.

            This is about the present day driver, current gen cards, current gen games, and a test that tells you which one is better at doing a specific thing.

            The 7950 is choppy. I’m sorry. I know it destroys your hope for humanity, but it’s true. You want to talk about choppy, I can tell you my 8800 Ultra is choppier than your 470, which is choppier than the 7950 and the 660 Ti, too. That’s lack of performance. The framerates may seem close, but the 470 is a different gen and the comparison becomes… murky at best.

            Meanwhile, the 7950 and the 660 Ti are cards of the same gen, at similar pricing, and are basically the choice for that price point. To act like that comparison isn’t apt and on point is to be very disingenuous.

            • l33t-g4m3r
            • 7 years ago

            Yes the 470 is last gen, but we didn’t see anything comparing it’s stuttering vs the 6XXX series, now did we? I’m not changing the subject, I’m showing that there is clear bias in how TR reviews brands. Nvidia always gets an easier pass, just look through the past reviews. TR gamed Metro 2033 with tessellation on and dx11 off, and Batman with DX11 off once AMD pulled ahead. The nvidia bias exists, and that’s why I’m suspicious of how TR is arriving at their conclusions. Who’s to say the stuttering exists across the board when we don’t have enough games tested to prove that.

            • MadManOriginal
            • 7 years ago

            YOU DON’T NEED A VIDEO TO SHOW FRAME TIMES IN EVERY SINGLE GAME TR TESTS! YOU JUST NEED THE FRAME TIME DATA!

            WHAT PART OF THAT DON’T YOU UNDERSTAND?

            /ssk mode

            • MadManOriginal
            • 7 years ago

            Troll or comedy gold? WHY NOT BOTH! 😀

            • Auril4
            • 7 years ago

            There are some articles on Tech Report that are informative which keeps me coming back, however, Tech Report also has a reputation for being a little biased. It’s hard to envision this site being at the professional level as other sites such as AnandTech.

            • l33t-g4m3r
            • 7 years ago

            True. Anand is completely neutral about reviewing cards, and does so with a wider game and hardware selection. If I’m going to read voodoo reviews, I might as well start reading [H], since they’re less gamed and nvidia biased.

            • flip-mode
            • 7 years ago

            Why don’t you ask Anand his opinion of TechReport’s bias. I bet my Radeon HD 5870 that if you get a response from Anand he’ll come back and tell you that he holds Scott Wasson and the rest of TR in the absolute and very highest regard in terms of professionalism and impartiality. Seriously. If you get a response from Anand that states that he thinks his site is /more/ professional and impartial than TechReport, then you get my Radeon HD 5870 (you pay shipping).

            The fact is that the only people that call TR’s impartiality into question are the people that don’t know how to distinguish impartiality from thoroughness. How in the hell can you call a high-speed video capture biased? It’s an astounding accusation. TR did an article and then DID TWO FOLLOW-UP ARTICLES in order to double check their own analysis, and they’re called out for bias. It’s pathetic.

            Anyway, the 5870 is yours if you can deliver on the above. I don’t think you can, though, or I wouldn’t have bet the card (which is just a tad slower than the HD 7850, so still pretty damn fast).

            • l33t-g4m3r
            • 7 years ago

            STRAW MAN ARGUMENT. The high-speed video is of ONE GAME. Start testing more games and maybe it would seem less biased.

            • MadManOriginal
            • 7 years ago

            You don’t understand the basics of the scientific method very well do you. The purpose of this high speed video camera test was to show that the underlying methodology is valid, and therefore the same data for other games is just as accurate and meaningful. It’s not just about this specific game, or these specific cards, or these specific results.

            You can continue to disbelieve the frame time data all you like with whatever caveats about games and drivers you like, but this high speed camera test proves that the frametime data reflects real gaming experience and the data should be trusted as much as any other results in the future for all games and video cards.

            • flip-mode
            • 7 years ago

            I just tweeted Anand to see if he cares to give a response. I have no idea if he will. But I’m willing to put my money where my mouth is if he does end up responding.

            • tfp
            • 7 years ago

            Can I have your card if he does agree with you? This way no matter what someone wins! 🙂

            • flip-mode
            • 7 years ago

            Hmm…. thinking….

            • derFunkenstein
            • 7 years ago

            For the love of crap, tfp is running a Geforce 9600GT. He NEEDS this!

            • tfp
            • 7 years ago

            And really who doesn’t love crap?

            Also TR, as all of the ATI/AMD cards are defective I’ll take one of them off your hands to help out. We should all do our part to clean up after this mess.

            • flip-mode
            • 7 years ago

            Hey, $115 + ship and it’s yours. That seems like a pretty good deal. PM me.

            • Ryu Connor
            • 7 years ago

            There’s always the remote chance this issue isn’t drivers.

            It could be underlying hardware – despite the specifications we interpret to be beefy – just aren’t as efficient as we believe.

            • l33t-g4m3r
            • 7 years ago

            Rage and BF3. It’s the drivers.

            • Growler
            • 7 years ago

            You can’t always predict the future based on what happened in the past. True, it most likely is a driver issue, but, as Ryu pointed out, there is a remote chance that it’s a hardware issue.

            It may be also possible that AMD isn’t able to nail down the exact cause of the stuttering until the next generation is available. Maybe they’ll never be able to figure it out and it goes unfixed.

            Right now, there is an issue that some consumers might find distracting. Suggesting to someone to buy the card and hope that it gets fixed sooner or later is irresponsible. It’s better to tell someone that there might be an issue that could be fixed in the future, but until then, here is an alternative.

            • jonjonjon
            • 7 years ago

            yea unfortunately my friend just asked me today if he should order a 7950 or 660ti. i wanted to tell him to get a 7950 but he isn’t very technical so i advised him to get a 660ti because of these articles. he ordered the 660ti and im still a little conflicted over telling him to get the 660ti. in the end nvidia just seems smoother and easier to use.

            • Silus
            • 7 years ago

            So basically reviews should not be done or the results should be ignored because drivers have bugs ? I doubt you would be that pragmatic if this was a NVIDIA driver bug…in fact, if NVIDIA had a driver bug, you would probably be here saying that it’s not actually a bug, but rather some optimization or cheat by NVIDIA to get more FPS!

            And “short term” driver bug ? Are we really talking about AMD here ?

            • HisDivineOrder
            • 7 years ago

            Short term driver bug? It’s been present at least six months…

            • Arclight
            • 7 years ago

            I guess that’s too mainstream…….

            • Silus
            • 7 years ago

            I would in fact stop reading TR if they did that…it’s just…ludicrous!

            • derFunkenstein
            • 7 years ago

            edit: I am captain obvious!

    • rrr
    • 7 years ago

    Neither looks very smooth to me…

    Radeon is generally smoother, but with significant occasional “bursts” of choppiness…

    In GF choppiness is not as significant, but pretty much constant.

    I wouldn’t want to play on either if gameplay was like that.

      • ColeLT1
      • 7 years ago

      Because vsync is off.

      • HisDivineOrder
      • 7 years ago

      Then you’re seeing what they’re trying to say. The inconsistency of the AMD’s version makes the game feel choppier than consistency.

        • rrr
        • 7 years ago

        The thing is Geforce one isn’t smooth either.

          • superjawes
          • 7 years ago

          Slow down anything on a computer and you will find that it is not smooth. What you actually see is an illusion of movement created by streaming several still images in a row at a high enough rate.

          What happens when you speed both back up is that the Geforce will appear smoother because it is [b<]consistent.[/b<] That is, every time you see the Geforce freeze, it is only frozen for a short time, so even though this is more frequent, it maintains the illusion better. On the other hand, the Radeon is much faster, but is plagued by those "bursts" you mentioned. Even though the freezing is less frequent, it is more significant, resulting in more massive jumps when the image unfreezes. When this is sped up, those jumps are more noticable to the human eye than those of the Geforce. And that is what the video (and the reviews) are trying to convey. You won't be able to see every choppy frame, but you will notice the big ones regardless of what your overall FPS measure is.

    • Arclight
    • 7 years ago

    I find that there is a big problem in the testing method. The video kept interrupting and there was a white circle made of dots in the middle of the screen that kept spinning……

    Edit
    *Sigh* I feel a disturbing lack of trolling power in my post.

      • MrJP
      • 7 years ago

      Not even close to “the left hand side is slower” I’m afraid.

        • Arclight
        • 7 years ago

        To the left, to the left. All the high latency frames in the box to the left.

      • superjawes
      • 7 years ago

      It was a fair attempt. +1

    • Tristan
    • 7 years ago

    AMD use some tricks to cheat Radeon users and maximize scores in benchmarks.
    Buy radeons, buy…

      • anubis44
      • 7 years ago

      nVidia simply give you less video memory and a slower memory bus and gets away with charging the same amount of money and people still buy the GTX660 Ti. Shame on the buyers, not nVidia.

        • Airmantharp
        • 7 years ago

        Neither the amount of memory nor the width of the memory bus are directly pertinent to the discussion when neither is a limiting factor. This is especially true when you consider that Nvidia improved the efficiency of memory usage from Fermi to Kepler.

          • axeman
          • 7 years ago

          Exactly. What matters is real world performance. Both Nvidia and ATi have at times delivered cards with memory bandwidth that exceeded their competitor’s, but still had less performance. Do your research, and just don’t look at specs, otherwise you might have be suckered into buying a Geforce FX, or more currently, a Radeon 7950 apparently.

            • l33t-g4m3r
            • 7 years ago

            “Real world performance”, which actually exists for the 7950, even moreso now that drivers are being optimized for the card’s memory performance.

        • Badben
        • 7 years ago

        I don’t think you’ve read the article you just commented on.

      • Meadows
      • 7 years ago

      Do not attribute to cheating what can be explained by incompetence.

    • Squeazle
    • 7 years ago

    Personally I prefer the Radeon based off those videos. The GeForce was de-syncing parts of the screen as the went along, and that just irks the hell out of me. Of course, stutters bad enough can make a game unplayable, but I could tolerate what I saw better than the picture falling apart.

    EDIT: I use a 550 ti, and it does just fine by me. Just to say I really don’t care about brands in general, I’ll pick whatever suits my budget, performs decently, and is convenient at the time.’

    EDIT2: Word choice above.

      • spuppy
      • 7 years ago

      That is screen tearing due to the GeForce running at a higher frame rate (74 Hz) than the Radeon (69 Hz) while the monitor is capped at 60 Hz.

      • MFergus
      • 7 years ago

      only solution to tearing is vsync

        • Firestarter
        • 7 years ago

        A higher refresh rate also works wonders

    • l33t-g4m3r
    • 7 years ago

    So… Where’s the expose for 470/570 owners? It’s pretty well known these cards stutter, especially because of ram limitations. The original darksiders stuttered pretty bad on my 470 when it first came out, especially the areas where the floating tiles rose up under your feet, but newer drivers fixed it. Hell, even TR’s given it a small mention in older reviews with BF3. Where’s my video comparison? Nvidia stutters too! OMG, it’s such a travesty! Not.

    Don’t forget the [url=https://www.youtube.com/watch?v=rXKj6Kw-VFk<]TDR[/url<] issues, which AFAIK was never investigated here either. Just conveniently ignore Nvidia issues, but do a multiple article expose on AMD. Yeah. No, seriously, what's the point of this. Why is TR going hardcore on AMD for something that no other sites seem to think is important, and likely won't even exist after the next few driver updates. Is there any investigation to see if specific settings are causing it? No. This is just finger pointing, with no investigation of what's actually causing the problem. W/E. Think for yourself people.

      • MFergus
      • 7 years ago

      Why do you think they are going hard on AMD? They reviewed an Nvidia card too. I have no idea why you think they are just going after AMD. This is a frame stutter expose on both companies. Nowhere does it say that Vvidia has no stutter, only that AMD stutters more on the games they tested. There will always be some amount of stutter.

      Maybe they dont really know how to investigate random card errors like TDR, benchmarks are more fun anyways.

        • l33t-g4m3r
        • 7 years ago

        Because stuttering has existed for years, was much worse with Fermi cards and SLI/crossfire, and TR’s cherry picking games without investigating the cause. If TR want’s to seem impartial, they should have thrown a 470 into the mix for comparison, otherwise the article is too one-sided, which they even admit that it is and recommend a 660 Ti over a 7950. That’s clearly picking sides, especially when it’s obvious the stuttering is a temporary driver issue, not a hardware one, and that should have been emphasized. Other people have mentioned the double standard between Dirt and AC3, and that even powertune may be a cause. It shouldn’t be completely left to the readers to do research into why the stuttering is there, but that’s exactly what’s been done here. This type of reporting will only alienate readers, since anybody smart enough to see through the spin will stop taking the reviews serious.

          • HisDivineOrder
          • 7 years ago

          “That’s clearly picking sides, especially when it’s obvious the stuttering is a temporary driver issue, not a hardware one, and that should have been emphasized.”

          Hmmmm. They said the issue’s been present in every driver they tested, 12.8 to 12.11 at the very least. Not sure why you’d expect this issue to then magically disappear. I suspect AMD’s been ignoring the issue because the only benchmarks that really matter to MOST people are frames per second.

          So what did they sacrifice everything to improve? Frames per second.

          See, you conclude, “It’s clearly a driver issue and temporary,” and I conclude that AMD cards have been doing this for years and they’ll have to give up fps’s (whoops, there go those higher performance drivers) to fix the issue.

          If the 7950’s fps get chopped to make them smoother, then suddenly the 660 Ti has a chance again?

          So your scenario or my scenario seem equally possible as true–which is to say neither is yet known to be the truth–and yet you think TechReport should do more than say exactly what they said?

          They’ve told the reader that the best card for smoothness now is the 660 Ti. They’ve told the reader that the issue is prevalent across a wide range of AMD drivers from the last six months. To me, they’ve done their due diligence. It’s like you’re blaming TechReport for telling their readers that because of this issue, they’ll enjoy the nVidia card more if they’re sensitive to the chop…?

          It’s AMD’s fault for having bad drivers in your scenario and it’s AMD’s fault for futz’ing smoothness in order to boost frames for arbitrary benchmarks in the scenario I described. Either way, the pattern is the same:

          It’s all AMD’s fault. Don’t shoot the messenger.

      • madgun
      • 7 years ago

      Why do you come to this site if you are so satisfied with other reviews. I primarily read TR b/c of frame time comparisons since I absolutely hate micro-stutter. Heck I even gave up on a dual 680 GTX setup because of micro stuttering, although I was getting around 150 fps at 2560 x 1600 with 8x AA in most of the games. Some people can brag about their 3d mark scores but to me the important thing is the quality of gaming. ATI/AMD have produced deplorable drivers over the years. X1900XTX was the last ATI/AMD card I owned ( and truly loved it by the way). After that AMD has had a history of epic driver failures. They have to get their act together and put a formidable software suite to go along with their compelling GPUs. Simply bundling in games does not help!

      • Cyril
      • 7 years ago

      So…

      1) One of our articles shows a noticeable problem with a shipping product.
      2) A number of folks accuse us of being biased and fudging the numbers.
      3) We respond by posting two follow-up articles to address those folks’ concerns.
      4) Suddenly, the fact that we posted follow-ups is construed as “going hardcore on AMD” and becomes fresh evidence of bias.
      5) …yet we’re also criticized for not investigating the issue enough.

      Interesting. 😉

        • MFergus
        • 7 years ago

        You just can’t win sometimes can you?

        • l33t-g4m3r
        • 7 years ago

        1) So what? It’s happened with Nvidia and we’ve never seen such articles here.
        2) No, it’s a driver problem and you’re making it out to be worse than it is.
        3) Repeat it long enough and people will believe it.
        4) Yes, because you offer no further insight in the cause, and recommended the bandwidth crippled 660Ti.
        5) Exactly. You’re only investigating what you want to show the readers, which is that AMD has a stuttering problem with certain games, without offering any perspective or explanation why.

        Interesting indeed.

        Also, you guys have clearly had a BIG problem with AMD, ever since trinity. No wonder people are reading between the lines. I haven’t been interested in a radeon since the 4870, and I’m not particularly a fan of either side anymore. I’m just looking at the big picture, and I think something’s rotten here. This is basically Tom’s hardware level reviewing, if you’re not going to offer perspective.

        The 660Ti is perfect for impulse buyers who want a card to work with the latest games now, but how will it perform a year down the road in comparison? The 7950 will obviously perform better in the long run, if that’s what you’re buying a card for. I certainly agree AMD needs to work on supporting recent titles, and if that bothers you, then go with nvidia. It just happens that doesn’t concern me very much, and that’s why I’m reacting as such. I said the same thing about Rage, and now it’s fine, same with BF3.

        It’s not that I don’t see what you guys are saying, there’s a real problem which you covered well, but there isn’t enough perspective on it and comes across as overly one sided. In other words, it’s not what you said, but how you said it.

          • MFergus
          • 7 years ago

          You can only really claim bias if next generation they don’t do something similar if AMD has much better performing cards. You’re claiming they purposefully never did these benchmarks before because Nvidia used to have the bigger issues with no evidence. Maybe they didn’t do it then because they didn’t come up with the idea to do 99th frame benches yet.

          They don’t owe you an explanation, AMD does.

          • jaydip
          • 7 years ago

          So lets see

          1.We need to throw a Gtx 470 in mix though we are testing 660Ti vs 7950

          2.If it is driver problem why it can’t be talked about?

          3.If AMD cards are stuttering that’s for AMD to find out it is not a duty of TR.Last time I checked TR doesn’t write AMD drivers.

          4.This test is showing us results from a different perspective.The older FPS data is presented as well.Pick what you like.

            • l33t-g4m3r
            • 7 years ago

            1.) You need something for balance or reference.
            2.) Never said you couldn’t. Just put it in a neutral perspective.
            3.) Right, unless TR has a faulty card. Have they tried other cards? What if it’s powertune like someone suggested? If you don’t do any investigation whatsoever, then this is just a witch hunt.
            4.) What older FPS? Skyrim? How about using some of the older dx11 games that TR previously used for benchmarks.

            There have been several users with 7950’s stating they don’t notice stuttering like TR, so it could very well be something TR’s doing with their benchmarks and cherry picked games. I can’t take this stuff serious without more detailed testing.

            • jaydip
            • 7 years ago

            1&2.That’s something I don’t get at all.Every review is neutral as it offers different perspective.If the review concluded that 660Ti had pretty uneven frame times what would be your opinion?

            3.I remember Scott saying he already tried another card with same results.

            4.More data would never hurt.But to say TR is biased towards NV would be pretty far fetched.

            • MrJP
            • 7 years ago

            You know, I have some sympathy with your core argument that some people seem to be taking the results of this particular selection of benchmarks as an indication of underlying problems with all Radeon cards, under all conditions, since the beginning of time. Just a quick browse through any of the recent reviews shows that this is not universally the case, and all the posts along the lines of “…I always knew there was something wrong with my 3870/4870/6970/etc…” are therefore fairly easy to ignore.

            However you lose my sympathy with the way you jump from this reasonable position to some far-fetched conspiracy theory where TR have deliberately selected these benchmarks to present AMD in as unflattering a light as possible. Don’t you think it would have been more unusual if they hadn’t used the latest games for a new review?

            You’ve even gone as far as branding the follow-up articles as a witch hunt, when they’ve been produced purely to address the doubts and questions raised in the comments by the readers. I certainly don’t see how you can possibly level the accusation that they haven’t done enough detailed testing given the additional tests conducted. The only thing they haven’t done that you mentioned is Powertune testing. This could be interesting, but certainly lower priority than the Windows 7, old Skyrim benchmark, and high-speed camera tests. Who knows, at the rate these articles have been coming, Scott’s probably posted a further update while I’ve been typing this.

            In short, pull your horns in and make your points in a more polite and rational fashion, and you might get your concerns addressed.

            (Edit – typos)

            • l33t-g4m3r
            • 7 years ago

            They used those games because those games worked easier to point out flaws. If TR wanted to show this was a wide-range scenario, which they haven’t done, they should have used their standard benchmarking titles. IMO, they didn’t because the 7950 likely didn’t have any problems with those games. I’m not saying the problem doesn’t exist, but it certainly appears that the issues are being exaggerated, and used to promote nvidia for holiday shoppers.

            I don’t care to be told the 660Ti is a better card here when the issue is driver related and will soon be fixed. The 7950 has more ram/bandwidth, and is obviously a better card at the same price point, driver bug aside. Now go impulse buy a 660Ti because of a short term driver bug. See the problem here?

            • MadManOriginal
            • 7 years ago

            Yeah, the problem is you.

            • l33t-g4m3r
            • 7 years ago

            Only if you’re a mindless zombie incapable of independent thought.

            • MadManOriginal
            • 7 years ago

            Better alert Alex Jones about this one – it’s obviously a HUGE conspiracy!

            • l33t-g4m3r
            • 7 years ago

            So your bias here admittedly originates from your political viewpoints. You’re automatically disqualified from this conversation, as all you’re doing is trolling.

            PS. AJ is a close as it gets to real time reporting of corruption that takes months to break through mainstream media, like Fast n Furious. His credibility and ability to do so is making him more popular than liberals like you are even remotely comfortable with. After all, AJ is somewhat responsible for Ron Paul’s massive popularity, and vice versa. The next step is putting his show on cable tv, then getting a real conservative elected as president. I can only imagine how you’d feel then. I’d take a week off and party non-stop. Only a matter of time. Oh, and Jesse Ventura said he’s running 2016.

            • MadManOriginal
            • 7 years ago

            No…I have no bias. I just like making fun of conspiracy nutjobs, and by extension those who believe their most outrageous claims and also see conspiracies everywhere else like you see here.

            Too bad you can’t take a joke, but I guess being incredibly wrong is making you extra irritable today.

            • Lazier_Said
            • 7 years ago

            “I don’t care to be told the 660Ti is a better card”

            You could have saved yourself the typing and just said this in the first place.

            • l33t-g4m3r
            • 7 years ago

            I did. I clearly said think for yourself.

            • derFunkenstein
            • 7 years ago

            What’s amusing is that you THINK you said “think for yourself” and then you ACTUALLY said “AMDAMDAMDAMDAMDAMDAMD”

            • MrJP
            • 7 years ago

            I think you need to go back and read the original article and it’s conclusions a bit more carefully (and calmly).

            On reasons for the game choice:
            [quote=”TR”<]Of course, we have a new crop of games for the holiday season, headlined by titles like Borderlands 2, Hitman: Absolution, Sleeping Dogs, and Assassin's Creed III. AMD's newfound aggressiveness means many of these games are part of its Gaming Evolved program, so they should run very well on Radeon graphics cards—and maybe, you know, not so well on those pesky GeForces. In fact, accentuating its stronger ties to game developers, AMD has taken to bundling a trio of these games with its Radeon HD 7950 cards. Cramming that sort of gaming goodness into the box with a graphics card certainly changes the value equation.[/quote<] Results were not as expected: [quote="TR"<]This certainly isn't the outcome we expected going into this little exercise. Given AMD's expanded involvement with game developers and a claimed across-the-board increase in driver performance, we expected the Radeon HD 7950 to assert itself as the best choice in its class. Instead, the Radeon's performance was hampered by delays in frame delivery across a number of games. [/quote<] Recognition that it's the choice of games that have highlighted the issue: [quote="TR"<]In the end, we're left to confront the fact that the biggest change from our prior graphics reviews was the influx of new games and new test scenarios that stress the GPUs differently than before. (The transition to Windows 8 could play some role here, but we doubt it.) For whatever reason, AMD's combination of GPU hardware and driver software doesn't perform as well as Nvidia's does in this latest round of games, at least as we tested them. That's particularly true when you focus on gameplay smoothness, as our latency-focused metrics tend to do. [/quote<] The final 660Ti recommendation is specifically in the context of playing these latest games: [quote="TR"<] Instead, we have a crystal clear recommendation of the GeForce GTX 660 Ti over the Radeon HD 7950 for this winter's crop of blockbuster games. Perhaps AMD will smooth out some of the rough patches in later driver releases, but the games we've tested are already on the market—and Nvidia undeniably delivers the better experience in them, overall. [/quote<] And BTW, I'm far from an Nvidia troll. The last desktop GeForce I bought was a GF4 Ti4200 and I'm now on my fifth ATI/AMD card since then. Owning a laptop that's in the process of dying due to the defective Nvidia GPU means it'll probably be a very long time until I contemplate buying anything else from them. My current desktop card is a 7950, so I have somewhat of a vested interest in seeing this issue properly investigated and put to bed. FWIW I whole-heartedly agree with you that the 7950 should be clearly the better choice than the 660Ti (and even arguably the 670). The higher memory bandwidth should make it significantly the better longer-term choice, so it's a great shame that the software appears to be failing to do full justice to the hardware in the short term.

          • MadManOriginal
          • 7 years ago

          [quote<]The 7950 will obviously perform better in the long run, if that's what you're buying a card for.[/quote<] So do you let other people use your time machine? Also, Skyrim is slightly over one year old at this point, and the basic game engine is much older than that. So here we are quite literally 'a year down the road' the 7950 still has more stuttering. Do you even have a valid point, other than that you disagree with the results? (p.s.- SCIENCE!)

          • anubis44
          • 7 years ago

          This.

          It’s very important to keep things in perspective. The 7950 has 3GB of memory and a faster, wider bus. This issue is clearly a driver/software issue, which can be, and likely will be remedied. The hardware shortcomings of the GTX660 Ti, however, cannot be fixed with a driver update. Most especially if you have 3 monitors, the 7950 is the obvious best band for the buck. I refuse to pay $300-$400 for a 2GB video card.

        • chµck
        • 7 years ago

        You forgot to bring us cookies >:(

      • superjawes
      • 7 years ago

      I think you miss the point of these follow ups…

      On paper, the 7950 should perform much better than the 660 Ti, and traditional FPS analysis shows the 7950 ahead of the 6660 Ti (at least on other sites). The original follow up was at least partially prompted by new drivers for the Radeons to increase performance in this season’s games (a few of which are also included with the 7950). Despite the drivers, TR discovered some pretty significant hiccups on the 7950, which is shown in these high frame rate videos, which made the 660 Ti a better choice for smoother performance.

      Simply put, conventional wisdom says that shouldn’t happen, which is what prompted several vitriolic comments against TR.

      On top of that, TR–doing the right thing–checked their testing methods trying to rule out any error in the testing methods, promting further analysis (and the “Does the Radeon HD 7950 stumble in Windows 8” review), and ultimately resulting in this article showing what Scott is talking about.

      The follow ups are a verification of the findings. TR published something that some considered controversial, and there was some question as to why the 7950 seemed to falter, so they checked and rechecked their results.

        • l33t-g4m3r
        • 7 years ago

        They rechecked results derived from a few select games. I’m not saying the problem isn’t real, but TR really needs to check more games, and perhaps even the hardware. You don’t make generalized accusations without comprehensive testing.

          • superjawes
          • 7 years ago

          These “few select games” are mainly taken from key holiday titles, one of these cards [b<]comes with[/b<] several of the tested games, and these new drivers were supposedly going to increase performance significantly. From the first "revisited" article: [quote<]As if that weren't enough, AMD has also released Catalyst 12.11 beta drivers that promise a roughly 15% across-the-board performance increase for its 7000-series Radeons.[/quote<] The remaining titles should also be popular this season, including Skyrim, which continues to be popular on PC (and can stress your hardware), and Assassin's Creed III is another bundled title. Sure, you [i<]can[/i<] test more games, but at some point you have to draw a line, or else you're just going to be adding a lot of data points without adding any real meat to the results. So what if the 7950 performs very well in older games? Shouldn't this new card be able to handle the new games? Should gamers have to wait six months or more to play these games because drivers have to be optimized? (Answers: maybe if you only play older games, yes, and no). As for hardware, I have a hard time understanding how you say that this analysis isn't comprehensive. On the conclusions page of the first "revisited" article: [quote<]Our first instinct upon seeing these results was to wonder if we hadn't somehow misconfigured our test systems or had some sort of failing hardware.[/quote<] That bit goes on to checking the hardware on different systems, different AMD drivers, and a whole new review to check if the change to Windows 8 was giving the card issues. If that isn't "comprehensive" enough for you, nothing is.

            • l33t-g4m3r
            • 7 years ago

            ” If that isn’t “comprehensive” enough” If they did do all that, perhaps there should have been more written about it. That’s good if they did, while it’s incredibly bad for people arguing against doing so, because it shows their bias even when it supports their view. All I want here is to have a better understanding of what’s going on, not to be reading some bash-fest from nvidia trolls, which is exactly what we have going on in this discussion.

            • superjawes
            • 7 years ago

            [quote<]...not to be reading some bash-fest from nvidia trolls, which is exactly what we have going on in this discussion.[/quote<] I'm sorry, but I'm just not seeing that. Most of the comments I'm reading are of interest and appreciation of the TR methodology, and even those critical of AMD and the 7950 simultaneously express concern or hope that this will get fixed. Basically, even those critical comments [i<]want[/i<] AMD to perform well, myself included, because a healthy competition is great for everyone. And TR has written extensively on this matter. The Win8 update provided a refresh in why TR testing is different and what the metrics are trying to express. This footage is showing what they were seeing and what the metrics were showing: choppy, inconsistent frame rates. This video is actually excellent in showing that the 7950 is giving, on average, a higher rate than the 660 Ti, but the choppy stalls in the 7950 are much more noticeable (which is what you would probably see if running the test yourself).

    • Firestarter
    • 7 years ago

    Thank you for doing this!

    What this tells me is that the big lag spikes are most definitely an issue. It seems though that the big oscillations that were so different from the previous test are not visible, which is a good thing. The question I’m left with is: Why do those oscillations even occur? Is it a measuring thing? Are they really there but just not visible on a 60hz display?

    • anotherengineer
    • 7 years ago

    Now you need a true 120 Hz monitor 😉

      • Firestarter
      • 7 years ago

      True, but it would have to be tested with a different game as Skyrim doesn’t really do 120hz.

    • kc77
    • 7 years ago

    The question should not be is the testing itself sound. It is. The question is centered around AMD’s drivers. Is there a performance regression? Well the answer is found within Skyrim from the previous 660 Ti review.

    The 660 Ti is more or less performing the same with today’s drivers as before. The performance is slightly better than before in average frame rate, and in 99th percentile. AMD is worse… by a noticeable margin. In this particular benchmark AMD goes from 17.7 in 99th percentile to 18.3. The time spent beyond 16.7ms grows from somewhere between 152 and 155 to 259.

    All of that aside, the 660 Ti has always performed minimally as well as a 7950 and some versions of it (like the 660 TI Zotac) perform closer to the 670 Ti. [b<]The strange thing is this was the case in the old review. [/b<] I guess no one noticed that (too many data points I guess). You can clearly see the 660 Ti Zotac in the older review 2 or 3 FPS seconds slower (which I don't count as being noticeable) and the 99th percentile outcomes better then and now. You take Nvidia's better drivers + AMD's bad drivers and you get the outcomes found within the 660 revisited review. It shouldn't be a shock to anyone. The 660 Ti is Nvidia's sweet spot it performs marginally slower than it's two higher priced SKU's yet it costs significantly less. That SKU in particular is most likely what I'm going to replace my 460 with.

      • BestJinjo
      • 7 years ago

      In their previous review GTX660Ti produced lower fps and similar latency. Not sure what review you are looking at at TR where the outcomes for 99th percentile are better. It was tied.

      [url<]https://techreport.com/review/23527/review-nvidia-geforce-gtx-660-graphics-card/11[/url<] Checking to HD7970Ghz vs. GTX680, 680 actually lost, while 7970 tied an after-market 670 in smoothness: [url<]https://techreport.com/review/23150/amd-radeon-hd-7970-ghz-edition/11[/url<] Saying "You take NV's better drivers + AMD's bad drivers" kinda ignores both of those reviews then where NV tied and lost in 1. It could very well be the case that the current drivers are still unoptimized because they are betas and these games are also newer. I wouldn't start generalizing about NV's awesome drivers vs. AMD's terrible drivers when NV themselves had issues with games like Guild Wars 2 and image quality issues in FC3 with HDAO now.

    • Jigar
    • 7 years ago

    OK, i confess, i thought TR had become biased and were against AMD – But the video clearly shows why i always have been a fan of TR reviews and boy was i on the wrong track recently thinking otherwise.

    Thanks once again for opening my eyes.

    • UltimateImperative
    • 7 years ago

    A possibility for testing: using something like an AJA KONA LHi or a Blackmagic card, in another PC, at 1080p. On the gaming PC, encode the frame number in the last pixel on the right of each line, using an overlay like FRAPS does. On the capture PC, record the original frame number(s) [assuming Vsync is off] and the timing info for each frame received over the HDMI link. Compile the results.

    The cards aren’t prohibitively expensive, but it seems like they won’t do 2560×1600, unfortunately.

    • Ratchet
    • 7 years ago

    Has anyone heard anything from ATI (still not willing) about this? Is this something that can be fixed in drivers?

    • flip-mode
    • 7 years ago

    Interesting. The graphs told the story plenty well: made me want a 680 ti and I don’t even play games anymore. I do see a little more “twitching” on the Radeon, but not nearly as much as the graphs had me expecting.

    • albundy
    • 7 years ago

    it was cus of my comment on that other post, wasnt it? I didnt mean to be an @ss about it, but I do question everything. i did notice the geforce a bit smoother on some instances on the 240fps video in some parts and AMD in others. just wondering why latency would be an issue running a game at 4 times less fps. i would imagine it would be a curve, not a constant.

      • flip-mode
      • 7 years ago

      It’s not an issue if you don’t notice it. Lots of people notice it, though.

        • derFunkenstein
        • 7 years ago

        You’d have to be blind to not notice it. Or one of those “you can’t see more than 30fps anyway” morons. Everybody with even passable vision can tell the difference; it’s just amatter of whether or not that bothers you.

    • Laykun
    • 7 years ago

    This issue forced me to move away from my beloved Radeons to a pair of GTX 670s. I’ve never had such a fluid and awesome experience like these in quite some time (migrated from 3 7970s). AMD have some serious work to do if they want to really get back into the game. I had the same problem with my 5970+5850, massive frame hitching and uneven frame delivery. 60fps didn’t actually feel like 60fps, you needed to get 100+ for a smooth experience.

    • Bensam123
    • 7 years ago

    As per my post in your last update, please do at least a subjective run through with powertune at +20%.

    Also consider collecting power consumption data and graph it along with each processor in the same graph, so you can accurately compare framerate peaks and valleys with power consumption. GPUZ can log values to a file and it does report power consumption numbers as well as a bunch of other helpful statistics which could be overlaid ontop of a FPS graph… such as memory usage, core clocks, and temperature. As long as the card is reporting the numbers correctly and GPUZ is reading them correctly, I could see this as being a very interesting addition.

      • Bensam123
      • 7 years ago

      -6 for suggesting a look at power draw and powertune… Booyah!

        • Krogoth
        • 7 years ago

        Fanboys going to hate!

          • Bensam123
          • 7 years ago

          I’m unsure of if you’re referring to Nvidia fanbois or AMD fanbois or haters of fanbois.

          I have some false positives going on here.

            • Bensam123
            • 7 years ago

            A lot of haterade going around these parts. Something to look at if you don’t understand how powertune works, I’m sure TRs initial report on it is floating around somewhere too.

            [url<]http://www.legitreviews.com/article/1488/4/[/url<]

      • Bensam123
      • 7 years ago

      You guys could at least offer counterpoints as to why testing powertune at +20% would be a bad idea.

    • Phishy714
    • 7 years ago

    TR: 1

    TRHATESAMD: 0

    • raffriff42
    • 7 years ago

    Scott,
    Thanks for all your work. It’s inspired me to made a viewer application for Fraps benchmark files to make it easier for everyone to make their own charts.
    [url<]http://sourceforge.net/projects/frafsbenchview/[/url<]

      • wizpig64
      • 7 years ago

      Neato!

      • nanoflower
      • 7 years ago

      Good job. Did a quick run run through with Fraps and used your utility to display the results. Worked quite well.

      • Chrispy_
      • 7 years ago

      <3

      • Pantsu
      • 7 years ago

      There are some good tools out there already for frame time analysis. I prefer to use Fraps Calc since it gives a lot of information and some nice graphs.
      [url<]http://www.rage3d.com/board/showthread.php?t=33989203[/url<] I see MSI Afterburner has added some sort of frame time measurement too, but haven't had the opportunity to test it yet.

        • Bensam123
        • 7 years ago

        Pretty cool, but according to the post they’re reporting different results. I definitely think the raw data and basic statistics is a good thing, although they try to interpret the data and give you a weighted conclusion based on it (which isn’t good).

          • Pantsu
          • 7 years ago

          Of course they’re reporting different results. One is measured avarages over seconds while the other is measured per frame (with a filtering option to boot), so the more stutter there is, the more skewed results the avarage over seconds measurement will give.

        • raffriff42
        • 7 years ago

        Agree w/ Bensam123; I don’t see the point of statistical analysis. I like the spectrum analyzer though. Thank you for mentioning the program.

        Afterburner gives an average frametime over whatever hardware polling period you choose – not per individual frame like Fraps does.

        Of course it’s valid to ask if Fraps is accurate in the first place; I don’t know, but I think Scott has gone a long way to confirming with this camera test that it is. You’d have to overlay frame numbers somehow to be sure, I think.

        BTW did you know
        – you can resize the Bench Viewer chart window, to up full screen.
        – mouse wheel zooms the X axis; mouse drag pans.
        – if zoomed in far enough, a tool tip appears with individual frame stats.
        – hover the mouse over the Y axis labels; cursor changes to scissors; clicking with scissors zooms the Y axis

        Everyone, thx for your interest.

        • krutou
        • 7 years ago

        Techreport needs to show the ‘spectrum analyzer’ graphs in their frame latency analyses.

        Fraps Calc seems like it would automate most of the work TechReport does in their frame latency reviews. That way, they can use more than 90 sec for their tests.

          • Pantsu
          • 7 years ago

          The problem with using spectrum graphs is that it doesn’t work well if you want to have several of them to compare between. It’s great to point out a problem in one result, or differences between two different results side by side, but it becomes unwieldy for comparing results from several cards.

      • derFunkenstein
      • 7 years ago

      Scott was talking about that on Twitter, cool stuff.

    • superjawes
    • 7 years ago

    You should definitely make more use of video like this when you can. Looking at the video for each card, I felt that the 7950 was faster (and maybe even had better resolution) than the 660 Ti, but it clearly had more significant pauses in flow, and it was much more jarring when the frame was “let loose.” But that’s really what your tests and metrics are all about, aren’t they? Quantifying the user experience.

    Oh, and feel free to give a little voice over these videos. I would love videos like these to be included in reviews when you can.

    • krutou
    • 7 years ago

    Should probably make it more obvious that the videos are slowed down. Eg. 0.5x speed, 0.25x speed like they do in Mythbusters.

    Especially have it written in the title of the Youtube vid, I’m pretty sure a lot of people skim the article and would miss the “The next video was shot at 240 frames per second, four times the speed of the display.” The previous sentence doesn’t make it obvious that the video was slowed down either.

    • BestJinjo
    • 7 years ago

    Hoping HD7970Ghz can be tested against GTX680 as well. Would be interesting to see at least how those compare in the latest games with the latest drivers.

      • Deo Domuique
      • 7 years ago

      Same drivers, same system, “slightly” more powerful cards… I’d totally expect similar results. I honestly don’t think there is something more to see in a comparison between 680/7970.

        • BestJinjo
        • 7 years ago

        HD7970Ghz outperformed GTX680 OC in smoothness factor in their previous review with Catalyst 12.7 beta drivers.
        [url<]https://techreport.com/review/23150/amd-radeon-hd-7970-ghz-edition/11[/url<] I want to know if the new drivers from AMD compromised smoothness. One way to find it out is to have more data for other cards and in other games too. Most sites now show that HD7970Ghz is 10-15% faster than GTX680 at 2560x1600P, which means it should be smoother than the 680 in theory. If TR shows that HD7970Ghz is less smooth than the 680, then we'll have even more evidence that perhaps AMD's driver issues are at play because these problems weren't there if you check their older reviews. I can't see why this would hurt, if anything it would give us more data to understand why suddenly AMD cards are stuttering more and just months ago, these issues weren't there: [url<]https://techreport.com/review/23527/review-nvidia-geforce-gtx-660-graphics-card/11[/url<] Is it related to the drivers in general or is it related to these games? How would games like BF3, Max Payne 3, Far Cry 3, Planet Side 2 fair? We know that HD7970Ghz had no stutter issues in BF3, or Max Payne 3 in the early TR review. Revisiting some of those games could help to see if the latest drivers introduced a problem that wasn't there with Catalyst 12.7s.

    • tbone8ty
    • 7 years ago

    so what do you guys think is the problem with the 7950? is it hardware related? (i thought Nvidia cards had some sort of chip or software thing that smoothed out the frames before they were sent to the display?)

    is the 7950 lacking some sort software/hardware bit that causes this?

    here’s the article you guys did with the slide I was referring to about frame production timieline in nvidia gpu’s

    [url<]https://techreport.com/review/21516/inside-the-second-a-new-look-at-game-benchmarking/11[/url<] never settle for second best driver is coming!

      • Dposcorp
      • 7 years ago

      He said in the article earlier he suspects a software issue.

      By the way, I am waiting for the comment of ” Unless I see the video card marks in his hands and put my finger where the nails were in the PCI express slot, and put my hand into his side where the video card fan spins, I will not believe it.”

      Maybe you could overclock both video cards until they died, and then resurrect them 🙂
      Nah, some people still wont believe.

      Thanks again for all the hard work Damage.

      • anubis44
      • 7 years ago

      No it is definitely not a hardware issue. It can be fixed in the driver/software. The 7950 is clearly superior to the GTX660 Ti in terms of hardware specs, and when this issue is fixed, 7950 owners will be better off than GTX660 Ti owners. You can’t add more video memory or a wider memory bus with a driver update.

        • swaaye
        • 7 years ago

        The pretty specs sheet doesn’t tell you about nasty hardware errata or other things like poorly-sized buffers / caches. So don’t count anything out of the equation.

    • OhYeah
    • 7 years ago

    Is there any evidence yet what is producing this behaviour? Is it simply faulty hardware, firmware or drivers?

    Great job on the video, by the way. I tried to guess just how bad the latency issue on some of the AMD cards is on your reviews on now we know.

    • Mat3
    • 7 years ago

    How about running the video at the normal speed at which you move in Skyrim, instead of this slow-mo/walking stuff.

    My 6870 runs this game great. I’m not on the lookout for frame stuttering, but there’s nothing noticeable.

    • nanoflower
    • 7 years ago

    Thanks for doing this extra work, Scott. It’s obvious that the 7950 is having more issues delivering a smooth frame rate but the Geforce is also having a few issues. I wonder what options would need to be turned off/down in order to get a smooth frame rate while still maintaining a high quality image?

    • Krogoth
    • 7 years ago

    (copy and paste from the other thread)

    I suspect the problem lies with DWM 2.0 and some kind of buffering issue. Windows 7/8 share the same kernel, the only difference is DWM. Windows 7 is on version 1.1 while, Windows 8 rides on version 2.0. DWM 2.0 is completely reworked. AMD driver teams either didn’t have the time or resources to tune their drivers for it, unlike Nvidia. I suspect this “current issue” will go away with a few months anyway.

      • Draphius
      • 7 years ago

      i doubt it it will go away. its been here for years people are just finally catching on to it. i want to know if anyone has tested the same cards with different cooling solutions. once i watercooled my cards the microstuttering i used to get went away completely in 90% of my games

    • brucethemoose
    • 7 years ago

    Will Vsync do anything to help?

      • Damage
      • 7 years ago

      Vsync will eliminate tearing, but it would also likely make the slowdowns seem worse, since you’d be waiting a whole refresh interval before getting new frame info displayed.

    • PopcornMachine
    • 7 years ago

    I guess you could say that a video paints a thousand graphs.

    Seems pretty conclusive. AMD needs to issue another beta driver.

    Thanks for the info.

      • cynan
      • 7 years ago

      They have. In the recent comparison between the HD 7950 and GTX 660 Ti, the 7950 was running Catalyst 12.11 beta 8. They’ve since released 12.11 beta 11.

      • lilbuddhaman
      • 7 years ago

      But this issue has existed for a long time…at least in my experience.

      And I double dog dare you to try a similar test out with Crossfire vs SLI of the same cards, i bet the frame times on the ati will be abysmal (either that or there is something very wrong with my 6870×2 setup)

        • Draphius
        • 7 years ago

        SLI used to give me terrible microstuttering until i watercooled my cards. since then games run much smoother. i swear gpus are running on the ragged edge of there heat tolerances on most cards.

          • lilbuddhaman
          • 7 years ago

          I have one watercooled with a frankenstein H60+ziptie rig, and the other standard. Dropped the fan one by 8deg (no more hot air blown on it) and the other by 15deg. Max temp is 59C on them

    • bjm
    • 7 years ago

    Scott, I know you’re trying but the methodology here just is not sound. In each of those screen recordings, the Radeon HD 7950 was consistently on the left side of the screen. Taking the Youtube compression into account, the left side of the screen is always more choppy. Nice try, but please remake those videos with GeForce GTX 660 Ti on the left side next time and I guarantee the results will be different.

    SMH.

      • MFergus
      • 7 years ago

      If the youtube video appeared drastically different than the source, im sure he’d say something. The video lines up with the frame latency issues AMD has on certain games.

      edit: i think bjm is just being sarcastic and im an idiot for not being able to tell

        • Damage
        • 7 years ago

        The video and the sources do indeed align.

          • MFergus
          • 7 years ago

          how bout you put up another video on youtube of the youtube video and source video side by side? :p

          • bjm
          • 7 years ago

          tch, I ‘spose my sarcasm was a bit too subtle!

          I didn’t think anybody would find the argument of the videos being on the left side the least bit convincing. But I should’ve known the better, the “TR hates AMD” crowd had come up with some really silly ones and seemingly were serious.

          I’ll try better next time. 🙁

            • Damage
            • 7 years ago

            And here I was frustrated about people not getting my Max Payne joke. Well played, sir.

            • MFergus
            • 7 years ago

            If I wasn’t an idiot i probably wouldn’t of taken the post seriously

      • chuckula
      • 7 years ago

      Either bjm is crazy, or he has a particularly dry sense of sarcasm… either way I think it’s great!

        • mnemonick
        • 7 years ago

        He almost got me, I admit. I was all set to post some pedantism about how video codecs work- thank glob I scanned the replies first. 😀

      • spuppy
      • 7 years ago

      You know what’s funny is that people are actually using Youtube as an excuse in other forums.

      • HisDivineOrder
      • 7 years ago

      Really, I imagine some users wouldn’t believe them if they showed the whole thing in person. Don’t think a youtube video is going to prove it to everyone, but the majority will probably be reasonable.

      • RenatoPassos
      • 7 years ago

      Best. Trolling. Ever.

    • StuG
    • 7 years ago

    Any chance of doing this with a 7970 and a 680? I’m curious now if this behavior follows the Radeons up the line.

    • Zarf
    • 7 years ago

    Fantastic article! Thank you. This whole ordeal has been fascinating to read through.

    I have a 7870 in my Windows 8 rig. Is that card affected by this same issue? I don’t get 60 FPS in Skyrim (Due to a combination of mods and my aging i7 920, though that is overclocked from 2.6 to 3.6 GHz), so I don’t know if I’ll be able to tell the difference between this deal and a normally sub-60 frame rate.

    Is this an issue that can be fixed with driver updates? Is it caused by the latest drivers? Will the fix need to come from Microsoft? Is Scott Wasson actually his own twin brother? STAY TUNED TO FIND OUT!

      • MFergus
      • 7 years ago

      The newer amd drivers slightly increased frame latency going by the other article, this video is just to show an example. Frame latency varies from game to game. Not every game will have huge differences between nvidia/amd.

        • nanoflower
        • 7 years ago

        Also keep in mind Scott is using settings that are designed to stress the card. Especially when he uses the high resolution settings. So it’s possible that you may not run into the issues if you don’t run at high resolution with all the eye candy turned on.

      • zimpdagreene
      • 7 years ago

      I agree also excellent article. I have also use AMD from 3850 to crossfire to 4870×2 6870 trifire and have had the same problem. But this puts the numbers on it and shows that there is something not right. I will make a guess like everyone in that I think its also drivers and the way the software manages performance and that’s something coded into the drivers and software. My way out there guess is also that may be some type of money relation to keep up new card sales. But it is something the AMD Catalyst programmers know but keep silent on. Until we can get inside the drivers and software then we will know. So there’s my guessing on it. But its just guessing as its my own experience. The big mystery……

    • cynan
    • 7 years ago

    Cue outcry over [i<]only[/i<] having video from a [i<]single[/i<] title (Skyrim). But seriously, this really does make it easy to see the hiccups on the Radeon. The fact remains that it's still a bit odd that back in the August [url=https://techreport.com/review/23419/nvidia-geforce-gtx-660-ti-graphics-card-reviewed/7<]660 Ti[/url<] review, both avg FPS and frame time latency between the HD 7950 and GTX 660 Ti were similar. And in the [url=https://techreport.com/review/23981/radeon-hd-7950-vs-geforce-gtx-660-ti-revisited/9<]most recent comparison[/url<], the GTX 660 Ti hands the 7950 its rear. The 660 Ti's current performance is consistent, more or less, with the August reviews's results, while the 7950 is substantially worse latency wise. I think this is what has people scratching their heads.

      • nanoflower
      • 7 years ago

      You may be right about people complaining about not having video for other games but I would hope not. I had no doubt about what Scott was seeing or what it looked like as I’ve seen it before when really stressing a card. My only question would be how much of the eye candy needs to be turned off in order to get a smooth frame rate instead of those hitches in display.

      As for that older review keep in mind that Scott switched to a different area in Skyrim which appears to put more stress on the video cards. That may explain the issue. Or maybe it’s some enhancement AMD did to the Radeon driver as I don’t recall if Scott tested with older drivers when running this round of tests. (I can understand why he wouldn’t as what he did certainly took enough time.)

        • cynan
        • 7 years ago

        I know the more recent Skyrim tests were done in a more open environment/map. Still, it’s a bit hard to swallow that moving to a more open environment causes the Radeon to fall apart, but not effect the 660 Ti in the slightest. I suppose it’s possible, but my money is on a driver issue. I hope it’s a random compatibility/conflict issue that’s cropped up and not something like AMD is tweaking drivers to deliver maximum FPS numbers at the expense of latency…

        To appease readers even further (not that there is any expectation to do so after all of the excellent work done so far) the obvious thing would be to retest Skyrim using the same map as in the August 660 Ti review.

          • MrJP
          • 7 years ago

          Scott did just that in the second article [url=https://techreport.com/review/24022/does-the-radeon-hd-7950-stumble-in-windows-8/9<]here[/url<]. [quote="Damage"<]The Radeon HD 7950 appears to have regressed a bit with the move to newer drivers, with a slight drop in FPS averages and a corresponding increase in 99th percentile frame times. That's true even though we've switched to a Sapphire 7950 card with a 25MHz higher Boost clock in our recent tests. Meanwhile, the Zotac GTX 660 Ti has improved somewhat with newer software and the move to Windows 8. The biggest change may be in its frame time plot, which looks tighter, with less variance than in our prior review.[/quote<]

            • cynan
            • 7 years ago

            I think the more relevant passage on that page you linked is this:

            [quote<]The larger takeaway is that the results from both test scenarios are very likely valid. They're just different. The Radeon HD 7950 handles the graphics workload in our Whiterun loop quite competitively, essentially matching the GeForce GTX 660 Ti, with nice, low frame latencies and relatively minor variance from frame to frame. However, the 7950 doesn't process the more difficult workload in our cross-country test nearly as gracefully as its GeForce rival does.[/quote<] Damage says that the difference in performance between maps in Skyrim is feasible. But that doesn't mean he actually retested using the Whiterun map that was used in the 660 Ti review in August. Is this just conjecture or did he actually retest the Whiterun map? At the end of the day, perhaps not worth going through the bother. But it would be interesting, seeing as how Damage has gone through all this trouble so far.

            • MrJP
            • 7 years ago

            Read the whole page again. He clearly did re-run the Whiterun benchmark specifically for this article with both the 7950 and 660Ti because otherwise he wouldn’t have been able to make the comments he did about the effect of the latest drivers (which weren’t available back in August). That’s what the table at the bottom of the page shows.

            • cynan
            • 7 years ago

            Whoops. I somehow missed the teeny tiny paragraph at the top (where it does indicate that Whiterun was retested). I wish I could minus myself down for that last post.

            So it really does seem to be something funny with the recent drivers..

      • MadManOriginal
      • 7 years ago

      The purpose of this is to validate the testing method in general rather than to rerun the same tests that they ran already.

        • cynan
        • 7 years ago

        What testing method? The high speed camera? What is there to validate? The only reason why the difference in smoothness are as pronounced as they are (lending validity to this method) is because the 7950 performed as poorly as it did. Determining why this happened (whether it is limited to a specific driver release(s)) and whether it is limited to a few vs most games is what is interesting to me (particularly if I was in the market for a new graphics card).

          • MadManOriginal
          • 7 years ago

          No, the frametime testing and data and whether it’s really representative of what you see on the screen.

      • Kaleid
      • 7 years ago

      Go back even further… January:
      [url<]https://techreport.com/r.x/radeon-hd-7970/skyrim-beyond.gif[/url<] [url<]https://techreport.com/review/22192/amd-radeon-hd-7970-graphics-processor/9[/url<]

    • ShadowTiger
    • 7 years ago

    This is a really cool article. It makes it really easy to see the difference in smoothness, though I am not sure if it would really bother me at normal speed, I usually don’t get anywhere near 60 fps in recent releases.

    • chuckula
    • 7 years ago

    NVIDIA IS IN CAHOOTS WITH THE CAMERA MAKER! IT’S ALL A LIE!

      • Deanjo
      • 7 years ago

      It’s their new “The Way It’s Meant To Be Recorded” campaign.

      • HisDivineOrder
      • 7 years ago

      Wrongo.

      In fact, it’s a conspiracy run by nVidia, but also aided by manufacturers of displays, cameras, Youtube, and every web browser. Secretly, nVidia added tech to every browser–with the cooperation of all browsers–that detects a signal sent by the secret code display manufacturers were paid to install into every display ever made. Displays only transmit when it’s AMD powering a display. This signal is detected by the camera makers and/or youtube and adds choppiness to the part of the video that is analyzed and determined to be an AMD-driven display.

      So you see. nVidia is everywhere. Their code is in everything. For this, they paid all the secret conspirators in free 660 Ti’s. One for each. Techreport was hired by nVidia to then spring the trap and catch AMD flatfooted, but this is all part of an elaborate distraction because the real purpose was to enable an nVidia strike team to infiltrate the (now rented) AMD HQ and kidnap (aka “rescue”) anyone left who can actually program that’s in the building. They know that much of the buildings are now empty, so they had to have a distraction to keep the few remaining AMD employees busy.

      Even now, sleep gas is being spread through the ventilation system…

        • derFunkenstein
        • 7 years ago

        Breathe deep and relax.

        • MadManOriginal
        • 7 years ago

        You’re making it much too complicated. The simple fact is that The Matrix we live in runs on nVidia GPUs, ipso facto nVidia GPUs run everything better!

      • Deo Domuique
      • 7 years ago

      It’s funny and all, but don’t push it too far, guys. This way you make Nvidia seem saint. It’s O.K., certainly there is problem on AMD cards. I’ve 7950 and I’m extremely sensitive to micro-stuttering, tearing and such, and I was agreeing with these articles from the first moment, without even reading them, but… Let’s try to hold it a bit, because we have only two damn options and we can’t afford losing either of them. We’re PC users and mocking each other ain’t the best we can do.

      All I’m trying to say is, would be great if all PC users could stick together against those few corps.

      Also, AMD currently has other serious problems, like on DX9 games with black artifacts, while Mozilla is blocking all Catalyst drivers on Windows 8. Something we should start checking.

        • MFergus
        • 7 years ago

        I don’t see anybody saying Nvidia is a saint. People just having fun at fanboy’s expense. Nobody is ripping on people just for having an AMD card.

          • superjawes
          • 7 years ago

          And a lot of us are taking this and saying that we WANT AMD to come back and kick ass.

      • Arclight
      • 7 years ago

      What if i told you
      There is No camera.

    • Ryu Connor
    • 7 years ago

    Clearly your high speed camera was made by NVIDIA.

    /sarcasm

Pin It on Pinterest

Share This