Poll: What’s the resolution and refresh rate of your gaming monitor?

One of the hottest topics in the wake of AMD's Ryzen CPU launch has been the appropriate monitor resolution at which to test CPU performance. AMD claims that the trend among high-end systems is toward higher-resolution displays like 2560×1440 and 4K, making CPU performance less relevant for gaming. The Steam hardware survey says otherwise: that more people play on 1920×1080 displays than all higher-resolution gamers combined. That's before we get into the plethora of refresh rates that are available on high-end gaming monitors today.

With all that in mind, we figure the best course of action is to ask the TR audience. What's the resolution and refresh rate of your primary gaming display?

Comments closed
    • pyron83
    • 3 years ago

    1680×1050 @120 hz (2233RZ 😎 )

    but had to to vote “Less than 1920×1080 and 60 Hz” to fit somewhere πŸ˜€

    • mcarson09
    • 3 years ago

    One more vote for cheese cause YOU suck.

    My resolution is 2560×1600 and it’s better than 2560×1440!

    16:10 >16:9.

    • mkk
    • 3 years ago

    While my 1440p 144Hz with Freesync is being repaired, I found out that first person gaming on a large 60 Hz screen with VSync on now gives me mild feelings of motion sickness. Thanks, PCMasterRace.

    • jensend
    • 3 years ago

    What hardware or resolution is most popular is not the point. Heck, over 20% of Steam users are on integrated graphics and the Intel HD 4000 is one of the most popular GPUs, but I doubt you’ll be including it for comparison in your 1080 Ti review.

    The point is to test combinations that make sense and overall setups that will be popular. In today’s market, for most games, it may not make sense to test a $500 CPU and a $600 GPU at the resolution of a $100 monitor. 1080p is the most popular resolution, but it’s commonly paired with sub-$200 GPUs and CPUs, which are plenty capable of providing good performance at that resolution.

    In days of yore, when reporting average FPS etc, using low resolutions to force a CPU bottleneck was largely a proxy for seeing how CPUs would handle the game’s most processor-intensive moments. That purpose is much better served by using sensible resolutions and frametime analysis.

    If CPUs would not be the bottleneck in realistic setups, this is important for consumers to know when making their purchasing decisions. Setting up an artificial scenario results in a synthetic test masquerading as a real-world test.

    If you expect more of a CPU bottleneck in the future as GPUs improve, that provides some marginal justification for tilting tests slightly towards a CPU bottleneck now. But future games will have different performance profiles too.

      • Laykun
      • 3 years ago

      I like how you think that consumers are all capable of making logical pairings. People buy $5,000 TVs then put a crappy sound bar on it. We get a lot of support requests for our game where users build machines with 16gb of ram and install Windows 32bit on it. It’s not a far stretch to say that there’s quite a few people out there on 1080p monitors with super high end video cards, and in fact monitors are one of things I find people want to upgrade the least, next being PSUs. Basically your assumption that some one with a 1080p monitor is not going to buy a 1080 Ti is false, and I’d almost wager that they’ll be the most common buyer (particularly with 144hz+ 1080p monitors out there).

        • jensend
        • 3 years ago

        One shouldn’t waste time testing unbalanced setups just because someone somewhere might be stupid enough to use them. And finding a few anecdotes of people who blew their money on absurd setups doesn’t justify your claim that most high-end buyers will do so.

        TR already focuses on realistic setups by refusing to test even the 1060 at 1920×1080. It’s a 1440p-worthy card, so they compared it to its competitors at 1440p. The 1080 review was mostly at 4K, with a few 1440p thrown in. I imagine the 1080Ti review will be entirely 4K.

        They should be testing CPUs that cost ~$300 and up at 1440p as well. If you care about gaming and are capable of thought, you aren’t spending $300+ on a CPU and pairing it with a GPU TR would bother to test at 1080p (i.e. one in the $100 range).

          • Redocbew
          • 3 years ago

          You’re a bit late to party, no? At least the other guys were on the ball about it. You’re both slow and wrong. There are plenty of other threads here and even more with the Ryzen review if you care to find out why.

            • jensend
            • 3 years ago

            Right, because how frequently you post is what shows superior reasoning πŸ˜›

            Not a single thing you’ve said makes one iota of sense. You have no evidence that performance at an artificially low resolution now is indicative of future performance at a more realistic resolution with a faster graphics card. Without such evidence, a low-res cpu-bottlenecked test is just another synthetic test.

            They already run synthetic tests. The gaming tests are supposed to be among the real-world tests, indicative of real-world performance in normal workloads. Artificially low resolutions are not normal workloads.

            Again, trying indirect ways to show CPU bottlenecks may have made sense when reviewers weren’t actually managing to capture all the relevant data about gaming performance. But any impact the CPU has on game experience will show up in the frame times. Frame time analysis exposes ways games are sensitive to CPU performance that didn’t show up in average FPS. We should be inspecting the differences it shows at 1440p etc. But if it doesn’t show a difference in gaming performance at realistic settings, [b<][i<]then there is no difference in gaming performance at realistic settings[/i<][/b<]. It's [i<]disingenuous[/i<] to act like a performance difference relevant to gamers exists based on a synthetic test if a realistic test shows none. Saying "well, we tested with realistic setups and didn't see a difference" isn't "hiding differences," it's reflecting the only relevant truths.

            • Redocbew
            • 3 years ago

            I didn’t say much of anything besides “go away”, and I thought that part was pretty clear. Have fun arguing with yourself. I’m all tapped out.

            • jensend
            • 3 years ago

            You’re the one who inserted yourself into my conversation, so it’s pretty rich that you’re acting like I had intruded on your space and you were just telling me to go away. All you were saying was “you’re wrong, I don’t have any actual evidence for this but because I ran my mouth in several threads that should be enough to keep anyone else from daring to speak.”

            If you’ve really finally “tapped out,” then good riddance.

          • Laykun
          • 3 years ago

          Alright, lets look at some numbers then, lets cherry pick some steam hard ware survey results and make some wild assumptions.

          Assumption 1 : People ONLY pair high end video cards with high end GPUs
          Assumption 2 : The majority of people with high end video cards have 1440p+ displays
          Assumption 3 : High-end is R9 Fury / 980Ti +

          Alright, lets add up the percentage of people with a 1440p+ display

          2560 x 1440 1.81%
          3440 x 1440 0.22%
          3840 x 2160 0.69%

          We get 2.72% of users have a high-end display.

          Lets add up the percentage of video cards

          NVIDIA GeForce GTX 980 Ti 1.33%
          NVIDIA GeForce GTX 1070 1.78%
          NVIDIA GeForce GTX 1080 1.03%
          Rx Fury Series 0.14%

          We get 4.28% of users.

          ONLY if our assumptions hold true do we get 63% of users with high-end video cards have high end displays, but this is a cherry picked scenario and you cannot correlate high-end with high-end. Even so, in this perfect work you have 37% of people on sub 1440p displays who would be much more interested in 1080p benchmarks. It’s so flakey that I’m willing to bet that people on high-end cards will by majority be on 1080p displays, I know this will be the case as I sit here with a 970 on a 1440p display, but I didn’t include the 970 as the data for that is guaranteed to be so much more wildly varied. You wanted data, here’s data, the numbers with some common sense suggest that people DO build wildly varying setups and DON’T always pair hardware in a sane way.

          Also, your attitude towards customer service is pretty poor, you have to accept the reality that people DO build shitty setups and you have to deal with that in your software, it’s a reality of life.

            • jensend
            • 3 years ago

            You forgot that the card-specific stats are their market share relative to all cards of that DirectX class, not their overall market share. (Otherwise, for instance, the DX11 Intel HD 4000 would have to have 150% of market share among DX11 GPUs.) So to get a number you can compare to the resolution statistics, you need to multiply your 4.28% by the ~7/10 of users with DirectX 12-class cards. And now you have 3%, which matches the number of users with 1440p and up fairly closely.

            But all this is a little besides the point.

            Sure, if you were building software you need to catch corner cases where users have absurd setups (though even there you don’t treat corner cases as the default).

            But dealing with crappy setups in software is not in the slightest what any of us have been talking about here. This isn’t “customer service.” This is [b<][i<]journalism[/i<][/b<]. TR isn't supposed to decide its coverage based on the hardware setup of some deluded crank out there. They're supposed to report what's [b<]most relevant and informative[/b<]. And the most relevant and informative material when you're claiming to do real-world testing is to test the most sensible real world scenarios.

    • Firestarter
    • 3 years ago

    1920×1080 120hz here. I would’ve upgraded to a 2560×1440 144+hz display already if not for that proprietary gsync stick that Nvidia still has up their collective digestive tracts

      • freebird
      • 3 years ago

      There are plenty of 2560×1440 @ 144Hz monitors out there now with adaptive-sync. I bought a Benq 2730Z many moons ago… and they range from approx. $400 & up, usually for 27″ I’ve seen some as cheap as $350 once will newegg sale/discount.

    • kleinwl
    • 3 years ago

    Still rocking a Dell Ultrasharp 1901FP (1280 x 1024 p) with a Gforce 560Ti / Q6600. The Crew really stutters, but Borderlands, Project Cars, COD, and all the new games I’ve installed are fine…

      • jessterman21
      • 3 years ago

      The Crew is really CPU-intensive – maxes my i5-3570K @ 60fps and mostly-Ultra

    • Tirk
    • 3 years ago

    The current testing premise is that by testing at a low resolution the bottleneck can be isolated to the cpu for testing purposes. How often has this been checked to see if its an accurate premise? This guy did:

    [url<]https://www.youtube.com/watch?v=ylvdSnEbL50[/url<] (~20min video, you can skip to the ~5min mark to get to the juicier stuff if you want) Now I don't completely agree with everything in the video but it does bring up some very interesting questions when you look at whether the low resolution premise pans out over time, which the video contends that it does not. I do not think what is shown in the video is definitive on its own, BUT it does seem to warrant others in the industry of CPU testing to look into how this premise actually holds up. More data points testing the premise can only help to strengthen testing at low resolutions or finding other more accurate methods. Techreport has had a history of creating industry changing testing methods, it seems fitting for them to lead the charge again if its warranted. Thank you for your time, hopefully this discussion can remain constructive.

    • deruberhanyok
    • 3 years ago

    The results here show why 1080p and 1440p performance still matters for benchmarks. For a lot of people, with the performance of the > $200 video cards right now, it almost doesn’t matter what you buy above a certain price. So the question of who has the fastest video card is just academic.

    This also shows why AMD didn’t push more expensive hardware for 4k performance with the Polaris cards.

    A lot of sites like to trumpet 4k performance, but people don’t replace monitors like they do video cards. I was looking at a 4k screen a while back (Dell’s P2715Q) but ultimately decided that when I replace my existing monitor I want it to last a long time, which means I’m waiting for a 4k, 120hz, HDR capable display. I can’t imagine I’m the only one.

    • Krogoth
    • 3 years ago

    320×240 Monochrome master race here!

      • Cannonaire
      • 3 years ago

      Basic geek.

      • Khali
      • 3 years ago

      Ha! I still have one of those in the other room. Its part of my Fathers ancient 80386 system. The one with 512k or ram and a whopping 80 Meg hard drive in it. Oh, lets not forget the 5 1/4 floppy drive and to top it all off the height of late 1980’s tech, a 3 1/2 floppy drive.

      • Pancake
      • 3 years ago

      Hmm, still have a Commodore 1084 colour monitor lying around but nothing to plug it into. To await a retro rebuild when I’m retired in the distant future…

    • Cannonaire
    • 3 years ago

    After about a decade with the ‘Sad Little Square’ (as my wife called it), I finally upgraded from 19″ 1280×1024 about two months ago.

    My new monitor is 1920×1080 144hz, and it’s a pretty big change. About 4x the viewable area for 16:9 content, and the refresh rate is a serious boon for FPS games.

    Also, this may sound silly, but I’m overjoyed to be able to see my entire Steam profile background.

    • ET3D
    • 3 years ago

    The PC that gets the most gaming is the HTPC, which is hooked to a 4K Samsung TV (don’t remember the model offhand; 60Hz with quite a bit of lag at 2160p; can do 120Hz at 1080p with little lag). It has a Radeon 5550, which can only output 1080p and most games (family games like LEGO and platformers) are run at 1080p or 720p. Planning to upgrade to a Pentium G4560 with low profile Radeon RX 460, which I’m hoping might make some 4K gaming possible.

    Edit: I may be mistaken about the 120Hz at 1080p. Looked up SUHD TV’s at Samsung and didn’t find that figure. But I remember it advertised when buying the TV.

    Edit: Yeah, confirmed (not experimentally, just net information), it has a 120Hz panel but only supports 60Hz input. Annoying.

      • _ppi
      • 3 years ago

      Recent PS4Pro testing revealed, that some “smart” HDR modes on 4K TVs can introduce lag, because the TV is trying to “improve” the image somehow and obviously they fail. And what I recall from reading (it was consumer testing publication in my country), some of the “game” modes were creating it.

      Therefore, check your TV modes and settings, perhaps get it as close to vanilla, and your 60Hz lag might go away.

      60Hz is 60Hz, 4K is just pixel density. Obviously, your GPU has to handle it.

    • ptsant
    • 3 years ago

    So, 1440p and 4k together are about 40% of users. That’s not bad.

    Concerning the question of 1080p benchmarking, I believe it boils down to the eternal divide between “low-level” or “pure” benchmarks, which seek to highlight the theoretical prowess of the CPU vs “high-level” or “real-life” benchmarks, which seek to reproduce the experience of the consumer.

    In my opinion, we need both. Most users are GPU-limited, not CPU-limited. I can imagine the exception of people playing competitive CS:GO at 1080p with a Titan to reduce the frame latency at <5ms. It is also relatively easy to extrapolate that if the 7700K wins at 1080p, it also wins at 1440p. But it is completely different for a prospective buyer that wants to spec a SYSTEM to know that at 1440p a change from CPU A to B will give him 3 fps and, with the same cost, a change from GPU A to B will give him 20 more fps.

    In the end, as an owner of a mid-tier GPU (RX480) and a 1440p 144Hz monitor I don’t think I can exactly quantify the differences between buying a 7700K or a Ryzen for my upgrade. I know that the 7700K is quicker, but if that difference is reduced to 3% other factors (total cost, performance in other tasks) become more relevant.

    What I’m saying is, when time allows, try to do both.

    • anotherengineer
    • 3 years ago

    Dell P2214h.

    The cnd$ dropping has kyboshed any new monitor plans, along with all the changes coming out.

    Going to wait until DP1.3/1.4 is more common, true 10-bit HDR, freesync/adaptive sync, and higher refresh rates at better pricing than now, which seem not to have gone down since it came out almost 8 years ago!!!

    • Jigar
    • 3 years ago

    Rocking 42″ LED TV @ 1080P @ 120HZ, feels plenty for my need.

    • Rakhmaninov3
    • 3 years ago

    Since I depend on Karma Go for home internet (God bless them), I don’t really game anymore, so my desktop background is delicious Brie smothered in apple cinnamon sauce.

    Actually it’s various digitalblasphemy.com images, but they’re almost as sweet.

    • rechicero
    • 3 years ago

    Good idea but, why not more than 1 resolution. I want to know the isolation info of 1080 and the real world info of higher resolutions :-/

    • chuckula
    • 3 years ago

    Could one of you 13 people with the 4K displays that are at > 60 Hz tell us what you are running?

      • Ninjitsu
      • 3 years ago

      Crysis

      • gfeldt
      • 3 years ago

      * ASUS Z170 PRO GAMING LGA 1151 Intel Z170 HDMI SATA 6Gb/s USB 3.1 USB 3.0 ATX Intel Motherboard
      * Intel Core i7-6700 8M Skylake Quad-Core 3.4 GHz LGA 1151 65W BX80662I76700
      * GIGABYTE GeForce GTX 1080 XTREME Video Card
      * Dell 27 Ultra HD 4k Monitor – P2715Q

        • f0d
        • 3 years ago

        but that monitor isnt over 60hz is it?

          • gfeldt
          • 3 years ago

          No, but it doesn’t need to be. I’m more interested in the pixels rather than the refresh rate.

          60hz at 4k, 120hz at 2k

      • f0d
      • 3 years ago

      i diddnt know there were any 4K monitors over 60hz yet?

      • EndlessWaves
      • 3 years ago

      All high end 4K TVs have a 120hz refresh rate.

      …Admittedly they also only have HDMI 2.0 so there’s no way to feed 120fps into them but they’re still genuine 120hz 3840×2160 panels.

        • Waco
        • 3 years ago

        That’s 4K at 60 Hz, then.

          • EndlessWaves
          • 3 years ago

          No, it’s 60fps at 120hz.

            • Waco
            • 3 years ago

            …which is effectively 60 Hz.

        • Bauxite
        • 3 years ago

        The TVs are doing dumb tricks with frames and cannot accept source >60hz, period. Yay marketing!

        And even at 60hz, you might want to make sure they even work at 4:4:4, many can only do that stupid 4:2:0 thing and are no good for PC use. (4k contrast, 1080p color)

      • bill94el
      • 3 years ago

      [url<]https://www.bhphotovideo.com/c/product/1210865-REG/vizio_d_series_d50u_d1_50_class_4k.html[/url<] 120hz w/ 980ti, i7-3770, 8GB RAM, Win7 64bit

        • Waco
        • 3 years ago

        Still 60 Hz input…

      • Bauxite
      • 3 years ago

      They are running their mouths.

    • floodo1
    • 3 years ago

    So far more than 1/5 of respondants are at greater than 60hz! While less than half of that are at 4K!

    Glad people realize how important a fast refresh rate is!

    • geekl33tgamer
    • 3 years ago

    Feels like I’ve been rocking 2560 x 1440 @ 60 Hz for an eternity ( think I picked up the screen in 2010).

    It’s 28″ IPS and I don’t feel a need to upgrade at all really. Perhaps I’ll upgrade to a wide 21:9 display with G-Sync and HDR once this screen kicks the bucket. I may be waiting a while tho…

      • Laykun
      • 3 years ago

      G-Sync + High refresh rate on a 1440p monitor is surprisingly good ( I also have 60hz 1440p monitors), I think you’d appreciate the upgrade.

    • mmp121
    • 3 years ago

    Where’s the 5760×1200 option?! (1920×1200 x 3)

    I got a triple head setup going on here…

      • geekl33tgamer
      • 3 years ago

      I’m not sure what’s rarer these days: The 3 x 1 screen layout or the fact your still on the much loved 16:10 aspect ratio.

    • SlappedSilly
    • 3 years ago

    Hmm, I need to check a few options, though one is missing.

    3440×1440 @60
    1920×1200 @60
    1080x1200x2 @90 πŸ˜‰

    (oh, and 1920×1080 @?? for the PS4)

      • DancinJack
      • 3 years ago

      It’s asking about your primary display for your computer, not what your PS4 outputs.

        • SlappedSilly
        • 3 years ago

        Indeed, those are the native resolutions of the all the displays I use for gaming, listed listed in order of time spent gaming on each over the last few months.

        3440×1440 when I’m spending quality gaming time (which I picked here and would also be reflected in the Steam hardware survey). 1920×1200 when I’m just killing time or too lazy to swap monitors. 1080x1200x2 when I dive into cyberspace. 1920×1080 when I’m being a couch tuber.

      • Laykun
      • 3 years ago

      [quote<];-)[/quote<] I think you mean [;]-)

    • Dezeer
    • 3 years ago

    2560Γ—1600 @ 60Hz, voted as 1440.

    • ColeLT1
    • 3 years ago

    2560×1440 and more than 60 Hz (144 Hz gsync S2716DGR)
    For CPU reviews, I would rather see 720p/1080p tests or we will be just staring at a GPU test.

    • DeadOfKnight
    • 3 years ago

    You all need to get in on this ultra wide 3440×1440@100Hz. Seriously, you can thank me later.

      • Anovoca
      • 3 years ago

      $$$

      • slowriot
      • 3 years ago

      I tried a 3440×1440 monitor and returned it and went with a 2560×1440 instead. Variety of issues for me. Game compatibility being a huge factor. Likewise I ultimately rather have a center monitor and a second one off to the right side vs one larger ultra-wide monitor. It was cool for a couple games though.

        • DeadOfKnight
        • 3 years ago

        While I admit 2 displays are preferable for productivity, since that is of secondary concern I much prefer ultra wide in games that support it. I do wish I had full time access to my desktop though. Unfortunately there aren’t any new high refresh/VRR monitors for the ideal setup I would have. That being a 2560×1600 main display and a 1200×1600 secondary display.

        Which has got me thinking…Maybe there is a way to trick Windows into thinking I have a 2560×1440 display and an 880×1440 display for when I am playing a game that does not support ultra wide and I want to take advantage of the extra real estate I have. Does anyone know how to do this?

          • VincentHanna
          • 3 years ago

          And then output both signals discretely on a single HDMI/Svideo cable? No. I don’t think so.

          You could always run your games in windowed mode, however.

          • Whispre
          • 3 years ago

          I use DisplayFusion to split up my 3440×1440… but it doesn’t make it appear as two physical monitors so full screen gaming still uses the entire monitor. If I do want to game and do something else on the ultra wide, I game in windowed mode.

          Love the form factor, won’t go back.

            • Redocbew
            • 3 years ago

            DisplayFusion is a nifty app. I used it for a while back when I spent more time in Windows. It’s good for the kind of things which I had always wanted Windows to do for me.

        • f0d
        • 3 years ago

        i know there are some games out there that have problems with ultrawides but in my experience they are few and far between or really old games (a lot of old games you can actually mod to work with ultrawides just fine)

        i actually dont have a single game that doesnt work with an ultrawide and the only one i knew of was overwatch but i think they patched it in

        as far as i know new games work just fine with an ultrawide

    • Cyco-Dude
    • 3 years ago

    1440×900 @ 60hz

    • davidbowser
    • 3 years ago

    So I think I may be getting caught up in the wording on this.

    My monitor is 3840 x 2160, but I don’t typically run my games at that res. I scale back to 2560×1440 or 1920×1080 to make everything buttery smooth.

      • cobalt
      • 3 years ago

      I’m in your shoes, and I went with my monitor’s actual resolution, so 3840×2160, because that’s what I’d ideally like to drive it at so I’m going to be looking at that kind of performance in video card reviews, for example.

    • DPete27
    • 3 years ago

    Time for a new TR Hardware Survey!!!

    • tay
    • 3 years ago

    2048×1152 @ 60 Hz Del 2309SPW.
    About to get a 2560×1440 @ 144Hz Acer XF270HUA.

    • ultima_trev
    • 3 years ago

    4K60. Kinda wish I went for a 1440P screen though, even with a GTX 1080 at 2075/5500 I have to drop down to very high settings (down from ultra) in a few titles.

      • tay
      • 3 years ago

      [url<]https://68.media.tumblr.com/8f7ff79bfb512382db7b37cdd995f971/tumblr_mjxtf3yXel1rzbvsto1_500.gif[/url<]

    • PrincipalSkinner
    • 3 years ago

    2560 x 1440 @ 144 Hz on Asus MG279Q here.
    Lovez it.

    • dragosmp
    • 3 years ago

    I’m on a 1080p / 60Hz – since quite a few years ago

    …but my next upgrade will be a high refresh rate monitor. If it’ll be more than 1080p…it depends on what I can find in IPS/90+Hz/budget. By then, the single CCX Ryzen might launch and hopefully will work well with HD resolution and high refresh rate supporting games

      • DancinJack
      • 3 years ago

      I thought it was pretty clear…if your main concern is gaming, go Intel.

    • The Egg
    • 3 years ago

    I bought a 144hz 1080p monitor as a “temporary” display…..only intended for a few months until a better model came out. That’s been over a year now.

    • slowriot
    • 3 years ago

    These results don’t surprise me. I bet if you stipulated the a requirement that a person has purchased a new video card in the last 3 years the results would be even more 1440P and higher leaning.

    I think people need to back off a bit how much they use the Steam hardware stats to inform decision making. There’s a lot of problems with the data. Not the least of which is our inability to validate it or cross check it. Can anyone even link to Valve talking about how the data is collected? Does old data ever fall off the database? Or are tons of these sub 1080P resolution answers from 5 years ago?

      • Ninjitsu
      • 3 years ago

      It’s a random sample every month afaik.

    • Kretschmer
    • 3 years ago

    In a month, 340x1440x100Hz. Currently 2560x1440x144Hz. Resolutions (and FPS) over 90Hz are the killer app for gaming, right now.

    • colinstu12
    • 3 years ago

    Really? No 2560×1600@60?

    I picked 1440p since that’s my closest choice.

      • Kougar
      • 3 years ago

      Yeah, also using 2560×1600. Shame 16:10 is going by the wayside

      • Generic
      • 3 years ago

      +1 for 2560×1600@60

      • Pancake
      • 3 years ago

      Dell U3011 Master Race represent!

        • colinstu12
        • 3 years ago

        U3014 here but close enough!

        • JustAnEngineer
        • 3 years ago

        Mine’s an UltraSharp 3007WFP. I’ve got a 2001FP at 1200×1600 beside it.

      • morphine
      • 3 years ago

      Can’t cover every single resolution, and the 16:10 weirdos need to get on with the times.

      (I was one πŸ˜‰ )

        • colinstu12
        • 3 years ago

        The new times suck!

        (edit: if we always ‘went with the new times’ I’d have an ugly blinged out RGBLED computer… no thanks). Moving to new things for the sake of them being new … doesn’t mean they’re any better.

        • Kougar
        • 3 years ago

        2560×1600 30″ displays are still being produced and sold by a half-dozen brands, and new models still regularly introduced like the Dell UP3017.

        Given this resolution has been sold on 27-32″ panels for the last decade, more than a one or two people have them!

      • Ochadd
      • 3 years ago

      2560×1600 Dell U3011 @ 60hz

    • not@home
    • 3 years ago

    1600×1200 @ 60Hz. I have two old school Samsung Syncmaster 214t monitors with IPS panels. I had to replace the caps in them a few years back, but since then they have been great. I will be keeping them until they die because I hate widescreen monitors.

    • emvath79
    • 3 years ago

    1680 x 1050 @ 60 hz.

    I know. πŸ™

      • Dazrin
      • 3 years ago

      At least you get to save money on graphics cards since an RX 470/GTX 1050Ti or even GTX 1050/RX 460, will run anything you want. πŸ™‚

      I am in the same boat. Upgrade in process though.

    • ALiLPinkMonster
    • 3 years ago

    1080 w/ Freesync, max 75 Hz. Being fed by a RX 480 4GB. It’s a dream to not have to worry about mild to moderate FPS dips.

    • gecko575
    • 3 years ago

    1080p@144hz here. Though I would love to be able to push at least 2k at 120hz+.

      • TwoEars
      • 3 years ago

      I hope that’s a joke.

    • xeridea
    • 3 years ago

    I would say test at 1080P, and include a good variety of DX11 and DX12/Vulkan titles to show current and future, since the single thread limitation is temporary, and quickly dwindling. Also note that if the minimum frame rate is above 60, the differences don’t matter that much to user experience.

    • WulfTheSaxon
    • 3 years ago

    I went with 1920×1200, since I generally think of it as being closest to my 1920×1440 and 75 Hz. But in hindsight, I guess I should’ve gone with β€œ2560×1080 and more than 60 Hz” since it has the same megapixelage and that’s what this poll is really about.

    • cynan
    • 3 years ago

    [i<]AMD claims that the trend among [b<]high-end[/b<] systems is toward higher-resolution displays like 2560x1440 and 4K, making CPU performance less relevant for gaming. The Steam hardware survey says otherwise: that more people play on 1920x1080 displays[/i<] To be fair, a lot of those steam users' sytems @ 1920x1080 probably don't quality as "high-end". Also, I think the poll would be more inclusive if the resolution options were in bins/ranges. For example, there's no option for 2560x1600 - which is not that uncommon for 1080p+ Shouldn't the resolutions be in order of least to most pixels? And since number of pixels is really the factor of interest, how about presenting the options in pixel ranges? Eg, <2MP 2-3MP 3-5MP 5MP-8.3MP >8.3MP

      • JAMF
      • 3 years ago

      *does some math* Ooh, 11MP. Nice! πŸ™‚

        • DPete27
        • 3 years ago

        I don’t suppose you’re from Wisconsin? Your username looks “familiar”

          • Anovoca
          • 3 years ago

          Can’t be, we prefer to not bother with doing math around these parts.

      • DPete27
      • 3 years ago

      1) Your comments about the rest of the steam hardware survey are exactly on point. According to that, the “average” gamer is using a 2.3-2.9GHz quad core CPU, 8GB RAM, 1-2GB video card, and a 1080p monitor. You know what that looks like to me? [b<]A LAPTOP.[/b<] Also look at the massive amount of people that play F2P games like LoL/Dota2/CS:GO/Warframe/etc. I think that massive user base skews the results a bit toward the lower end systems. 2) Don't get OCD about specifics. If you're running 2560x1600 vote 2560x1440.

        • Ninjitsu
        • 3 years ago

        I’d disagree, laptops with quad core CPUs are usually expensive.

        Your actual typical laptop config would be:

        1.5-2.5 GHz dual core, possibly with hyperthreading, 4-8 GB of RAM, Intel HD graphics to 2GB discrete card, 768p.

        What you’ve described is basically a lower end desktop (non-K i5, Core 2 or Lynnfield) – and yeah, perhaps even some upper mid range and high end laptops.

        EDIT: and hardware outside the US is expensive πŸ™‚

      • MrJP
      • 3 years ago

      Also need to allow for refresh rate, so perhaps bins in MP/s would make more sense?

      Getting a bit complicated…

    • JosiahBradley
    • 3 years ago

    1440p144 here. That’s 10p per Hz

      • End User
      • 3 years ago

      Is that 3440X1440 or 2560×1440? Defining ones resolution solely by the vertical is next to useless.

        • Kretschmer
        • 3 years ago

        3440x1440x144Hz doesn’t exist yet, unfortunately.

          • End User
          • 3 years ago

          What the heck does the Hz have to do with what I was talking about?

          Edit: My bad. I get your point now.

        • JosiahBradley
        • 3 years ago

        I live in 16:9 so sorry forgot to include that. 2560x1440x144Hz IPS 27Inch w/FreeSync

        • ImSpartacus
        • 3 years ago

        I think you assume that anything ending in “p” is 16:9. Or at least I always assume that. It’s an HDTV standard for 16:9 viewing areas, so I expect it to be used in that 16:9 context.

          • End User
          • 3 years ago

          The “p” refers to progressive-scan. The “p” was important a decade+ ago from a marketing point of view when lesser 1080i HDTVs were available. The “p” has no relevance nowadays.

          • EndlessWaves
          • 3 years ago

          It’s nonsensical notation even in the HDTV world. Different aspect ratios use a fixed width and adjust the height. If you stream movies to your TV’s Netflix app then they’ll be 1920 or 3840 wide and have variable height that depends on their aspect ratio.

          It’s just a hangover from the days of analogue broadcasting where scan lines were the only discrete units.

          It’s not even any shorter than writing QHD or 25×14.

      • cynan
      • 3 years ago

      You should probably see your doctor then. (Even if it only Hz every 10th p.)

        • f0d
        • 3 years ago

        how is this not getting more upvotes
        you made me have a chuckle πŸ˜›

      • Dysthymia
      • 3 years ago

      8 Hz WAN IP.

      • freebird
      • 3 years ago

      Yeah, but is there a problem if it Hurtz every 10 times I p?

    • wiak
    • 3 years ago

    no Eyefinity3@1920×1200 60hz? boo

    • The Wanderer
    • 3 years ago

    2560×1600 59.97Hz, here. No poll option for me!

    (Though I don’t game much, either, and this system is old enough that none of its components are still shown in the graphs for TR’s current reviews.)

      • CheetoPet
      • 3 years ago

      And yet it’s still a fabulous resolution. Pushing 8 years on my 3008WFP now.

        • The Wanderer
        • 3 years ago

        Mine died (I think it’s an internal capacitor, very cheap to replace, I just haven’t gone to the trouble of ordering one and trying to do the labor), and I picked up the equivalent modern model last year sometime. I actually forget offhand what the exact model name is (it’s not visible on the front of the unit); it’s not quite as good on black levels, but seems excellent in pretty much every other way I can measure, except wake-up times.

        The original unit is still across the room, actually; I should probably dig back up what the part I needed was, and see about getting that repaired.

    • Chrispy_
    • 3 years ago

    Almost pushed 1080p60 because I suspect I do nearly half my gaming on my HDTV these days….

    • derFunkenstein
    • 3 years ago

    2560×1440 @60Hz here. I got in on the “Korean seconds” fad later in its lifetime (October 2014) with one of those Aurias you could get at Microcenter. It supports some ridiculous higher-than-native resolution that looks like garbage, but thankfully Windows and macOS see it for what it truly is and default to the native resolution.

    Aside from some very noticeable backlight bleed on totally black screens, I’m very happy with it. I’d like an IPS G-Sync monitor but they currently cost $Texas.

      • brucethemoose
      • 3 years ago

      Looking at the results, I bet quite a few of us are in that boat.

      Does yours overclock?

        • derFunkenstein
        • 3 years ago

        Not even a little bit. πŸ™

    • thesmileman
    • 3 years ago

    My Aorus laptop with 1070 resolution is 2880 x 1620

    • TwistedKestrel
    • 3 years ago

    Still rocking a Dell U2410, but once FreeSync 2 monitors hit the market I’ll look for a 2K display that I like

      • TwistedKestrel
      • 3 years ago

      Although it did take me more than five years to learn that you can manipulate the colour profile (SRGB vs wide-gamut) used by the Game mode by a sneaky/undocumented method. All things considered, the monitor controls on the U2410 are pretty lousy

    • superjawes
    • 3 years ago

    [quote<]What's the resolution and refresh [u<]raate[/u<] of your primary gaming display?[/quote<] You have a typo in there.

    • willmore
    • 3 years ago

    If the goal of this is to say “hey, look, people game at 1080p, so CPU performance is critical”, then you’re missing the point. Ask “Who games at 1080p and runs a 1080?” Most of us are GPU limited at 1080p rather than CPU limited.

      • derFunkenstein
      • 3 years ago

      It’s more like “what’s the best this CPU can give if nothing else is holding it back”

        • Tirk
        • 3 years ago

        But that isn’t what Jeff premised in the note above the poll or the poll itself. He brought up that because steam showed a lot of 1080p users that the techreport readers should be the same and therefore validate testing at 1080p. In fact what the poll is currently displaying is that for the techreport readers only 39% of them use 1080p or below and a majority of them 61% use a higher resolution for gaming which contradicts the steam survey, although bearing in mind neither statistic is gathered using scientific rigorous means.

        I would say more testing needs to be done to show whether testing at a low resolution does indeed limit the bottleneck to the cpu only and if other factors could be in play that need to be accounted for. Like this: [url<]https://www.youtube.com/watch?v=ylvdSnEbL50[/url<] and additional testing to create a more verifiable result. I think low resolution might still have merit but needs to be combined with other tests or altered for more accuracy to show a more complete picture. "what's the best this CPU can give if nothing else is holding it back" might not be to only use lower resolutions.

          • derFunkenstein
          • 3 years ago

          The point of the poll and the goal when testing don’t have to be the same.

          We wouldn’t be talking about this if AMD won across the board, because AMD is the one that made the big stink.

            • Tirk
            • 3 years ago

            I would agree, except that Jeff linked them with what he wrote above the poll. If you don’t wanted them linked than your beef is with Jeff not me.

            Throwing accusations that less testing should be done because someone somewhere has a bias toward AMD does not strengthen or weaken the case to verify through testing whether low resolution does indeed only limit the bottleneck to the cpu. If a new metric for testing was pushed by another company like Nvidia (FCAT) that puts their products in a good light do we all now of a sudden no longer care about frame times and think it all some big conspiracy? Lets not sling mud till I start posting:

            “Down with all other companies! Only our AMD overlords can make our computers great again!”

            • derFunkenstein
            • 3 years ago

            He’s linking AMD’s statement that 1440p and 4K is “a trend” with the Steam hardware survey and using it as evidence that it’s not. We don’t get to filter some data by other data, it’s not possible to get an exact number.

            Still, the point of 1080p testing was to remove a bottleneck (the graphics card), and nothing in Jeff’s post says otherwise.

            • Tirk
            • 3 years ago

            Jeff’s post said the best move forward was to ask the TR audience which, as of writing, currently shows more gaming systems using a higher resolution than 1080p. Maybe there’s a veiled joke in what he wrote, but with all that in mind, the fact he premised the poll with countering AMD’s response to test at higher resolutions with Steam’s survey and a poll of his own seems to indicate he is linking them.

            I do not wish to link them. I wish techreport and other testing sites to do some additional testing to see if low resolution testing is indeed showing the cpu isolation that they are expecting it to or if other ways of testing are warranted.

      • tipoo
      • 3 years ago

      Wut. On a CPU test you want a CPU bottleneck. On a GPU test you want a GPU bottleneck. Doing a CPU review at high resolutions where the GPU becomes the bottleneck makes less sense and shows less differentiation.

        • willmore
        • 3 years ago

        My point is that people at 1080 won’t be CPU bottlenecked, so the benchmark is meaningless. It’s a made up benchmark mascarading as a real gaming benchmark.

          • Ninjitsu
          • 3 years ago

          When GPUs become fast enough to not be a bottleneck at 1440p, you’ll see the same results as with 1080p.

          GPUs are a year or so from becoming that fast. That’s why it’s important and relevant to today’s CPUs.

            • Redocbew
            • 3 years ago

            This seems to be an amazingly difficult point to make recently.

        • ImSpartacus
        • 3 years ago

        No, the point is to show a realistic, relevant benchmark regardless of what you’re benchmarking.

        The only exception is if you go through the painstaking process of demonstrating that a “synthetic” artificial benchmark shows results that are relevant to real world benchmarks. Then after doing so, it’s appropriate to utilize that non-real world benchmark. Generally you’d only so this if the synthetic benchmark is much easier to test.

        And to date, I’ve never seen anyone ever show that <1080p low-res gaming on a high-powered desktop can be somehow extrapolated into some kind of conclusion that’s relevant to real world desktop gaming. You get a metric shit ton of baseless hypotheses, but nothing actually backing them up except speculation.

          • MrJP
          • 3 years ago

          It carries across directly. If a CPU can’t maintain the framerate you want when the GPU limitation is taken out of the equation (i.e. CPU review with powerful GPU at 1080p), then that limitation will always be there regardless of the GPU and resolution you end up trying to run.

          Here’s how you should use the reviews when chosing the components for a gaming PC:

          1. What minimum frame rate do you want to aim for? Use the CPU reviews at 1080p to find a CPU that exceeds your minimum frame rate requirement in all games. Reviews that look at individual frametimes are very useful here to highlight glitches. Leave a bit of margin because CPUs usually last for quite some time between upgrades.

          2. What resolution do you want to play at? Use the GPU reviews to find a GPU that exceeds your chosen minimum frame rate at your chosen resolution in all the games you’re interested in. Again, frame time analysis is very useful, and leave some margin to allow for future games (but perhaps not as much margin as on the CPU front).

          3. Build your PC, start eyeing up flashy new monitors with higher resolution and higher refresh rates, and then start all over again…

            • Ninjitsu
            • 3 years ago

            Absolutely this.

        • freebird
        • 3 years ago

        Where was the CPU bottleneck? On the videos I saw with CPU utilization of tests running at 1080p, especially the comparison between 1800x vs 7700x NEITHER CPU was running 100%, but in fact the 1800x was less utilized than the 7700x. So is it really a CPU bottle neck or a software one(bios/OS/driver) or some limiting factor with CPU <=> GPU communications? I would agree system-wise the AMD is still slower and significantly so on a few games at 1080p, but is it truly a CPU issue or something else that may be address via a software update?

        Once again, myself and most people willing to buy a 1700/1800/7700x, x370/270 mobo AND 1080 or equivalent will more than likely also be playing above 1080p.

        Maybe I’m wrong and people spending $1000-1500 on a system will be satisfied playing on a $100-$150 1080p monitor, but something doesn’t sound right with that…. to me.

      • thedosbox
      • 3 years ago

      There’s a difference between discovering the limits/limitations of a CPU vs measuring user experience in “realistic” scenarios.

      Ensuring the GPU is not limiting the CPU addresses the former question, while increasing the resolution addresses the latter.

      It is perfectly fine to do both, but the former has more value in the long term.

        • ImSpartacus
        • 3 years ago

        How do you demonstrate that those limits will ever show up in anything but artificially low resolution gaming?

        Like, you have to actually demonstrate that the limits WILL be in some realistic scenario. It’s not self evident that a cpu that bottlenecks at 720p with a 1080 will “automatically” bottleneck at 4K with a “1280”. You actually have to demonstrate that it will happen, THEN you get to say that such benchmarking is relevant for anything more than dick measuring.

          • Redocbew
          • 3 years ago

          By definition limits happen at the extremes. If your realistic scenario encounters them frequently, then you are realistically doing it wrong. Be that as it may, the extremes are really what we’re after here, no?

            • Tirk
            • 3 years ago

            Most testing removes outliers and does not use them to predict the curve. Its a mistake to take the extreme measurement to predict the rest of the plot. Hence why Max FPS is no longer sufficient to show the smoothness of frame delivery.

            • Redocbew
            • 3 years ago

            True, I should have differentiated between extremes in measurement and extremes in performance there.

            • Tirk
            • 3 years ago

            If I understand ImSpartacus’s argument, he seems to be promoting testing the extremes in performance by expanding the testing conditions rather than testing the extremes in measurement and claiming they are representative of overall performance. If that understanding is true than it seems you two are agreeing with each other.

            • Redocbew
            • 3 years ago

            That’s how I understand it also, but where is this claim of overall performance? Maybe I’m weird, but I’m not looking for overall performance in a CPU review.

            • Tirk
            • 3 years ago

            Well of course if every review were written just for you there would be no issue because the review was only for your usage. But reviews tend to be read by more than just you, which makes evaluating it based on the correct assumptions all the more important to the readership looking to find the results in the review that are most pertinent to their cpu usage scenario. This requires a review to cover more than one usage profile or performance level. If testing at 1080p shows performance trends across all resolutions and over time with ever increasing hardware components like gpus then it would indeed be a reliable isolation point. The problem is whether that assumption is correct. As others have mentioned like in this video:

            [url<]https://www.youtube.com/watch?v=ylvdSnEbL50[/url<] That assumption may indeed be flawed. It doesn't mean that no tests should be run at 1080p but it does suggest that their might be other isolation factors that need to be addressed to better describe the performance of the cpu being tested, whether its your usage case or someone else. I think it odd people would be so against testing whether the assumption that a lower resolution is the best way to isolated cpu performance is a valid one to take. Its a logical hypothesis but one I think needs to be tested more before one can clearly state as always a fact.

            • Redocbew
            • 3 years ago

            [quote<]Well of course if every review were written just for you there would be no issue because the review was only for your usage. But reviews tend to be read by more than just you[/quote<] Funny you should say that. I was thinking the same thing when replying to someone in the other thread. The point was not that my use case is most important, but that you often can't extrapolate total system performance from a single component, and that I haven't seen that done except by those who are all up in arms over the supposed evils of testing at 1080p. I get it, really. Increase the resolution and those gaps we saw between Ryzen and the other CPUs will probably go away. Clearly though there are differences in performance, and if they're not noticeable at higher resolutions, so what? Why run benchmarks at all if we're going to use methodology which doesn't expose the underlying behavior of the hardware?

            • Tirk
            • 3 years ago

            I’m not simply saying to not use 1080p as I’ve already stated in my previous posts, but that the idea of lowering the settings and resolution seems an industry assumption that it will correctly isolate the cpu’s performance. Closer look at the data over time seems to question that assumption. It doesn’t mean, don’t test at 1080p, but that other methods might be required to better represent the CPU’s performance in isolation.

            • Redocbew
            • 3 years ago

            Have you ever fallen upwards? Yeah, of course dropping resolution will help to isolate the CPU. It’s not by any means the only correct way to test these things, but if that were not the case, then you’d be testing some kind of twilight zone PC which is just totally weird. Performance of Ryzen is a bit of a moving target at the moment anyway with the BIOS updates and SMT weirdness also in play, so let me ask you again: clearly there are differences in performance, so how do you know that’s not what you’re looking at?

            • Tirk
            • 3 years ago

            I’ve already linked the video that succinctly answers that very question but here it is again if your interested:

            [url<]https://www.youtube.com/watch?v=ylvdSnEbL50[/url<] Its at least a question that should be further tested as I would not draw absolute conclusions from one video but hence I've asked for more testing on the matter.

            • Redocbew
            • 3 years ago

            The video doesn’t add anything which we haven’t already covered, but I do like the dudes accent. πŸ™‚

          • thedosbox
          • 3 years ago

          With 1470 votes, 39% of those who voted are running a 1920×1080 screen at various refresh rates. Given those results, I’d say testing at 1920×1080 is pretty realistic.

          And again, I said it’s OK to do both. You only seem to want to bury the lower resolution tests. I wonder why that is?

            • ImSpartacus
            • 3 years ago

            Don’t change the subject. I have no problem with 1080p gaming – I made that crystal clear.

            I asked you how to demonstrate that those “limits” will ever show up in anything but artificially low resolution gaming.

            This is absolutely critical to the entire premise of using an unrealistic benchmark. You must prove that it has some measurable link to reality. It’s not good enough to go, “oh well we just lower the resolution until the CPUs stop tying each other so the graph looks better and we can confidently announce a winner.”

            • thedosbox
            • 3 years ago

            39%

            • Redocbew
            • 3 years ago

            Conspiracy among reviewers CONFIRMED!

          • freebird
          • 3 years ago

          Don’t try talking common sense ImSpartacus… it just gets down rated here… πŸ˜€ I tried making that same point with the same results.

          Just like the same Straw Man arguments that X% of Steam users play at 1080p, but the fail to note how many have 7700x/GTX1080s and 1080p.

      • Prestige Worldwide
      • 3 years ago

      I game at 1080p / 120hz and have a GTX 1080.

      I need to upgrade from my i7 3820 CPU badly.

      • f0d
      • 3 years ago

      i run at 1080 with my r9-290 but i reduce settings enough so im not gpu limited at 1080
      high fps>eye candy

      • Aquilino
      • 3 years ago

      I game at 1080p with a 980ti but always downsample/AA/tweak the hell out of NV Inspector.
      Monitor resolution it’s not as indicative of required power as you could think.

    • bthylafh
    • 3 years ago

    Still getting my money’s worth out of 1680×1050@60.

      • tipoo
      • 3 years ago

      I have a monitor at that res sitting around doing nothing. I pondered getting a hiDPI screen for a gaming build, but I’m also half wondering how low you could go on a GPU to max out games at that res.

      There was a 720p gaming video I saw once that showed you could hit 60fps on an extremely budget build, lol

    • NTMBK
    • 3 years ago

    1080p60, reporting in.

    To be frank my monitor size is limited by the space I have in my desk. I can’t fit any bigger widescreen monitors in. How I wish they still made 4:3 displays. πŸ™

      • boskone
      • 3 years ago

      Dell sells 1920×1200, I’m using one as my secondary display.

      • paulWTAMU
      • 3 years ago

      I want a small 4:3–think like 15″–for a secondary so bad but man, you just can’t find ’em anymore

      • Mr Bill
      • 3 years ago

      Asus PA248Q is 1920×1200; I have three in my lab and an older gaming BenQ 1920×1200 at home.

    • chuckula
    • 3 years ago

    I see a bunch of numbers in that poll but there’s only one correct answer.

      • Anovoca
      • 3 years ago

      800×600, the max resolution possible in Diablo 2 LOD?

        • derFunkenstein
        • 3 years ago

        There are reshacks that let you see more of the map. Not sure if they work on the new update, though.

    • Bomber
    • 3 years ago

    4k60 for me. Dell and the pretty awesome P2715Q made it a no brainer to step up.

      • bhtooefr
      • 3 years ago

      That’s what I run as well, although mine has pretty bad image retention, even after trying to clear it up with the display conditioning function.

      Granted, the reason I have a 4k monitor isn’t for gaming, it’s for everything else I do with the machine. (I’ve also got an IBM T221, but that one’s not on the gaming machine – I demand all the pixels.)

    • Anovoca
    • 3 years ago

    My PC is 1440 and 60Hz but I spend more time gaming on my TV which is 3840×2160 and 60 Hz or 1920×1080 and 120 Hz . Sigh, yay for HDMI 2.0

    And in case you are wondering, yes, I actually do switch the res and refresh configuration based on what game I am playing.

    • DPete27
    • 3 years ago

    Had to vote where I’m at now 1080p @ 60Hz. Although I’m really gunning for a 1440p ~100Hz VRR monitor.

    • Anovoca
    • 3 years ago

    [quote<] My desktop background is cheese [/quote<] & potato Hz

      • Ninjitsu
      • 3 years ago

      GLaDOS?

    • Airmantharp
    • 3 years ago

    2560×1440@165Hz, reporting in.

    Also waiting for 3096×2160@144Hz+ IPS/VA HDR panels to hit the market, maybe with a Vega or Volta GPU in tow…

      • TwoEars
      • 3 years ago

      Good copy. Another 2560×1440@165Hz user here as well.

      A steady 60fps is “ok” but once you’ve tasted 100fps or so in a game like DOOM or Shadow Warrior 2 there’s no going back.

        • Airmantharp
        • 3 years ago

        I’d give up IPS and G-Sync before giving up 100Hz+, for gaming.

      • jmc2
      • 3 years ago

      4K@60 display port (use @1440p)

      Yep, waiting for same…
      4K,*HDR*,FreeSync, High refresh,

Pin It on Pinterest

Share This