Asus XG32VQR breaks cover with FreeSync 2 and HDR in tow

Every single time we write up a gaming monitor, there's always someone that says "oh this monitor would be great if it had X." Well, buckle up, then. Asus just came out with the ROG Strix XG32VQR, a display with a specs list as long as a CVS receipt.

First off, there's the 2560×1440 grid of pixels across a curved 32" VA panel. The screen's specced for 450 cd/m² maximum brightness and a sweet 3000:1 contrast ratio. The real kickers are these, though: there's FreeSync 2 support on tap and a 144-Hz refresh rate. The combo is intoxicating enough already, but the display's color gamut should cover 94% of the rather wide DCI-P3 space, helping the monitor earn its DisplayHDR 400 certification.

The XG32VQR includes other niceties. There's RGB LED lighting support courtesy of Aura Sync, and the cybernetic-looking stand can project a ROG logo on the desk. The included stand is height-, tilt-, and swivel-adjustable, and there's a two-port USB 3.0 hub along a mini-jack output.

There's no word on pricing so far, but the existing XG32VQ currently goes for $629 at Amazon and $630 at Newegg. We'd wager that the FreeSync 2 support and extra brightness and improved color reproduction of the new model will make it ring in at substantially more.

Comments closed
    • gerryg
    • 10 months ago

    Can’t wait for the $300 LED-less version! Or the $200 LED-less ROG-less version!

    • RonaldJanson
    • 11 months ago

    The bonsai tree makes a wonderful home visitor with minimal upkeep and a number of model. These miniature timber include conventional inexperienced foliage or colourful buds, strutting their stuff in full animation designs. The bonsai plant does effectively in nearly any type of potting, from glass terrarium containers to easy shallow pans. Use richly textured shade gravel to suit your inventive area or design your individual world with miniature ceramics and backyard equipment. [url<]https://uptheproduction.com/best-8x8-grow-tent/[/url<]

    • Ninjitsu
    • 11 months ago

    oh this monitor will be great if it isn’t going to cost over $500

    • Chrispy_
    • 11 months ago

    I’m sorry but nobody should be buying a VA gaming panel from Samsung at the moment. This panel is known to struggle with smearing at 60Hz; Many of the G2G transitions take longer than 20ms, which means it’s not even fit for 50Hz

    I have a Freesync, 48-75Hz, HDR600, AUO, 31.5″ VA panel in my current monitor. It is the same [i<]AUO[/i<] panel as the one in my previous Samsung S32D850T and in both cases the image is smear free, even on dark transitions. The AUO panels even acquit themselves pretty respectably on the Blurbusters UFO test. Until Samsung catch up to AUO with response times, I will specifically AVOID Samsung VA since I've yet to experience in person, or read an independent review (Prad.de and TFTCentral) of any Samsung 'gaming' panel that actually manages the response times to meet the refresh rates claimed.

      • DPete27
      • 11 months ago

      I don’t disagree, but the AUO panel is a pretty narrow 48-75Hz. Samsung may not be prefect, but it’s kinda the only choice right now.

      On that topic, what panels are GSync monitors using? What about HDR GSync monitors? Does NVidia have those all under lock and key through GFE contracts?

        • Ifalna
        • 11 months ago

        IIRC the 4K 2000$ beasts from Asus and Acer are currently the only monitors that have GSync HDR.

        • Chrispy_
        • 11 months ago

        My complaint with those samsung monitors is that they can’t even display 75Hz without smearing dark trails everywhere. They are 144Hz in name only.

        At least the AUO 75Hz panels deliver 75Hz very well (and some can be pushed to 90Hz+ with third-party tools).

        The new G-Sync HDR monitors use the latest AUO VA panels too. I hate to say it, but Samsung VA panels aren’t really good enough for gaming. I’ve been bitten twice now by ignoring reviews thinking that I wouldn’t care because I wasn’t going to get 144fps, but you really can’t; They’re just awful even at 60fps and they give VA technology a bad rep – because AUO and Innolux are doing great work in my opinion. I’m a fan of great contrast ratios and when you get a decent, non-Samsung VA panel they are much nicer to me than wishy-washy, low-contrast, bleed-issues, corner-glow IPS.

      • rudimentary_lathe
      • 11 months ago

      Unfortunately Samsung seems to be the only company that makes non-blingy gaming monitors these days.

      • Kretschmer
      • 11 months ago

      Hear Hear!

    • Srsly_Bro
    • 11 months ago

    They use the 4-5 year old game to prove their point and reinforce delusions. The AMD shilling is getting tiresome.

    1080ti FE barely does 60 fps, and my FTW3 would do marginally better. This monitor is great if you play really old games where Vega can get past 60fps.

    [url<]https://www.overclock3d.net/reviews/software/hitman_2_pc_performance_review/9[/url<] The old and beloved Witcher 3 Vega 64 barely breaks 70 fps [url<]https://tpucdn.com/reviews/EVGA/GeForce_RTX_2080_Ti_FTW3_Ultra/images/the-witcher-3_2560-1440.png[/url<] V64 again can't even get 100fps [url<]https://tpucdn.com/reviews/EVGA/GeForce_RTX_2080_Ti_FTW3_Ultra/images/far-cry-5_2560-1440.png[/url<] If you have games from 2010 where the V64 breaks 100 fps and approaches 144, lmk. Your 75hz free sync is plenty for now. Save your money, nerds.

      • christos_thski
      • 11 months ago

      What’s wrong with variable refresh rate between 45 and 60 fps? Isn’t it supposed to alleviate a lot of the choppiness on lower-than-60 framerates as well? Honestly wondering here.

        • Srsly_Bro
        • 11 months ago

        And I’m honestly wondering why you didn’t bother to read a post you replied to. Answer me that, bro, because i already gave my answer. Honestly wondering here….

          • christos_thski
          • 11 months ago

          Your post is some breathless angry rant about how the monitor is useless with modern games, supposedly because radeon cards won’t break 60fps. I asked why a variable frame rate under 60fps is bad. Take a chill pill bro. Seriously.

            • Srsly_Bro
            • 11 months ago

            Not true bro. I breathe just fine. It takes a lot of work to check these goal post movers.

      • Krogoth
      • 11 months ago

      They are using “Ultra” in-gaming settings which crowd who actually care about high framerate almost never use.

      Ultra settings is meme-tier non-sense that only exists for die-hard videophiles and bragging rights. For majority of modern games, Ultra settings is really just High settings without optimization tricks, no texture compression, LOD bias disabled.

        • Srsly_Bro
        • 11 months ago

        And you move the goal post closer so you can make the goal.

        Are you going to justify low settings next just to use a high refresh monitor instead of just getting 1080P 144hz in the first place?

        No matter what I say, the position of the posts will change so the narrative fits. People shouldn’t be ashamed of their graphics card and make up stories for their personal agenda.

        I bought a 1080Ti FTW3 because i wanted to use a 144hz 2k monitor. Don’t shame your graphics, bro.

          • Krogoth
          • 11 months ago

          Not moving goalposts buddy. Just saying what high-framerate crowd actually use in the real world. FYI, your factory-overclocked 1080Ti cannot break 144hz barrier at 2560×1440 under Ultra settings with tons of AA on top under current crop of titles. You need a 2080Ti at minimal for it.

    • Shobai
    • 11 months ago

    oh this monitor would be great if it had higher resolution. And no curve. And HDR600, minimum. And wasn’t so expensive. And was 28″ tops.

    • not@home
    • 11 months ago

    You lost me at “curved.” I have absolutely no idea why anyone would want a curved monitor.

      • Usacomp2k3
      • 11 months ago

      For anything over 27” you start getting to angles when looking at the corners. I agree with it on something of this size. Of note, we bought my FIL a 32” curved 1080p screen and it looked great when seated in front of. Would recommend.

        • Chrispy_
        • 11 months ago

        Yes, you are right, but the problem only affects IPS. VA doesn’t need curvature to reduce the edge angles because it doesn’t suffer from off-angle light bleed like IPS. Even TN is better in that regard (!)

        • Kretschmer
        • 11 months ago

        I’ve owned a flat 34″ and curved 34″ IPS. The flat had better BLB and less geometry distortion.

      • CScottG
      • 11 months ago

      I like it in a 65″ “monitor” – at about 3 feet away.

      Fill’s peripheral vision and is more “immersive”.

      • Waco
      • 11 months ago

      Agreed. Especially since aspect correction is not a thing for curved monitors, and games are designed to project onto a flat screen.

      I love my 40″ 4K monitor. It’s flat. 😛

        • Amiga500+
        • 11 months ago

        I’ve a 40″ flat and would prefer it curved a bit!

        The corners are too “off” for me.

        [all work and no games on it though.]

      • floodo1
      • 11 months ago

      Because it helps

      • Kretschmer
      • 11 months ago

      I agree. I’ve used both a 34″ flat and curved IPS as my daily driver and prefer flat.

    • christos_thski
    • 11 months ago

    This monitor would be great if it had a couple of 7 watt rms speakers (ducks for cover, fails to avoid storm of rotten tomatoes, mud, and crap)

    what? most people have worse speakers than that and 15 watts are just fine for /stabbed to death

      • Ifalna
      • 11 months ago

      *cleans bloody knife*

      Damn heretics!

      • Usacomp2k3
      • 11 months ago

      I don’t disagree. Can’t hurt.

        • BurntMyBacon
        • 11 months ago

        I dunno. I think getting stabbed to death would hurt plenty. (O_o)

          • Ifalna
          • 11 months ago

          Only if the one doing the stabbing has no skill.
          Sneak up from behind and slide that blade through his neck separating the spine… the victim won’t know what hit it.

            • christos_thski
            • 11 months ago

            THE DESK IS SO MUCH CLEANER WITH FEWER CABLES ON MONITOR SPEAKERS gurgl gurg

            • Ifalna
            • 11 months ago

            Meh I just hooked my computer to my stereo. So much more fun to route the signal to the big bad boys than having two tiny squealers on the desktop. Added bonus: a cable free desk. 😛

    • Usacomp2k3
    • 11 months ago

    Pretty. HDR400 should DIAF though.

      • Voldenuit
      • 11 months ago

      Pretty much.

      “Does this support HDR?”
      “It supports HDR400.”
      “So, you’re saying… nope.”

    • Srsly_Bro
    • 11 months ago

    Now AMD just needs to make a video card that can break the 60 frame barrier at that resolution.

    Pre purchase a monitor for a GPU 2 years later. Makes no sense to me, imho.

      • Krogoth
      • 11 months ago

      Ahem, you know that Vega 56/64 can easily handle 100FPS+ at 2560×1440 and carry themselves above 60FPS at 4K gaming?

      RX 480/580 8GiB easily break the 60FPS barrier at 2560×1440.

      I think you meant breaking the 120-144FPS barrier. Vega 56/64 barely do that at 2560×1440 without AA.

        • Srsly_Bro
        • 11 months ago

        Not everyone plays csgo tho.

          • Krogoth
          • 11 months ago

          Sorry to burst your bubble. I got a Vega 64 here and it is easily handling 130-150FPS for overwhelming majority of current titles at 2560×1440 with a 3570K at the helm and it is [b<]mildly[/b<] CPU-bound. Breaking the 60FPS barrier at 2560x1440 isn't really that much of a milestone for performance GPUs from either camp. It is easily obtainable with mid-range SKUs for goodness sake.

            • sweatshopking
            • 11 months ago

            Which games? Nothing I play, with the exception of heroes of newerth, would get close.

            • Goty
            • 11 months ago

            I mean, just looking at TR’s own testing, the Vega56 achieves over 60 FPS average at 2560×1440 in Forza 7, Wolfenstein II, GoW 4, Deus Ex, Hitman, Rise of the Tomb Raider, GTA V, DOOM, and the Witcher 3 (or, in other words, everything but Watch Dogs 2, where it averaged 54 FPS.) The same card also achieves sub-16.7 ms 99th percentile frame times in a majority of those titles as well (I think the worst I saw was around 20 ms in Rise of the Tomb Raider), so it’s not like it’s just barely making that mark or anything, either.

            • Krogoth
            • 11 months ago

            They are also using Ultra detail and healthy amounts of AA on top (Ultra settings typically enable it by default)

            Ultra settings is basically High settings minus optimization tricks and LOD bias disabled because “mah visual fidelity!”. That’s why there really is almost no IQ difference between High setting and Ultra settings. You really can’t tell the difference if LOD bias and optimization tricks are doing their job.

            Beside high framerate FPS junkies almost run their stuff at medium/high settings or some kind of mix-up.

            • lycium
            • 11 months ago

            > over 60 FPS average

            Yeahhh I thought the discussion was about 100hz+, ideally 144hz, though.

            So much downvoting for the poor guy, but he’s right. AMD don’t have high end GPUs atmo.

            • Krogoth
            • 11 months ago

            Yes, they do and they are currently Vega 56 and 64 SKUs.

            Unless you want high FPS and 4K. They are quite sufficient if you can get pass the higher power consumption at load. They are price competitive now that crypto-currency GPU craze has finally crumbled. Nvidia has faster SKUs but they command higher price points.

            • Goty
            • 11 months ago

            The first post in this thread literally says, “Now AMD just needs to make a video card that can break the 60 frame barrier at that resolution.”

            So no, the discussion wasn’t about higher refresh rates.

            • Srsly_Bro
            • 11 months ago

            Not the first, but the first most informative.

            • rudimentary_lathe
            • 11 months ago

            I agree, 1440p is not an issue in many titles for even mid-range cards. My plan is to purchase a Navi card along with a shiny new 3440x1440p VA, 100+Hz monitor for my next upgrade. That is of course contingent on Navi being a significant improvement on Polaris, and AMD not charging an arm and a leg for Navi.

            • BurntMyBacon
            • 11 months ago

            Just make sure you check you monitor of choice at a third party site that objectively test for pixel response time, smudging, and blurring like TFTCentral. Many VA panels (Samsung?) don’t respond quickly enough to keep up with their own specified refresh rate.

          • Ninjitsu
          • 11 months ago

          lol

      • enixenigma
      • 11 months ago

      Even taking your (false) statement at face value, isn’t that the exact scenario where you want Freesync?

      • CScottG
      • 11 months ago

      In addition, there is also Looking Glass with something like a RX550 (Freesync-out) + 2070/2080/2080ti (processing).

      • ptsant
      • 11 months ago

      With my RX480 I can play most games at 1440p with a mix of high/ultra within the FreeSync range of my monitor (48-90 Hz), meaning average 70-80 fps. That includes BF1, BF4, Witcher 3, Wolfenstein etc.

      I you are not obsessed with putting everything at ultra (which often doesn’t make much of a difference) an RX580-590 can work very well for 1440p.

        • Srsly_Bro
        • 11 months ago

        And the shilling and delusions of AMD owners who can’t accept reality is increasing.

        I already posted this, and again your Rx 580 is not faster than Vega64. And you do not get higher average frames than a V64.

        [url<]https://tpucdn.com/reviews/EVGA/GeForce_RTX_2080_Ti_FTW3_Ultra/images/the-witcher-3_2560-1440.png[/url<] From the Vega64 review, your card gets 44 fps and you said 70-80 so more proof of your unwillingness to live is this reality. It's not even a simple margin of error in frames from a test, you're just making things up. [url<]https://tpucdn.com/reviews/AMD/Radeon_RX_Vega_64/images/witcher3_2560_1440.png[/url<] The V64 barely gets over 70 fps so you can stop making up facts so you feel better about your purchase. There is nothing wrong with getting a 1080P monitor for low end graphics with your Rx 580. Please do research and stop justifying your purchase to yourself with fake news.

          • Krogoth
          • 11 months ago

          Omitting in-game settings makes those charts practically worthless. They are just as important as knowing what kind of resolution you are running at. Tech Report and other respectful hardware sites almost always put up in-game settings top of the resolution.

          The beauty of PC gaming is that you can tweak and adjust settings to your own preferences unlike consoles where you are stuck with whatever developers/artist decide to use.

          Not defending purchases or hardware here. It seems you are too long-up on ultra-high hardware and not double-checking minor details while being dismissive on anything that does not operate at Ultra settings. The point of those settings and tests is to tax GPU to their breaking point. You can see how much breathing space they got and how much tweaking you might have to do to get that “sweet spot” of framerate/image fidelity.

          Saying at x cannot do resolution y without providing in-game settings and AA level is at best being facetious and at worst disingenuous.

            • Srsly_Bro
            • 11 months ago

            Yeah, that part is fine. The issue I have is the ego crew saying their Rx580 is a1440 card. It’s like the dork at the gym trying to bench or squat too heavy and the spotters lift 80% of the weight. The ego crew needs to accept they can max 1080 and have mixed on 1440, and disclose that. My 1080ti FTW3 and an Rx 580 are not there same at 1440. I would like them to understand that.

            Which leads to my original post, why get a monitor that only select few games will hit the 144hz limit with the cooler used on the 5.0 28 core Intel. They are better served with a 75hz monitor, if only their ego allowed.

          • ptsant
          • 11 months ago

          Did you read the part where I said I’m running a mix of settings, typically around high, at worse mid/high?

          I’m not going to download Witcher 3 again to prove my point, but I’m currently playing Wolfenstein II and I run a series of OCAT captures.

          At 1440p I get the following 99% percentile in ms (averages are a bit higher):
          – high –> 13.97ms, 71.6 fps
          – uber –> 14.95ms, 66.8 fps
          – mein leben (highest) -> 16.37, 61.2 fps

          Almost any game can be tuned to achieve this. With the exception of BF1 which does seem to improve from high to ultra I have not noticed a major degradation in visual quality.

            • Srsly_Bro
            • 11 months ago

            I didn’t because i don’t have a trash GPU that can’t max out games. Mix of med/high isn’t for me. Download Witcher 3 or not, the benches don’t lie, only you do. More shilling and fake news.

            Accept that your card gets 30-40fps on Witcher3 1440p ultra. The Rx 580 is bargain price rn so idk why you are sticking up for something that’s around $200. The delusional V56 owners aren’t that bad.

            • ptsant
            • 11 months ago

            What I’m saying is that high-ish settings get me decent frame rates and a visual quality that I find satisfying (with the exception of BF1 where ultra does look better).

            What you are saying is that you absolutely need to play at “ultra” settings. Whatever, tastes differ. Both of us are happy with are our purchases and we can leave it at that.

            The rest of your post, especially the part where you say that I lie, and accuse me of shilling and fake news is a bit silly and makes you look like a 12-year old. Grow up.

            • Srsly_Bro
            • 11 months ago

            What I’m saying is you’re moving goal posts.

            • Goty
            • 11 months ago

            You should probably start confirming your figures before you post. If read TR’s 1070 Ti review, you’ll see that the 580 achieves an average of 49 FPS at 2560×1440 w/Ultra settings in The Witcher 3 with a 99% framerate of 44 FPS, so you’re off by a good margin with your numbers.

            • Srsly_Bro
            • 11 months ago

            And at tpu where i linked, it’s 44fps. You’re really grasping at small straws…

            • Goty
            • 11 months ago

            Which straw is smaller? A single average FPS number or full frametime data?

            I think you lose that one.

            • Srsly_Bro
            • 11 months ago

            Red herring alert. I made my initial argument off average fps. You’re introducing your own argument. I’ll let you argue with yourself over that.

            • Goty
            • 11 months ago

            And your initial arguments are also wrong. Both the one about AMD not having cards that can average 60 FPS at 2560×1440 AND the one about the 580 only managing 30-40 FPS in The Witcher 3 (because *SPOILER ALERT* 44 does not fall between 30 and 40.)

            It’s been fun.

    • oldog
    • 11 months ago

    Does this use the same Sammy panel as the 32″ CHG70? The specs look suspiciously familiar.

    [url<]https://www.samsung.com/us/computing/monitors/gaming/32--chg70-gaming-monitor-with-quantum-dot-lc32hg70qqnxza/#specs[/url<]

      • Krogoth
      • 11 months ago

      Most likely, ASUS is just an ODM in the monitor world.

      • JustAnEngineer
      • 11 months ago

      The Samsung got a an HDR600 rating from VESA:
      [url<]https://techreport.com/news/32974/samsung-chg-displays-are-the-first-to-net-displayhdr-600-certification[/url<]

        • DPete27
        • 11 months ago

        Same panel though.
        Samsung got that extra mileage through the use of QDot.

    • homerdog
    • 11 months ago

    My GPU would be great if it supported Freesync.

      • DoomGuy64
      • 11 months ago

      There are workarounds for that, and there are sales on vega 56 which is underrated.

        • albundy
        • 11 months ago

        sadly, i wouldnt trade NVenc for VCE, ever.

        • Kretschmer
        • 11 months ago

        Vega56 is decent-ish, but it pales in comparison to my 1080Ti. In addition, who wants to be locked into a vendor that goes years between GPU performance increases? If your purchasing cycle doesn’t line up with their release cycle you could be waiting a long time.

          • DoomGuy64
          • 11 months ago

          Lol. Vendor lock-in? Life cycles less than a year? Yup, Nvidia’s your brand. Perfect for your PhysX and RTX gaming needs. Not to mention RTX is already unplayable. I got burned on Kepler, and never again with Nvidia’s more expensive cards.

          The question should have been thought out better, because there are so many logic holes. First off, only the most bleeding edge enthusiast will ever need Ti level performance. Second, Pascal is no longer being produced, and the new hotness is overpriced 1080p RTX gaming. Third, this is a 1440p monitor. Vega56 is perfectly fine here.

          I don’t need 4k gaming performance. It wasn’t viable early on, was too expensive, nor did it support proper gaming features until recently. My eye sight is also much worse than when I was younger, and I couldn’t care less about pixels that tiny. 1440p gaming is as far as I care to go, and the ol 390 was actually still getting the job done. The only real issues were a few edge cases for performance, and EOL driver bugs like 144hz disabling memory power efficiency states. I probably wouldn’t have even upgraded if AMD put more support into the 390, as most of the performance sucking options are worthless and easily disabled like gameworks and motion blur.

          So Vega56. What’s the problem? Does it do RTX, the only feature that could make it slow? No. Do I care about RTX or 4k? No. Can it handle 1440p gaming for the next few years? Definitely. It also OC’s like a beast, since it is essentially a micro cut Vega 64, or can be tuned for efficiency when you don’t need it. Pretty versatile either way. Perfect for monitors like this one. If this is the monitor you want to buy, you don’t need a 1080 Ti, because it clearly is 1440p, and Vega56 can easily handle that for years. There are no issues unless you are restricting yourself to Nvidia’s edge cases. Which, good for you if so, but don’t pretend that Vega56 can’t handle 1440p, because that’s bull.

            • Kretschmer
            • 11 months ago

            I explicitly said “Vega 56 is ok.” But barring NVidia growing a heart buying into FreeSync means that future GPU upgrades are at the Mercy of AMD’s upgrade cycle.

            • DoomGuy64
            • 11 months ago

            Meaning what? Unless you want raytracing, there is nothing that is going to invalidate Vega56’s performance @ 1440p.

            You keep mentioning this mythical upgrade cycle, but you don’t mention what specifically would cause a NEED for upgrading. There is none. Consoles aren’t going to do raytracing anytime soon. Developers outside of Nvidia bribery aren’t going to do raytracing, and it isn’t playable on the 2080Ti anyway.

            WHAT is the reason or need for upgrading past Vega for 1440p? Seriously. You have nothing, because there is nothing. If anything, Vega’s performance is driver and API limited, which means it will improve over time due to both driver updates and games using newer APIs. Hell, Nvidia is getting on that bandwagon due to the improvements in RTX. So essentially, Nvidia is going to start supporting the new stuff with RTX, which will ruin performance on Pascal and improve performance on Vega. A year down the road, my Vega56 will play games faster than your 1080Ti if you still even have it. Just like the 290 vs Kepler.

            If being forced to upgrade your 1080Ti to a 2080Ti is your idea of an “upgrade path” while my Vega56 still plays games fine, you’re on crack. That’s planned obsolescence, and exactly why I don’t buy Nvidia anymore, along with the vendor lock in and pricing. That said, if you actually have some insight other than single sentence talking point generalizations, then I’d like to hear it. Of course, you won’t, because you don’t have any actual insight, and are just spouting off nonsense.

            • Kretschmer
            • 11 months ago

            Have you ever used strobing? Do you think that a Vega56 can sustain 100FPS or 120FPS in most released titles over the next 3-4 years?

            No? Ok.

            • Krogoth
            • 11 months ago

            Unless you are shooting for 4K or 2560×1440 with healthy doses of AA. Vega 56/64 are able to handle it barring CPU limitations.

            I think some users are too hung up on ultra high-end hardware and/or pushing the envelope that they lose perspective.

            • Srsly_Bro
            • 11 months ago

            The V56 plays 1440 according to your standards, which is lowered graphics settings compared to a 1080Ti/2080.

            Your V56 is not going to get twice as fast in a year. Several of you AMD owners are really insecure and the evidence is clear by your and others comments. Land of delusions. SMH

            If your V56 is faster on average than my 1080Ti on a year, you can have it. But we know, or at least those in reality know, that will not be the case. Why would you say such a thing? Look at the image, do you honestly think your card is going from 64 to over 100fps by driver improvements ina year?

            [url<]https://tpucdn.com/reviews/AMD/Radeon_RX_Vega_64[/url<] /images/witcher3_2560_1440.png

            • Krogoth
            • 11 months ago

            Vega 64 will likely catch-up to 1080Ti and distance itself away from 1080 under DX12 render path because the Pascal architecture doesn’t have the hardware for it. It has push through with brute force (software) and inflicts a noticeable hit. Vega doesn’t have this issue.

            It isn’t really that much of a secret though. It is one of the main reasons why DX12 adoption never took off. Nvidia held onto the DX11 bandwagon for their Fermi derivatives and used their clout to make game developers follow suite. That story is changing soon though.

            Because, Turing is Nvidia’s first DX12-tier gaming GPU architecture from ground-up. They are going to use DX12 features as a ticket to force “obsolescence” on their their older Maxwell/Pascal-era chips. Like how they had used Maxwell’s color/texture compression as means to make Fermi/Kepler obsolete for future DX11 tiles back in the day.

            • JustAnEngineer
            • 11 months ago

            [quote=”Krogoth”<]Vega 64 will likely catch-up to 1080Ti under DX12 render path.[/quote<] That may be a bit too optimistic. [url<]https://www.3dmark.com/spy/5266366[/url<]

            • Krogoth
            • 11 months ago

            Actually, Vega 64 and 1080Ti are much closer to each other under that synthetic test at stock speed and maximum boost speeds. 1080Ti still pulls ahead but you need to aggressively overclock the 1080Ti for it to climb. Again, Pascal needs brute force to keep up in the race. While Turing completely outclasses both Vega and Pascal at that test.

    • enixenigma
    • 11 months ago

    Almost exactly what I’ve been looking for. It’s just a little large and why, oh why, did it have to be curved…

      • DPete27
      • 11 months ago

      Check out Samsung C27HG70 for ~$400 then. I also decided 32″ was too big for my setup.

    • Phr3dly
    • 11 months ago

    This monitor would be great if it had a TB3 dock built in

      • morphine
      • 11 months ago

      You would you like a set of razors with that?

Pin It on Pinterest

Share This