Acer XV273K brings 4K, 144 Hz, and FreeSync together

Acer isn't saving all of its gaming-monitor fun for Nvidia graphics-card owners this morning. The company is also introducing a brand-new, high-end FreeSync display at IFA called the Nitro XV273K. This 27″ panel offers a 4K resolution, 144-Hz refresh rates, and 90% DCI P3 coverage, just like its G-Sync sibling. The XV273K has one other trick up its sleeve for FreeSync users, though. A feature Acer calls Visual Response Boost reduces “moving picture response time” to a claimed 1 ms.

Like the XB273K, the XV273K comes with DisplayHDR 400 certification. Gamers and content professionals alike can tune this display's color reproduction using six-axis adjustment. Unlike the XB273K, the XV273K rings in at under $1000. It'll go for $899 in the USA and ā‚¬1049 in the Eurozone. Acer says to expect this display on store shelves in the fourth quarter of this year.

Comments closed
    • DoomGuy64
    • 1 year ago

    Breaking news!
    [url<]https://www.techspot.com/article/1687-freesync-on-nvidia-gpu-workaround/[/url<] You can now workaround the Gsync lock by using an AMD powered APU as a pass thru display output. They don't mention the Intel Vega APU, but I bet that would work too.

    • DeadOfKnight
    • 1 year ago

    Can someone explain why G-Sync is so much better that it’s the only VRR Nvidia will support? Besides the fact that they make money by selling chips to the monitor manufacturers. I mean, you’d think they want to sell GPUs more, and people with a Freesync display probably just aren’t going to buy a GPU from them.

      • DoomGuy64
      • 1 year ago

      The history of G-Sync is that Nvidia was originally working with VESA, and pulled out early to beat AMD. There were no video cards on the market that supported VESA VRR, so Nvidia made a FPGA buffer chip to end run this limitation and be the first to market. They then stuck with this model to create a monitor/video card walled garden that keeps users locked into the ecosystem.

      Is Gsync better? No. They just have complete control over the monitor list, while AMD doesn’t, giving a false impression. The whole less power argument, while already a joke from how miniscule it is, is probably not even true when adding in the 24/7 extra juice pulled from the FPGA chip. The feature set is also fully baked into the FPGA giving little or no chance for feature adaptation, unlike Freesync.

      Freesync had early adoption issues like literally a single monitor not supporting pixel overdrive, which Nvidia lied about and said was across the board. Then, since Gsync had LFC built in (limited to sub 30 fps.), they lied about AMD not supporting it. Which technically is not directly supported by the monitor, but graphics driver, and the monitor needs to have a proper VRR range to operate under. If you cherry pick non-gaming monitors, you can claim those monitors don’t support LFC, but even then is not true, because LFC is still *partially* supported and drops to vsync on/off when running out of spec.

      The big picture issue of “why Gsync is better” has nothing to do with Gsync, because Freesync is just as good, if not better today. It has more to do with Nvidia having faster hardware, “premium gaming” aesthetics, and locking users into repeated purchases through Gsync.

      You can get a good Freesync monitor at a lower price, but VRR won’t work on Nvidia, and you will have to *slightly* compromise performance with Vega. (very overblown) It’s still possible to use the basic capabilities of both monitors with both manufacturers.

      Overall, whales shelling out thousands for Titan cards will buy Gsync, and that is the intended market. Just look at RTX pricing. Nvidia has completely left the mid-range value market to AMD, and now only cater to “premium gaming” and compute.

      Also, the [s<]turtle neck[/s<] leather jacket guy says it's better, so it must be true. I mean, it does cost more. Costing more means better, right? Nvidia has their fingers in too many pies with OEM deals, press deals, and game dev deals for the truth to get out. GPP, gameworks, and whatnot. Just look at who accurately reported on the 1060 not having fully enabled ROPs. The future doesn't bode well for critique of Nvidia's practices, and it's mostly up to the consumer to wade through the muck to find the truth.

    • pogsnet1
    • 1 year ago

    Let us support FREESYNC then.

    • jts888
    • 1 year ago

    What are they doing to squeeze UHD@144Hz into DP 1.3/1.4 bandwidth? Is it DSC, 18b color, 4:2:2 chroma sampling, or some out-of-spec transfer rate? AFAIK, UHD@120Hz/24bpp is the limit of DP 1.3/1.4, and I am total OK with forgoing that last 20% refresh rate boost for a less janky overall viewing experience.

    • Wirko
    • 1 year ago

    Wow. The stand is from 2015. The rain cover is from 1985. Is the monitor from 2018?

    • Chrispy_
    • 1 year ago

    Ah AMD, you made a great thing with a rich ecosystem of affordable VRR monitors.

    Where oh where are the graphics cards to drive them, exactly? Vega 56/64 straddle the 1070Ti in performance, except that is a cool, quiet, affordable card with plenty of available stock, whilst the Vega56 is [i<]at least[/i<] 20% more power hungry, 20% noisier, 20% more expensive, and with iffy stock levels in a lot of regions. Even if I do want to buy Vegas and Freesync monitors, you have no product that can push 4K at common freesync ranges, let alone 144Hz. [i<]Le sigh[/i<].

      • synthtel2
      • 1 year ago

      V56 is more like 12% overpriced right now. Either way, it’s a pittance compared to the $400 gap between these two monitors.

      If I had $1400 to spend on a GPU and monitor, I’d take something like this and V56 over the more common choice of a cheaper monitor and more graphics power, but I know my preferences on that are weird.

        • Chrispy_
        • 1 year ago

        Not weird. The graphics power will become obsolete far quicker than the monitor, so put the money into the monitor!

        • Kretschmer
        • 1 year ago

        V56 is a dog at 4K, though. Pairing one with this monitor would be blech.

          • synthtel2
          • 1 year ago

          As I said, I know my preferences on that are weird. It has a lot to do with the average recent AAA game not being very compelling; V56 would run the stuff I actually care about at 4K without trouble, settings compromises can take care of the rest, and a nice monitor is nice no matter what you’re doing on it. I’m running basically the same tradeoff one step down right now (RX 480 and 1440p144 monitor) and it’s great.

            • Spunjji
            • 1 year ago

            I’m with you here. There’s also always the option of running 1440p on the 4K display to hit its higher refresh rates and treating the resulting blurring as a sort of freebie FXAA.

            • Kretschmer
            • 1 year ago

            *Thrashes uncontrollably*

            • DoomGuy64
            • 1 year ago

            This is actually feasible if you use something like mCable, which upsamples so well that Linus recommended it. Ridiculous that GPU companies don’t bring this tech to the desktop, especially when it already exists on consoles.

        • derFunkenstein
        • 1 year ago

        Yeah, when you consider the price of a 1070Ti + GSync montior vs Vega 56 + FreeSync, the scales tip in AMD’s favor. Problem is that you’re not going to see graphics cards that can do FreeSync and present playable framerates at 4K anytime soon.

      • ronch
      • 1 year ago

      In terms of availability, either AMD isn’t making enough of them or distributors and retailers aren’t buying a lot of them because they don’t expect to sell a lot of them given how Nvidia has stronger products. And those that do carry them price them higher because of the higher risk of putting them on the shelves and seeing if they’ll sell. Heck, they know that people who really want to get Vega will pay extra.

      • DoomGuy64
      • 1 year ago

      They do have a Vega 56 that runs faster and on less power. It just is a workstation card that cost $999. So you either ignore the power consumption and overclock, buy the wx model, or wait for a desktop refresh that uses the workstation grade chips.

      Yeah, AMD clearly isn’t trying hard enough on the desktop.

      • Kretschmer
      • 1 year ago

      Thinking about this one further, does AMD even have much of a “performance” GPU roadmap? Navi is a Polaris follow-up, so you might be waiting a very long time for good frame rates at 4K.

        • DoomGuy64
        • 1 year ago

        No. There might be a Vega refresh, but that’s it afaik. If Navi has 64 ROPs, the best perf/$ would probably be buying two for crossfire. Vega crossfire isn’t practical. As it stands right now, Vega is the fastest “performance” GPU they have, and it performs more like an upper mid range card. That said, Vega still has potential from it’s unused performance features, and hires gaming is possible if you don’t max out the graphics settings. 4k is mostly an efficiency problem, so the better devs optimize for the hardware, the better performance will be. 4k is the new Crysis, and a lot of games aren’t taking full advantage of the hardware. Once they do, performance will stabilize.

    • SoundFX09
    • 1 year ago

    Glad to see Acer step up to the plate with a 4K 144hz Freesync monitor. However, after seeing the various issues that the G-Sync options have regarding 4K 144hz, I’m worried that Freesync may run into issues similar to it or worse.

    If possible, can someone clarify on the issues regarding 4K 144hz adaptive-sync lately? All of the articles online have been all over the place regarding this, and I’m not getting it at all.

      • Chrispy_
      • 1 year ago

      As far as I’m aware, G-Sync (SDR) and Freesync (1) are comparable in every way. Technically Freesync is a bit more flexible than G-Sync in terms of how you treat sync behaviour outside the VRR range, and G-Sync doesn’t support HDMI, unlike Freesync.

      Assuming the monitors are equal, Freesync and G-Sync work identically, but it’s possible to get Freesync monitors that don’t have a wide enough range to support LFC, something all G-Sync monitors must do to qualify for the name. As a rule of thumb, the G-Sync-equivalent freesync monitor needs a VRR window where the upper frequency is more than double the lower frequency, so 48-100Hz is good, 40-75Hz is not.

      Freesync 2 hasn’t been spotted in the wild yet, and there are currently only two monitors with G-Sync HDR available at the moment, both of which cost more than a high-end PC and monitor should cost in total. The only ‘issues’ that I’m aware of are that it’s expensive, proprietary, and requires the latest display controller to support 4K-144Hz natively. The kludge that was used up until recently was to run 4K-120Hz at full 8-bpc RGB, or to drop chroma bit depth from 4:4:4 to 4:2:2 as a cheat to get around the older displayport bandwidth limitations.

    • tay
    • 1 year ago

    Are these monitors with a hood glossy? I want a glossy screen, but haven’t found a good one.

      • DancinJack
      • 1 year ago

      Very much doubt it. Any reason you want a glossy screen? There are some really decent (non-refracting) matte coatings these days.

        • Parallax
        • 1 year ago

        Because it’s possible to see sharp details on a glossy screen that are obscured by most (if not all) matte coatings. I’ve been looking for a good matte display but have so far always been disappointed.
        What we really need are anti-reflective coatings for the best of both worlds, but unfortunately the price seems to have scared away all the manufacturers. Then again, $400+ used for G-sync could cover a lot of other things.

      • Pettytheft
      • 1 year ago

      Glossy screens are dead except touch screen notebooks. I have an old 1440p Crossover with no calibration controls that still looks better than my current 1440p 144hz monitor with all the bells and whistles. I hate the fact that they have gone out of fashion as I always felt the colors were more vibrant on them.

    • Acidicheartburn
    • 1 year ago

    So is this also IPS? I don’t see a panel type mentioned.

      • DancinJack
      • 1 year ago

      90% DCI-P3 = IPS type. HDR400 also requires 8 bit processing. So, there is almost zero chance it’s TN.

        • moose17145
        • 1 year ago

        IF it is a TN panel… then it is at least one VERY high quality TN planel…

          • DancinJack
          • 1 year ago

          It’d be the very first with that kind of color coverage. Never seen something like that with TN.

    • shank15217
    • 1 year ago

    27 inch 4K sucks, please do 4K 32 inch

      • DancinJack
      • 1 year ago

      It’s pretty tiny, and with Windows being pretty meh at scaling (still), it’s just not an option for my eyes at this point.

      • jihadjoe
      • 1 year ago

      How about 23 inch 4k?

        • derFunkenstein
        • 1 year ago

        200% scaling SHOULD be pretty decent, but with Windows it’s always a dice roll. šŸ˜†

          • Spunjji
          • 1 year ago

          I get by really well with 200% scaling on my 24″ 4K display, but it’s admittedly a bizarre experience moving a window across from the 24″ 1080p display at 100% next to it.

    • Eggrenade
    • 1 year ago

    $1299 – $899 = $400 for a G-Sync module?

    That seems a bit excessive.

      • Firestarter
      • 1 year ago

      That’s the price for a sense of pride and accomplishment

        • DoomGuy64
        • 1 year ago

        loot boxes are a value added consumer friendly feature!
        (OP BF2 reference I think)

        *sigh* Modern gaming, what have we become…

      • Airmantharp
      • 1 year ago

      Probably more than half for the privilige of being able to hook it up to a GPU that might actually be able to push 4k144 šŸ˜€

        • derFunkenstein
        • 1 year ago

        …someday

      • DoomGuy64
      • 1 year ago

      4K Gsync modules should be more expensive proportional to the cost of 2k.

      Why? Because the processing is all being done on the FPGA, and you have to install a faster chip plus larger Ram buffer to handle it.

      Therefore, I don’t believe Nvidia is overly price gouging on the 4K Gsync panels, in comparison to the 2k panels.

      ***
      The only problem with this price increase however, is that the majority of Nvidia fanboys don’t believe Gsync is being done on the FPGA module, even though the technical specs prove it, so they might think Nvidia is just price gouging. Which is fair enough, considering Nvidia does price gouge everything else on a regular basis. Regardless, the more capabilities get added to monitors, the higher Gsync modules are going to cost. The next gen modules may even cost $600. All because Nvidia refuses to support freesync, and the consumers let them get away with it.

      • jensend
      • 1 year ago

      [url=https://www.youtube.com/watch?v=E_207UxnkGQ<]nV official response to consumer complaints about the cost of the G-Sync module[/url<]

      • mvp324
      • 1 year ago

      I swear I read a statement by Nvidia, that the cost of G-Sync, would not be so high. I think this was when they were first deploying G-sync, because they also said you could buy the module yourself and add it to the monitor yourself.

      • Krogoth
      • 1 year ago

      Nvidia is trying to recoup the costs of making G-Sync 1.0 exclusive to desktop/workstation GPUs. These are low-volume items so they got to add in higher margins to make up the difference.

      This could all be avoided if they would simply discard the obsolete G-Sync 1.0 and use VESA’s VRR spec over Displayport 1.2a.

      They would even more a larger profits too since that move would single-handily kill off AMD RTG in the discrete gaming market and nearly every AMD RTG gaming user would be jumping ship.

      • Suleks
      • 1 year ago

      More or less yeah. [url<]https://www.pcper.com/reviews/Graphics-Cards/ASUS-ROG-Swift-PG27UQ-27-4K-144Hz-G-SYNC-Monitor-True-HDR-Arrives-Desktop/Tea[/url<]

        • Eggrenade
        • 1 year ago

        Well, maybe $400 [i<]is[/i<] a completely reasonable price for a completely unreasonable way to get VRR.

      • Chrispy_
      • 1 year ago

      Apparently G-Sync HDR is such a kludge that the module is effectively a complete PC with it’s own active cooling, RAM, and FPGA processor.

      It’s a really really dumb way to do it, but that’s the only method Nvidia are offering so either pay up or shut up šŸ™

        • Krogoth
        • 1 year ago

        None of it is really necessary. It is there to give compatibility for older GPUs (Like they have the power to handle native resolution for this mointor)

          • DoomGuy64
          • 1 year ago

          Their newer GPU’s still don’t support DP 1.2a (on purpose), so it is indeed necessary. You cannot retroactively enable DP 1.2a with a driver toggle switch. It must be built into the hardware. AFAIK, both DP and HDMI have made VRR optional, which means Nvidia can continue to opt out indefinitely.

          I am in no way saying Nvidia is incapable of making their cards support 1.2a, because the mobile chips clearly do, but they are deliberately not supporting it on the desktop because they are selling two products per user, and it enforces vendor lock in.

          Gsync’s FPGA also lets Nvidia keep tight control of their gaming panels, which then enables uneducated fanboys to argue Freesync is inferior because some $100 business class monitor with a VRR range of 50-60 exists. Which is a ridiculous argument, but that’s how propaganda and shillery works. *The reality however is that most Gsync panels have exact freesync equivalents made by the same company, and only differ by the cost of the FPGA module and one letter in the model name.*

          Backwards compatibility is also a joke, because that is easily solved by using a chip powered display cord like mCable that could convert the display output from Gsync to Freesync. It wouldn’t even be as expensive as current FPGA modules, costing $100-150 tops, and knockoffs costing closer to $40-50. (Just look at existing 4k mCable prices.)

          What people need to do is petition mCable, and all their knockoff competitors to implement freesync conversion, and once that happens Nvidia will not have the ability to continue selling Gsync. Perhaps the only reason why it doesn’t already exist is potential lawsuit threats and collusion, but if not, should be possible. AFAIK, some converters are open source, so bare minimum they should be capable of doing it once they release DP 1.2a+/HDMI 2.1 hardware.

            • DancinJack
            • 1 year ago

            FWIW, the RTX family supports 1.4a. It’s not friggin Adaptive Sync over HDMI, but it’s something!

            • DoomGuy64
            • 1 year ago

            I think they only support 1.4 though, and not 1.4a. The a part was never made mandatory. There are people who think Nvidia included adaptive secretly, but there isn’t any proof so far. I don’t think Nvidia will enable it anytime soon if so, and they will more likely sell new hardware months to a year ahead of enabling hidden features. If they ever do at all.

            edit: I have seen something saying “1.4a ready”, but the actual tech specs say, “DisplayPort 1.4, HDMI 2.0b”.

            Breaking news: Workaround now officially exists:
            [url<]https://youtu.be/kbYeN1-OYeI[/url<] [url<]https://www.techspot.com/article/1687-freesync-on-nvidia-gpu-workaround/[/url<] *Requires APU passthrough.

            • DancinJack
            • 1 year ago

            Haha what a mess.

            • Krogoth
            • 1 year ago

            Nvidia is locking it down their implementation of VESA’s VRR behind drivers and firmware. The silicon for mobile and desktop GPUs are identical (save for power consumption).

            The workaround proves it by tricking the drivers on the Nvidia desktop GPU into using VESA’s VRR spec. Nvidia just has to release a driver update and possible firmware update on any GPU that has Displayport 1.2 or newer.

            The walled garden strategy that Nvidia is trying to pull is actually counterproductive for expanding G-Sync marketshare. They make the bulk of their revenue via the through sell of GPUs not bloody monitors (The vendors take home most of the margins). Their current setup ensures that G-Sync 1.0 is an exclusive feature to high-end desktop market and mointor vendors have little or no incentive to expand their line-up.

            The walled garden strategy would only work if Nvidia actually made and sold monitors while commanding a decent marketshare of the display market.

            • DoomGuy64
            • 1 year ago

            The workaround doesn’t prove anything. It passes through the output from Nvidia to an AMD display port using inbuilt windows 10 features. Nvidia’s output ports are completely bypassed, and it won’t work without using an AMD APU set as the output.

            *What this workaround actually proves is my mCable theory that you can insert a 3rd party chip to convert regular unsynced frames to a VRR compatible format. Also, this method would be preferable to the APU method, as it would completely bypass drivers and allow any CPU.

            That said, RTX’s official specs are DP 1.4, but hidden below in black text on a black background, in fine print, it does say “2 – DisplayPort 1.4a Ready”

            As far as making RTX fully 1.4a capable, it would indeed require both a firmware and driver patch, and considering Nvidia recently released a firmware patch for HDR, it may be possible.

        • Freon
        • 1 year ago

        Rumors about FPGA were around for older modules as well. I imagine they could make a custom SoC to handle this for much cheaper, but I guess there is some calculation somewhere based on product volume. I’m not sure how many of these Gsync monitors they will expect to sell before the spec will change again and they’ll have to role another one.

          • jts888
          • 1 year ago

          It is (was?) not only the FPGA that was arguably overbuilt, but the memory subsystem as well. They are using 6 * 512 MiB DDR4 (up from the first boards with apparently 6 * 128 MiB), which is something around 120 framebuffers of UHD@24bpp.

          If there were doing time-compressed panel scan-out, they would obviously need pretty high sustained peak bandwidth, but this is not a particularly cheap thing to accomplish in this manner.

      • Kretschmer
      • 1 year ago

      I’m guessing that the GSync SKUs get better panels than the FreeSync SKUs.

        • Spunjji
        • 1 year ago

        All evidence of panels that are otherwise identical besides the module seems to contradict that.

        The reason you see bad panels in FreeSync displays is because it’s effectively become a cost-free addition, so the barrier to entry is lower than G-Sync. Pay less, get the same; pay even less, get less.

    • homerdog
    • 1 year ago

    Iā€™d like to see a head to head with the gsync model in terms of VRR effectiveness.

      • DancinJack
      • 1 year ago

      Apples and oranges, my friend.

      • moose17145
      • 1 year ago

      I was thinking I would like to see the VRR Range of each of these. I assume the G-Sync one is capable of going down to 30Hz… It would be nice if the Freesync one is capable of the same. 30-144Hz on FreeSync with an IPS panel IS possible… Nixeus managed to do it.

      That being said… idk that I would even want a 27″ 4K monitor… That Nixeus I have is a 27″ 2K… and the text on that thing is already so dang tiny that it is hard to read. I kinda wish it was a 32″ screen with the other specs all being the same.

        • Airmantharp
        • 1 year ago

        For 4k, I’d like to see 35″+; I have a 31.5″ 4k, and it’s just a tad tiny.

        Also have a 31.5″ 1440p (the LG with 165Hz and G-Sync), which is just about right, but comical next to the 4k šŸ˜€

        Also no mention of Freesync2, which would alleviate some concerns…

          • Chrispy_
          • 1 year ago

          Hah, you and I must be getting old.

          31.5″ 1440p seems about right for me for general use. 27″ 1440p was a little on the small size but okay, I’d much rather see 4K panels starting at the common 37″ TV size (equivalent to a bezel-free quad 18.5″ monitor array).

          It’s not that windows DPI scaling is awful, but it’ll be [i<]at least[/i<] half a decade before[i] common applications and webpages stop being designed for a fixed 96dpi.

            • Airmantharp
            • 1 year ago

            Well, I have 27″ 1440p, 31.5″ 1440p, and 31.5″ 4k to compare šŸ˜€

            Mostly, the 4k monitor is fine at 31.5″; I just sit a little further back, so I can see that it would be an issue for many.

        • DoomGuy64
        • 1 year ago

        lol, I have a 1440p IPS panel that does that, and any IPS VRR panel that doesn’t simultaneously use ULMB will suffer noticeable blur @ 30Hz. (afaik, non-existent or requires hacks)

        You are better off to raise the VRR range to force LFC compensation on, because it doubles your physical refresh and lessens blur without ULMB. Eg, your 30 FPS will then refresh @ 60Hz.

        Otherwise, you have to tolerate smeary 30Hz that gives you eye strain in most third or first person games. It’s not a positive feature to support a low VRR refresh rate, aside from TN or the mythical VRR+ULMB panels.

          • moose17145
          • 1 year ago

          Hmmm good point, I totally forgot about LFC. In that case, hopefully the range is high enough for LFC.

          • Chrispy_
          • 1 year ago

          Agreed. 48-144Hz is absolutely fine because you really don’t want the panel refreshing any slower than that.

          35fps is still absolutely fine on a 48-144Hz monitor, it’s just that it’ll be refreshing natively at 70Hz

        • Demetri
        • 1 year ago

        Have you tried the Windows scaling options? I have the same Nixeus (EDG I presume) and run it at 125% desktop scaling with no issues. 100% was unacceptably small.

        • Spunjji
        • 1 year ago

        150% scaling works pretty nicely these days. Font rendering just looks better and you’ll have the same real estate as a 2K 27″. What’s not to like?

          • K-L-Waster
          • 1 year ago

          How about “same effective image as at 1440p but with 50% more pixels for your GPU to push” ?

          Afraid I’m not seeing the advantage of 4K here…

      • JustAnEngineer
      • 1 year ago

      … also in pocketbook effectiveness.

      • Krogoth
      • 1 year ago

      Assuming panels are the same. You will get identical performance. G-Sync 1.0 is an hold-over before VESA’s VRR spec was finalized. The extra hardware was meant to make it work over pre-Displayport 1.2a era GPUs and monitors. It is no longer necessary on anything that properly supports Displayport 1.2a or newer.

    • DancinJack
    • 1 year ago

    Oh come on Acer. At least differ the product names more than ONE LETTER. Good lord.

      • Spunjji
      • 1 year ago

      Why, though?

Pin It on Pinterest

Share This