Acer’s Predator Z35P is on the hunt for a high-end gaming rig

Our peripheral guides make it clear: we like variable-refresh-rate displays, and we think that any gamer who sits down in front of one will fall in love with the silky-smooth motion on the screen. Gamers looking for Nvidia's G-Sync technology in a premium package might want to take a look at Acer's Predator Z35P display, which puts the VRR tech into a curved, ultrawide package.

At 35", the Z35P is a big slab of pixels. It employs a VA panel with 1800R curvature and a native resolution of 3440×1440. While that's not quite as many pixels as one would find on a 4K screen, the display has enough real estate to display four documents side-by-side, and its ultra-wide 21:9 aspect ratio should make games and video even more immersive. Gamers will appreciate its standard 100 Hz refresh rate, 4 ms response time, and 2500:1 static contrast ratio. Unfortunately, Acer didn't publicize the display's G-Sync range, though that's not usually a concern with the technology.

The Predator Z35P connects to graphics cards through DisplayPort or HDMI connections, and comes with a four-port USB hub. As with Acer's other Predator products, the monitor is aggressively styled. It's adorned with bold red highlights and a large Predator logo, and the back panel has a brushed metal finish. The Z35P is available now for $1100.

Comments closed
    • cynan
    • 2 years ago

    Why not just go for the [url=https://www.amazon.com/AOC-AG352UCG-Curved-Gaming-Monitor/dp/B06X9CBRTP<]no-name version[/url<] and save some money?

    • Kretschmer
    • 2 years ago

    Edit: mistaken. Disregard.

      • Kretschmer
      • 2 years ago

      I’m curious to see how this panel performs. The large VA panels haven’t yet been able to match their speedy peers.

      I’ve blown my monitor budget through 2020, but it’s fun to follow the tech.

        • cynan
        • 2 years ago

        Wouldn’t this be using the same AU panel as the HP Omen X? Max refresh for that panel is also 100Hz and 1800R curvature.

    • DPete27
    • 2 years ago

    Acer – Keeping G-Sync alive.

      • K-L-Waster
      • 2 years ago

      They wouldn’t keep selling them if they weren’t, y’know, selling….

      • Airmantharp
      • 2 years ago

      Consumers are keeping G-Sync alive.

      And if we’re to be really honest, AMD’s lack of high-end GPU competition is what’s really keeping G-Sync alive: for gamers that want to drive displays like the above at acceptable image quality settings and framerates, AMD and thus FreeSync simply aren’t an option.

      [and I say the above with a little bit of sorrow, as competition from AMD would bring pricing down and thus value to the consumer up, and also with a bit of respect for Nvidia, whom have done a decent job of competing with themselves on the high end while AMD has been absent, as Intel had done, though I have no doubt they will be prodded forward at the sight of competition, just as Intel was with the release of Ryzen, see rumors of 115x 6C12T i7’s]

        • slowriot
        • 2 years ago

        I’d say consumers are overwhelmingly choosing FreeSync panels and hoping at some point Nvidia will support it in the future.

          • K-L-Waster
          • 2 years ago

          What’s that based on? Do you have sales figures?

            • slowriot
            • 2 years ago

            What’s it based on? The most predicative indicators I’m aware of…. price and availability. As far as I’m aware of there’s no sales data with the detail enough to know though.

            G-Sync is solely a high end item. I bet its great from a manufacturer perspective and it doesn’t need volume to make sense for them to keep shipping products with it. As long as Nvidia doesn’t support alternatives buyers of their high end GPUs will look to G-Sync. But in terms of market share I’d say its certainly smaller than Free-Sync simply because Free-Sync is just a step above throw in.

            • K-L-Waster
            • 2 years ago

            The reason I question it is the fact that vendors continue to roll out new G-Sync monitors. If they weren’t making a profit on them, they wouldn’t keep introducing more.

            • slowriot
            • 2 years ago

            G-Sync monitors do not have to out sell or in the quantity as FreeSync one’s do. They’re a decent bit more expensive and a large portion of that is pure profit. I’d say there’s significantly more margin on a G-Sync monitor than FreeSync of otherwise very similar specs.

            FreeSync for the masses on thin margins and G-Sync for the high-end using largely the same panels/chassis but with much larger margins. Makes sense why companies continue, and will continue, to offer both. And why no one in the industry seems in any rush to change the situation.

            • DPete27
            • 2 years ago

            Remember that there’s royalties to be paid to Nvidia for inserting a GSync module into a monitor. I don’t know what the modules cost exactly, but I’d bet that’s the majority of the price premium you’re seeing compared to an otherwise comparably specc’d FreeSync monitor.

            • slowriot
            • 2 years ago

            Yeah that’s a factor. Ultimately I feel this is all down to whatever Nvidia wants to do is what will happen.

            • K-L-Waster
            • 2 years ago

            I think you’re both right on the specifics — but I’m still not seeing any hard evidence that consumers have overwhelmingly chosen Freesync (which was the question I asked at the top of this thread…)

            (edit: it’s the “overwhelmingly” I’m concerned about here — makes it seem G-sync should be a dead technology with a very small user base.)

            • slowriot
            • 2 years ago

            Are you waiting on it or something? Because I already said I didn’t have it beyond my own observations and general indicator (lower pricing, wider model variety) of what moves product in volume. We could try making some guesses on what popular websites post for “best selling” but I don’t think that’ll be accurate at all. Otherwise there’s no data, that I’m aware of, that could answer the question.

            I wouldn’t get too hung up on the overwhelming. Just that FreeSync moves more volume. If I were to make a guess it’d say the market share might be 70:30.

            • Laykun
            • 2 years ago

            The majority of the market uses nvidia gpus, so even if each camp bought VRR monitors in equal proportion you’d still find g-sync monitors outselling freesync monitors.

            While I appreciate the open standard of freesync, until nvidia implements it on their GPUs I can’t see any serious gamer ever buying a freesync display.

            Cheap monitors with freesync go to cheap places, like offices, where the benefits of freesync will not be seen. By simple market share, if someone is looking for VRR for gaming, it’s more than likely going to be a gsync display.

            • Airmantharp
            • 2 years ago

            Seconded.

            And I’ll even admit that without data, I’d be inclined to believe slowriot’s point, as it is supported by logic: AMD does typically (at least appear) to provide more performance per dollar at the lower end of the price spectrum, and FreeSync displays, while typically lower in quality than G-Sync displays, are also typically more accessible price-wise.

            Together, these observations support the assertion that consumers are choosing AMD/FreeSync displays over Nvidia/G-Sync displays, but do not merit proof for the ‘overwhelmingly’ qualifier, in my opinion.

            Further, as can be gleaned from my explanation above, FreeSync and G-Sync seem to target two different audiences, almost exclusively. For the higher-aimed G-Sync, probable lower sales volume may not be a problem for Nvidia at all.

            • slowriot
            • 2 years ago

            Yeah I’d generally agree with you here.

            To me G-Sync is premium product and Nvidia would like to keep it that way. I feel this is all down to whatever Nvidia wants to do. Consumers are more at their mercy than the other way around.

            • Airmantharp
            • 2 years ago

            I’ll agree with ‘more’, but I don’t really see it as a problem. I’ve seen more recent looks at the G-Sync ‘tax’ and Nvidia ‘tax’ conducted informally in forums that have shown that at monitor feature parity, the ‘tax’ as it were has dropped from a release delta of about US$300 to less than US$100, and Nvidia’s pricing deviates only slightly higher than AMD’s in terms of price versus performance.

            So I agree that G-Sync and accompanying Nvidia GPUs are a premium, but for say a US$2000 performance gaming system, possibly only a US$100-150 premium (possibly comparing hypothetical AMD parts), and it’s clear that consumers are willing to bear that burden.

            I’ll also reiterate that one can find an excellent, approaching if not premium experience with systems based around AMD cards and FreeSync monitors, all depending on budget and what features and performance one considers necessary.

          • Kretschmer
          • 2 years ago

          We were saying the same thing 3 years ago.

        • DPete27
        • 2 years ago

        Sad but true. Wasn’t Vega supposed to launch nearly a year ago?

        • DoomGuy64
        • 2 years ago

        What games actually need a 1080 to run 1080p or even 1440p? Sorry, but most gamers choose Nvidia solely because of marketing, and not need. AMD may not have cards that match the 1080, but I haven’t played any games that need anything higher than a 480 either. Because of this I “downgraded” from my firesale Fury back to a 390, so that I can sell the Fury and buy Vega when it comes out.

        Nvidia’s high end cards are only necessary for resolutions higher than 1440p, gsync, and VR, so those cards are not catering to the average gamer, but the niche high end market. Everyone else is better off with AMD and freesync.

        I don’t really see Vega changing things much either, outside of letting AMD users run higher resolutions. Nvidia’s market lock is mostly due to marketing, and AMD offering a competitive high end card isn’t going to change things all that much.

          • Airmantharp
          • 2 years ago

          Many.

          What you’re discounting is that ‘need’ is different from want, and that individuals’ performance and game image quality goals can differ significantly, and very apparently differ significantly from your own.

          To provide a contrasting argument, I’ll use my system: a 6700k, a pair of GTX970s, and a 1440p 165Hz IPS G-Sync monitor. Here, I do not ‘need’ higher than RX480 performance (roughly similar to a single GTX970), but with decent quality settings in say BF1 or Mass Effect: Andromeda, running with SLI disabled very clearly affects performance.

          Push up the resolution to 4k, or even the UUWA 3440×1440@100Hz panels as featured in the article above, and one can easily see the need for more graphics power- more than even a 1080Ti can deliver, and we haven’t even touched VR where minimum framerates must be significantly higher to maintain immersion.

          • Kretschmer
          • 2 years ago

          If you want to run a 144+Hz with great fluidity you’ll want all the GPU you can get. Similarly, using ULMB mode smoothly requires a capped 80/100/120FPS, which requires extra headroom over an average of the same performance (as an average includes lower data points).

          Running 3440×1440 with 390X/480 performance is not acceptable at high quality settings. I know; I’ve tried. Even a 1070 can struggle at that resolution.

          Once you start checking the cool monitor boxes – high refresh rate, big, high-res, or ULMB, you’re going to want more than Polaris can provide.

            • DoomGuy64
            • 2 years ago

            Yeah, I clearly stated greater than 16:9 1440p, and there’s no such thing as “greater fluidity” when you’re already getting max fps with 480 class hardware @ 1440p. The scenarios where you actually need a 1080 are really slim, and pretending like those niche scenarios are that important is cheap.

            Obviously high end displays need a high end card. That’s a given, and if you’re already spending money on a display, a high end card would be appropriate to go with it. All I’m saying is that those cases are NICHE, and most people don’t need that level of horsepower because they’re not running displays higher than 1440p or ULMB.

            There may be a small handful of games that “need” a high end card, but for the most part those games are using pointless “ultra” settings and gameworks effects that don’t run right on AMD. Not to mention they’re not super popular titles like CS:GO. The 480 can handle all of the popular games fine @ 1440p, so the main reason why people don’t use AMD for those titles is marketing. I have no issues running games 1440p with a 390, so anytime I hear otherwise I know people are either lying or brainwashed by marketing.

            • Srsly_Bro
            • 2 years ago

            Says the guy who is ok with gaming at 30 fps. Jeff and I need to have a chat. Anyone can create an account and post.

            I spent a few minutes looking at 1440p benches with the rx480 for the possibility I was wrong. My conclusion is the same. He has no idea what’s going on or what year it is.

            I conclude that ronch and doomguy are roommates or the same person. I suggest a ban for multiple accounts.

            Thanks, bro.

            • DoomGuy64
            • 2 years ago

            Seriously bro. “Gaming at 30 fps” is a point blank lie, and you are in the same boat as ronch and “anyone can create an account and post”, as I’ve seen numerous posts of yours that are outright absurd. Ronch posts in a completely different style as well, so you’re obviously trolling for a response with that comment.

            I’m not saying the 480 is a powerhouse or can run this monitor. I was replying to airman’s point about consumers. People choose nvidia 99.9% of the time because of marketing. They aren’t running high end monitors or high end nvidia cards. They’re running 1080p with 960’s and 1060’s, and I’m saying the reason for that isn’t horsepower, but marketing, and it always has been.

            1440p is playable on a 480. You’re pulling 30 fps completely out of your backside, so why don’t you go take a hike.

            • Kretschmer
            • 2 years ago

            [url<]http://www.tomshardware.com/reviews/amd-radeon-rx-480-graphics-card-roundup,4962-3.html[/url<] Tom's shows the 480 as a 40-60FPS card at 1440P. The techreport review had to dial some newer games back to 1080P for the 480 review to keep the card from tanking. Seeing as the 290X is roughly like a 480 and I ran that card on a 1440P FreeSync display, I can comment that it was not a great level of performance even in 2016. The card is much better suited for 1080P.

            • DoomGuy64
            • 2 years ago

            Depends on what games you run. I get over 100 fps in doom, and same with the other games I play. I don’t play any of the games listed in that benchmark. I have played the new Deus Ex, which got 50-60 fps using dx12 on the Fury, but I got tired of it and quit playing. Most of the games I play are popular multiplayer titles, or games that run well on my card. Even if I did play some of those games, it wouldn’t be an issue since games like the witcher don’t need 144fps and I can just use freesync.

            That’s one of the biggest reasons why I “downgraded” back to the 390. I don’t need Fury/1070 level of performance in what I play to get over 100 fps in 1440p, and settings can always be adjusted in newer games to compensate if I felt compelled to play any of them. So yeah, I guess there are games that need better cards, but since I don’t play them I don’t need a better card. Maybe someday when popular multiplayer titles start needed more than a 390, but that isn’t now, and I plan on upgrading to Vega by then anyway.

            • Kretschmer
            • 2 years ago

            Based on the Steam survey results, the GTX 1080 has outsold the RX 480, so maybe people actually do enjoy fluidity. 🙂

          • Srsly_Bro
          • 2 years ago

          You doom humanity with your ignorance. Your name checks out.

          • K-L-Waster
          • 2 years ago

          Uhhmm… you do realize this article is about a 3440×1440 display, right? 1080 is out of scope.

    • Neutronbeam
    • 2 years ago

    Love it! Funding is set up at [url<]http://www.buyneutronabigassmonitor.com[/url<] and the site takes PayPal, Bitcoin, credit / debit cards and gold bars of dubious provenance. Morphine, your card is pre-approved for the full amount. 🙂

    • Kretschmer
    • 2 years ago

    Correction: The Z35P is 120Hz, not 100Hz.

    Note that these giant VA panels suffer on response times compared to IPS panels of similar refresh rates, though they typically exhibit less backlight bleed and no “IPS glow.”

    I personally opted for the Acer X34 and am happy with it (the only flaw on my sample is slight backlight bleed on the corners, which is very difficult to see under normal operating conditions).

      • Airmantharp
      • 2 years ago

      I was wondering what was different/new with this display.

      Another note might be the possibility (needs to be tested) of not just less ‘glow’, but also better contrast, something that traditional LG IPS panels as well as many/most TN panels are limited.

      It’s also possible that this panel has lower persistence than expected for a VA panel, as it does seem quite unique for the purpose.

      • morphine
      • 2 years ago

      Actually, the tech specs specifically say 100 Hz.

        • Kretschmer
        • 2 years ago

        [url<]http://www.tftcentral.co.uk/news_archive/38.htm#acer_predator_z35p[/url<]

          • K-L-Waster
          • 2 years ago

          Acer’s own spec page sez 100.

          [url<]https://www.acer.com/ac/en/US/content/predator-model/UM.CZ1AA.P01[/url<]

            • Kretschmer
            • 2 years ago

            These firms often botch specs on their websites. It appears to be 100Hz stock with up to 120Hz in OSD:

            [url<]http://www.overclock.net/t/1629226/acer-z35p-in-hand-and-small-review[/url<]

            • Airmantharp
            • 2 years ago

            My 27″ Predator has a similar feature, boosting from 144Hz max to 165Hz max through the OSD, a feature I turned on to check and have since left on (not that I could tell the difference either way).

            However moving to a potential 120Hz is perhaps more significant, as it makes a great fixed desktop rate for video for the same reasons that there are/were so many ‘120Hz’ TVs. To wit, while I run games at 165Hz/G-Sync, I keep my desktop on that monitor at 120Hz, and further, I’d not likely notice if the display was limited to 120Hz period.

      • Chrispy_
      • 2 years ago

      As a Z35 owner (briefly, I sent it back for smearing and poor response times) I would have agreed with you, but since then I’ve had two VA panels that are more than adequate for high-refresh gaming:

      1) The 1800R curved Samsung 144Hz VA 4ms panel in the Predator Z271
      2) The 32″ 2560×1440 panel in the Samsung SD850.

      It would seem that the only response that is “slow” is the 0-255 transition, at [url=http://www.tftcentral.co.uk/images/acer_predator_z271/response_7.png<]15ms or so[/url<], but 1/65th of a second isn't too terrible, and you'll notice that only affects transitions from pure black. If the transition is from a dark colour to a light colour (far more realistic in real use) you'll see the transitions are ~6ms, plenty quick enough for a 144Hz monitor, and overkill for a 100 or 120Hz refresh rate. I came from a Korean IPS and a couple of LG IPS screens and I'm so pleased to have VA these days. Backlight bleed seems to be a serious problem with LG IPS panels, whilst the IPS glow varies from monitor to monitor. Neither of these matter compared to the gobsmackingly-rich contrast that VA offers. It's visibly better even in a well-lit room. Where IPS blacks are okay, VA is closer to OLED than anything else.

        • Kretschmer
        • 2 years ago

        Edit: wrong review, nevermind.

        VA panels have suffered in the response time department recently (e.g. Omen X and AG352UCG). We’ll have to wait for the TFT Central guys to get their hands on this one.

          • Chrispy_
          • 2 years ago

          Yeah, that site and Prad.de are the only people who do proper testing of screens these days :\

        • Kretschmer
        • 2 years ago

        MY LG 34UM88C was free of any BLB, but it was slllooowwwww.

        Don’t worry, if I bought an X34 in spring 2017 it means that the tech will be completely revolutionized by spring 2018. 🙂

          • Chrispy_
          • 2 years ago

          Haha, thanks for taking one for the team 😉

Pin It on Pinterest

Share This