Intel, Samsung aim to deliver 4K displays for $399

And here I was, thinking that $649 4K display Scott reviewed was pretty dang affordable. Apparently, Intel doesn't share that view—and it's teamed up with Samsung to drive prices even lower. Here's the scoop from CNet News:

Intel is specifically partnering with Samsung to increase delivery on high-quality, 23.6-inch PLS 4K panels, with a stated aim of such monitors hitting a US$399 (£239, AU$430) price point. For Intel based All-in-Ones, Intel believes we will also see 4K All-in-One prices starting from US$999 (£599, AU$1,080). . . . These PLS, or Plane to Line Switching, monitors are very high-quality, with 100 percent sRGB coverage and Technicolor certification.

It looks like Intel didn't share much else in the way of details. There's nothing definite in the CNet News story about when we can expect these uber-affordable 4K screens—or exactly what Intel's partnership with Samsung will entail. The site suspects Intel might be spurring Samsung on with "financial incentives," however.

Update 10:41AM: Looks like Intel covered this during its big Computex keynote. The keynote included this slide, which fills in some of the missing pieces:

$399 4K displays will appear in time for the holidays, apparently, and both TPV and ViewSonic will offer them. I guess Samsung itself won't be joining in the fun, though I expect the firm will be providing the panels for these monitors. (Thanks to TR reader SH SOTN for the heads up.)

Comments closed
    • Chrispy_
    • 6 years ago

    We have some PLS screens here at work (only 8 of them, and all the same model). Colours and viewing angles are great, but response time isn’t; They’re a bit smeary in motion.

    They’re about 4-5 years old now so perhaps PLS has come a long way since then, but I’m wondering if it’s no coincidence that Samsung themselves still use S-PVA panels for their high-end HDTV’s and even when they outsource to AUO or CMO for the mass-market models they go for A-MVA.

    If PLS was so hot howcome it’s only on desktop monitors and then only on the models that are cheaper than IPS or S-PVA?

      • Airmantharp
      • 6 years ago

      Five years old is older than I knew PLS existed- and the first ‘efforts’ weren’t terribly commendable, or even an improvement over Samsung’s own S-PVA.

      And I’m still not sold on it. PLS, like S-PVA, has that same basic response time issue, which compensating for only adds even more input lag; i.e., it will likely never be good enough for gaming as TN and IPS can be.

    • ptsant
    • 6 years ago

    I think running at 1080p resolution is OK (scales naturally) for people without the GPU horsepower. I definitely would want some sort of sync technology, 60Hz and single-surface support before even thinking about it.

    • itachi
    • 6 years ago

    Holy you kidding my asus vw266h 1900×1200 cost me 500 when I bought it XD

    When do I win lotery ?

    • ripple123
    • 6 years ago

    So there planning to hit the price point that the Seiki 4k 38 inch VA panel had last year. How nice of them to catch up.

      • Airmantharp
      • 6 years ago

      Well, they’re going to do it with a much nicer panel (at least far superior panel technology, no accounting for yields yet), in a monitor that can handle it’s full resolution at 60Hz; both are significant steps up from Seiki’s ‘bottom-barrel’ solution :).

        • ripple123
        • 6 years ago

        The Seiki panel is VA, so, a hell of a lot better than the TN stuff samsung and their ilk is pushing with all the 28 inch panels at the moment, and really the 30hz thing was more a hdmi standards lacking at that point. If Seiki updates their 39 inch panel with 60hz circuitry and displayport, they will totally smoke any other panel at that price point, as everyone snaps up 39 inch 4k tvs at $500 to use as monitors.

          • Airmantharp
          • 6 years ago

          But it’s really crappy VA, and PLS is an upgrade to VA (in theory).

          I agree that if Seiki were to continue their aggressive pricing with panels that can handle the full refresh rates exposed by DP, i.e. 60Hz at 4k and 120Hz (at least) at 1080p, and if they were to keep a slightly sharper eye on their QA, they’d still be able to capture plenty of marketshare. I’d definitely consider them for extra ‘displays’, as they’d be a very real upgrade from the array of TNs that I’m using to back up my 30″ IPS :D.

        • sschaem
        • 6 years ago

        Are you sure the panel in those 23″ monitor will be better then the VA in the seiki?
        (I’m saying this because I have a dell U3014 and its great on paper but not that great in real life. light leakage, and viewing angle non uniformity. something you dont see from calibration tool)

        And calibrated, this monitor ( seiki) is actually amazingly good.
        Also its using HDMI1.4 simply because at the time HDMI2.0 was not even finalized…

        Adding HDMI2.0 logic to this model will not add much to the BOM. So sill <$400 for 39″ VA 4K with 60hz (on a 120hz refresh panel)

        BTW, Intel been talking about 4k monitor for the PC since 2011… whatever they say NEVER seem materialize.

        And really. who is excited about a $400 23″ monitor ? specially if its 16:9

        16:9 start to make sense at 27″ +

          • Airmantharp
          • 6 years ago

          Well, I cant predict the future, so no, I’m not ‘sure’. I can only use what’s happened so far to make an observation and prudently apply it to various companies’ claims.

          As for your crappy U3014, you seem to have failed yourself- Dell’s return policy is above reproach, and you are obviously not satisfied. That monitor can and should be perfect.

          • rpwooste
          • 6 years ago

          These are way better than a 39″ Seiki. I have used both, there is no comparison. The 39″ Seiki is quite simply too large for sensible usage. Frankly it was horrible, I used it for two days to setup a machine for a demo, but was glad to be rid of it. I’ve been using a 23.x 4K monitor (Dell 2414Q) since December 2013, there’s utterly no comparison, and the TPV monitors and selection of 4K AIO’s all using the Samsung panel at Computex were far superior to the Seiki. Plus the 23.6″ panel runs at 60Hz.

    • bhtooefr
    • 6 years ago

    With brands like those, not sure if want.

    Mind you, I don’t actually care about resolution scaling at this level (I’d run ’em native, just like I run my MBPR native), but Microsoft really needs to get their act together in this regard. I’d almost say, deprecate all the various resolution independence APIs that they have now, and then copy Apple’s approach, which is hackish and needs lots of GPU, but it WORKS.

    • Billstevens
    • 6 years ago

    It sounds like the items to wait for are:

    27″ – 32″ inch screen
    IPS panel
    4k res
    Gsync and Display Port Adaptive sync
    Right around $500.00

    That to me sounds like the kind of monitor you wont have to or want to replace for the next 5 years.

      • rpwooste
      • 6 years ago

      Unfortunately I predict you’ll be waiting 10 years, and then waiting some more, for that spec and price. If you drop the need for GSync and accept DP 1.2a or DP 1.3 with Adaptive Sync, you’d get closer, but 27′ 4K IPS in the next five years for $500, seems tough to imagine.

    • deathBOB
    • 6 years ago

    I really don’t care about these for gaming, but I am looking forward to improved text. Reading from my 1080p monitor after using my Moto X (which isn’t anything special in the mobile space) sucks.

    • slaimus
    • 6 years ago

    The Seiki 4k is $389 right now at newegg: [url<]http://www.newegg.com/Product/Product.aspx?Item=N82E16889522025[/url<]

      • UberGerbil
      • 6 years ago

      Was going to mention that — saw that on a deals site last night. I have less than favorable impression of Seiki, but I don’t have any real data to back that up.

      • Airmantharp
      • 6 years ago

      30Hz and poor uniformity make it a no-go for all but the most basic of tasks- but the price is certainly reasonable :).

    • drfish
    • 6 years ago

    My whole life I’ve been chasing higher and higher resolution screens and my whole life my wallet has been complaining that it only means I need to spend more and more on the rest of the computer… 4k is tempting but I [i<]think[/i<] I could settle for, 27", 120Hz, IPS, WQHD, and adaptive sync. Native 4k gaming is pricy and dropping to 1080p would be too painful - and I don't want to run anything in between... Of course if the screen only costs $400 to begin with you have a lot left for the rest of the system vs. the old days of $1,200 30" screens...

      • rbattle
      • 6 years ago

      This sentiment is all over this comment section. Many folks seem to agree on what we want. Not sure why you were downvoted, but I fixed that.

      • mako
      • 6 years ago

      I was about to spring for a 27″ IPS but this news does give me pause.

        • Airmantharp
        • 6 years ago

        I still am; Acer has one with a 144Hz panel and G-Sync coming, if I’m not mistaken. I could lose some vertical pixels from my 1600p monitor for that.

    • Star Brood
    • 6 years ago

    Who cares about gaming when you can view your Linux iso’s in 4k.

    • l33t-g4m3r
    • 6 years ago

    I doubt these screens will be good for gaming, unless they support g/free-sync, and lightboost. Not to mention hardware requirements will be huge, as cards that powerful don’t exist on the market yet.

    You know what feature I’ve discovered that works really nice on existing monitors? [url=http://www.neogaf.com/forum/showthread.php?t=509076<]Downsampling.[/url<] Pixels aren't really the problem, it's AA. 4K is more for spreadsheets.

      • albundy
      • 6 years ago

      so in retrospect, what you’re saying is that gaming before g/free-sync monitors was not good at all?

        • l33t-g4m3r
        • 6 years ago

        Yes. LCD monitors are terrible for gaming compared to CRT’s. I haven’t owned any where motion blur is non-existent, and overdrive [url=http://www.blurbusters.com/faq/lcd-overdrive-artifacts/<]artifacting[/url<] can be just as annoying as motion blur. Some games are worse than others, depending on art style and viewpoint. Darksiders was a terrible game for LCD's, it burned my retinas, but I loved playing it. I "overclocked" my monitor to run @ 100hz to play it without getting eye-strain headaches. Games like Q3 are fine, because the less noisy textures and first person view doesn't cause blur issues, same with strategy games, but OMG some games are just unbearable. Any new monitor I buy will need to have good blur reduction, because I can't stand it. It's kind of like why poor AA causes people to buy higher resolution monitors, as the flickering bothers them. Well, AA doesn't bother me using downsampling, and blur bothers me much more than any pixel flickering.

          • ibnarabi
          • 6 years ago

          For most games people actually play CRT’s suck compared to LCD screens. I am overjoyed I no longer see flickering tubes out of the corner of my eye 🙂

            • Airmantharp
            • 6 years ago

            I payed extra to get tubes that could do at least 85Hz, back in the day- but I preferred 120Hz if I could get it.

          • Waco
          • 6 years ago

          So because you haven’t owned a nice LCD monitor all of them are crap? Nice wide brush you have there.

      • Airmantharp
      • 6 years ago

      Actually, they’ll be great for gaming!

      We’ve been living without adaptive vertical syncing since the dawn of gaming without forced V-Sync (a long, long time), and while it’s been a noticeable problem the whole time, it has not stopped us.

      Lightboost is a whole nuther ball of wax, and a cool one- but it still isn’t matured beyond TN panels, most of which just aren’t good enough for anything beyond basic computing and twitch gaming.

      Some of us like to game and have our colors too.

      • truerock
      • 6 years ago

      Well… If I compare a 32″ 1080p display to a 32″ 2160p display isn’t that better than downsampling 2160p down to 1080p on a 32″ 1080p display? Doesn’t 2160p have double the AA of 1080p?

      I’ll agree that technology like NVidia G-Sync, lightboost and higher refresh rates like 120Hz and 240Hz are more important to gaming.

      • truerock
      • 6 years ago
    • HisDivineOrder
    • 6 years ago

    I’d be much more interested in a 1600p IPS (of course) 120hz monitor with G-sync with a promise to update its firmware to support Adaptive sync for $500 or a non-Facebook VR unit for $300 than 4K for $400.

    4K is going to be a pain to drive and the gain is going to be… negligible for the time being.

    ESPECIALLY with the DPI scaling in Windows versus applications being where it’s at.

    I love my Dell 3007WFP-HC, but I wish it had higher refresh rates, retained the IPS (no TN thank you very much), possibly included some kind of Lightsync (strobing effect to improve perceived motion for all content), and Gsync.

    Definitely wish I didn’t have to lose the top and bottom of the far superior 16:10 ratio to get a better monitor.

    I’d love to HOPE for an Oculus Rift that would become a great way to game in many cases since it could solve a lot of problems, but Facebook ripped that option out from under us all.

      • rbattle
      • 6 years ago

      So much, this. Just give me 1600p with adaptive frame rates in a fast but high quality IPS or *VA panel and intelligent backlight strobing. I don’t even care how much it costs, within reason.

    • internetsandman
    • 6 years ago

    4K PLS for $400? Give it 60Hz and single tile operation and I’m sold

      • JustAnEngineer
      • 6 years ago

      …also [url=https://techreport.com/news/26451/adaptive-sync-added-to-displayport-spec<]DisplayPort 1.2a Adaptive-Sync[/url<].

      • anotherengineer
      • 6 years ago

      Don’t forget a fully adjustable stand, VESA mounting, flicker free back-lighting, and light AG coating also. Good panel uniformity, low to negligible back-light bleed, with minimal response time, input lag, and RTC overshoot would be a nice + also.

      We can dream can’t we?

        • Dygear
        • 6 years ago

        I’d settle for 4K PLS, with 60 Hz and Single Tile. My only true requirement is 100×100 VESA mounting or it’s useless to me. If they can wrap all of that into $400, they can take my money now and I’ll wait until before the holidays.

          • Airmantharp
          • 6 years ago

          I can settle for PLS if it’s a better effort than Samsung’s first run of the technology- they at least need to get panel brightness and color uniformity fixed, even if they don’t get the viewing angles up to full IPS spec.

          But if they do, these will be killer even for photography-related interests.

      • rpwooste
      • 6 years ago

      It is PLS, 60Hz, single tile operation, no MST hassle with this panel.

      Also it’s >100% sRGB color gamut, and Technicolor Certification has been performed on the Samsung panel when directly attached to the Intel GPU (as is the model with AIO designs). Certification with a scalar (as in monitor designs) and with discrete video cards is just a matter of time.

    • kamikaziechameleon
    • 6 years ago

    Did the Math, dream PLP setup 52″ 4K center monitor and flanking 30″ 1600p monitors in portrait. Would be great for RTS’s

    • anotherengineer
    • 6 years ago

    Wasn’t this mentioned in Wednesday’s shortbread, number III
    [url<]https://techreport.com/news/26571/wednesday-shortbread[/url<] Either way good news!! The really important question, will these be available in 16:10?? Edit - an even more important question, will these push 2560x1440 and 2560x1600 monitors to the same price range??

      • Usacomp2k3
      • 6 years ago

      No, that’s not an important question. When you’ve got 2160 vertical pixels, who cares what the ratio is. 16×9 will always be cheaper due that that being the standard ratio for televisions.

        • HisDivineOrder
        • 6 years ago

        DPI scaling is going to make the resolution unimportant. In which case, ratio will return to relevance.

          • Airmantharp
          • 6 years ago

          How do you figure?

          If you have DPI scaling available, what would keep you from adjusting it to taste?

        • xeridea
        • 6 years ago

        I think 16×9 is cheaper than 16×10 due to it being cheaper to have a lot of pixels in only 1 direction, but I am not an LCD engineer, so this is just my theory.

          • JustAnEngineer
          • 6 years ago

          I’ve read that the panel manufacturers are able to get more good panels in a given manufacturing run (since they cut around defects and since the 16:9 panels have fewer pixels).

        • willmore
        • 6 years ago

        Why would movies effect the aspect ratio of a computer monitor? What, because these pannels are going to be made for 24″ 4K TVs? No, they’re not. They’re purpose built for computers, so there’s no reason to limit them to anything to do with TVs or movies.

      • internetsandman
      • 6 years ago

      There might be a couple manufacturers who stray from the norm, but don’t get your hopes up at this point

      • Anomymous Gerbil
      • 6 years ago

      16×10 was “important” when there weren’t many vertical pixels.

      Now that plenty of displays have more than enough vertical pixels to display web pages and documents etc, the screen ratio (for productivity work, for example) is pretty much irrelevant.

      Yet curiously and amusingly, people forget why it was once important, so they still think they need it even when it’s no longer relevant!

        • Voldenuit
        • 6 years ago

        Yeah, I don’t miss my old 1920×1200 monitor on my 2560×1440 display.

    • gmskking
    • 6 years ago

    Better keep on dropping that price.

    • esc_in_ks
    • 6 years ago

    As someone with a high DPI laptop screen (Yoga 2 Pro), I’d like to see better support from applications. There are quite a few applications I use on both Windows and Linux that don’t handle a high DPI display well. On Linux, it’s even worse—the latest releases of Firefox and Chrome and pretty horrendous.

    I’ve even debated running it in 1600×900 (instead of its native 3200×1800) just to make everything work as expected.

      • kamikaziechameleon
      • 6 years ago

      the app issue will resolve itself after a time. Devs just need a reason to change their coding preferences.

        • HisDivineOrder
        • 6 years ago

        Too bad MS doesn’t EVER plan ahead.

          • kamikaziechameleon
          • 6 years ago

          MS and scaling has been an issue for high density monitors going back to 2000. They’ve literally put off this need forever and its crippled peripheral PC expansion for over a decade. Its why a living room PC never took off for one.

            • rpwooste
            • 6 years ago

            Windows 8.1 supports 500% – how much do you want?

            It’s the other developers who have been lacking anything beyond 100%, although most of the important ones are there with 200% already, and the rest following as quick as they can.

            Roland.

        • JustAnEngineer
        • 6 years ago

        Devs tend to be [b<]really[/b<] lazy when it comes to fixing problems like that.

          • kamikaziechameleon
          • 6 years ago

          Not lazy when people stop using their software!

      • DrCR
      • 6 years ago

      For Linux, just manually set the dpi setting in the xorg.config.

      I would give more details, but I have not done it since I was on a CRT, so my memory is failing me on the details. (I’m less picky now, and I don’t bother with my present displays.) This was Slackware 9 era on an Nvidia card, so maybe it was a nvidia binary blob driver option, but even on a CRT, I loved being able to tweak my dpi setting to make the text size independent of the resolution. It may be something you may want to look into.

      Edit: xdpyinfo is something that comes to mind, fyi e.g.
      $ xdpyinfo | grep ‘dots per inch’

        • chuckula
        • 6 years ago

        Within certain limits you can manually set the DPI in X-windows. Usually X autodetects the DPI of your monitor, although this is not always perfect.

        In KDE it’s actually part of the advanced graphics options menu.
        A more general setting for DPI can go into the X.org configuration files.
        Here’s an article from Arch’s wiki, but the same setting would likely work on most distros:
        [url<]https://wiki.archlinux.org/index.php/xorg#Setting_DPI_manually[/url<]

    • kamikaziechameleon
    • 6 years ago

    32″ 4K is the sweet spot. We need to get 600 dollars for something like that.

      • MadManOriginal
      • 6 years ago

      32″ is a bit too big for normal desktop viewing distance IMO. A range of sizes like what is common now is fine.

        • Airmantharp
        • 6 years ago

        I use one 30″ and home, and two 30″ monitors alongside two 24″ monitors at work. I may not be ‘ergonomically ideal’, but we human beings are pretty flexible!

          • kamikaziechameleon
          • 6 years ago

          I have a 30″ flanked by 2 24″ monitors. I personally would like a 32″ display. 🙂

      • WaltC
      • 6 years ago

      Heck, 27″-28″ would be more of my sweet spot for ~4k–but 23″ is way, way too small, imo. (My wife often has trouble reading from a 24″ 1920×1080 display with Win7 text set at the normal 100% dpi. She has to set it to 150%-200% before she’s comfortable. Imagine 100% ~96 dpi Windows text at 4K resolutions in a 23.5″ diagonal! Needle in a haystack?)

        • Airmantharp
        • 6 years ago

        With the advent of Retina displays on Apple OS X products and the pressure that Apple puts on developers to ‘get it right’ (I’m not claiming that they actually do), vendors can’t claim ignorance to the full availability of effective scaling options built in to Windows anymore.

        Currently, I’m finding scaling to be fairly decent with up-to-date applications, and I expect in the near-term that it will be a non-issue, as high-DPI output options will literally be everywhere.

        • kamikaziechameleon
        • 6 years ago

        I don’t think that 4K on a 28″ is as needed as on a 30-32″ I have a 28″ 1440p for work and the density difference is REALLY noticeable. I want a higher density 30 or 32″ monitor first.

          • cynan
          • 6 years ago

          Also, the current crop of 30-32″ monitors are all 16:10, whereas the 27-28″ seem to be 16:9. At these sizes, a 16:10 aspect ratio has a bit more (2-3%) more viewing area per diagonal dimension than 16:9.

        • rpwooste
        • 6 years ago

        I believe your wife is probably in the minority of users, using more than 100% on 23-24″ displays with 1080P, but either way, Win 8.1 supports up to 500% scaling, so she’d be fine with 4K on 23.6″ and the text would be sharper/crisper too.

        I actually find that 4K on a 23.x screen fine at a less than 200% multiplier versus 1080P viewing due to the extra pixel resolution. I found somewhere between 150% and 175% ideal.

        Roland.

    • the
    • 6 years ago

    For that price, picking up three would be tempting. The problem is that I’d also have to pick up >$3000 in video cards in hopes of being able to drive all three at full resolution during gaming sessions.

    • Ryhadar
    • 6 years ago

    Wow, PLS display at 4K for $399. I’m not all that interested in the higher resolution but that’s a pretty damn good deal. That said, if it’s released as a samsung consumer monitor you know it’s going to be missing VESA mounts so that’ll be a bummer.

    • blastdoor
    • 6 years ago

    I guess intel is imagining this will spur PC sales. I think that’s probably wrong. I don’t think most people can tell the difference between HD and 4k on a 24 inch screen.

      • adisor19
      • 6 years ago

      They can if the OS supports it properly. See OS X.

      Adi

      • derFunkenstein
      • 6 years ago

      Why do you think high DPI phones and tablets are all the rage? Why MacBooks sell so well? People can tell the difference and Windows needs to be optimized for it. Virtual resolutions are the way to go, I think. Make something way bigger than the display (say 5120×2880) and then scale it down to fit the panel for a crisp and clear virtual 2560×1440.

        • HisDivineOrder
        • 6 years ago

        They’re all the rage because people told them they’re all the rage. In China, octacore is all the rage not because it’s required, but because that’s what people told them was important.

        Here, quadcore is fine and so it’s used. Same goes for “retina” here. It became a big deal mostly because the quality of the Apple panels is better, but people assumed that was because of “Retina” and here we are, a few years later and all the industry is me-too’ing Apple once again in the only spec they can: resolution.

        If resolution was so all-important, we wouldn’t have home consoles releasing with sub-1080p games, would we? Resolution is only important where people told other people it was important.

          • Visigoth
          • 6 years ago

          ROFL…if you can’t tell the difference between a 1080p and a 4k panel, then you’ve got serious issues, my friend!

          • GTVic
          • 6 years ago

          For more than a decade serious computer users have been asking for a significant improvement in display technology to get beyond 4 dots/mm. This may not be the holy grail that they are looking for but to dump all over it because of disdain over TV marketing techniques is asinine.

        • briskly
        • 6 years ago

        Most flagship and former top phones have very high pixel densities. There isn’t much to pick from if you want something made within the last few years. The macbooks do have a very nice fit and finish FWIW, and the Apple tax is less outrageous when other high spec laptops also cost an arm and leg, without some design sensibility.

        This 4K for everyone campaign doesn’t seem so great for the monitor vendors, especially since a big motivation behind the 4K push is mending the low, low margins of the consumer displays market.

          • derFunkenstein
          • 6 years ago

          High densities in Android land go back to at least 2011 with the Galaxy Nexus, and then in early 2012 with HTC One X and Galaxy S3. Even the Moto G has a high-DPI 1280×720 display, and so do other midrange phones like the new HTC One Mini, Moto Droid Mini, and with any luck, the next Galaxy S mini.

          People don’t deal with this stuff on their phones anymore, so why when given a relatively low-cost option would they deal with it on their PC?

            • Airmantharp
            • 6 years ago

            Pretty much this. I don’t have to see pixels anymore, so I don’t want to- anywhere.

            • briskly
            • 6 years ago

            In smartphone land, last year’s flagship is today’s midrange model, in terms of similar spec or outright sliding the old model lower. Anyways…

            Unlike smartphones, I probably won’t place my face within one foot of my monitor. The need for ultra high pixel density is less pressing. The higher processing power and more robust data signaling needed for these higher resolutions I’m sure has stopped things like the IBM T221 from taking a hold of the market earlier. Given a low cost solution, I don’t see an issue as a consumer. I don’t see it as a top priority, but it is nice

            • derFunkenstein
            • 6 years ago

            It’s not like there are ‘high dpi” and “low DPI” – there’s a sliding scale. a 24″ 4K display is like 190 dpi. You don’t need to be 12″ away to see a difference. 18-24″ away is plenty. Right now as I type I’m about 24″ inches from the monitor (21.5″ 1080p) and I can definitely see how a 4K display would make a big difference at that size. Even if the virtual resolution is still just 1920×1080.

        • Airmantharp
        • 6 years ago

        That’s really the best solution, I think- well, the BEST solution would be for every UI element possible to be vector based, so that scaling becomes a non-issue.

      • rpwooste
      • 6 years ago

      If you sit closer than 37″ from the screen you can see the difference, unless you have poor and uncorrected vision. At 16-20″ the difference is very obvious. You can see it in text applications as text looks vastly sharper, you no longer see pixels, images have more detail – it’s now “in focus” rather than blurry, even video has incredible detail, and games, if you can afford the graphics compute, amazing.

    • south side sammy
    • 6 years ago

    great, 4k displays for the masses……… now all we have to wait on are the software engineers to catch up.

      • sjl
      • 6 years ago

      And the GPUs to push all those pixels for the gamers (and, to an extent, the non gamers as well; 30 Hz is a little bit … well. Laggardly is one way of putting it.)

        • Prestige Worldwide
        • 6 years ago

        Laggardly is putting it nicely

        I would prefer to say completely unplayable

          • xeridea
          • 6 years ago

          30Hz is totally playable so long as it is consistently maintained. If it were average 30FPS, there would be very noticeable hitches in motion, but if 100% of the frames were delivered exactly 33MS apart, it would be good. Not as smooth as 60 Hz consistent, but good luck getting 16.7 99th percentile frame times at 4k on modern games.

          At 23″ though, I would rather have 2560×1440, so I could potentially actually use it for the desktop without magnifying glasses. My ideal monitor would probably be 2560×1600 25″.

            • Airmantharp
            • 6 years ago

            No. Even setting the desktop to 30Hz is an extremely noticeable and ultimately hindering problem, even if it doesn’t affect the basic functionality of the software. People really don’t want it.

            I’d agree that some games would be ‘okay’, but anything that required constant user input, that means not only fast-paced FPS but any FPS that requires aiming, and real-time RPG, and any real-time RTS would be noticeably hindered by being limited to 30Hz versus the experience provided by displays at 60Hz and above.

            Might as well just not make the compromise in the first place.

        • south side sammy
        • 6 years ago

        I believe they have 60’s now.

        • Airmantharp
        • 6 years ago

        Depending on the game in question, of course, pushing pixels really isn’t that big of a deal if you dial the settings back appropriately for the hardware available, assuming of course that the game in question is designed to have it’s settings scaled back enough.

        The larger problem tends to be VRAM, where the average 2GB to 3GB just winds up not being enough with all of the pixel-level effects that modern games employ. But again, if the game can fit into available VRAM, 4k is just another rung up the ladder. We once though the same of 1024×768, 1600×1200, 1080p, 1600p, etc., and the problem will go away just as quickly (or even quicker, really) as the rest.

      • flip-mode
      • 6 years ago

      Shame on those software engineers! They should have had this done years ago.

      • Mad_Dane
      • 6 years ago

      Why the software guys? You can play Quake 1 in 4K, question is: do your computer start crying? Unless some magical dust lands on next gen GPU’s it we be years before a single chip can drive 4K in a AAA title with all the goods on at 60 FPS.

        • Airmantharp
        • 6 years ago

        Revise ‘years’ to ‘about one year’, and you’d be spot on, unless you’re talking about the absolute max insane settings that many AAA games include just for the sake of showing off how well they can not run on modern top-end systems.

        Remember that as resolution increases, the need for extreme levels of AA decreases, etc., and that the fillrate of modern GPUs has actually been fairly high- it’s all of the shader processors that need a bump, and they don’t all rely on the speed of processing pixel-level effects.

      • Vaughn
      • 6 years ago

      Totally agreed with you here.

      This is the first step in the right direction however…. until Nvidia and ATi release single gpus that can push 60fps at 4k I still won’t buy.

      I’m already on a 24 inch IPS 1200p screen so next up is 30 or 32′

      when I buy a new display it usually always has to be substantially bigger than the last.

      The reason I haven’t looked at the 27′ monitors out is the 3 inch jump in screen size isn’t big enough for me.

      Its the reason I went from a 46′ Lcd to a 64′ plasma it actually feels like an upgrade!!

Pin It on Pinterest

Share This