More monitor makers using IPS tech in desktop LCDs

IPS panels have become increasingly common in tablets, and it looks like their popularity might rub off on desktop displays. DigiTimes is reporting that Mitsubishi, AOC, and Viewsonic are all dipping into IPS technology for desktop monitors. They’re not just doing so with big screens, either. All three have come out with 23-inchers based on IPS technology.

According to the story, IPS panels in the 19-22" range cost about $35 more than their TN equivalents. That gap grows as the screen size increases, although other factors can come into play. IPS panels are typically reserved for higher-end screens, so they’re often paired with extra goodies like additional video inputs, integrated USB hubs, and more adjustment options than budget LCDs.

The DigiTimes piece focuses on wider viewing angles as the key selling point for IPS displays. Improved color reproduction is usually another key benefit, but that’s because IPS displays have typically offered eight bits per color channel, while TN technology uses six bits. IPS isn’t synonymous with 8-bit color anymore, though. Some cheaper panels use six bits per color channel in conjuction with Advanced Frame Rate Control (A-FRC), much like TN displays. Reviews of these so-called "6-bit + A-FRC" monitors look pretty positive, but I’ve not yet gazed upon one myself. If you see a good deal on an IPS monitor (particularly if it’s described as e-IPS), you might want to do some digging into the datasheet to figure out its true nature.

Comments closed
    • albundy
    • 8 years ago

    wow, it only took a decade, and right on time for OLED deliveries!

    • Wirko
    • 8 years ago

    I wonder how much the manufacturers save by shaving two bits from the D/A, lookup tables and (part of) signal processing chain. Very little, probably, as 8-bit D/A’s were standard on VGA cards even 15 years ago.
    There must be another reason to steal bits and it may have something to do with the cost of the TFT matrix itself but I have no idea what that could be.

      • Chrispy_
      • 8 years ago

      TN panels use 6-bit instead of 8-bit in many cases to improve response time.

      I think a TN pixel reacts faster if it only has to “snap” to one of 64 different positions instead of 256 positions.

      This is why all the <2ms or 120Hz panels are 6-bit and all the 8-bit panels have 5-8ms response times still.

      I really don’t mind the dithering of a 6-bit panel, but I do hate poor black levels, ghosting through high response times, input lag and backlight bleed.

    • atryus28
    • 8 years ago

    I’m really pissed to find out well after the fact that my Dell U2311 is not 8bit. Bastards, I bought this monitor over a year ago now, and more than 6 months later the truth comes out that it is not in fact an 8bit panel which was a major point to get. AArrggg I HATE the liars department… I mean marketing.

    It is still a very nice monitor but when I was specifically trying to AVOID 6bit panels that sucks. Much happier I only paid $240 for it then.

      • bthylafh
      • 8 years ago

      If you were happy with it until you discovered the missing bits, I don’t see what’s wrong with it.

      I’d have been happier if my 2209WA was 8-bit, perhaps, but it’s still a freakin’ good monitor for the price.

        • atryus28
        • 8 years ago

        If you paid more money for a product because of it’s features and then you found out it did NOT have those features would you not be upset. What if I sold you a Lincoln (or any other high end car of your choice) and found out it was a re-badged Taurus? Even if you were happy with the car would you not be upset you paid and an extra $15-20K for something you did not get. Jokes aside about whatever you don’t like it’s the principle.

          • Goty
          • 8 years ago

          You mean like the MKS? Or the Continental?

          • bthylafh
          • 8 years ago

          Eh, if it was a car I’d be more annoyed with myself for not doing sufficient research.

          I knew when I bought my 2209WA that $250 was crazy cheap for a 22″ IPS monitor, and that they must have done something to get the price down where that was profitable; it was advertised as “e-IPS” anyway, though nobody said what the exact difference was. So while I’m vaguely disappointed to learn that it’s a 6-bit panel, it’s not a huge deal, and IMO it’s still a good price for what I got.

      • ludi
      • 8 years ago

      If you were trying to “avoid 6-bits”, then how did you not notice problems when running your color calibration routines?

      On the other hand if you were trying to avoid the classic TN viewing angle and color-shift problems, then what’s wrong with the panel you’ve got?

        • atryus28
        • 8 years ago

        Still a problem to pay for a higher end product and feature set you did not receive. I even used the site linked in the article to do my research on it. I have not yet gotten to the point of knowing if the colors sent to the printer were not accurate (the budget for some things got cut last/this year). Regardless may not be able to see if something is exactly off but someone else I might do work for could, or the printer I send it out to and get a bad proof etc. We’re not all rich and can’t afford to jump into full pro equipment.

      • morphine
      • 8 years ago

      A friend of mine bought that exact panel. I didn’t find any banding or dark shade problems with it, so as far as I’m concerned, this “6-bit + AFRC” thing does indeed work.

      And hey, if you hadn’t noticed until now… that means that it worked out OK for you too, no?

      • d0g_p00p
      • 8 years ago

      Well you did buy a 23″ display at a cheap price, that right there should have told you that is was a cheap panel. However Dell does list the Dell U2311 as having a IPS panel and 16.7 million color depth. Again though the price and the resolution should have clued you in.

      If you were happy with it before you knew about the specs, why would that change?

    • Deanjo
    • 8 years ago

    Great now give me increased resolution as well (16:10 ratio please and keep that 16:9 crap for TV’s that don’t have menu bars).

      • sydbot
      • 8 years ago

      If only the 16:10 Tax was cheaper, granted it is easier to find tilt/height/swivel in 16:10.

    • Parallax
    • 8 years ago

    This is not new news. IPS displays have been available for about 15 years now.

    The point that consumers and manufacturers alike still completely miss is that the backlight is just as important as the LCD itself.
    Need a wide gamut display? That is determined by the backlight.
    Brightness? Backlight.
    Flickering? Backlight.
    Significant portion of cost? Backlight.
    Edge light bleeding? Backlight.
    Typical display lifetime? Backlight (or capacitors).

    I personally cannot comfortably use most new LCD displays (CCFL or LED backlights) for reading because the backlights flicker too much. This is because turning the brightness down relies on PWM (cycling the backlight on-and-off) quickly. The LCD itself may be absolutely beautiful and fast, but without a good backlight I won’t buy one.

      • anotherengineer
      • 8 years ago

      The same could be said of a CRT with a low (below 75hz) refresh rate. (they flicker)

      I did not know the brightness on lcds with ccfl used pwm and cycling to reduce intensity..interesting.

      What about LED backlit panels, couldn’t they use voltage control to adjust brightness??

      That being said I use a samsung 2233RZ lcd for gaming since it is 120HZ refresh rate and I do find it a lot smoother, easier on the eyes and less fatigue and headaches after spending a lot of time in front if it compared to my old vp903b viewsonic beside it. However staring at any screen for a long time give me eyestrain.

      I do find the viewing angles on the viewsonic (MVA) panel better than the samsung (TN) panel.

      Monitors like most things are personal preference.

        • Parallax
        • 8 years ago

        Both CCFL and LED backlit desktop displays seem to all use PWM at 175-220Hz to dim the backlight (note that the backlight and LCD update frequencies are not synchronized). While this speed is likely high enough to prevent flickering when looking at one spot on the display, it is NOT high enough to prevent flickering when your eyes are moving (like when reading text). The worst case seems to be when reading white text on a dark background, since when glancing from one point to another I can see multiple afterimages coinciding with the backlight cycles that are very distracting. For the case of CCFL I can also see color shifting in the afterimages, and have confirmed they are present due to the backlight by using a high-speed camera.

        AFAIK the high-brightness LEDs used in monitors like to be driven at a constant voltage and current because of heat issues, and so the only method left to dim them is PWM. I don’t know why nobody drives them at 1KHz+, since there should be no reason not to. Some smaller devices (e.g. iPad) may use some form of voltage or current control, but I have no proof of this as yet.

        I would love a 120Hz (or more) display, but for now viewing angles and bit depth are more important [u<]to me[/u<]. Have you tried turning the brightness of your display all the way up (to prevent flickering), and reducing the brightness in the display drivers (so it's not too bright)? This will probably give you crappy contrast, but I know it does help me a lot when reading.

      • Kaleid
      • 8 years ago

      From what I have read most backlights run at a much higher hz than those on CRTs.

      “On smaller CRT monitors (up to about 15″), few people notice any discomfort below 60–72 Hz. On larger CRT monitors (17″ or larger), most people experience mild discomfort unless the refresh is set to 72 Hz or higher. A rate of 100 Hz is comfortable at almost any size. However, this does not apply to LCD monitors. The closest equivalent to a refresh rate on an LCD monitor is its frame rate, which is often locked at 60 frame/s. But this is rarely a problem, because the only part of an LCD monitor that could produce CRT-like flicker—its backlight—typically operates at around 200 Hz.”
      [url<]http://en.wikipedia.org/wiki/Refresh_rate[/url<] This also seems to contribute to making LCD much more suitable for reading text.

        • Parallax
        • 8 years ago

        I’ve personally measured several CCFL LCD monitors, and they all had backlights running at 175Hz.

        The flicker is not visible to me unless my eyes are moving, but this happens constantly while reading and gives me eyestrain. I do very much agree that LCDs are better than CRTs for text (and I’ve been using one since I could afford it), but they could easily be much better.

      • SPOOFE
      • 8 years ago

      [quote<]IPS displays have been available for about 15 years now.[/quote<] And the market has been predominantly TN for quite some time, too. Greater penetration of IPS into the consumer space certainly is news.

    • anotherengineer
    • 8 years ago

    It would be really nice if laptop/notebook manufacturers gave the option of a TN, MVA or IPS panel when ordering them.

    I would gladly pay $50 maybe even $100 to have good viewing angles. Side side are not too bad on newer latop TN panel, but the up and down viewing angles on them leave a lot to be desired.

    • I.S.T.
    • 8 years ago

    Wider angles is far easier to display to the unknowing consumer than better colors.

    Not only that, but you said it yourself: you need to become accustomed.

    It’s not odd at all why they focused on that. If I was writing that article, I’d do the same thing.

    • Frith
    • 8 years ago

    I don’t understand the obsession with IPS panels. I have a couple of 24inch 16:10 IPS monitors and they’re just as bad as TN panels. LCD truly is the worst display technology ever created and only comes in varying grades of crap.

    Despite what you say, viewing angles are exceptionally important and remains one of LCD’s greatest weaknesses. If you look at an LCD at any angle other than 0degrees the colours become lighter, the contrast drops and black starts to look more like white. You might think “that’s not a problem because I always sit straight in front of the monitor”, but the issue is that it’s impossible to look at the whole of a monitor straight on. When I look towards the corners of my monitor the angle I’m viewing at is probably about 30degrees. This means the colour uniformity across the screen is dire and the colour recreation and contrast towards the edges of the screen is abysmal.

    Even without these issues LCD is awful at the best of times. Despite the bullshit marketing most makers put out about 1trillion:1 contrast and 0ms response time, LCD still has poor contrast, poor black levels, poor colour recreation, poor response time, issues with motion judder and most displays are still 60Hz which looks terrible if you’re used to 120hz.

    I’m still clinging to my Sony GDM-FW900 CRT as my main monitor and I’m praying that it holds out until OLED monitors become available. When you compare the FW900 side by side to an IPS LCD the difference is like night and day and the FW900 is vastly superior in every respect.

    LCD has been a total catastrophe for image quality and has lead to a decade of woeful displays, and the agony is continuing as OLED delays continue. The Sony BVM-E250 is the only ray of hope at the moment and I can’t wait until that filters down into consumer level products.

      • bitcat70
      • 8 years ago

      Maybe a bit off-topic but this is about contrast: at my local Best Buy I was looking at flat screen TV’s. There is a whole wall of them. Of course most are LCD but some are plasmas. Here’s the thing: all the plasmas seem to be washed out, like the blacks aren’t as deep as on the LCD’s; the other colors seem to be comparable. How is that possible? I thought plasma was the better tech.

        • bthylafh
        • 8 years ago

        Plasmas are supposed to be better at displaying blacks. My guess is they were misconfigured; I think it’s a common practice in showrooms to have TVs run full-bright, because it looks better in a bright room.

          • CaptTomato
          • 8 years ago

          The TV’s are placed in dynamic mode{high brightness and contrast} to counter the strong instore lighting, and as I just said, LCD are much brighter than plasma so they typically wallop them instore, but at home the tables will be turned, especially if the LCD you buy has motion handling issues, viewing angle problems, or screen uniformity problems{my LCD suffers from all 3 of these issues, my plasma suffers from none of these}.

          LCD beat plasma to market with 1080p and I think used less electricity doing it, so it may have given some people the impression that LCD was the real king of HD, not to mention that many manufactures dropped out of plasma production including the mighty Pioneer, so it’s not unreasonable for the average person to assume LCD is no1 especially when it also looks better instore.

          If however your LCD suffers from any of it’s inherent problems, you might be inclined to have a rethink.

          As long as you don’t overwhelm a plasma with excess light[especially sunlight], most people will get a kick out of even Panasonics base model 1080p unit.
          I also see the “it looks like it was shot on video” effect on large LCD, as such, no large LCD is of value to me.

            • bthylafh
            • 8 years ago

            Och aye, 1080p plasmas use a ton of energy. I think Panasonic’s were using ~550W at 42″ versus ~120W for an LCD at the same size, and that without LED lighting.

            I went with a 42″ Panny 720p unit; we don’t sit close enough for extra res to matter, don’t have a Blu-Ray player, and the only console hooked up is a Wii.

            • CaptTomato
            • 8 years ago

            This is not true anymore though….the extra cost is $1-$2 a week more….not so much to pay for superior tech.

            • bthylafh
            • 8 years ago

            …ignoring the aircon bill during the summer. 550W is a fair old bit of heat.

            • CaptTomato
            • 8 years ago

            180w for a 50in 1080p panasonic plasma

            [url<]http://www.hdtvtest.co.uk/news/panasonic-tx-p50vt30b-p50vt30-201106061171.htm[/url<] Sounds like you're quoting figures from 4-6yrs ago.

            • bthylafh
            • 8 years ago

            It was… two and a half years ago that I got my plasma after quite a bit of research. Good to know that they’ve improved quite a bit since then.

            • nerdrage
            • 8 years ago

            I have a 2010 model (fairly high-end) Panasonic plasma 55″ and I believe the manual said it consumed 600 watts. But I think that may be the “peak” power draw. Don’t have a meter to confirm this.

            • CaptTomato
            • 8 years ago

            2010=54in…..and yes, 600 is max draw, but average consumption will be much less, but there’s no doubt that high end plasma are very energy efficient these days, and people need to be aware of how cheap it is to run these things anyway, ie, it actually doesn’t cost that much to run a 50-55in HDTV in 2011.

        • Theolendras
        • 8 years ago

        You may want to calibrate the display, usually default settings pushes brightness trought the roof since manufacturer knows you’re likely to see it in a well-lit store with demo full of saturated colors. Brightness makes a better splash than contrast, deep black and color reproduction to Joe sixpack I guess…

        • Frith
        • 8 years ago

        Ambient light conditions impact the contrast of a display, and in high ambient light an LCD will outperform a plasma. In a pitch black room the situation will reverse and plasma is far better.

        Anyone who is concerned about image quality will always watch films in a dark room so as to get the most from their display. Comparing displays in Best Buy is therefore not a very effective way to find which television offers the best picture since the light conditions probably won’t match that of your room.

        • CaptTomato
        • 8 years ago

        “””I thought plasma was the better tech””””

        Overall it is, however plasma’s aren’t as bright as LCD, but the LCD brightness is useless at home “unless” you’re placing it in a well lit room.
        What you’re seeing instore is just the extra brightness, but at home, a good plasma will kickass, no meaningful motion blur, near perfect if not perfect screen uniformity, strong fullscreen black levels, wide viewing angles etc.

        Plasma are also better at 3D, but 3D is of no interest to me.

      • Kurkotain
      • 8 years ago

      “and only comes in varying grades of crap.”

      I laughed so hard.

        • Frith
        • 8 years ago

        That’s because you don’t have a clue what constitutes a good image and you’ve never had the opportunity to compare an high quality display to an LCD. In short you’re completely ignorant.

        The BVM-E250 is a product aimed at broadcast professionals who know what a high quality image looks like. At the presentation announcing it Sony talked about how they were constantly getting complaints about their LCD broadcast monitors with people saying “the colours are terrible” and “the blacks aren’t deep enough”.

        They compared OLED, CRT and LCD in terms of their primary advantages and the advantages listed for LCD are that it’s light weight and has low power consumption. It shows how bad LCD is when these are the only benefits it has.

          • SPOOFE
          • 8 years ago

          [quote<]the advantages listed for LCD are that it's light weight and has low power consumption. It shows how bad LCD is when these are the only benefits it has.[/quote<] I think that shows how most people prefer a display that doesn't require a forklift to move around. CRT's better have a better viewing angle, because you're sure as shootin' not moving the thing as soon as you place it.

      • morphine
      • 8 years ago

      I have some trouble believing that your CRT after ~3 years of frequent use still has a black level that can beat a modern LCD. The reason why I dropped my own Sony GDM-series monitor was that the phosphor had worn out and it was getting bright and blurry.

      Response time and blurring are very much a non-issue these days. Input lag is a far worse problem, but recent models have been getting much, much better in that regard. These days, in practical terms, the only “real” complaints can be directed at the contrast ratio and black depth. And of course, there’s the whole “incredibly sharp picture” thing that LCDs have 🙂

      Also, there’s the matter of cost: did you spend as much on those 24″ IPS monitors as you originally did on your CRT? You’ll find that Really Damn Good monitors do exist, but they’re expensive. Such is the way with everything.

        • Frith
        • 8 years ago

        As phosphors age the level of light output declines, and over 10 years the light output will decline by about 50%. For you to say that your CRT was “getting bright” because it had “worn out” does rather show that you’re talking out of your rectum.

          • morphine
          • 8 years ago

          How polite.

          Phosphor wearing out = picture going dark = brightness/contrast going up to compensate = black level being lost, blurriness introduced.

          But hey, that was only my rectum talking. *brraapp*. That, and the three repairmen I talked to.

            • Frith
            • 8 years ago

            I’m not sure where the “brightness/contrast going up to compensate” phase comes in. If we’re talking about Panasonic plasmas, then yes they did add a “feature” which automatically increases the white level as the display ages, but we’re not talking about Panasonic plasmas.

            The light output on a 10 year old CRT is more than satisfactory in a dark room, so there is no significant need to increase the white level to compensate for the lower light output.

            • SPOOFE
            • 8 years ago

            [quote<]I'm not sure where the "brightness/contrast going up to compensate" phase comes in.[/quote<] Well, when the display people look at is dim, they adjust the brightness/contrast. Isn't that what you do if the picture you're looking at isn't preferable? Adjust it 'til it is?

            • Anonymous Coward
            • 8 years ago

            [quote<]The light output on a 10 year old CRT is more than satisfactory in a dark room[/quote<] I primarily occupy lit rooms.

      • Vasilyfav
      • 8 years ago

      Are you also clinging to single core cpus, wired ball mice and floppy drives?

        • Frith
        • 8 years ago

        I prefer “better” while your priority seems to be “newer”. If we compare the performance of a modern CPU to an older one the modern CPU will come out faster, so I prefer a modern CPU. If we do a side by side comparison between a high quality CRT and an IPS LCD the CRT will win in every respect so I prefer the CRT.

          • indeego
          • 8 years ago

          [quote<]"If we do a side by side comparison between a high quality CRT and an IPS LCD the CRT will win in every respect so I prefer the CRT."[/quote<] You win! The market has swayed from your superior opinion, and you need to lead us into the light.

            • kroker
            • 8 years ago

            Just because a market moves in a certain direction doesn’t mean it’s better. Look at megapixels, 16:9 for MONITORS etc. But yeah, CRT needed replacing. And so did VGA!

            • SPOOFE
            • 8 years ago

            The direction that a market moves indicates what the market WANTS. The market is tired of humongous and heavy displays, even if it means you’re only seeing 16 million colors and can’t see the screen from 179 degrees.

          • CaptTomato
          • 8 years ago

          Problem for me is that SIZE matters to image quality, so even if older high end CRT’s haven’t degraded, they’re still too small to do justice to the world of HD, including how much more lifelike hi-res photo’s look.

        • kroker
        • 8 years ago

        Ball mice weren’t that bad. I wasn’t that annoyed that I had to clean them once in a while. I’ve never had the same accuracy in games with a cheap optical mouse that I had with a cheap ball mouse. You need a more expensive optical mouse with adjustable weight and a high end mouse pad nowadays.

        And let’s just forget floppy drives ever happened 🙂

      • Krogoth
      • 8 years ago

      Wow, you are so of touch with reality that isn’t funny.

      There isn’t a technological solution that can perfectly replicate what the human eye can perceive (technically visual cortex) in terms of color acuity. However, our visual cortex is very good at putting in the missing pieces when push comes to shove. That’s why the majority find LCDs to more than adequate for their needs, despite some of their known issues.

      CRTs aren’t perfect either, that have imaging issues of their own that you seem to quickly forget. They mostly deal with screen geometry. Discoloration can happen on them as well, it mostly due to how well electro-magnets can drive rays onto phosphate surface. Sufficient EMI can easily distort the rays (ie, powerful magnets from a subwoofer/amp). Their image quality also depends on analog wiring and RAMDAC quality of the video card in question. There’s a reason why Matrox used to be hail for their 2D image back in the day. They consistently over-engineered the RAMDACs on their video cards. It wasn’t the case with other video card vendors.

      There’s a reason why CRTs got phased out. They are more expensive to manufacture due need for all the heavy metals for the shielding and glass. They require far more power, volume and are massive especially for 19″+ behemoths when compared to a LCD with similar screen size. Because of the aforementioned reasons it is difficult to scale CRT up to 30″. There has been a general push for manufacturers to move away from heavy metals due to environmental concerns. IIRC there’s already been a huge fuss for electronic guys to move away from lead-based solders. Can’t imagine CRTs getting a free pass. There’s already a big problem with disposing dead units.

      CRT benefits of having “superior” colors only matter to certain niches (content creators, artist) and require proper source material (not games) and conditions (room lightning matters a lot), proper screen calibration (most people don’t do this) to point them out. Viewing angles are more noticeable, but they typically affect spectators not the primary viewer. It is mostly an issue with budget LCDs. It is practically a moot point for a quality IPS unit.

      Don’t hold your breath for OLEDs and SEDs to be the holy grail. They each their own issues as well, OLEDS = lifespan and scalability, SEDS = trapped in legal/patent hell along with unknowns on scalability/longevity.

      You overplay LCDs on making image quality “horrible”. You don’t know or remember the old days. Most CRTs used to have shoddy image quality (by today’s standards), low resolutions (320×200 or lower) and only had one phosphate (amber, green, white). A full-color, quality CRT used to cost a pretty penny (several thousand USDs in 1970s-1980s). Full-color CRTs didn’t become decent until mid to late-90s. They still weren’t cheap, but you get what you pay for.

      • cynan
      • 8 years ago

      I had a couple of 21″ CRTs with Sony Trinitron tubes (one was a Sony, the other was a rebadged Sun) that I used throughout the mid 2000s. Then I finally got a Dell 24″ Ultrasharp in 2008 (Samsung PVA panel).

      The LCD was sharper and had comparable contrast. The color reproduction may have been a tad better on the CRTs, but, meh, not so much that it bothered me. When I got the 30″ Dell Ultrasharp the following year (IPS of course), the difference in color reproduction capability was pretty much mute and out went my beloved CRTs for good. The one thing that annoys me from time to time with my Dell IPS monitor is that darn anti-glare coating. You get used to it eventually, but I’ll never know why they had to make the grain quite so pronounced.

      And as for CRTs, I remember wasting many minutes, every so often, obsessively trying to get an image that perfectly filled the screen, incessantly playing around with all those trapezoidal and parallelogram geometry settings, etc, that CRTs used to come with to allow users to self-flagellate trying to get a perfectly fitted rectangular image on a screen that wasn’t quite flat. Good riddance to all of that with LCDs.

      I was really happy with my Trinitron CRTs while I had them, but I’d never go back, especially once you factor in the size, weight and esthetic improvements of modern LCDs.

        • JustAnEngineer
        • 8 years ago

        My brother still has Viewsonic PF815+ (22″ Trinitron) and PF95+ (19″ Trinitron) monitors that I gave him. My UltraSharp LCD monitors are significantly better.

        I recently gave my Dad my old KD-34XBR960 (34″ 16:9 HD Trinitron) television. It’s still got a good picture, but geometric distortion is annoying when compared to a modern LCD or plasma television. Plus, the darned thing weighs 200 lbs. Moving it was fun.

          • SPOOFE
          • 8 years ago

          Years back I helped an old theater director move out of his house; he had two 32″+ Trinitron’s, plus some really large projection TV (~50″, I’d guess?). Yeah, I remember moving those TVs, and this was when I was in crappier shape.

          That weight is one heckuva cost for “quality”.

      • kroker
      • 8 years ago

      I had a Dell P1130 (21″ Trinitron) before I bought my first LCD (24″ P-MVA). Now I have a HP ZR24W 8-bit IPS 24″ screen. The only thing I miss from my old Trinitron monitor is the incredibly sweet fluid motion in v-synced games running at 80-120Hz/fps (the monitor could do 120Hz at 1600×1200 but the Nvidia 7900 GS I had back then couldn’t always keep up).

      However, I could never get that darn monitor to perfectly focus text on the horizontal axis, and the focus changed slightly as the monitor warmed up. Luckily Cleartype helped make text more readable. But still annoying!

      • odizzido
      • 8 years ago

      I agree with almost every point you make. However there is a huge, huge benefit for me using LCD monitors though and that is that it doesn’t burn my eyes and have an unstable image like plasma/CRT displays do. At about 100hz CRTs aren’t bad bad to look at, but I still prefer the stability of an LCD.

      • internetsandman
      • 8 years ago

      And this, ladies and gentlemen, is why we don’t feed trolls =)

      • albundy
      • 8 years ago

      “I’m still clinging to my Sony GDM-FW900 CRT as my main monitor and I’m praying that it holds out until OLED monitors become available.”

      why bother? I’m sure you’ll complain about that too. stick with your radiation tube! its great for your eyes!

      • Dashak
      • 8 years ago

      I have a Sony GDM-FW900 too. It’s awesome for gaming, but nothing I’ve seen beats H-IPS for anything else.

    • bthylafh
    • 8 years ago

    Are these new IPS panels 8-bit, or are they 6-bit? ISTR that the cheaper IPS monitors are 6-bit.

      • Dissonance
      • 8 years ago

      Good catch. Updated the article.

      • Voldenuit
      • 8 years ago

      As Geoff mentioned in his news post, it’s the E-IPS panels that are 6-bit. AFAIK, LG is the only company producing E-IPS panels at the moment, so it might be worth researching who makes the panels even if the panel type is not mentioned in the product literature.

      Something worth considering is that even 8-bit panels can show banding unless they are using a 10-bit color table (or have a very well tuned color gamut with an 8-bit table).

        • bthylafh
        • 8 years ago

        Look a little further down the thread – he didn’t mention that until I brought it up.

    • btb
    • 8 years ago

    Good! About time the general public found out about the wonders of IPS 🙂

      • boing
      • 8 years ago

      I felt it was time to replace my 19″ five-year old TN-panel monitor and jumped on the IPS-hype purchasing a 23″ IPS monitor with LED backlight. Other than the new monitor being (quite obviously) bigger and having a much better contrast, I really cannot tell much difference. Sure, the colour reproduction is better, but not much. And as for viewing angles, I must be a very odd one because I like to sit in front of my display so that’s never been a concern of mine.

Pin It on Pinterest

Share This