Chrome being optimized for high-density displays

The new MacBook Pro’s Retina display is pretty nifty. Few applications are designed to really take advantage of all the extra pixels, but Google is getting an early start. Chrome Canary, a sort of beta testing ground for new Chrome features, now takes advantage of the high-res screen. Behold:

Looks like the user interface and the font rendering have both been tweaked. Google indicates it has more work to do over the next few weeks, and I’m curious to see how long other browsers and applications take to follow suit.

The MacBook Pro isn’t the only system with a high-density display, of course. Asus’ new 11.6" and 13.3" Zenbooks feature 1080p displays. Those screens may have fewer pixels per inch than the MacBook, but they’re huge improvements over the 1366×768 resolution all too common in modern notebooks.

Comments closed
    • jbraslins
    • 7 years ago

    The picture in the article illustrates how fonts benefit from retina dislpay in OSX. Chrome has to do an update to let OSX render fonts or do something extra in their code.

    What i find more interesting is that for eons web developers would laugh at others who’d have just one high resolution image on their site and simply set width and height of IMG tags in HTML to create smaller versions and thumbnails.

    Suppousedly that is what you need to do to your website to make it retina friendly. If your images are actually much larger (lets say 300×300) but your IMG tags are (100×100), it will look about the same on non-retina displays due to browser scaling your image down, but will be displayed at higher dpi on browsers that are retina display aware (currently safari only).

    I believe apple.com website is using a different set of images if it detects retina capable agent. I doubt most developers would go that far, and most likely just bloat their site for everyone.

      • bhtooefr
      • 7 years ago

      I still say, there needs to be an extension to HTTP to allow a user to report their ppi, then the server can use that information to serve the correct image for the client.

      Then, use width and height tags to make the OS know how to fit that image in the page.

      Either that, or do it via URLs, but you could have filename collision there.

    • Bensam123
    • 7 years ago

    Sorta looks like a version of anti-aliasing.

    • brucethemoose
    • 7 years ago

    So, do those of us with 1600p or 1440p displays get any high dpi love? It is “quad” 1280×720 and 1280×800, after all.

    Chrome canary doesn’t seem to have changed much for me.

      • cynan
      • 7 years ago

      But who would want to run their 30″ monitor at 1280×800 to take advantage of it? Everything might look crisp, but you loose all that extra real estate you paid for.

    • Stargazer
    • 7 years ago

    Why would the font rendering require any special tweaking? Doesn’t it already depend on the resolution? (i.e. for a given letter size it’s already using more pixels at a higher resolution)

    Also, is it just me, or is the “Chrome ball” just as blocky on the right side as on the left side?

    It sort of seems like the only difference you can really be sure of is the thinner lines in the chrome (not Chrome) section on the right side.

      • DancinJack
      • 7 years ago

      Per the article below, Chrome does some off-screen rendering before displaying it’s contents on the page and that process isn’t aware of the high resolution used in the retina display. I don’t know quite enough to elaborate, so I’ll just take Anand’s word for it.

      [url<]http://www.anandtech.com/show/6002/chrome-canary-fixes-rendering-issue-with-retina-macbook-pro[/url<] I don't think it's near as blocky. I also think the properly rendered version looks better in every aspect, not just the lines you're talking about.

        • Stargazer
        • 7 years ago

        Thanks for the link. I haven’t had time to look at it yet, but I’ll do so later.

        About the blocks, when zooming in, it seems to me that the red blocks are the same size on both sides of the line.

        I’m not sure how you can tell that it looks better “in every aspect” since aside from the lines and the font rendering, there aren’t very many things you can actually compare between the two sides. The back/forward/reload buttons only exist on the left side, and the same goes for the green/red/yellow Mac OS blobs. The padlock and URL text only exists on the right side. The slopes and rounded corners in the UI are only on the left side.

        The only things you can find on both sides are the font rendering, the Chrome ball (which to me seems to have an equal block size at least in the red region), and some straight lines. So, in the chrome section, the only real difference we can be sure of are the straight lines. Or am I missing something?

          • DancinJack
          • 7 years ago

          You’re not missing anything. You’re right, we don’t have much to compare but the text. I suppose when I said “every aspect,” I was referring to the text and the lines since that was all I was comparing. I don’t know if the buttons and such will require higher resolution sources, or if the newer versions of Chrome will just render them “retina aware” to make them look better.

    • GasBandit
    • 7 years ago

    Can we get chrome optimized not to take 30 minutes between keystrokes when typing URLS in directly?

      • DancinJack
      • 7 years ago

      Mine doesn’t do that. I’ve also never heard anyone else complain about that. Weird.

        • OneArmedScissor
        • 7 years ago

        Neither have I.

        I think I’d uninstall it, delete all of the leftover files with CCleaner, and then reinstall from the Google site on the stable channel.

      • humannn
      • 7 years ago

      Mine doesn’t do that.

    • syndicatedragon
    • 7 years ago

    Of course, if Apple hadn’t obfuscated the resolution in the first place and implemented proper UI scaling, apps wouldn’t need to have “special” support for Retina.

      • BobbinThreadbare
      • 7 years ago

      Did you even read this article?

      • crsh1976
      • 7 years ago

      Well they technically still haven’t implemented actual scaling, they just 4Xed everything (but it’s still cast in stone).

        • bhtooefr
        • 7 years ago

        They tried actual scaling, it broke so much stuff that they had to back out of it.

        You’re not getting actual scaling that works unless it’s on a new OS that explicitly breaks backwards compatibility with everything else out there.

    • jdaven
    • 7 years ago

    Now we have a new hardware spec for software companies to catch up with and they still haven’t fully caught up with the other specs out there:

    Full 64-bit environment
    Full multi-threaded environment
    General computing using the GPU

    and now

    Support for high resolutions. When will the madness end!?

      • yogibbear
      • 7 years ago

      Laxadaisical support for multi-monitor desktop use
      Touch capabilities
      Hyper Brain-link archiving
      Motion sensing input devices

      • OneArmedScissor
      • 7 years ago

      [quote<]When will the madness end!?[/quote<] When Macs switch to user changeable parts and OSX runs all games.

      • Frith
      • 7 years ago

      Let me ask you this, why don’t you demolish your house and rebuild it? Houses now are built differently than they were twenty years ago, so why don’t you knock you house down and rebuild it using more modern construction methods?

      Doing such a thing is obviously idiotic. It would take considerable time and cost a lot of money to demolish and rebuild your house, while it would deliver little or nothing of benefit. What you’re suggesting is equally idiotic, and it would be foolish for software companies to spend a lot of time and money rewriting their software if it won’t deliver an tangible benefits.

      Take multi-threading. Some applications, such as video encoding/decoding, benefit greatly from multi-threading and that’s why most video encoders/decoders are multi-threaded. However, the vast majority of software doesn’t lend itself well to multi-threading and you’d be putting extra threads in for the sake of it. Furthermore, most software runs perfectly fine in one thread and the user would not notice any difference in performance if it were multi-threaded. Only a fool would suggest that software companies invest time and money rewriting software if it will deliver absolutely no benefits.

      GPU acceleration is similar. GPUs are very good at large batch calculations, but they are very poor when it comes to branching. For that reasons most programs would benefit very little from GPU acceleration. OpenCL is troublesome and CUDA is Nvidia only so rewriting your software to take advantage of the GPU would be expensive and time consuming. Again then, it would be idiotic for a software company to spend time and money on GPU acceleration if it delivers no tangible benefits.

      Recompiling for 64bit is similar. While it requires vastly less effort than the other two, there are still risks involved, and in most cases little to be gained.

      You might think, “OMG! Multi-threading is the r0xors!!!11!!11 We need it in every application!!!! And GPU acceleration az well!!!”, but software companies take a rather more practical view of things. No software company will spend time and money, and risk introducing bugs, to do something that in no way improves their software. So no, software companies don’t need to “catch up” any more than you need to demolish and rebuild your house.

      High density displays is the one matter I agree with you on, since they do deliver very tangible benefits in all software. The problem is that until high dpi monitors are on the market there’s no real point in developers reworking their software to take advantage of them.

        • moose17145
        • 7 years ago

        [quote<] No software company will spend time and money, and risk introducing bugs, to do something that in no way improves their software. [/quote<] well.... except microsoft...

          • Parallax
          • 7 years ago

          …and Adobe.

      • burntham77
      • 7 years ago

      I am still floored by the fact that Windows 8 will be offered up in 32-bit versions. Seriously, Microsoft, just let it go. Even my parents are using the 64-bit versions of Windows 7. Is Microsoft saying they’re more out of date than my technologically retarded father?

      (Note, my father has referred to himself as such, so no disrespect intended.)

        • indeego
        • 7 years ago

        16-bit legacy code exists, is in use for people in critical LOB situations, and it’s a market Microsoft can’t quite ignore. Microsoft likely is well versed in determining the cost/benefit ratio of pulling some features versus moving people forward into the future.

        32-bit won’t be offered as a default install for almost any OEM, period, so it’s really not anything you need to worry about.

          • bhtooefr
          • 7 years ago

          Also, there’s plenty of Atom devices out there with their 64-bit support disabled.

Pin It on Pinterest

Share This