Eyefinity pushes over 24 million pixels with one next-gen Radeon

Say you’re AMD, and you make graphics chips that nearly double in performance with every generation. Yet games haven’t been getting all that much more demanding over time. What would you do with all of that excess power, especially if you wanted to stir up interest in your latest product?

AMD’s answer at the moment is a new feature it calls Eyefinity. Here’s the basic concept: through the magic of its next-generation GPU and an array of compact DisplayPort connections, a single GPU can drive up to six high-megapixel displays for gaming at resolutions that boggle the mind. The example I saw in action today looked like this:

Dead Space at over 24 million pixels

That’s six Dell 30″ monitors, each at a resolution of 2560×1600, showing Dead Space at over 24 megapixels. The game ran fluidly, and as you can see, your character on screen is pretty much life-sized, if not a little larger.

7680×3200 resolution, anyone?

I didn’t snap a picture of it, but I checked the back of the PC accomplishing this feat, and all six of the DisplayPort connections were plugged into a single expansion slot. Not only did games like Left 4 Dead and World of Warcraft play smoothly, but DiRT 2, a DirectX 11 game, ran at more or less acceptable frame rates and looked stunning doing it, as well.

Suporting six monitors with a single graphics card will require a specialized board, since most don’t ship with six DisplayPort connectors across the back. But most cards based on the forthcoming Radeons will be able to drive three ultra-high-res displays of various types. Here’s AMD’s Dave Baumann showing off DiRT 2 on a triple-monitor setup.

If you’re an old-timer like me, you’re probably having flashbacks to the days of the Matrox Parhelia and TripleHead, a similar feature that didn’t fit well with the Parhelia’s inability to run games at sufficient speeds on even a single display.

By contrast, AMD says quite a few of today’s games run just fine at such mega-resolutions. Part of the trick to making this work is that Windows sees the array of monitors as a single display device, which helps with game compatibility. AMD’s drivers then handle the task of setting up the displays and coordinating their relative positions. I got a quick look at an early version of the control panel wizard designed for this task, and it’s already reasonably straightforward to use.

AMD seems to think Eyefinity could be a pretty compelling feature for some folks, and no doubt playing DiRT 2 on a 24+ megapixel display is an interesting experience, at least. Part of AMD’s pitch for Eyefinity is based on the realities of monitor pricing: three relatively nice displays could be had for the price of a single 34″ high-density monster. So why not build a gaming setup with three displays instead?

Although the demo we saw today was based on conventional Dell 30″ monitors, AMD has been working with Samsung on Eyefinity support and has plans involving monitors with very narrow bezels, so that many displays can act together as one with a minimum of visual interruption. The company may also incorporate a feature in its graphics drivers to compensate for the visual offsets caused by bezels. Still, if this takes off, three-display setups will almost certainly be the most popular variation, because four- and six-display configs will have the display edges interruputing the dead-center focal point—where the crosshair goes in first-person shooters, among other things.

One can’t help but think of Eyefinity as a rival of sorts to Nvidia’s GeForce 3D Vision scheme, which involves per-game compatibility profiles, steep GPU requirements, specialized 3D glasses, and 120Hz displays—or Jaws 3D-style red-and-blue glasses in its ghetto guise. The question is whether Eyefinity will gain any more traction than Nvidia’s 3D tech has with consumers. I have my doubts. The peripheral vision afforded by a wrap-around three-way display might be nice for locating zombies in Left 4 Dead, but one wonders whether so having so very many pixels—and the GPU requirements that come along with them—really makes sense when another option is planting a giant-screen HDTV in front of your gaming rig and calling it good.

Then again, ultra-megapixel monitor arrays aren’t just good for gaming. Google Earth looks mighty impressive at 7680×3200, as well, and workstation users don’t have the budgetary constraints of your typical gamer, either.

Comments closed
    • 0g1
    • 10 years ago

    Impressive to pump 24MP. 2MP is fine for me and this card could theoretically do that 12 times faster :). 24MP is only useful way to use power for these old games.

    • moritzgedig
    • 10 years ago

    “Then again, ultra-megapixel monitor arrays aren’t just good for gaming. Google Earth looks mighty impressive at 7680×3200, as well, and workstation users don’t have the budgetary constraints of your typical gamer, either.”
    But that would often be 2D, something entirely different.
    I think different display modes would help better for 3D CAD, like one screen at X% zoom and another at 3X% to edit details but not get lost in the close-up.

    • Kaleid
    • 10 years ago

    Cry-Engine 3 for PC (First Time: Ati Eyefinity)
    §[<http://www.youtube.com/watch?v=04dMyeMKHp4<]§ AMD Shows off 24(!) Panel Eyefinity on DirectX 11 Graphics Cards §[<http://www.legitreviews.com/article/1069/1/<]§ It seems pretty amazing: §[<http://img.techpowerup.org/090911/2322.jpg<]§ §[<http://img141.imageshack.us/img141/7429/5800c.jpg<]§ The card stripped: §[<http://www.techpowerup.com/103599/AMD_Cypress__Radeon_HD_5870__Stripped.html<]§

    • liquidsquid
    • 10 years ago

    This just screams to have at least three projectors in a large room with a thundering audio system!

    • sigher
    • 10 years ago

    Perhaps they can make displays with 1 or 2 sides borderless and keep 2 borders, that way you can still route cables and what not and keep it sturdy and the consumer only has to pick a left or rightside model, if you sell enough it should not be too big a hurdle to offer 2 or more models surely.

    • StashTheVampede
    • 10 years ago
    • Meadows
    • 10 years ago

    Terrible. The way those are assigned is far too wide for almost any sort of game except perhaps a racing one.

    Sure, Dead Space is as boring a “go forward forever” rail-shooter as you can possibly find, so you’re not really going to ever look up or down, but still.

      • asdsa
      • 10 years ago

      Suck it up. ATI is going to destroy nvidia this round but keep posting those “terrible” comments, maybe some of them catch on fire.

        • Peffse
        • 10 years ago

        ATI needs a good round this time, after Nvidia wiped the floor with the 8800. Now Nvidia’s just lazy, rehashing it 4x over and renaming it again.

        • srg86
        • 10 years ago

        Now if only their Linux drivers were as solid as nVidia’s. I’ve got the first nVidia card that my main machine has ever had and, driver wise, I’ve not looked back.

        • Meadows
        • 10 years ago

        I don’t remember saying anything about AMD or nVidia.

      • Kaleid
      • 10 years ago

      While I won’t be using multiple monitors the real story here is the great performance at really high resolutions.

      It certainly is an achievement.

    • fpsduck
    • 10 years ago

    Well, what kind of the GPU for this testing?
    Radeon 5870?

      • UberGerbil
      • 10 years ago

      That would be telling </Number2>

      • Krogoth
      • 10 years ago

      Probably a 2x or 4x CF 5870 setup. I honestly doubt a single 5870 can drive that many pixels and still obtain a playable framerate.

    • Spotpuff
    • 10 years ago

    That’s pretty awesome. I usually don’t like stuff that doesn’t add much to the gaming experience but the nerd in me loves how impressive this is technically. Plus, you know, big numbers.

    • Jambe
    • 10 years ago

    Wow, neat. I use two displays myself but I’m not fond of stretching images or programs across the border.

    I’m not familiar with display panel production; are there specific technological or natural barriers standing in the way of mega-high-res panels? Obviously there’s purely economic reasons (consumer demand, tech investment, etc) but I’m interested in the actual manufacturing. I mean, surely as you increase resolution you also increase the loss incurred by unacceptable dead pixel count, right? But are there inherent electrical issues as panels get bigger and/or denser?

    I imagine the production techniques these companies use are pretty heavily guarded. Especially Samsung with its current corner on the OLED market…

      • DancingWind
      • 10 years ago

      There are, LCD’s are made just like cpus and other chips – from Si wafles, and for a big LCD the waffle has to be big. And also reletively HQ and while the individual pixel arent as small as cpu elemts and therefore more resistant to impurities the wafle still has to be relitivley pure ot there will be stuck/dead pixels.

        • MadManOriginal
        • 10 years ago

        mmmm….waffles.

      • bhtooefr
      • 10 years ago

      Yields do go down as panels scale, drastically increasing prices – even near the end, new IBM T221s (3840×2400, 22.2″) were in the $4000-7000 ballpark, IIRC.

      That hurts consumer demand on one end, and then… I work with people that freak out about “the letters being too small” on a 1920×1200 15.4″ panel. That’s not dense at all compared to the two panels I work with every day at home. Myself, I LIKE my letters small, but that’s just me.

      But, the point is, resolution independence will be necessary for high-res displays to take off.

        • Corrado
        • 10 years ago

        I wonder if they could use multiple panels in a single display tho. Keep the border to a complete minimum.

    • SGT Lindy
    • 10 years ago

    I can see the huge lines now. Pre-Orders taking down web sites.

    • eitje
    • 10 years ago

    i’d like to see it take bezels into account, and just trim that content out. it’d be easy to configure bezel width, much like HDTV wizards conform a display to a given (non-standard) physical resolution. then, just cut those parts out before sending it to the monitors (where it would be spanned).

    • tejas84
    • 10 years ago

    What a crappy fad typical of AMD. This is just a distraction from the impending pwnage coming from Nvidia and GT300…. Oh and Snakeoil just go to hell….

    There well someone had to say it!

      • 5150
      • 10 years ago

      CRIPPLE FIGHT!

        • SomeOtherGeek
        • 10 years ago

        LOL +1

    • bdwilcox
    • 10 years ago

    b[

      • WillBach
      • 10 years ago

      You could also make a seven-panel configuration, with one panel in the center and the rest forming a hexagon around it.

        • UberGerbil
        • 10 years ago

        Five in a cruciform is probably more feasible, but still overkill for most.

        • Farting Bob
        • 10 years ago

        How do you make a hexagon with 6 rectangle monitors surrounding a central rectangle monitor exactly? Unless you like your setup resembling abstract art.
        Really the only way to go is 25 30″ monitors in a 5×5 square. Anything less is just for t3h n00bs.

      • indeego
      • 10 years ago

      FSM has eight appendagesg{<.<}g

        • ImSpartacus
        • 10 years ago

        Genius /discussion

      • Meadows
      • 10 years ago

      I doubt “God” had a say in it.

        • UberGerbil
        • 10 years ago

        Didn’t you get the memo? He works at Samsung.

    • Xenolith
    • 10 years ago

    Wonder what the electricity usage is for a system like that?

      • dpaus
      • 10 years ago

      LOL!! Yeah, my 45W TDP Phenom II is driving a 3 kWatt gaming display 🙂

    • Dually
    • 10 years ago

    Anyone? Anyone with info on these “single 34″ high-density monster” displays?

      • Scrotos
      • 10 years ago

      Maybe you’d like something from §[<http://www.seamlessdisplay.com/<]§ Little blurb here too: §[<http://gizmodo.com/196974/radius-320-seamless-display-look-ma-no-borders<]§

        • maxxcool
        • 10 years ago

        there we go! Thats what i’m looking referring to in the other post. That’s just chocked full of kick ass :).

        now…. just stack them 3×3 and were set….

        oh wait…. 25,000 dollars… holy ….. o.o

          • Scrotos
          • 10 years ago

          Yeah, and I don’t know that they are actually selling anything yet. PanoramTech, before they went under in 2007, had a kind of seamless display model, but stopped selling it a while before they went under, I think.

          Seamless Display seems to have some patents for their seamless system which might mean that as long as they sit on this, there won’t be any other vendors who would actually bring this to market or bring the prices down. Sucks. I’d love to get one, too, but not at the cost they are pushing.

          But hey, 40″ for only 10k!

    • swaaye
    • 10 years ago

    Why not just buy a huge TV. 🙂

      • colinstu
      • 10 years ago

      It wouldn’t be that high resolution.

      • Inkling
      • 10 years ago

      Did you read the article?
      “…one wonders whether so having so very many pixels—and the GPU requirements that come along with them—really makes sense when another option is planting a giant-screen HDTV in front of your gaming rig and calling it good.”

        • JustAnEngineer
        • 10 years ago

        My single 2560×1600 monitor has twice as many pixels as a 1980×1080 HDTV has. That’s a big advantage right there.

          • SGT Lindy
          • 10 years ago

          Why? How? More desktop space for Windows? So you get more cells in Excel?

          What video sources are you getting at that resolution? For a TV right now, anything higher than 1920x1080p is worthless. Even at 1920x1080p only BD will use that.

            • Kurotetsu
            • 10 years ago

            I would imagine somebody deciding to spend the money on a 2560×1600 monitor would have more legitimate uses for it than watching TV and dicking around with office suites. For gaming (most PC game can scale to whatever resolution you give them, so I guess that’d be a legitimate use), at such a high resolution like that anti-aliasing becomes less important, and the overall experience becomes more immersive and realistic. At freaking 7680×3200, anti-aliasing shouldn’t be needed at all. That’s a big burden off the GPU.

            A low resolution (like 1080p by comparison) blown up to giant size would make anti-aliasing more important, I imagine.

            • JustAnEngineer
            • 10 years ago
            • SGT Lindy
            • 10 years ago

            From that link…

            “They’d look even better if these wimpy HD movie formats weren’t a fraction of its native resolution”

            Which is my point.

            • JustAnEngineer
            • 10 years ago

            Games are better at high resolution. The Windows desktop and all applications on it are better at high resolution. Photos are much better at high resolution.

            • bhtooefr
            • 10 years ago

            I can speak as the owner of a 3840×2400 monitor. (Running at 2880×2400 now, but still…)

            I have it not to get more cells in Excel, but to get a large web browser, multiple PDFs, a couple IM sessions, Twhirl, a couple video streams, a large SSH session, some folders, and jBidwatcher on screen.

            At once. (Of course, I’m using my laptop’s 2048×1536 display at the same time, which I got to get the same couple video streams, a smaller SSH session, a smaller web browser, Pidgin, and Twhirl on screen at once, wherever I’ve got my laptop.)

            • indeego
            • 10 years ago

            I have found dual displays > one large display in most cases for my work and home patterns.

            One advantage of dual displays versus one large display is it naturally segments your workspace and working habits.

            At work I offset my left display to just have status/monitoring apps, and my main display has primary working apps. That way I can always refer to the left for reference, and the right for getting * done.

            I’ve found that when I was blessed with the use of a single large monitor, even though I could fit a lot on screen, the task of organizing everything into nonoverlapping windows was cumbersome and never flowed well. Some app would always work better full screen versus windowed. An example is playing a game. I can still monitor items on the secondary screen while gaming with two displays. Not generally possible with one large display without severe compromise.

            Also, 2 24″ displays are almost always cheaper than 1 30″ display, and you get far more screen real estate in the processg{<.<}g 2010 is the year I go 30"+24"g{<.<}g

            • SGT Lindy
            • 10 years ago

            Running a LCD at non-native resolution is like smearing Vaseline on it.

            • bhtooefr
            • 10 years ago

            The T221 only scales resolutions evenly, and it will never distort an aspect ratio.

            So, no, it’s not like smearing vaseline on it. It just means that I’ve got a 480 pixel black border on either side.

            And, I would run it at native if my old Mobility FireGL V5200 supported it, but it doesn’t – instead, I have to swap motherboards to one with a (failure-prone) nVidia Quadro FX 570.

            • UberGerbil
            • 10 years ago

            AutoCAD, Photoshop…

            • moritzgedig
            • 10 years ago

            photoshop is 2D and thus does not need 3D.
            Why the high resolution, so you can walk instead of zoom?

            • Meadows
            • 10 years ago

            Photoshop uses GPU acceleration and has the ability to compute 3D surfaces.

            • UberGerbil
            • 10 years ago

            Full frame pro digital cameras produce images on the order of 6K x 4K pixels; drum scans of medium format negatives can be significantly larger than that. Working with them at a per-pixel level you’d like to be able to see most of the image without resorting to zooming out to a huge degree.

            The question I was responding to was (to paraphrase) “Why do you need a high resolution display?” That’s why.

            • fpsduck
            • 10 years ago

            Pr0n! Pr0n! Pr0n!

            😉

          • swaaye
          • 10 years ago

          With games other than RTS, I can hardly tell the difference between 1360×768 and 1920×1080 when gaming on a big TV. This is partly because I’m sitting ~6 ft away (50″). I just can’t see the pixels anymore at that distance. And my GPU just loves the paltry pixel area.

          Actually because I’m ~6 ft away, the higher resolution actually makes it more difficult to see/read icons, buttons, etc. They become tiny. I was amazed to realize that I preferred the lower resolution. I’m sure there must be applications for ultra-high resolution output and a large screen, but I haven’t found it on my end.

          I have a 1920×1200 24″ + 1680×1020 20″ setup in the “home office” for regular PC use, but for games that PC on the huge TV with the lay-z-boy in front is pretty nice, eh 🙂

            • UberGerbil
            • 10 years ago

            I’d much rather use an HTPC (Windows Media Center or anything else) on 1080 than on 720, regardless of the size of the screen.

            • SGT Lindy
            • 10 years ago

            Lots of game my allow the resolution to go higher but you are not getting anymore detail out of them.

            Especially games that are 720p console ports.

            • MadManOriginal
            • 10 years ago

            Doesn’t your FOV get smaller with a lower res? Or is that partly game dependent.

            • Meadows
            • 10 years ago

            It doesn’t, not at all. The definition gets less granular, though.

            Think of it this way: the game creates a theoretical “photograph” and then scales it down (always down) to whatever your resolution is.
            320×240 will then show the same captured area as 1600×1200 would, but you will inevitably lose lots of definition to the lower resolution.

            Differences only arise when you compare normal ratio versus widescreen ratio, where nothing is universally good but “widescreen should increase the FOV” is generally the accepted solution (as opposed to “cut off the top and bottom of a normal ratio rendering).

            • Meadows
            • 10 years ago

            In a good recent game, resolution doesn’t change the perceived size of the user interface.

      • spiritwalker2222
      • 10 years ago

      Cause you can only buy one at 1/6th the resolution.

        • OneArmedScissor
        • 10 years ago

        And you still have to sit far enough away to where it probably looks the same, only it doesn’t have a bunch of lines in it. :p

        Resolution only needs to be so high.

    • TheShadowself
    • 10 years ago

    When someone comes out with a video card that supports 8K (4320 x 8192) at 60 fps *and* a display (even consumer cost projector) that supports 8K I’ll be truly impressed.

    E&S had a 8000×4000 system over a decade ago, their “video wall” — though at a truly ridiculous price. As far back as 1988 consumer desktops have supported, in the OS, up through six monitors with the desktop and windows spanning across all the monitors as a single system. Almost a decade ago IBM had the T220 then the T221 at 3840×2400 but only 22.1 inches. When will we have 40″ to 60″ at 8K? (If OSes ever implement TRUE screen resolution independence [not the pseudo crap in Windows and Mac OS X] then the 154 to 231 dpi would not be an issue.)

    I’m waiting for a true breakthrough, not slow evolution.

      • OneArmedScissor
      • 10 years ago

      That’s higher resolution than theater projectors…wtf would you need that for your house.

        • SomeOtherGeek
        • 10 years ago

        We are human, we NEED it all!

        • ludi
        • 10 years ago

        What is your question supposed to mean? Most people don’t even /[

      • UberGerbil
      • 10 years ago

      You’re going to be waiting a long time, then.

      Various companies like GE are working on large-scale OLEDs that you can use to “paper” a wall, but those are purely for illumination; the individuals pixels aren’t addressable. And those are years away from introduction.

      • SomeOtherGeek
      • 10 years ago

      I remember the “war rooms” of a natural gas company that had huge screens with crazy numbers like you are talking about, but it was set up in a different way. Like 4 video cards per projector-type screen. Each card wound do one color at different shades and combining the 4 cards make really shape real-time video images of the gas pipe-lines. I crapped in my pants! They had like 4 of these monsters with associated smaller screens around them and a console like a starship running it. It only cost something like 26 million bucks in the early 90’s. These were not mainframes, but regular custom built PCs. We played Doom I on one of the screens and it was mind blowing!

        • yogibbear
        • 10 years ago

        Yep we still have those. But one of ours is basically just a couple of projectors hooked up to slightly larger than normal boxes.

        Live video feed from drilling platforms with 3d models + excel + Crysis for when the drilling dudes are fishing instead of actually drilling anything interesting. 🙂

        Or did you mean something boring like: §[<http://www.cockpit-group.com/en/services/monitor/set-a-cockpit-room<]§

      • WaltC
      • 10 years ago

      /[

        • UberGerbil
        • 10 years ago

        Yeah, if you do a google image search on “trading station” you’ll see lots of examples of systems that need lots of monitor real estate but don’t need 3D. Those have been around for a long time (and are one of the few reasons Matrox stayed in business).

          • bhtooefr
          • 10 years ago

          Also, a lot of medical imaging is 2D (and IIRC, Matrox cards support 10 bits per color channel, which is necessary for some monochrome medical imaging displays.)

    • kvndoom
    • 10 years ago

    Good gosh, don’t bring up Jaws 3D… I try to forget that ever happened.

    • snakeoil
    • 10 years ago
      • indeego
      • 10 years ago

      “But if you thought borderless meant the range can display images to the very edge of the screen – forget it.”

      But then they neither show us a picture of what they mean or describe what they mean. WTF is with journalism these daysg{

        • UberGerbil
        • 10 years ago

        WTF is with /[

      • MadManOriginal
      • 10 years ago

      Do you even read the content of the links you post?

      q[

    • snakeoil
    • 10 years ago

    wow this is spectacular.

    intel is in big trouble because intel graphics are pretty much garbage
    while amd’s graphics are real gems.

    these are two cards in crossfire with six outputs.

      • 0g1
      • 10 years ago

      I agree, but I don’t think Intel is in trouble yet.

      • 5150
      • 10 years ago

      Wow, didn’t know you were still around.

      • UberGerbil
      • 10 years ago

      Yeah, it’s a bummer that you can’t put an AMD graphics card in an Intel system. That dooms them for sure.

      Also
      g[

        • flip-mode
        • 10 years ago

        Even UberGerbil has been snared by the troll.

      • maxxcool
      • 10 years ago

      I hate monitor seams…. 🙁 now if there was a really flat lcd, that would align to each others edges… that would look pretty…

      especially when run by a intel 4500hd integrated chipset. 😉

      • SGT Lindy
      • 10 years ago

      Intel is in big trouble? What planet do you live on? What percentage of market share do you think this will take, .03%

    • FubbHead
    • 10 years ago

    I’ve always wanted a huge aquarium with fish I don’t need to feed. This might be it.

      • LockeOak
      • 10 years ago

      You know they sell automatic fish feeders, right? They’re about $30.

      (my other hobby is aquarium geekery)

        • mesyn191
        • 10 years ago

        Well yea but you still have to maintain the tank. Depending on your fish this can either be a minor annoyance or a major PITA. I found many really cool fish (ie. seahorses) to be fairly delicate to changes in temp. and water quality, really sucks to have them die on you and you find out later the heater wasn’t set right or something.

        • BiffStroganoffsky
        • 10 years ago

        But saltwater tanks cost more per sq. inch than lcd.

          • ssidbroadcast
          • 10 years ago

          Would be per cubic inch if we’re talking about aquariums?

        • FubbHead
        • 10 years ago

        Actually no, I didn’t. 🙂

        But then there are other maintenance that is required. I just want the soothing look and sound without maintenance. And perhaps most importantly, you can have a Great White swoop by from time to time. 😀

          • Usacomp2k3
          • 10 years ago

          Honestly, I’m pretty lazy when it comes to my aquarium. I feed them every couple days and clean the tanks every couple months. I add 5 gallons of water to replace that which has evaporated about every 2 weeks, but that’s the only maintenance I do. I have a heater in there that automatically regulates the water temperature too. The light is on a timer that is on from 9 am to 10pm (roughly). The fish seem to be fine in this configuration. Probably doesn’t hurt that I have ~12 fish in a 50 gallon with water filters that combined are rated for about a 120 gallon tank. *shrug*

    • Dually
    • 10 years ago

    Where is the super hi-density 34″ referred to? Link please! I’d much rather have one of those bad boys than stare at bezels.

      • TheEmrys
      • 10 years ago

      Apparently the recession hasn’t touched you. Kudos.

        • Dually
        • 10 years ago

        Thankfully, no. But I also make my living in graphic design, as a creative director, so a good monitor is an investment. Besides, all I asked for was a link for more info.

          • Scrotos
          • 10 years ago

          The only high res monitors that I know of offhand were:

          §[<http://en.wikipedia.org/wiki/IBM_T220/T221_LCD_monitors<]§ Someone in the forums was looking for a way to get one of these working. 15fps. Maybe there's something newer and better, though that'd be nice to see a link to, too!

            • bhtooefr
            • 10 years ago

            I’ve got one, and I’m who posted the thread.

            Depends on your graphics card.

            This could drive the version I have at the full 48 Hz.

            The older versions could be driven at their full 41 Hz.

            I’m getting 22 Hz at partial resolution (2880×2400) on my current card, due to a combination of… I don’t have the dual-link converter box, so I’m just running it off of a single link, and there’s hard limit in ATI’s older cards of 2880 pixels wide per texture. The rest of the chip can handle 3840×2400 easily, and the scaler will scale to that in VESA modes, but the most it can natively generate is 2880×2400. (Actually, I suspect 2880×2880.)

    • rythex
    • 10 years ago

    This would be nice with a projector type system to avoid having bars hehe

    • UberGerbil
    • 10 years ago

    If AMD has been talking to Samsung, have they talked about more DP ports on monitors so you could daisy-chain them instead of having multiple cables coming out of the card? In theory at least they should be able to drive three 1680×1050 displays off one DP 1.1 connector (and higher res or more monitors with DP 1.2)

      • ssidbroadcast
      • 10 years ago

      Wouldn’t there be uh, signal degradation after a while? From one monitor to the next?

        • UberGerbil
        • 10 years ago

        No, not any more than you get in other packet-switched networks (like the daisychaining from your modem to your router to your PC). This is digital data after all, and each monitor is responsible for amplifying the signal as it passes the packets on to the next.

        DP 1.1 mandates a 3m cable at full bandwidth, and 15m at lower bw (but still good for 1080p). DP 1.2 offers higher bandwidth and also supports optical cables for longer runs.

          • bhtooefr
          • 10 years ago

          However, latency could be an issue. I say could, as it may not be noticeable, but who knows – there is, after all, input lag even on some single-monitor setups.

      • sigher
      • 10 years ago

      5870 connections: §[<http://www.computerbase.de/bildstrecke/25375/4/<]§ Huge ass beast of a card btw, and covered all over, but with a tiny grill to vent the air, doesn't sound troublefree, but only time can tell I guess.

    • reactorfuel
    • 10 years ago

    I like this a lot. Plenty of people already have dual or even triplehead setups for work, and offering a seamless way to extend that to gaming is a great feature. While it’s still mostly a gimmick, I see it as much more useful than Nvidia’s 3D stuff (which requires a substantial investment in new hardware pretty much no matter what).

    As for the comparison to a massive HDTV, I don’t think that’s completely accurate. I’ve got a Sharp 40-incher I use for games and movies, and while it’s a lot of fun, it doesn’t offer an expanded field of view – it’s the same resolution and FOV, just bigger. With proper support in games, though, this can offer a genuine wraparound FOV that more closely approximates the human field of vison and hopefully provides for a bit more immersion. Plus, there’s always the option of 3 massive HDTVs if you’re the sort for whom overkill is never enough. 🙂

      • UberGerbil
      • 10 years ago

      Research in the Virtual Cockpit program at Wright-Patterson AFB, and later furthered at the HIT lab at the U.Wash, found that when the display FOV exceeded ~45 degrees the viewer began to perceive himself to be “in” the scene rather than watching it. So yeah, properly set up, a three-screen display like this certainly can contribute to immersiveness.

        • reactorfuel
        • 10 years ago

        There are still issues – to have a display take up that much of your FOV, it needs to be either massive (and out of reach of mere mortals’ wallets) or very close to the viewer. Unfortunately, when it’s very close to the viewer, you start to perceive individual pixels and lose immersion. Big flight sims also have a major advantage in that they can work with other senses – in particular, the entire “cockpit” moves to simulate the motion of the airplane. I don’t think we’re going to get quite to that level of immersion on consumer systems anytime soon.

        However, this is certainly a nice step in the right direction. Multiple monitors are already popular for work systems; simply making them viable for gaming is a great bit of added value on ATI’s part.

    • d0g_p00p
    • 10 years ago

    Wow, that’s mighty impressive. I just might splurge and pick up a 30″ monitor now, knowing that I can play games at native resolution without having to spend a ton of money on a GPU just to enable all the features and have smooth frame rates.

    I moved up from a 20″ widescreen to a 24″ widescreen earlier this year and was impressed with that. However having 6″ more inches would just be killer.

      • UberGerbil
      • 10 years ago

      …that’s what she said. (bah-dum-dum!)

        • kravo
        • 10 years ago

        rofl
        I wanted to say that 🙂

      • bthylafh
      • 10 years ago

      q[

    • bthylafh
    • 10 years ago

    <jayne>I’ll be in my bunk.</jayne>

      • Fastfreak39
      • 10 years ago

      Hahahahaha, quote of the day right there.

      • Spotpuff
      • 10 years ago

      Get Vera! Jayne ftw.

    • SecretMaster
    • 10 years ago

    My only gripe is the actual borders of the monitors interfering with the “overall picture”. If only you could spray paint LCD pixels over it…

      • indeego
      • 10 years ago
        • lethal
        • 10 years ago

        Keep in mind that the monitor in that article is NOT what they are talking about, that thing has a huge bezel. The monitor in question is the one posted here:

        §[<http://news.cnet.com/8301-17938_105-10300859-1.html<]§

          • indeego
          • 10 years ago

          oh I’m aware. I even cringed linking to cnetg{<.<}g

          • mesyn191
          • 10 years ago

          That thing is beautiful, but probably god awful expensive, particularly if you want to get 2 or 3, or even 6 of em….

          Personally I prefer a decent projector in a dim (not dark) room for meeting my stoopid huge display needs. It certainly isn’t cheap, but its about the only way you can get a large good looking display for something that approaches an OK price.

            • UberGerbil
            • 10 years ago

            Projectors tend to be severely resolution-limited, especially at the saner price points.

            • indeego
            • 10 years ago

            Not to mention loudg{<.<}g

            • mesyn191
            • 10 years ago

            ???

            There are certainly some loud units out there, but many are fairly quiet these days. Of course that can be somewhat subjective. If you’re one of those guys who freaks out over the sound of his hard drive seeking then yes most if not all of the projectors out now will drive you nuts.

            Personally its not an issue for me, YMMV.

            • indeego
            • 10 years ago

            /[

            • mesyn191
            • 10 years ago

            Phhht, everyone says that.

            Personally, I think I’m just better than you.

            jk 😀

            • BiffStroganoffsky
            • 10 years ago

            zero is a constant.

            • indeego
            • 10 years ago

            Bingo. You know my mileage nowg{<.<}g

            • spanky1off
            • 10 years ago

            that looks massively rubbish

            • mesyn191
            • 10 years ago

            You can get a decent 1080p/i projector for about $1200-1500 right now. Depending on how far you sit from the screen you can easily blow the picture up to 50-90″ before things start looking blocky with one of them. There are much better ones out there of course, but the price quickly rises to the $3-5K range which is indeed very expensive and probably not worth it.

      • UberGerbil
      • 10 years ago

      OLED might make bezel-less monitors possible, though for ordinary use people generally like to have a “frame” around the content. But borderless might be an extra-cost option for people planning a setup like this. You know, in “3 to 5 years” when OLED arrives….

        • Usacomp2k3
        • 10 years ago

        …in TV form. It is already here in some phones and the new Zune HD which I’m looking forward to playing with at CEDIA tomorrow.

          • UberGerbil
          • 10 years ago

          Screens that fit in your hand don’t automatically scale up to screens that fill your wall.

      • Hattig
      • 10 years ago

      Yeah, another way to minimise the border could be to have overlapping bezels on adjacent monitors (you could imagine that the monitor design would have a sloping bezel that interlocked: |/ and /| on the left and right sides).

      However it goes to show how insanely powerful GPUs are getting.76×32 indeed!

      • StashTheVampede
      • 10 years ago

      Shouldn’t be much of an issue. Just use projectors instead! A bit of timing to line it all up, but that would be as near seamless experience.

        • dpaus
        • 10 years ago

        We have an immediate application for just such a set-up.

    • flip-mode
    • 10 years ago

    Crysis?

      • ish718
      • 10 years ago

      No, just WoW.

      • TurtlePerson2
      • 10 years ago

      If it doesn’t run well on one monitor it probably won’t run well on 6.

        • flip-mode
        • 10 years ago

        and thus flew high my abbreviated attempt of ‘but can it play Crysis?’

      • Kaleid
      • 10 years ago
        • Meadows
        • 10 years ago

        “New” my ass. It’s CryEngine 2 neutered. Note the lame light sources, the blurry textures especially on the walls, and the cheaper looking waterfalls. Now that might be fine on a console, but you’re not going to sell any new Radeons with that.

        They just gave this demo a less saturated, less tropical setting. Which is fine, but they hardly created anything even resembling a new engine.

          • Kaleid
          • 10 years ago

          Yeah. Since its coming to consoles it will have to be a crippled version…

            • Meadows
            • 10 years ago

            So it’s not so awesome anymore, is it?

            • Kaleid
            • 10 years ago

            No not really. But to me it does says (still) that the new Ati GPU has plenty of power.

    • ludi
    • 10 years ago

    Low-end 22″ TN panels are regularly selling for $120-130 now. Less than four hundred bucks for a three-monitor, wrap-around driving or flight sim is actually an attainable price.

      • UberGerbil
      • 10 years ago

      Yeah, roughly equal to the bottom end of the 24″ IPS or VA monitors. Which I’d rather have, actually. But then my flight sim days are mostly over.

      • 5150
      • 10 years ago

      TRAIN-SIM!!!

        • indeego
        • 10 years ago

        HVAC AIR DUCTS SIMg{

          • Scrotos
          • 10 years ago

          Man, I just played through Half Life 1 recently and I think that’s your damned HVAC duct sim right there!

      • Kaleid
      • 10 years ago

      Many of them have too bad viewing-angles unfortunately.

        • zqw
        • 10 years ago

        That’s why 3 pointed(curved) at your face is better than a single giant display with bad viewing angles.

      • JrezIN
      • 10 years ago

      What is the point when you can’t even see homogeneous blacks/colors even with only one of them? why add more monitors and see even worst of their viewing angles? Gimme more non-TN displays (specially outside US!!!)

        • ludi
        • 10 years ago

        Given a choice between more display area and more display quality, the majority of people buy more display area before buying more display quality, especially when the former is up to several times cheaper than the latter.

        Them’s may not be your priorities but it works for me. I don’t do graphics design and I don’t like dumping $800+ into hardware for a single system unless I’m getting a pretty big ROI. “Better resolution than real life” doesn’t trigger my gottahaveit buttons — I’ve got the Colorado Rocky Mountains sitting just a mile away from my front door if I want to experience an untainted panorama of fun.

Pin It on Pinterest

Share This