Report: Wii U graphics based on Radeon HD 4870

Slowly but surely, more details are emerging about Nintendo’s upcoming Wii U console. We knew the system had an AMD graphics chip, and now we have a little more detail on its origins. According to Japanese site Game Watch, the console’s graphics chip is similar to the RV770 GPU that powers the Radeon HD 4870.

Although the Radeon HD 4000 series is getting on two years old now, it’s still miles ahead of the ancient GPUs in current-generation consoles from Microsoft and Sony. The Xbox 360’s Xenos GPU bears some similarities to the Radeon X1900, which is straight outta 2006. The GeForce 7800-class GPU that underpins the PlayStation 3’s RSX graphics chip can be traced all the way back to 2005.

Of course, both of those consoles are capable of playing optical discs—the PlayStation can do DVDs and Blu-ray movies, while the Xbox 360 can at least handle DVD playback. Despite the fact that the Wii U has an optical drive, it won’t do either. Nintendo chief Satoru Iwata explains:

The reason for that is that we feel that enough people already have devices that are capable of playing DVDs and Blu-ray, such that it didn’t warrant the cost involved to build that functionality into the Wii U console because of the patents related to those technologies.

Nintendo hasn’t been keen on selling consoles at a loss and making up the difference with game sales, and I suspect the company will follow a similar strategy with the Wii U. Let’s hope it can at least stream video playback through services like Netflix.

Comments closed
    • Novum
    • 8 years ago

    Xenos isn’t based on R520, but on the cancled R500. In contrast to the X1900XT it has a unified shader architecture and more features (e.g. texture arrays).

    • Brad Grenz
    • 8 years ago

    GameWatch appears to be just repeating the same “based on AMD’s R700 line” rumor posted on that French site months ago. Engadget misread the machine translated article and misconstrued this as “4890 or 4870” levels of performance, but the Japanese article seems more interested in the DX10.1 level shader functionality. In all likelihood we won’t be getting more than 320 shaders, something more akin to the R740. In fact, I expect the WiiU to use a single chip CPU/GPU combo, in the style of AMD’s Fusion chips, only with PowerPC-based CPU cores (probably 3 of them, NOT based on Power7) and 320 shaders derived from the R700 line of GPUs. There will also be a good amount of embedded DRAM to make up for a slow 128bit unified memory bus with no more than 1GB.

    Anyone expecting 4870 levels of performance will be sadly disappointed. Anyone expecting an appreciable difference between the WiiU and other HD consoles is also likely to be disappointed.

    • Krogoth
    • 8 years ago

    Looks like “Wii 2” is going to have some balls to it.

    A normal 4870 can still effortlessly handle 2Megapixel gaming (1920×1080). A tweaked version might be even better at it. The real quesiton is the CPU of the console, it still can make and break it performance-wise. Otherwise, Wii 2 should easily outclass the current PS3/360.

    I would imagine that the next “PS3/360” will likely use a derivative of today’s mid-range GPUs due to cost and power requirements. Their performance will be somewhat better, not quite like the current gap that exists between the Wii I and 360/PS3.

    • albundy
    • 8 years ago

    wow, those are some dated graphics, considering this console will be released in a years time.

      • My Johnson
      • 8 years ago

      This console has likely been in development for a very long time.

      • BestJinjo
      • 8 years ago

      HD4870 level of performance is likely better than what 80% of gamers use on the PC today. Sure HD6970 is 2x faster but it costs $300+ for the videocard alone. It’s not feasible to expect an HD6970 style GPU in a console in 2012 given the high cost and the extreme heat dissipation required for such an advanced GPU. HD6970 has a TDP of 250W on 40nm. So even if you shrink that chip to 32nm, it would still consume far too much power for a small console like the Wii U. It’s time to revisit reality vs. wishful thinking.

    • potatochobit
    • 8 years ago

    and you can bake cookies at the same time!

    • DeadOfKnight
    • 8 years ago

    Really? They couldn’t at least put a 5770 in this thing? Lame. Not that it is any faster but the 5770 runs cooler and quieter and has the features to Compete with next gen consoles.

      • cygnus1
      • 8 years ago

      I’m pretty sure they just mean architecturally it’s going to be similar to a 4870. I can promise you it will be fabbed on a better process node than the 4870 was, and will be optimized differently. They won’t just start printing new 4870’s without any modifications.

        • Kaleid
        • 8 years ago

        Yes, Wii U pictures have surfaced and it is only slightly bigger than Wii so of course they need to shrink the process plenty to keep power requirements and temperatures down.

        Probably the only thing missing with not using 5770 is DX11.

    • khands
    • 8 years ago

    I wonder if it’s going to be more like Juniper than the R770 once all is said and done, considering it seemed to be a much better chip for something like a console, cooler, less power draw, and contains an up to date features set, at the same performance level.

      • swaaye
      • 8 years ago

      I’m sure a safe bet will be that it’s a fairly custom chip with extra system stuff integrated and built on 28nm perhaps. There has to he zero chance of them using some old PC GPU. I also would bet on a 128bit memory bus instead of a more costly 256bit setup like with RV770, and that there will be EDRAM sized for 720p or 1080p. Hopefully more EDRAM than 360 has because that has been troublesome for modern game engines.

    • Joerdgs
    • 8 years ago

    Seeing what kind of graphics developers can still push out of those 6 year old GPUs on the PS3 / Xbox 360 after some severe optimization, I’m pretty positive what they’ll be able to do with this tech in the long run. Let’s just hope they don’t skimp on memory since that is what seems to trouble devs the most in the long run.

      • OneArmedScissor
      • 8 years ago

      GDDR5 can reduces costs by allowing the memory interface to shrink, encouraging them to use it, and it’s already so prolific and inexpensive that even $50 cards have 1GB running at a decent speed. It will just be even cheaper and faster by the time the Wii is actually going on sale, and they’re going to both want it and need it.

      What is much more interesting is the possibility that the GPU has its own eDRAM buffer. You’re not even going to get that with a GTX 680. E-pene shrinkage: IMMINENT.

    • derFunkenstein
    • 8 years ago

    Why is everyone celebrating? This is the bare essential hardware requirement here. “It’ll be faster than the PS3 and 360!” Well I should hope so; those consoles are 5-6 years old.

      • Hattig
      • 8 years ago

      It’s Nintendo – people were worried the GPU would have been worse!

    • WillBach
    • 8 years ago

    [quote<]Let's hope it can at least stream video playback through services like Netflix.[/quote<] Wii has had Netflix playback for sometime, since spring of 2010. If Wii does it, it's likely that Wii U will do it, too.

    • Hattig
    • 8 years ago

    Hopefully this means it’s getting 800 shaders, and that they will run at a fairly decent clip too.

    The 360 has 240 shaders running at 500MHz, for a point of reference. Obviously the 4870 uses more modern shaders and has a beefier tessellator, so performance could be higher than a simple ratio of shaders and clocks. Especially if AMD sticks in some of that Barts performance improvement they did.

    Anyway, if the new GPU is made on 28nm for reduced power consumption, and therefore runs at 700MHz, we could see graphical performance along the lines of (800/240 * 700/500) = 4.7x faster than the 360’s graphics.

    But there are so many IFs that it’s very speculative.

      • swaaye
      • 8 years ago

      Actually the 360 has 48 shader ALUs and they are more like X1900’s pixel shaders than they are like the post R600 VLIW5 units. I’ve always wondered how it performs compared to R600 era tech but of course it’s very difficult to compare.

        • Hattig
        • 8 years ago

        Of course, but as the shader ALUs have 5 shaders inside, the simplest comparison to modern shader counts is 48 * 5 = 240 shaders.

          • ET3D
          • 8 years ago

          Actually X1900 has two ALU’s per core, one capable of four ADD/MUL/MADD and one capable of four ADD operations (see here: [url<]http://www.beyond3d.com/content/reviews/2/3).[/url<] So it can be counted as 48 * 8 = 384 shaders, but very limited in what they can do. I think it would be hard to do a direct comparison to a modern architecture.

      • bcronce
      • 8 years ago

      It will probably be on the 40nm as their Power7 CPU will be on 45nm. The original 4870 was on 55nm, so 40nm is still a nice shrink and 40nm is crazy mature right now.

      Also, it “based” on the 4870, so they probably made whatever basic tweaks that they learned from the 68xx/69xx series, like power gating and stuff.

        • Hattig
        • 8 years ago

        Rumours are now saying it’s 32nm (e.g., [url<]http://techland.time.com/2011/06/09/wii-u-specs-disclosed-including-25gb-optical-discs/)[/url<] with 1300 GFLOPS. As such I wouldn't be surprised if it turns out to be the Llano graphics core with twice the shaders, or even just the Trinity GPU core. That's because AMD's already done the work on getting these designs on 32nm.

          • ET3D
          • 8 years ago

          Standard GFLOPS calculation for AMD is SPUs * 2 * clock (the *2 is for single cycle MAD). 800MHz * 800 SPUs * 2 = 1280 GFLOPS, which I think is a reasonable assumption for this this.

          It’s possible that they’re using the Trinity core, but nothing prevents AMD from creating another design in 32nm, so it’s certainly possible to create an RV770 variant like that.

    • lilbuddhaman
    • 8 years ago

    I feel that a 4870 based chip will simply not be enough for the console, and that this was a “go as cheap as humanly possible” move by Nintendo. If anything, they should have worked with ATI to use up to date tech ( 6xxx based ) but scaled down to fit costs. I’d rather have a crippled 6870 than an overhauled 4870.

    As far as gaming goes… I forsee ~25% of titles hitting that 1080p magic point… and it will mostly be casual titles, and the few First Party Titles (down the road, the release mario / zelda will only be 720p methinks).

    One thing is for sure, I won’t be buying this console. Hell, I likely won’t buy anymore consoles … ever… until the day they come with a M+KB in the box.

      • swaaye
      • 8 years ago

      It might be using recent tech regardless of what is said here. A lot of these supposed reports sound more like regurgitated rumors.

      Although really the main benefit of Cypress and Cayman tech (5000/6000) is DX11 and not really performance per transistor. 4870 is a little faster than a 5770 for example and the latter has similar specs. 5870 and 6970 are just much larger and hotter chips than 4870.

      The question is how much heat and noise is Nintendo willing to generate because a 40nm 800 shader GPU is quite hot (I have a Mobility 5870 notebook).

        • lilbuddhaman
        • 8 years ago

        I’m thinking the same thing actually, Similar to how the Xbox360 had tessellation features added to their chip.

        My 4870 def. ran hotter+louder than my 6870 does now though.

      • judoMan
      • 8 years ago

      I suspect that wattage is a major consideration for them. So not only are they trying to find the “sweet spot” between cost/performance, but they’re weighing wattage incredibly heavily. Less watts==less heat and thus a longer console life (e.g. no RRoD).

      • OneArmedScissor
      • 8 years ago

      Hi, I can see you’re new, so let me welcome you to real life! In this world, general purpose PCs have been using Radeon 4870s, and lesser cards, to run games at 1080p for years. Also, “up to date” cards like the 6750 and 6770 are the same thing as the 4870! Woweezoweewee!

        • judoMan
        • 8 years ago

        ignore…sorry

          • OneArmedScissor
          • 8 years ago

          M-M-M-M-MONSTER reply fail! :p

        • lilbuddhaman
        • 8 years ago

        I can see you’re also new to the video game industry. In this industry, there are things called “lazy programmers”. These guys aren’t so good at doing their job, and the end result is a game that looks worse than it should, and one that performs worse than it should.

        So when one might expect a game that looks like X to run at 1080p no problem, it doesn’t.

        Woweezoweewee!

        I certainly believe that 100’s of the [b<]earlier[/b<] Xbox360 ports that come out will run quite nicely at 1080p, but I'm thinking many of the more (graphically) powerhouse games will instead opt for a ~45fps 720p 2x AA experience over a ~35fps 1080p 0x AA experience. p.s. I've been around at least this long: [url<]https://techreport.com/discussions.x/1125[/url<] , As you can see my writing hasn't gotten any better since then.

      • designerfx
      • 8 years ago

      do you have any idea what you’re talkin about? The 4870’s we use are not as specialized and don’t run as tight code as consoles are defined for. Remember, console hardware doesn’t change quickly. A 4870 in here is a great jump, but by no means underpowered. I’d love to see a 6870 in it too, but think about this: when was this console designed? Hint: not 2011 in june. so of course you’re not going to see a 6870 hitting that.

        • lilbuddhaman
        • 8 years ago

        No, of course not a 6870, but this chip was likely in development at the same time as the 6xxx, so I’d be more expecting the chip to be “an optimized 6xxx series chip, featuring advanced tesselation, DX11 features, specially made for Nintendo’s newest console” (to say it in marketing speak).

        Nintendo is very scant with details (as always), so we likely won’t get to see a sidebyside comparison of the silicon until near-launch day 🙁

          • Farting Bob
          • 8 years ago

          Except using a brand new chip would likely be more expensive, while offering not a wholelot more than nintendo would want. Remember they are going after the casual crowd still and its all about the controller. The hardware inside isnt as important to them. The next gen xobx and PS might use more up to date graphics when they are released, i expect them to want 1080p 3D capability so a 4870 equivilent wont be enough, even with highly optimised drivers and API’s.

            • Krogoth
            • 8 years ago

            What are you smoking?

            R770 has always been a capable GPU. It easily beats the venerable G80-G92 dyansty and manages to rival GT200 dyansty despite having far less resources at its disposal.

            It can still handle 2Megapixel (1920×1080) gaming fine which The Wii 2 and PS3/360’s successors are targeting.

      • ET3D
      • 8 years ago

      At least until the other consoles are updated, all games will probably be able to run at 1080p without problem. Most modern PC games (there are some exceptions) can get 30+ FPS on a Radeon 5770 at 1080p with no AA, and the 4870 is faster than the 5770. As has been pointed out, consoles can be more efficient than PC’s when taking advantage of the GPU.

      I see no reason why first party titles won’t continue being 1080p for the foreseeable future, since they’re cartoony, so it’d be easy to make them not require a great deal of GPU power. It’s all a matter of decision on Nintendo’s part. Though I think that 1080p is more for 3rd party hardcore gamer, so Nintendo could go either way on first party.

      • Laykun
      • 8 years ago

      Being a console they won’t have to sit behind graphics APIs like DX and OpenGL. Also only having one architecture to optimise for I believe they will be able to squeeze some very impressive visuals out of the RV770. The Xenos GPU and the RSX have managed to produce some pretty amazing visuals despite their heritage.

    • JoJoBoy
    • 8 years ago

    Didn’t we hear that the Wii 2 would be using this chip almost two months ago…. [url<]http://gear.ign.com/articles/116/1163325p1.html[/url<]

      • BestJinjo
      • 8 years ago

      R700 includes HD4250 with 80 Shaders all the way to HD4890 with 800 shaders. So telling us the chip is based on R700 architecture only tells us that it won’t support DX11 and it’s maximum performance is up to an HD4890. Now that we know it’s ~ 4870, that’s at the top of the food chain for that generation.

    • l33t-g4m3r
    • 8 years ago

    Well, it’s a step forward for Nintendo. The 4870 is a very capable chip, and is actually closer to dx11 than dx10. The problem on the PC is that game developers never supported the extra features, and AMD never wrote drivers exposing the full capabilities of the card. There are hacked drivers allowing MLAA on the 4870, proving that AMD is artificially handicapping the card. I also think that AMD could have written drivers that enabled tessellation compatibility with dx11, but did not to further sales of dx11 cards.

    So, while us 4870 owners never saw the full potential of the card being utilized on the PC, I can easily predict it will be fully utilized under a console environment, which should then tell you how much we’ve all been screwed.

      • ET3D
      • 8 years ago

      Tesselation exists on the Xbox 360, and it’s not used, so there’s no guarantee it will be used on the Wii U. The unit is also not up to DX11 spec, and even if it was, the chip lacks enough other DX11 features that it wouldn’t have been possible to expose this feature.

      Still, I agree that console developers could end up exploiting more features. I’m not sure this will actually be done, however. Most games will probably use the same engines developed for the other consoles, and will stick to DX9 level features, at least until the next gen arrives.

        • Novum
        • 8 years ago

        It is used. For example Viva Pinata used tesselation for terrain rendering.

      • Novum
      • 8 years ago

      “and is actually closer to dx11 than dx10” – it’s not.

    • BiffStroganoffsky
    • 8 years ago

    I would agree that most people already have the optical playback system that they want or need and many are making the leap to wireless/ethernet/USB connectivity for their entertainment needs. DVD and Blu-Ray will soon only be relevant for ripping to that multi-terabyte RAID on your network…which will be confiscated by the authorities and used by the RIAA and MPAA lawyers to brand you as a pirate for stealing.

    ….yarrr!

      • stdRaichu
      • 8 years ago

      Pirates aren’t branded, they’re made to walk the plank! Seriously though, I agree – I’ve been ripping my optical media to disc for nearly a decade now and more and more of my friends are finally doing the same thing now that NAS and HTPC-type devices are much more user friendly.

      Nintendo’s always been against DVD playback, mostly due to the licensing fees (and the reasonable assumption that gamers will already have the requisite tech), so it’ll be the same but more so with blu-ray.

      Back OT though, I’m surprised nintendo didn’t go with the tech used in the 5xxx series; I don’t really follow GFX tech that closely, but didn’t the 5xxx series turn out a lot more power efficient or was that just due to the process shrink?

    • Sahrin
    • 8 years ago

    If you think that the Xbox 720 or the POS4 will have wildly more advanced GPUs than WiiU – let me disabuse you of that notion. Remember how much games changed after DX11 GPU’s were released? It’ll be like that.

      • Mystic-G
      • 8 years ago

      What you said… it means nothing.

      • bcronce
      • 8 years ago

      Game engines tend to have long lives(~5 years). Even though DX11 came out a while back, no engine has been built from the ground-up to actually use it. Most current games either just use DX9 or are “patched” to make use of one or two DX11 features.

      Now those engines are getting phased out and new ones are coming in, we’re starting to see the beginning of DX11 games. I would like you to take a look at Civ V, which scales up to 12 CPU cores, and BF3, which is due this fall.

      THOSE are DX11 games. Now say there isn’t much different.

      I’m sure if a console releases with a GPU that has DX11 type features, the games will quickly take advantage of those features, especially as the devel kits mature.

      • BestJinjo
      • 8 years ago

      Yes, I do remember how games changed after DX11 GPUs were released – they didn’t change at all. It took Metro 2033 and Witcher 2 just to match Crysis’ graphics from 2007. And Witcher 2 is only a DX9 game….drumroll.

      Wii outsold PS3 and Xbox360 despite far inferior hardware. Now that Wii U’s GPU is 4-5x faster than what’s found in the current PS3 / Xbox360, it’s fast enough to port any of the current games from the PC. PS4 and the next Xbox won’t ship for another 2-3 years which gives Nintendo a solid head-start.

      All the other games with “Advanced DX11 graphics” are a joke from a DX11 perspective. BF2:BC2, Dirt 2, Lost Planet 2, Hawx 2, etc. The minor DX11 sprinkles in those games make no difference to 99% of players.

        • Sahrin
        • 8 years ago

        “Yes, I do remember how games changed after DX11 GPUs were released – they didn’t change at all. It took Metro 2033 and Witcher 2 just to match Crysis’ graphics from 2007. And Witcher 2 is only a DX9 game….drumroll.”

        This was my point. Games didn’t change…meaning that the marginal power boost that the 360 and PS4 will have over the Wii U (assuming there is one at all) won’t even be a factor.

    • 5150
    • 8 years ago

    Why wouldn’t it be able to stream Netflix when the Wii currently does? Trying to throw a little FUD out there Diss?

      • Xenolith
      • 8 years ago

      Yea, I think you meant HD video. Netflix is streaming 480p over the Wii right now.

        • bcronce
        • 8 years ago

        Even then, the Wii only streams 480p because that’s the max res it can output.

        I don’t think he meant it as “FUD”, but it came off that way.

      • OneArmedScissor
      • 8 years ago

      I heard it can’t even play games. When they said it’s, “gonna be 1080p,” and has a mysterious, “new experience,” they meant a 1080p static image of Mario, which keeps saying, “Woohoo!”

      To make it the most modern gaming platform of all, different discs contain absolutely no content beyond paid DLC style upgrades. For example, if you buy Mario Tennis, Mario will then be holding a tennis racket, which he swings around uncontrollably, in a manner which may or may not be triggered by the motion controller. You can never really be sure.

      But at least there’s going to be Mario FPS, where he’s got a gun, and says, “Woohoom, headshot!” and Mario Office 9 to 5, where he’s also got a gun. The Wii isn’t just for kids anymore.

        • Farting Bob
        • 8 years ago

        I know people who would buy the Wii U if all it could do was display an image of Mario.

        The guy needs to be retired, he’s wheeled out on every nintendo game regardless of content.

Pin It on Pinterest

Share This