AMD GPUs could power all three next-gen consoles

Even today, AMD enjoys a rather privileged position in the console world, with its graphics processors powering two of the three current-gen consoles—Microsoft’s Xbox 360 and Nintendo’s Wii. As it turns out, AMD might end up with fingers in even more pies come the next generation. We already know the Wii U will be Radeon-based; HardOCP now reports that Microsoft’s and Sony’s next consoles will also feature AMD graphics.

Neither Microsoft nor Sony has gone on the record to elucidate the matter quite yet, but HardOCP received purportedly reliable intel from tipsters roaming the halls of the Electronic Entertainment Expo last week. Based on their information, the site says AMD’s future console dominance is already a “done deal.” There will be room for Nvidia, but only in the handheld space, where its Tegra 2 system-on-a-chip lacks a direct competitor from the AMD camp. (Future Fusion APUs might do a decent job in tablets, but I doubt they’ll be able to squeeze into something like a Nintendo DS or PSP.)

AMD might even score a win on the microprocessor side of things with the next PlayStation, HardOCP says, although IBM could just as well retain its current dominance. All three of today’s major consoles are packing Power-based processors—even the PlayStation 3, whose Cell chip has a PowerPC component sharing die space with SPE co-processors.

Now, when can we expect the next Sony and Microsoft consoles? A DigiTimes story posted earlier this month suggests PlayStation 4 production will kick off “at the end of 2011” in preparation for a launch some time in 2012. The picture seems a touch murkier on the Xbox front, though Boy Genius Report received word last month that Microsoft plans to unveil its new console at E3 2012. That doesn’t tell us when the device will be out, but I do recall Microsoft officially unveiling the original Xbox 360 about six months before the retail release.

Comments closed
    • clone
    • 8 years ago

    meh, I suspect AMD is more flexible than Nvidia or more desperate depending on how you’d word it given the thrashing they are taking on the cpu side while Nvidia is moving away from gaming & catering to the more lucrative professional market they dominate in.

    if true and AMD grabs all the consoles it’ll likely be due to tighter margins, it’d be interesting if it happens as all games would have to be optimized for AMD architectures which would really take the wind out of Nvidia’s dubious TWIMTBP program.

    • ish718
    • 8 years ago

    We all know Microsoft and Nintendo are going with AMD. If Sony goes with AMD, I’m sure they wouldn’t use directx11, they’ll probably go with openGL 4.0 or some customized opengl. O_O

    • spigzone
    • 8 years ago

    Leaving nVidia with … ?? … PhysX and TWIMTBP (more) dead in the water and nearly all future PC games highly optimised for AMD GPU architectures with nVidia optimization thrown in as an afterthought?

    Also don’t see any new consoles releasing before fall of 2013 at the earliest or any unveiling until the E3 before the sept-oct fall release.

    Corporations love their cash cows, which the 360 and PS3 currently are, and are loathe to mess with them until/unless
    it is absolutely necessary.

    • TrptJim
    • 8 years ago

    At the moment, PowerVR is the leader in the portable space. It’s used in the iPad 2 and the upcoming Playstation Vita, and is much faster than Tegra 2. We’ll see how Nvidia’s Tegra 3 matches up.

    • ronch
    • 8 years ago

    I wonder why they would not use Nvidia. Is it because of higher power draw? I’ve always thought the extra circuitry for general compute in Fermi makes it harder to keep TDPs low, but if ATI is following the same path, unless ATI has some nifty power saving tech looming, I don’t see how ATI makes a better value proposition.

    Then again, maybe Microsoft’s fallout experience with Nvidia with Xbox 1 and Sony’s current partnership with Nvidia is the reason. Is Nvidia hard to work with?

      • can-a-tuna
      • 8 years ago

      It must be hard to see facts with green glasses on. AMD has better TDP, better perf/die, better value for money, more advanced multi display tech, up to half year ahead in manufacturing process technology, and they don’t use nasty tactics to gain advantage and market share. I’m sure AMD has better and more gaming oriented GPU roadmap than nvidia plus they have Fusion which will most likely be used in further console generations if not Intel’s variation which I doubt because of their graphics incompetence.

        • ronch
        • 8 years ago

        Funny how you jump to the assumption that I am an nvidia fandoi. No. I have owned more ATI cards than Nvidia. But I am not a fan of either.

          • sweatshopking
          • 8 years ago

          he’s a amd nazi. shameful even

      • ChangWang
      • 8 years ago

      You read my mind on the nvidia part. Last I remember, MS and nvidia didn’t really part on the best terms with the first Xbox. I’ll also bet you that nvidia is charging sony the same price now that it did at the PS3 launch. Factor in power, heat, and noise… well mostly power and heat. That doesn’t leave nvidia with much that could fit in a console like box with all the other heat generating parts…

        • Palek
        • 8 years ago

        Not only that but, as far as I know, Microsoft has to pay royalties to nVidia for every Xbox 360 sold in order to enable backwards compatibility with old Xbox games.

      • khands
      • 8 years ago

      AMD has traditionally had better licensing agreements than Nvidia (usually the console makers own the part in full vs. having to license it for as long as they need to produce the console).

    • Rakhmaninov3
    • 8 years ago

    I hope Sony and MS don’t price their new consoles into the stratosphere. Wasn’t PS3 $600 when it first came out? 360 was also expensive, and Wii mopped up because you didn’t need a second mortgage to pay for it (and it was so much fun).

    I hadn’t even heard of any work being done on PS4 before I read this article, I was thinking it’d be at least a couple years before we saw anything out of Sony.

    • FuturePastNow
    • 8 years ago

    I think Microsoft would be the most likely to use the complete APU. That would give new meaning to the words console port. We know Nintendo will just be using an AMD GPU with a PowerPC processor, and I don’t know what Sony will do.

      • BestJinjo
      • 8 years ago

      You realize an HD5770 (low-end AMD GPU) has a die size of 170 mm^2. How do you expect them to put even that GPU together with a CPU on 1 die? And what if you want a faster GPU than HD5770?

      A console’s main purpose is to play videogames. It isn’t feasible to use a modern mid-range GPU like the HD6850/GTX560 in an APU design. That probably won’t be viable until 2013 on 22nm.

        • mczak
        • 8 years ago

        HD5770 isn’t “low end”.
        I can’t see any problem putting that on one die together with a cpu. A 300-350mm² chip should be quite viable for a (high-end) next-gen console (the original ps3 had 2 ~240mm² chips), which leaves plenty of die area for the cpu.
        Something with the performance of HD6870 (Barts die size is just 255mm², GF114 is definitely quite a bit bigger) should be piece of cake integrated with cpu on 28nm (which is probably what these consoles will be using).
        (Note I’m not saying it’s going to be an APU (well not for these consoles I’m still fairly convinced for the Wuu), just that it should be doable if that’s the performance target they have in mind.)

          • BestJinjo
          • 8 years ago

          HD6770 costs $95: [url<]http://www.newegg.com/Product/Product.aspx?Item=N82E16814131434&cm_re=hd6770-_-14-131-434-_-Product[/url<] By definition of discrete desktop GPUs, the <$99 market segment is the lowest segment and therefore low end: [url<]http://www.xbitlabs.com/articles/graphics/display/nvidia-geforce-gtx-460.html[/url<]

    • sydbot
    • 8 years ago

    Wondering if the PS4 will be backwards compatible. They’d have to go with a chip that can look and behave like a Cell, because I just don’t see emulation working that well for 7 special purpose cores.

      • BestJinjo
      • 8 years ago

      Who wants to play PS2/PS3 games on their $500 PS4? If you want graphics from 2006, just buy a used PS3. I never understood the argument for backwards compatibility. If you still nostalgia to replay your old PS3 games, well you have PS3 for that.

        • BobbinThreadbare
        • 8 years ago

        It’s nice to just have one system, and it can throw on some filters to make older games look nicer.

        I for one am glad I got a PS3 before they got rid of PS2 compatibility completely. Although, it looks like most of the best games are getting HD releases now. Which is also good in it’s own way. Can’t wait for MGS3 in 1080p.

      • Anonymous Coward
      • 8 years ago

      If its not backwards compatible, its really going to add to an appearance of directionlessness and disregard for customers, and make it harder for me to ever justify getting a Sony console. When I buy a game, I want to be able to play it at any time for a very long time, just like on a PC.

    • ClickClick5
    • 8 years ago

    Both Microsoft and Sony have had a…interesting time with Nvidia. Microsoft had a huge war with Nvidia about pricing and Sony more or less is as well.

    So for them to all go AMD is understandable.

    • xeridea
    • 8 years ago

    This is kind of a no brainer since the AMD GPUs are ~1/3 the die size, making them a lot cheaper to manufacture, which is a critical concern for consoles to keep the cost down. Also less cooling/power supply needs = lower cost. 3x the die for 10% performance lead just doesn’t seem worth it, even condisering GPGPU side of things, because for equal silicone, AMD totally wins, and watt for watt, isn’t far off. With Nvidia making gargantuan GPUs, low yields, and sub par board circuitry, I honestly don’t see how they stay in business these days.

      • Chrispy_
      • 8 years ago

      By having decent professional workstation drivers.

      Seriously, the profit margin on Quadro cards is unbelievable, it must be something like 500% or more….

      • BestJinjo
      • 8 years ago

      I agree with better power consumption (thus requiring less robust cooling), but your comment about AMD GPUs being 1/3 the size of NV’s makes no sense

      HD6950/6970 have 2.6 Billion transistors on 40nm process, occupying a die size area of 389 mm^2.
      GTX480/570/580 have 3.0 Billion transistors on 40nm process, occupying a die size area of 520-530 mm^2.

      NV’s die sizes for their fastest GPU is 33-36% larger, not 3x the die size!!

      The last time AMD had a really small GPU die was with the HD3870 when it was only 190 mm^2.

    • UberGerbil
    • 8 years ago

    Well, the next gen of the three current console leaders, maybe.

    Though, to quote a line, [i<]there is another[/i<]...

      • DeadOfKnight
      • 8 years ago
      • derFunkenstein
      • 8 years ago

      as much as I’d love it, Sega ain’t comin’ back. 🙁

        • CuttinHobo
        • 8 years ago

        Definitely the 3DO 2!

          • Deanjo
          • 8 years ago

          Nu uh, TurboGrafx 128

            • Palek
            • 8 years ago

            I’ll spoil your fun and throw in a more plausible contender: Apple.

            • CuttinHobo
            • 8 years ago

            Pippin 2! Someone squat on that domain name!

            • no51
            • 8 years ago

            Ya’ll crazy. Obviously Phantom is going to release their console.

      • dpaus
      • 8 years ago

      Duke (Nukem)’s [i<]twin sister[/i<]?!?

        • swaaye
        • 8 years ago

        Oh we’re not ready for the Duchess.

          • Krogoth
          • 8 years ago

          Thanks for the imagery.

          Duchess Nukem would end-up in these two catagories.

          A.) Takes all of the hot blonde chick stereotypes into one packaging.

          B.) A butch chick with a hyper-feminist spin.

          Neither of them are good.

          I curse you rule #63!

    • lilbuddhaman
    • 8 years ago

    What happened to MS and Sony both stating their systems were going to last till ~2014 ?

      • bcronce
      • 8 years ago

      Arm announced a cellphone chip that will be released in in 2013 that will be more powerful than either 360 or PS3. Losing to a cellphone sucks.

        • xeridea
        • 8 years ago

        Yeah I saw that and I think its BS. While the cell phones have come a long way, and can run decent looking games, they are no where near there. I think its just ARM being extremely optimistic/deceitful to try to get market-share. Reminds me of Nvidia trying to twist words/benchmarks around to fool suckers into thinking they are better than they actually are. It is good that embedded systems are becoming faster, and able to do more, but ARM needs to quit thinking they are the shiz. ARM is fine for cell phones and such because the computing requirements are super low, they suck at doing anything that actually requires real processing, on a screen bigger than a credit card.

          • bcronce
          • 8 years ago

          Well, it’s not too hard to expect. 360 will be about 8 years old in 2013. Both GPUs and CPUs have actually been outpacing Moore’s Law with about a doubling every year. 2^8 = 256 times.

          You are nearing a 300 times difference in performance from 2013 tech and 2005 tech, consoles start to sound kind of slow.

          edit: 1.5 years was the normal 18 months. messed up my math. Probably closer to 200 times difference by 2013, as most of the out-pacing Moore’s ahs been in the past 2-3 gen and not every gen since 2005.

            • xeridea
            • 8 years ago

            First, you are wrong on your 1.5x figure, second, Moore’s law is only for transistor density. Smaller processes have smaller returns due to leakage and other factors (currently ~40%). At 40% return, it takes just over 2 generations to get double power efficiency. This return is shrinking every generation. ARM is capped on power. Video cards generally have been doubling in performance per generation, not 3x. Part of the increase has been to increased power usage, with cards now using up to 300 watts for single GPU cards. Processor increase has slowed way down, expect in multi-threaded performance, and this has overhead, depending on application. The main process in 3D APIs is hard to multi-thread due to coherency limitations, other parts can be, point being, there are limitations here. You need to rethink your figures, cell phones perhaps at PlayStation 1.7 level at best, and would beg for mercy @ 1080p (requiring ~6x graphics horsepower). I am not saying they haven’t came a long way, just that ARM is way to optimistic here.

            • bcronce
            • 8 years ago

            I never said they’ll be 200 times faster, I’m saying that current tech is possible to be 200 times faster, so instead make it the same speed and use less power.

            “Video cards generally have been doubling in performance per generation, not 3x.” Yes, but this next generation is going to be 2.5-3 times faster than the prior and the next next generation in 2 years will be ~5 times faster than this next one. By 2013/14, GPUs will be about 16 times faster than current for sharer/compute. Last I read a road map, 16 times faster, 8 time less power per flop, by 2013. Current graphics is already about 64 times faster than 360/PS2 generation graphics, now throw another 16 times on top of that.

            I’m not saying they *will* be perfectly as fast as a 360/PS3, I’m saying I would not be surprised that they could pull it off. I’m sure it will be more optimized for power and the fact they have a smaller display, but I would not doubt the peak processing power backing those chips.

      • bdwilcox
      • 8 years ago

      [url=http://www.youtube.com/watch?v=Ll3uipTO-4A<]This.[/url<]

      • ShadowTiger
      • 8 years ago

      This hasn’t changed. They have simply switched strategies from having 1 console and a “slim” version to inundating the market with accessories.

      Kinect and move were the last push to stay relevant… expect to see more junk on the horizon.

        • sweatshopking
        • 8 years ago

        I would hardly call kinect “junk” i think it’s going to change everything.

          • ShadowTiger
          • 8 years ago

          McDonalds is junk (food)… it changed everything… its still a big influence… consider your point insufficiently detailed sir!

            • sweatshopking
            • 8 years ago

            fair enough, but I think kinect IS teh market leader, and, currently, the best tech. it works beautifully. In that case, I’d say it’s not junk, and will change everything.

          • deinabog
          • 8 years ago

          It may not be junk but it’s not that interesting either.

      • XaiaX
      • 8 years ago

      They still make new Playstation [b<]2[/b<] games. There's no reason to think that a new console launch would mean the end of the existing console. The only time that's happened was when MS put the original Xbox to a swift death on the launch of the 360, but that had more to do with the impenetrable fixed hardware costs of the original Xbox. Namely, hard drives don't ever get cheaper, they only get bigger, so that puts a distinct price floor on your design. With modern flash memory being cheap enough to be usable as core system storage, and flash not having the same downward scaling issue as HDD, there's no reason to think that the Xbox 360 would go away upon launch of a new design. And the PS3 could take a similar approach, if it needs to get the price down to match, although their system architecture makes that quite a bit trickier since they can't afford to have a small flash storage like 4GB the way the 360 arcade does.

      • SPOOFE
      • 8 years ago

      I wouldn’t be surprised if both kept supporting their current gen for a year or two after the successors come out. The reason MS dropped support for original Xbox so quick was mostly due to licensing; they couldn’t drop the prices on system components enough because they didn’t own the IP. I don’t think that’ll be an issue this time around.

      • crabjokeman
      • 8 years ago

      $$$$$$$$$$$$$$$$

      • FuturePastNow
      • 8 years ago

      Even if their new console releases are two years away, it’s time for the design phase to begin in earnest, and that means setting major component selections in stone.

    • AlvinTheNerd
    • 8 years ago

    I can very much see any console as just being a AMD APU system.

    AMD already mass produces them. The PS3’s CPU was suppose to find other uses, but in the end the PS3 is the only major device using cell which means any improvement on the silicon design in reducing the node size has to be done by and for Sony alone. That gets expensive. I think it will be a lesson that Sony learns.

    Plus fusion is basically all they need. Its already designed, its one chip, easy to manufacture the box, easy to cool and has the performance they need. Plus if AMD might stick to an FM1 like pin setup through several generations like they did for AM2-AM3 which will make retooling for smaller nodes, PS4 slim, and other changes a breeze.

    As for the Xbox and WiiU, I think a slightly modifed fusion with 400 stream cores and 2 x86 cores would sell them.

    Plus reinvent the wheel or try and get different chip makers like Intel, IBM, Nvidia to try and get things working together? AMD is offering the whole package.

      • BestJinjo
      • 8 years ago

      You realize Wii U has been confirmed to use an RV770 GPU from AMD right?

      There are only 2 RV770s GPUs:

      4850 with 640 shaders, 40 Texture Units, 16 ROPs
      4870 with 800 shaders, 40 Texture Units, 16 ROPs

      The fastest APU AMD sells is the E6760 (The E6760 is the latest and greatest AMD embedded video card, utilizing the Turks GPU (6600/6700M) from AMD’s value lineup):
      [url<]http://www.anandtech.com/show/4307/amd-launches-radeon-e6760[/url<] E6760 with 480 shaders, 24 Texture Units, 8 ROPs If PS4 or Xbox3 launch with a current AMD APU style GPU, then they have 0 chance of competing with Wii U. The only possibility is to use a next generation AMD APU (but these chips are not even available since 28nm isn't even in production for GPUs). The most reasonable expectations for PS4 or Xbox3 (if they both use an AMD GPU) is something like a 6800/6900Mobile GPUs: [url<]http://www.anandtech.com/show/4475/anandtech-mobile-graphics-guide-summer-2011/2[/url<] They are far more power efficient than Fermi mobile offerings and cost less to manufacture. If PS4 or Xbox3 don't use these GPUs at minimum, they'll be a total failure imo.

        • TDIdriver
        • 8 years ago

        You, sir, are quite ignorant.

        Both the HD 4850 and HD 4870 have 800 shaders. The HD 4830 had 640.

        Nice try though.

          • derFunkenstein
          • 8 years ago

          It’s the same chip with parts turned off, though. My guess is that the Wii U’s GPU will not be made on a 55nm process, probably down to 40nm.

          • BestJinjo
          • 8 years ago

          Oh my bad, I made a slight mistake. My point still stands, both the 4850 and 4870 are far more advanced that anything AMD has in APU format. So the notion that PS4 or Xbox3 will use an APU style GPU is probably flawed unless AMD designs a custom 28nm APU in 2012 for them. I doubt Microsoft and Sony will let an 800 SP RV770 in Wii U beat them.

          But thanks for missing my point entirely and focusing on the 640 vs. 800 SP error. meh.

          • Anonymous Coward
          • 8 years ago

          [quote<]You, sir, are quite ignorant.[/quote<] Nice.

        • Geistbar
        • 8 years ago

        Using the APU to handle all the graphics grunt work would probably turn out about as poorly as you expect. However, they could use them as a processor to handle physics and the like, much as GPGPU is (slowly) trying to do on the desktop end.

          • Arag0n
          • 8 years ago

          Agree. Using an APU with a dedicated GPU for graphic propouses and an APU for GPGPU would be a nice combo. Up to 5-6 times the nowadays XBOX 360 performance inside a cheap package.

        • JustAnEngineer
        • 8 years ago

        A next-generation console needs to be able to drive 1920×1080 at 120Hz for 3D. I’d like some anti-aliasing with that, too.

          • Voldenuit
          • 8 years ago

          3D TVs made up 2% of TV shipments in 2010. If you’re talking about installed base, it’s even lower than that. And seriously, who wants to wear 3D glasses to play a game? Especially with all 3 consoles providing some form of motion sensor controller tech.

          Manufacturers and analysts have been bullish over 3D TV sales* as it promises to increase their margins and justify large markups, but in this economy, I don’t think there are as many people biting as the executives have hoped. 3D computer displays have been around in one form or another for 15+ years now, but the market penetration is miniscule. 3D TVs may just be another flash in the pan – Sony might try to target 3D for its consoles as it makes 3D TVs, but for everyone else, supporting 3D will add to the component cost of their console without benefiting the vast majority of their prospective customers.

          * I’ve seen figures estimating 11% for 2011 and up to 52% for 2012, but you know what? Less than 1 in a 100 people I know have, or plan to get, a 3D TV.

            • JustAnEngineer
            • 8 years ago

            If the console is in the marketplace for 8 years, what do you project the adoption rate to be in that period?

            3D gaming is something that is here today with existing consoles (Playstation 3 has several titles – [i<]Tumble[/i<], for example). A next generation console needs the graphics power to push top titles at 1080p and 120 Hz (60 Hz effective with stereo shutter glasses).

        • Brad Grenz
        • 8 years ago

        Actually, the WiiU has not been confirmed to use anything. It has been rumored to sport a GPU descended from the R700 line of products, but that includes anything from an 80 shader budget part to the 800 shader cards you’ve mentioned. In all likelihood, given what developers have said and the demos shown so far, the most likely configuration is a 320 shader design, possibly in a single chip with 3 PowerPC cores, Fusion style.

        But your point that any fusion design is too slow for a next gen PlayStation or Xbox stands. On the CPU side a Llano would be slower than either the 360 or PS3 in many tasks and on the GPU side a Fusion chip might be theoretically faster, but only marginally so and limited by memory bandwidth.

        Unfortunately, most speculation on next generation designs is colored by TSMC’s failure at providing a 35nm node. GPU designs have been stuck at 40nm for a long time, but the jump to 28nm will be enormous in terms of what is possible at a given die size and power envelope. By the time the PS4 and Xbox 720 are shipping, I would expect at least GTX 580 or 6970 performance levels. For the PS4, at least, it would also be pretty trivial to create a quad Cell processor that would destroy any possible competing design for CPU. We’re talking about machines that will be far, far more powerful than any Fusion chip AMD has going.

          • swaaye
          • 8 years ago

          I think you are overrating the 360 CPU. I’ve sensed from developers on Beyond3D that it is rather gimpy for most typical CPU-like tasks even if it is a potential SIMD beast. I doubt it’s in the same class as a Athlon 64 X2 in the end. Maybe more like a 3 core Atom considering it’s in-order.

          I think a good example of its limited performance may be Oblivion. The game is CPU heavy and performs pretty bad on 360. It also has limited threading so you’re seeing the capability of one or two cores.

            • mesyn191
            • 8 years ago

            Joker and other developers who’ve worked on X360 and PS3 have said that Xenon is better in some ways and worse in others compared to Cell, that doesn’t sound so gimpy at all.

            Up against a A64 X2 or whatever modernish OoOE CPU running main stream OS’es and programs I’m sure it’d get whipped, but then the same is true for Cell. Both of those chips really require tuning to get the most out of them.

            Oblivion was also a PC port…

            • swaaye
            • 8 years ago

            I thought Oblivion was a console port. 😉

            Modern developers don’t want to go down to the metal. Projects are too big and they want their game to run well on everything with the minimal amount of effort. Oblivion is a mish mash of middleware, as many games now are.

            I’m curious if with the next machines we’ll see them lay off with the cost compromised multicore chips that need tons of special attention to perform decently. Developers sure have bitched about them enough.

            • mesyn191
            • 8 years ago

            Yea that is what I meant re: Oblivion.

            Its true none of them want to get “down the metal” and do ASM by hand, and the console guys don’t. But tuning doesn’t necessarily require that, just lots and lots of practical working knowledge of the hardware you’re targeting.

            I doubt that next gen systems will be easy to work on, compared to say a PC. At least if you want to get anywhere near peak performance. I’m expecting more and simpler cores or at least cores similar to Xenon but with bigger and badder vector FPU’s, which beleive it or not is all they may really need.

          • BestJinjo
          • 8 years ago

          Comment Fail.

          [url<]http://www.tomsguide.com/us/Cell-GPU-Bulldozer-Wii-U-PlayStation,news-11809.html[/url<] "Although AMD already announced that Nintendo is using a custom AMD Radeon RV770 GPU in the upcoming Wii U console...." Thanks.

        • swaaye
        • 8 years ago

        As far as I know it was just a rumor some news site started. It doesn’t make any sense for Nintendo to grab a 3 year old off the shelf PC GPU that doesn’t have the usual console hardware integrated. It would be not be a cost efficient design.

        I expect something with a VLIW4/5 shader design but customized, perhaps with eDRAM again, and having a northbridge built in. 28-40nm manufacturing process.

      • bluepiranha
      • 8 years ago

      If this does happen, do you think Microsoft and/or Sony will wait for a slightly upgraded Fusion APU for their purposes, or are the current parts’ compute + graphics capabilities “good enough” for console duty?

      Then again, as for getting different chip makers to get things working together…isn’t that what motherboards are essentially all about? Just a thought.

        • Farting Bob
        • 8 years ago

        The latest fusion chips are good enough to go into current gen consoles no doubt, but the next round from MS and sony will want to be able to do 1080p at 60fps (for 3D gaming), AMD APU’s wont be able to handle that for a while, even if you turn down the AA etc.

          • BestJinjo
          • 8 years ago

          Exactly!! The current GPU in Llano is a joke. It can’t even beat an HD5570 (itself a 400 SP GPU).

          [url<]http://www.xbitlabs.com/articles/cpu/display/amd-a8-3800_17.html#sect0[/url<] There is no way AMD is releasing a full-fledged Bulldozer 6- or 8-core processor with an 800+ SP GPU onboard by 2012. And I doubt MS or Sony will be happy to launch their next generation console with a 400 SP GPU in an APU style design. That won't even beat the Wii U. They'd need an 6800M/6900M at least to be taken seriously.

    • Farting Bob
    • 8 years ago

    I thought word on the street was we still have at least until 2013 before we see new PS and XBOX?
    An autumn 2012 release for the PS4 doesnt leave long for them to get things into gear with the developers, distributors, marketing and hardware guys and i havent heard much definite talk from them yet.

      • TheEmrys
      • 8 years ago

      I didn’t know Sony cared about getting things into gear.

      • xeridea
      • 8 years ago

      I am not a console gamer at all, but this is possibly good news for PC gamers, as their games might be somewhat better looking crappy ports of console games. Like IE holding back the web, consoles hold back PC games considerably due to long life cycles. PS3/XBox360 looked great…. 6 years ago. I don’t play many games that are console ports (maybe NFS but not in a while, last demo looked poorly done, as is expected from EA), so its not an issue for me but I see others playing PC games that have been ported, and they are awful.

        • SPOOFE
        • 8 years ago

        [quote<] consoles hold back PC games considerably due to long life cycles.[/quote<] Incorrect; the PC has benefited greatly from a market which encompasses a much larger and diverse audience than PC gaming alone ever could. If the market numbers weren't there, the huge budgets for most AAA titles would never have materialized.

          • swaaye
          • 8 years ago

          Well we could probably do without most of those AAA games and be better off. 😉

          Besides it’s not like the console world is a new thing. It’s more that the FPS market has exploded in the past 10 years and the consoles became the low cost of entry FPS gamer platform and then every company in the universe needed to get in on it with their corridor shooter clone.

            • SPOOFE
            • 8 years ago

            The FPS market started exploding in the early ’90s and is still exploding. Or maybe we call that a “burn”, like Centralia. But to attribute the rise of FPS’s to the rise of consoles? Ignorance. I would assert that the FPS craze is what knocked PC gaming off the throne in the first place.

            • swaaye
            • 8 years ago

            Yeah I attribute today’s console market to the explosive rise of shooters. That’s what people want on their consoles these days. Call of Duty, Halo, GTA, etc, etc. Even sports games weren’t able to bring in the console gamer population like those have. It really kicked in around the days of the N64 and PS1, when multiplayer 3D shooters hit the consoles. And then Xbox brought in that LAN connection….

            • SPOOFE
            • 8 years ago

            I still don’t see how it’s “explosive”. In terms of pop cultural awareness, it’s been going on since Doom. In terms of market penetration, it’s been happening since at least Space Invaders. It’s hard to attribute a phenomena to “something new” if that “something new” has been around for decades.

            Consoles are a result of popularity and large sales numbers. Huge volume makes things like standardization and platforms (and platform control) desirable. The late ’90s in the PC world showed obvious symptoms, like the rise of Direct X (standardization) and Big-Two dichotomy (AMD/Intel and nVidia/ATI). Throw in the ever-dropping prices of amazing computing hardware and consoles are essentially inevitable. One didn’t cause the other; they were both caused by the same thing.

          • xeridea
          • 8 years ago

          Reasons:
          Crappy console ports are a mockery to PC games.
          Dumbed down PC games to cater to the lazy console gamers.
          Dumbed down game due to limitations of consoles (not only processing/memory, but limitations of having controller as only input).
          Having to develop games on 2-3 consoles then doing an afterthought PC port.
          How configuring your XBOX controller on your PC game just doesn’t seem right.
          All the crappy console only needs ported to PC (to many save points, zero config options, crappy hard to navigate UI, ignorance of the existence of a mouse/keyboard.

          Explain… or try.

            • SPOOFE
            • 8 years ago

            [quote<]Crappy console ports are a mockery to PC games.[/quote<] Crappy PC games are a mockery to PC games, just as it's always been. Or do you think Daikatana was a console port? Ever hear of Die Hard: Nakatomi Plaza? How much consolitis do you think that suffered from? [quote<]Dumbed down PC games to cater to the lazy console gamers.[/quote<] No PC game appeals to lazy console gamers. [quote<]Dumbed down game due to limitations of consoles (not only processing/memory, but limitations of having controller as only input).[/quote<] Because this TOTALLY prevented the PC from having any bad games before consoles started outselling. [quote<]Having to develop games on 2-3 consoles then doing an afterthought PC port.[/quote<] Biggest market gets the focus? Color me shocked, SHOCKED. [quote<]How configuring your XBOX controller on your PC game just doesn't seem right.[/quote<] Then don't do it. You're complaining about having TOO MANY options thanks to consoles, now? We call that "internally contradictory". [quote<]Explain... or try.[/quote<] It might require some effort if the things you say weren't so stupid.

          • BobbinThreadbare
          • 8 years ago

          I’m not sure that’s true. Firstly, many AAA titles are not released on PC, or released so late they might as well have not been released (eg, Halo, Gears of War, Forza, Gran Turismo). Secondly, a AAA title which doesn’t take advantage of PC hardware isn’t really a AAA title for PC. It might be a AAA title on a console, but when all the textures are for 720p, and the mouse support sucks, and there are no mods or dedicated servers, it’s not a AAA PC title.

          Sure it’s nice that Ford makes the awesome Focus RS in Europe, but if they don’t sell in the US it’s not very helpful to American buyers, is it?

          Edit: you are also assuming the PC market wouldn’t have grown significantly had consoles not taken up such a large space.

      • l33t-g4m3r
      • 8 years ago

      The Wii-U might be encouraging them to release earlier, but if they do it’ll be the caught with pants down scramble.

        • Kurotetsu
        • 8 years ago

        Isn’t the Wii U only going to have the graphics horsepower of the PS3, roughly? It doesn’t seem like so much of a threat that Sony and MS have to throw out a brand new console to counter it.

          • BlackStar
          • 8 years ago

          No. The RV770 is at least three times faster than the PS3 GPU.

          • khands
          • 8 years ago

          The power difference between the Wii U and the PS3 is going to be something like 1/3 to 1/2 greater than the difference between the Gamecube and the Wii, so take that as you will.

            • BestJinjo
            • 8 years ago

            False. RV770 has 800 shaders. HD4870 is at least 4-5x faster than the 7900GS/GTX found in PS3.

            HD4850/4870 destroy the GPU found in PS3:
            [url<]https://techreport.com/articles.x/18682/6[/url<] Xbox360 isn't much better as it has 48 Stream Processors only.

      • Rurouni
      • 8 years ago

      I also don’t think Sony will launch the PS4 in 2012.
      If the story true, then it might not be a true PS4, more likely a PS3.5. Basically a PS3 with more RAM and faster CPU+GPU. Sony can mandate that all games must be compatible with PS3, but newer games can render at a higher resolution and/or they can include better textures that take advantage of the PS3.5 bigger RAM. Maybe with the PS3.5 you could play those stereo 3D games with the same resolution and speed as the non S3D relative to the PS3.

      I don’t think a PS3 with bigger RAM and faster CPU+GPU will be more expensive than the current PS3. Probably a fraction more expensive mostly due to the added RAM.

      Then again, they still have Vita to promote, so I don’t think this PS4 or even my theoretical PS3.5 will materialize on 2012.

        • Anonymous Coward
        • 8 years ago

        [quote<]Basically a PS3 with more RAM and faster CPU+GPU.[/quote<] The way things went for the PS3, they probably should be playing conservatively, so yeah a PS3 with more RAM, CPU and GPU sounds sensible to me. Even if MS makes a beastly console would it hurt Sony? Sony would have the price advantage and most games would have to be written to the weakest platform anyway (especially true once Wii U is thrown in). I would be really surprised to see anyone aim for the clouds this time around. Sony will of course call their next console the PS4, even if its a modernized PS3.

      • Corrado
      • 8 years ago

      My guess is that they ANNOUNCE them in fall of 2012 for holiday release 2013. That would make the most sense really. Gives them 2 years to get everything in place.

    • bdwilcox
    • 8 years ago

    It’s going to take Microsoft a little extra time to integrate all its cooling design flaws.

      • Farting Bob
      • 8 years ago

      this time the ring will turn purple when it fails!

        • Goty
        • 8 years ago

        That’s progress right there!

      • xeridea
      • 8 years ago

      Totally… since it won’t be using Fermi silicone of the Power Whore variety.

        • TDIdriver
        • 8 years ago

        haha, “Fermi silicone”

        • ericfulmer
        • 8 years ago

        Didn’tthe FDA just lift the ban on Fermi Silicone? 😛

        • smilingcrow
        • 8 years ago

        Fermi silicone is at least better than floppy silicone!

      • TaBoVilla
      • 8 years ago

      lol

Pin It on Pinterest

Share This