Rumor: ‘Xbox 720’ to have Radeon HD 6670-class GPU

Either we’re looking at an inaccurate rumor here, or the next-generation Xbox won’t be the graphics powerhouse some of us expected. The folks at IGN claim to have gotten details about the “Xbox 720,” and they say the upcoming console will feature a GPU equivalent to today’s Radeon HD 6670.

Just to put things in perspective, the 6670 costs $69.99 at Newegg. It has 480 shader ALUs, a 128-bit memory interface, and the ability to process 8 pixels and 24 texels per clock cycle. IGN claims this GPU will endow the Xbox 720 with six times more graphics horsepower than the Xbox 360, which might well be the case… but the Xbox 360’s GPU is the equivalent of an archaeological relic these days.

A better comparison would be with AMD’s new Radeon HD 7970, which has 2048 shader ALUs, a 384-bit memory interface, and the capacity to process 32 pixels and 128 texels per clock. To make matters worse, IGN says the Xbox 720’s graphics chip will enter production “by the end of 2012,” and that the console itself won’t be out until the fall of 2013, almost two years from now. At that point, even the 7970 will be looking long in the tooth.

I suppose a 6670-class GPU might make sense if Microsoft intends the Xbox 720 to be an uber-cheap device meant to double as a Joe Everyman’s media hub (and do a little gaming on the side). Still, if there’s any truth to this rumor, games might not make a truly substantial leap in visual fidelity for a long, long while.

Comments closed
    • sschaem
    • 8 years ago

    The next gen consoles are going to be just *phenomenal*, and a 6670 class GPU is not in the card.

    The cost to MS is a fix license + fab cost. MS can double the compute performance of the system for ~$30
    So cutting $30 of production cost to have a system with half the performance is out of the question.
    Their limit is power, cooling & real-estate. But so far 4+ billion transistor is on the table with ~250 watt.
    So we wont see 7970 raw performance from next gen console from those imposed limitations,
    but the end result in games will be no less impressive.

    What MS is putting together is a console for the next decade that compel current console owner to upgrade.
    And why upgrade if the system is only 2-3x faster? games would look almost identical.
    Thats what a 6670 would deliver since 2.2x faster is just enough to run in 1080p vs 720p, people might not even see a difference.

    The new systems will pack about ~10x the CPU/GPU compute performance and include new features.
    So expect to see a small ‘SSD’ on all systems, no exceptions…
    (This is used in conjunction with a version of AMD VMEM architecture introduced in their 7 series)
    note: This ‘ssd’ is not a replacement for the traditional HDD.

    Thats my prediction and I’m sticking to it πŸ™‚

    • PetMiceRnice
    • 8 years ago

    I haven’t read all of the posts related to this topic, but I reckon that they are concerned with keeping the amount of heat generated down to avoid massive failure rates. Don’t forget that consoles are not housed in big enclosures like a gaming tower. If the new Xbox is a few times faster overall than the 360, I would be happy, but then again I don’t really game much anymore.

      • Airmantharp
      • 8 years ago

      You can put whatever amount of gadgetry into whatever space you like, limited by the physics of available technologies alone- but that’s nothing new.

      I don’t think heat, by itself, has anything to do with it, or with the failure rates of the previous generation- it was simply bad engineering.

      As an aside, I did get to personally see the scale of BOTH companies failures during my supervisory tenure at a UPS sorting facility- at the height of the Xbox 360 return debacle, we had a dedicated line with trailers reserved specifically for them, due to so many thousands coming through on a daily basis. The PS3 side of things was a little more gradual, but still took up significant space in the system.

    • Rza79
    • 8 years ago

    I’ve read somewhere that, just like the last revision of Xbox 360, Xbox 720 will use a ‘Fusion-style’ CPU. Since the Xbox 720 CPU will be (probably) produced on 32nm SOI, this makes sense. AMD used a 400 shader (5-wide VLIW) on Llano. If it also incorperates an triple or quad core CPU then 480 shaders is about the limit they can go without making this chip too expensive to produce. There’s a big chance though that MS will again make use of GDDR memory and other technologies (like they do now with the NEC addon chip which includes the ROP’s & EDRAM).
    But, and that’s a big but, it needs to be clocked very high for it to be 6x faster.
    XBox 360 GPU = 240 GFLOPS
    240 * 6 = 1440
    So a 480 shader unit would need to be clocked at 1.5Ghz to be 6x faster. This makes me believe the 6x figure must be for the whole system (the same way nVidia compares T3 with T2). It probably means 6x more graphical power thanks to the upgraded CPU’s, GPU, GDDR5, … and whatnot.
    Anyhow, if XBox 720 is indeed 6x faster than XBox 360 then we’re not completely doomed. Don’t forget that these consoles can have up to 1000x less overhead from software and optimisations compared to PC’s.

    Edit: spelling

      • Rza79
      • 8 years ago

      Whoever voted me down, can you also explain to me why?
      If my info is wrong or my logic flawed then I want to know why.

      • Airmantharp
      • 8 years ago

      Up-vote for taking a decent stab at the problem.

      There’s doubt that AMD would actually license their CPU tech to MS, and it seems more likely that there will be IBM IP in there, I wouldn’t rule out a scaled-down STARS (AMD X-whatever) implementation, given that AMD has moved on, and it’s a proven core.

      Hell, an out of order CPU would be awesome, and almost needed to get 6x the CPU performance out of such a system at a reasonable cost.

        • Rza79
        • 8 years ago

        I never implied that they would use an AMD CPU. πŸ™‚
        Just an AMD GPU.

          • Airmantharp
          • 8 years ago

          Oh I know- not placing blame on that one.

          Thing is, if they’re going straight for an integrated CPU/GPU, they had might as well just grab the full AMD APU and be done with it; even if it is dumbed down on the CPU side to make way for unfettered licensing.

            • Rza79
            • 8 years ago

            Why would they? They’ve done an APU before AMD so it’s not like they need AMD’s expertise. All they need is IBM’s expertise. :p They would want to be backwards compatible and that’s easier if they stick with PowerPC cores. Anyhow I truly believe that IBM has much more to offer in CPU technology.

            • Airmantharp
            • 8 years ago

            I don’t doubt, at all, that IBM has much to offer in CPU technology.

            The only sticking point of the single general-purpose core in the Cell and the three in the Xenos was that they were in-order- essentially castrating them from being able to effectively run dynamic, branching code. That’s what I really hope they rectify, whether with IBM’s IP, AMD’s, or whoever else.

            A good example is the comparison by game developers that the Xenos with it’s three 3.2GHz double-threaded cores was only about twice as fast as the 800MHz Intel Celeron in the first Xbox, primarily due to the lack of out of order capability.

            • Rza79
            • 8 years ago

            [quote<]A good example is the comparison by game developers that the Xenos with it's three 3.2GHz double-threaded cores was only about twice as fast as the 800MHz Intel Celeron in the first Xbox, primarily due to the lack of out of order capability.[/quote<] That comparison is old and is only true for single threaded stuff.

            • Airmantharp
            • 8 years ago

            I agree that it is old, and that it can only really compare to single threaded stuff because the Celeron had only one core, but compare the Arm A9 with it’s A15 successor. The A15 is typically quoted has having double the IPC just do to being out-of-order.

            On top of that, even out-of-order PPC cores have been shown to have lower IPC than their Intel equivalents of the same generation, great evidence from Apple’s switch from IBM’s PPC to Intel’s Core. And keep in mind that the Cores we use today are descended from that old P6 architecture that the original Xbox’s Celeron was based on.

            On the balance of it, while the PPC-based Xenos might wind up being more than 2x faster than it’s Celeron predecessor, I don’t think it’s nearly as fast as it’s basic specifications might suggest.

            • Anonymous Coward
            • 8 years ago

            Well “Xenos” sucks. It appears to have come from a time that IBM thought clockspeeds where the most clever idea, along with Power6. I doubt it has better per-cycle performance than Atom. In fact Atom might generally embarrass it, per clock.

            Apple switched for many reasons including the fact that they needed a reliable roadmap and not the daydreams of IBM and Moto. The “G5” was an alright design, apparently competitive with K8 on average, but Intel’s reliability of delivery seems to be way out of IBM’s league. You might be familiar with [url<]http://en.wikipedia.org/wiki/PWRficient[/url<] which was apparently destined to become the next laptop chip for Apple. It was probably a fine product, but it would have represented a huge risk for Apple compared to using Intel. Apple's switch proved nothing except that Intel is the best CPU maker on the planet.

            • Anonymous Coward
            • 8 years ago

            [quote<]Anyhow I truly believe that IBM has much more to offer in CPU technology.[/quote<] I am skeptical the IBM is ahead of AMD by any significant amount. Power7 vs BD would be an interesting showdown.

      • sschaem
      • 8 years ago

      The limit for console design is power & heat while maxing out the latest process tech.

      32nm will be a 3 year old process by the time the new xbox is released.
      And we already have 28nm (4+ billion transistors) chip in consumer video cards,
      why go back 3 years to 32nm when power efficiency is key.

      The current xbox uses <100w with 45nm chips. Naively, going back to 200w would allow MS to allocate all that to the new GPU.
      and going from 45nm to 28nm can increase power efficiency by 35%.
      So a 7970 class GPU is a stretch, but something very close to it is possible.

      We dont get 1000x less overhead on consoles, but console might pioneer new game engines based on newly introduced HW features.

      Best example is the 7serie Vmem architecture, its in HW but their is no DirectX API layers yet for it.
      and even if it was, what game developer would use it so their engine only work on 1 model of card of 1 manufacturer.
      On console, game developers can use all HW features without worry… (but this might make allot more titles xbox exclusives)

      I would conclude that a special AMD 7 serie GPU on 28nm is the blueprint for the next xbox. Not the age old 40nm 6670.

      • kamikaziechameleon
      • 8 years ago

      “Don’t forget that these consoles can have up to 1000x less overhead from software and optimisations compared to PC’s.”

      Not true anymore, hasn’t been for a while. Not equal but really you see the diff. the ps3 has entire cores allocated to overhead.

    • tikrjee
    • 8 years ago

    [url<]http://www.fudzilla.com/home/item/25619-oban-initial-product-run-is-real[/url<] Semi-accurate broke the original story, and Fudzilla did a followup. According to several other sources as well, everyone seems to be overlooking that it is possibly a modified 6000- or 7000-series in a 32nm SoC. Like an APU, but with hutzpah. This would actually be good because that's 1) one less part to go wrong, 2) one less component to cool, and 3) one less component to build. Cheaper, smaller, more efficient. If the graphics power is equivalent to the 6670, that's not really a bad thing. Considering a 6670 with an AMD II X4 631(2.6GHz) can run pretty much Skyrim with DX11 eye candy enabled at 1440x900 at a relatively low power draw, and most likely, developers will still produce at 720p, since there are still people out there who, though bought an HDTV, bought it cheaply and didn't get 1080 (or bought a 32" or smaller and would never notice the difference between 720 and 1080 on a console anyway). A setup similar to this would actually prove both effective at pushing pixels while still allowing for a compact form factor with minimal noise/cooling needs, and a lower production cost.

      • tikrjee
      • 8 years ago

      [url<]http://www.fudzilla.com/home/item/25717-xbox-next-to-use-blu-ray-discs[/url<] Also, this. Blu-ray discs. Yay, more room for uncompressed audio! Boo for protection against used games! Meh on smaller controller and super-Kinect. And, as always, THESE ARE RUMORS. Not a whole not of fact behind them. Except for the taping out of the Oban SoCs. That seems to have been confirmed.

    • Antimatter
    • 8 years ago

    It’s hard to believe that AMD’s Trinity APU will better graphics than the next gen Xbox and that’s due in a few months.

    • Bensam123
    • 8 years ago

    Does it bother anyone else that monitors and TVs have been out that produce 1080p for like the last 6 years and it has become a baseline that people generally don’t push past? Instead they switched to huffing 3D at people and higher refresh rates. Higher refresh rates I don’t mind, but they could’ve done that before (and have with CRT monitors).

    That aside, this is quite a frivolous rumor. No one even knows when they’re going to come out. The latest I heard was 2014/15 (now end of 13?) and a 6670 would be quite antiquated by then, just what the doctor ordered for consoles…

    Really MS and Sony are just milking the console market till they and game developers shift focus back to the PC. It costs them nothing to produce a new console till they start losing customers. They really don’t even need to do that much work to crap one out. All it is would be a HTPC with a custom case (with the rare exception of Sony spending gajillions of dollars on something they don’t need to). I’m guessing it would take the same amount of time as it takes Alienware to crap out a new custom PC, maybe a year or two. The time is more then likely all invested in making sure there are video games available for them when they launch.

      • Airmantharp
      • 8 years ago

      WRT to 3D and refresh rates:

      I’ll say that I like 3D in theaters. If you saw Avatar in Imax 3D, you know what this technology is capable of. I’d love to have that kind of experience at home, and I’m glad they’re working on it.*

      For refresh rates- CRT’s only needed to go above 60Hz for flicker associated with the technology. My eyes needed at least 75Hz to not be significantly annoyed, and 85Hz to almost eliminate the problem. LCD’s as a technology don’t suffer from this, as they don’t shoot electrons through a vacuum tube at rows of phosphors.

      *I’m quite aware that the ‘active 3D’ stuff being sold today with HDTV’s is crap. My eyes are far too sensitive for the amount of ‘flicker’ involved, and while it’s nearly impossible to purchase a decent TV without the current active 3D tech built in, it’s not something that I will personally invest in or recommend to others.

        • Bensam123
        • 8 years ago

        I’ve seen 3D in theaters, it’s not all that impressive. I’ve tried it on the computer as well and it’s not impressive there either. It breaks the illusion whenever the depth of the image is inverted (like if a person turns around). The image flips to maintain the depth, but before it does it goes flat, which breaks immersion for me. That and 3D generally gives me a tolerable, but mildly annoying disconnected feeling, as well as it starting to become linked to eye and brain damage. Currently in kids, but I’m sure it will expand as it gains a larger adoption.

        3D is a parlor trick and always will be till the images are directly beamed into our brains bypassing our eyes.

        I haven’t really mind the change in refresh rate though. You can tell that 120hz LCDs are more fluid then 60hz ones. Geoff (or was it Scott) even mentioned how easy it is to tell them apart in motion scenes when they’re sitting next to each other. In my experience you don’t even need to do that, you just need to watch a action packed blu-ray.

          • Airmantharp
          • 8 years ago

          I respect your position on this; I guess it depends on perspective more than anything.

          If you compare the technology with an imagined (though probably quite accurate) standard of perfection, then what we have today obviously comes up quite short. This I agree with and appreciate.

          But in this case, I feel that however insufficient the technology is, it is still a step forward from what we have- and having seen what it is capable of now, I have hope that it will continue to advance in usefulness.

            • Bensam123
            • 8 years ago

            Aye, I don’t disregard 3D technology as a whole, but they will be unable to add depth to the screen without either making it a real screen or beam the images into our brains. Our eyes stand in the way of reproducing such a effect well on a 2D screen.

            If you go to a play or theater, which you should try at least once, it adds an entire other dimension to things (well besides the live actors).

    • plonk420
    • 8 years ago

    i just want games to look like

    [url<]http://pouet.net/prod.php?which=51443[/url<] and [url<]http://pouet.net/prod.php?which=58262[/url<] (yes, these DO run realtime on a PC πŸ˜‰

    • Austin
    • 8 years ago

    ;o) It *could* be that Ms are releasing 2 separate models; entry-level and a high-end. Games could automatically run medium details on the entry-level (6670) console and high details on the high-end console (7970). Either way they should factor in the ability to play Xbox360 titles with AA and other enhancements, would be a nice way to enjoy your older titles. I’d also like to see QUIET consoles, something an entry-level model could certainly excel at.

      • Bensam123
      • 8 years ago

      No, that’s too much like a computer. The high end models would just end up a curiosity with a couple games that actually take advantage of it.

      One of the supposed selling points of producing games for a console is that they’re all the same.

    • yammerpickle2
    • 8 years ago

    I hope the rumors are untrue and they shoot higher up the graphics food chain. Otherwise another eight years of dreadful console ports. This will cause the discrete graphics card market to erode even more. Why bother with an upgrade to the latest and greatest when old systems and cards are already being underutilized? I only hope that if this is true 4K 3D TV’s and multi-monitor gaming take off and make the next generation of consoles obsolete quicker.

    • l33t-g4m3r
    • 8 years ago

    I can think of several reasons for this. DX11 is horribly inefficient, needing either SLI or a monster like the 580 to get acceptable framerates, and it’s too difficult and expensive to make those chips in quantity. Cooling would also be a big problem. Microsoft is instead going to make a console that gives them more headroom to make the type of games they’re already making, meaning bigger levels. W/E. Graphics cards makers and game devs have brought this on themselves for making horribly inefficient hardware and games, and I’m cool with MS’s decision since that means less upgrading I have to do in the future.

    • GTVic
    • 8 years ago

    I don’t think this is well thought out, including some of the comments. If Microsoft thinks that HD 2K is not applicable to the 720 generation then they only need hardware that can handle 1080p and 3D.

    The horsepower required on a PC for current games is quite a bit different than what is required on a console. Triple-monitor spanning support is not required for example.

    Also not taken into consideration is that AMD takes a generic 6xxx and fine tunes it for workstation graphics and sells it as a FirePro device. So they could easily take a 6670 and customize it with more ALUs to fit Microsoft’s requirements.

    As seen on the original 360, Microsoft does not choose to design to be future proof, it didn’t come with HD-DVD or Blu-ray built in and I don’t think the original version even had HDMI. Regardless of that it was still a difficult product to build and manufacture and they took a loss on the sales for quite some time.

    What they would want this time is something that can be built using established processes, not bleeding edge and not something that costs double what they can sell it for. From that perspective a 6670 or modified 6670 may be perfectly reasonable.

      • cynan
      • 8 years ago

      You make some valid points. Most notably that even if MS does use an HD 6670 “platform”, it will contain optimizations that will make its performance handily exceed that of an HD 6670 installed in a desktop. Perhaps this is why the HD 6670 is quoted as being 6x the performance as the Xbox 360 as, as others have pointed out, a desktop HD 6670 certainly doesn’t offer 6x the performance of the chip in the 360 (closer to 3x).

      I also think people tend to forget how long ago the original Xbox 360 was actually released. December 2005 was a long time ago. Consumer Bluray or HD DVD players were not yet available (though they were during 2006). If there had been, the Xbox 360 probably would have had one. It was only after MS saw that Sony was using the PS3 to push Bluray that they jumped into the HD-DVD camp with Toshiba. I agree that the 360 should have had HDMI from the get go though.

      The important point is that, optimizations for HD 6670 aside, the Xbox 360 used cutting edge, state of the art, consumer graphics tech at the time. Dec 2005 was right in between the X1800 and X1900, which is what the chip in the Xbox 360 approximates. At the time these cards cost about as much as an HD 7970 does now. Going with a HD 6670, instead of something more current, is a vast departure from this model and it seems like they’re taking a page from Nintendo’s play book. It shows that MS is no longer interested in bringing cutting edge tech into living rooms (as they were with the first two Xboxes).

      Given the initial success of the Wii (which also used less-than-cutting-edge graphics) and the overheating problems of the first batch of Xboxes, this move is perhaps not surprising. It does mean, however, that MS is giving away the edge that made them competitive vs the PS2 and PS3 – namely faster or comparable hardware at a lower price than what Sony offered. This of course leaves Sony to come in with a monster of a PS4 and really dominate. But again, one has to wonder if Sony can commit to such a strategy, given how much money they lost just making and selling PS3s.

      So, given this rumor is credible, the outlook for next-gen consoles giving “cutting-edge” gaming performance looks pretty grim. I also agree that this does not bode well for technical advancements in gaming if the gaming industry continues to cater to these platforms. If this does come out fall 2013, it will mean the Xbox 360 will have had a product life cycle of almost 8 years. The thought of being limited to HD 6670 performance (even with a host of platform specific optimizations) in 2020, at least to me, is less than appealing.

      In summary, this rumor suggests that MS is going for a cheaper to produce (at launch), lower risk design. Instead of competing with Sony (as they did the past two generations), this time their target is Nintendo. While this might be a good strategy as far as trying to get the console into the hands of the masses, it really saps most of the “enthusiast” appeal (raw performance) out of these so-called next-gen consoles that their previous versions had. It also will inevitably further widen the gap between modern PC gaming platforms and consoles. Who knows, this could be a good thing and bring about more PC-directed development. Sadly, with all of the console ports from half-decade and older consoles deemed “good enough” by developers in recent years, I wouldn’t get my hopes up.

    • Ushio01
    • 8 years ago

    I can see this as being true for one reason limited scope of future process shrinkage.

    The 360 was released with a top of the range GPU at the latest process, there have been two process shrinks to reduce cost since. With the time between process shrinks increasing there may only have 1 chance to reduce costs in the consoles lifetime so they may be looking to reduce costs now.

    • Chrispy_
    • 8 years ago

    The next-gen consoles need next-gen content creation tools, not better hardware.

    Big budget games [i<]already[/i<] take too long, too much money and too many people to make, yet they're costing us about the same to buy and contain fewer hours of gameplay than they used to. Bioshock, Arkham Asylum and Forza look good on my 360. Not amazing, but good enough. What would make the platform better is not higher resolution or better detail, but more variety of games, more frequent releases and more emphasis on gameplay and not graphics. Super Mario Bros. was fun. It ran at 256 x 224 resoltution and had 48 colours.

      • PainIs4ThaWeak1
      • 8 years ago

      I agree with some of your points, but TBH – I want ALL those things in a game. Though I WILL sacrifice on some of those points, if the others outweigh what is lacking.

      … For example… Fallout 3 & New Vegas, though not particulary “great” looking, IMO, it more than made up for it in gameplay, duration, re-playability, game environment, and just plain fun.

      On the opposite side of the spectrum, and again, IMO, … Metro 2033… Looked fantastic, and while, yes, it was fun, I believe it lacked in duration, re-playability, and the gaming environment (it was a wee bit too linear – though, isn’t always a BAD thing.)

      Conclusion: I liked both, for different reasons.

      • phileasfogg
      • 8 years ago

      Very well put. the 2nd para in your post is 1000% true.

      • yogibbear
      • 8 years ago

      If you give devs better horsepower then they can use more of it for AI, loading cool level designs etc. without sacrificing on the shinies. What they did this gen was sacrifice on the game world size and narration by using endless “tricks” so that the game still looked good on a console. Arkham City looked pretty good but the map design was almost bat shit crazy. A big horseshoe…. hm…. totally not for level loading reasons, view distance, texture pop in etc.

        • Kaleid
        • 8 years ago

        You’re right, that’s why Arkham City runs on a measly hd5450 512MB.

          • Anonymous Coward
          • 8 years ago

          That cheap graphics card has as much RAM on board as a whole XB360…

      • Anonymous Coward
      • 8 years ago

      Games might get a bit easier to make if devs don’t have to waste their time squeezing water from rocks.

      • kamikaziechameleon
      • 8 years ago

      amount of content isn’t really an indicator of quality. Old games had 90 percent filler that is not value. there has been a huge divergence in game development between Independent and block buster. Games don’t have to conform to your notion of them anymore.

    • hubick
    • 8 years ago

    Forget about how outdated this GPU will be upon release, think how far behind it will be at the *end* of this console’s 7-10 year lifespan! Gah πŸ™

      • hubick
      • 8 years ago

      I mean, even if you forget about trying to anticipate where we would like games to be a decade from now (what this should really be about)… and instead take a popular game *today*, like Battlefield 3, as a benchmark.

      If you look at 64 player Battlefield 3 running with max settings on a PC today… I want a console which can match that graphical fidelity on a 4k TV resolution.

      I’d like to think my console games a decade from now would even look better than what we have today (more detailed models/textures, etc), but that should be a baseline, and we aren’t even gonna get near that with this GPU! πŸ™

    • ChunΒ’
    • 8 years ago

    Weren’t there also rumors about a multi-xbox generation, too? Like a gaming Xbox and a media hub one? If those are in fact true, this will be easier to swallow.

    • swaaye
    • 8 years ago

    Interestingly, this sounds similar to WiiU’s supposed performance level.

    I imagine that part of the reason for choosing a midrange GPU is power usage and the related thermal issues. A 6670 should keep it in a similar power footprint relative to the 360. Even this hardware will be a huge improvement over the current hardware and, as such, sell as “next gen” to the audience there.

    I expect it to be a highly custom solution though, perhaps with eDRAM again along with various integrated extras to reduce the number of chips needed for the machine.

    • lilbuddhaman
    • 8 years ago

    What i’m wondering is what custom units they’ll add to the gpu. The 360’s hardware had a “next-gen” tesselation unit that was not found in the original X1950, and ended up being extremely similar to the one in the 4xxx (or is it 5xxx?) series of hardware.

    What will we see in this GPU that the 6xxx series does not natively have ?

      • khands
      • 8 years ago

      Well, it does have multi-monitor support as well, I’m not exactly sure besides that and some non-game related enhancements.

    • gmskking
    • 8 years ago

    If Microsoft decides to go this route I promise a PS4 will be in my future because I know Sony will not bring out an inferior system.

      • Corrado
      • 8 years ago

      Sony will not tread down the road of releasing super high end, expensive hardware again. It bit them in the ass for YEARS. The rumors have the PS4 being essentially a refresh of the PS3 with more cores and higher clocks and more memory.

        • khands
        • 8 years ago

        If it doesn’t use a unified architecture it will be DOA.

        • mcnabney
        • 8 years ago

        Actually, the PS4 is going to be a decent jump forward. Full support for 4K (the BluRay standard will be making that jump as well) – so gaming and movies at higher resolution will be possible. Xbox720 will be stuck at 1080p, so Sony will have a pretty decent advantage. Plus – people will probably be more willing to buy movies at 4K since there isn’t going to be higher resolution in the future because 8MP is the limit of 99.999% of existing movie film.

          • Kaleid
          • 8 years ago

          4k won’t be the norm for years.

            • sweatshopking
            • 8 years ago

            4k won’t be the norm for at least a decade*
            There, fixed that for you.

            • travbrad
            • 8 years ago

            but it’s a big number on spec sheet. My number is bigger!

        • albundy
        • 8 years ago

        it all depends on how they leverage their cost factors. Bluray was expensive back then, and it was a proprietary drive (vs using DVD), but the fact that you could get console and Bluray together for less or equal to a standalone component was a good bet that sony would sell many units more than they anticipated. My guess would be that they made way more money because of the super high end hardware, with people buying games and movies like crazy. i cant even imagine sony’s cut on licensing, fees, etc.

        • Ushio01
        • 8 years ago

        I wonder, would putting 2 cell processors on a single package be possible for the CPU? as it would allow full backward compatibility without having to include the PS3 chipset like how the first run PS3’s included the PS2 chipset. As between that and much lower bluray drive costs the build price of a PS4 would be much lower right?

          • Corrado
          • 8 years ago

          Thats the rumors I’m hearing. The cell is inherently scale able. Its got 7 or 8 cores for gaming now. If they could tweak it, get the IPC up a bit, die shrink and up the clock a touch, they could, in theory, double the CPU power while still being 100% backwards compatible.

            • Ushio01
            • 8 years ago

            IBM still offers the PowerXCell 8i doesn’t it? I wonder if it could be based on that. I guess we will have to hope that Sony continues to use Nvidia GPU’s for full backwards compatibility.

            • Geistbar
            • 8 years ago

            Cell might be core-scaleable, but the games it will run will not be. Increasing the core count would likely provide very little true benefit to games- any significant benefits will have to come from clock bumps and design tweaks.

            • BobbinThreadbare
            • 8 years ago

            Well, it depends. The stuff that is CPU bound, like physics might well scale to extra cores easily.

            • evilpaul
            • 8 years ago

            PS3 games typically have the two thread/one Power PC core schedule and feed the 7 less capable vector cores.

            • Geistbar
            • 8 years ago

            The thing here, though, is that the things that do scale- such as physics- are rarely the limiting factor. [url=https://secure.wikimedia.org/wikipedia/en/wiki/Amdahl%27s_law<]Amdahl's Law[/url<] dictates that the least scalable factor is the one that limits the benefit from adding additional cores. Truth be told, I've never understood the hype for cell- it's greatest strengths don't particularly align with game code. Now, the things it is good at it, it does quite impressively- no denying that. Those things aren't games however. Xenon has similar limitations (which also confuses me as to why they picked it), but it's never gotten the hype train that Sony* built up for cell. * Which also confuses me, because Sony didn't even do most of the design work- that was [url=https://secure.wikimedia.org/wikipedia/en/wiki/Cell_processor#History<]IBM and Toshiba[/url<].

            • evilpaul
            • 8 years ago

            The Cell processor’s strengths and game code’s requirements are so barely related it’s perplexing. It’s a massive SIMD engine basically, and games are branching not-SIMD stuff. Hell, it was re-purposed as a PCI-e video encoding engine.

            • Geistbar
            • 8 years ago

            Exactly, and yet it’s hyped as the best thing for consoles since the analog stick by the gaming media and even most of the technically inclined enthusiasts. I don’t get it in the least. I originally guessed that Sony used it to try and have a high volume part to get yields up so they could use it in other devices, but I can’t think of anything else it’s been used in besides some blade servers by IBM.

            The whole manner in which cell has been treated by Sony is, as you said, perplexing.

            • BobbinThreadbare
            • 8 years ago

            Well originally, Sony intended the Cell to do everything including render graphics. With that in mind, the design makes sense. Then it turned out to be too slow, and they had to add a real GPU.

            • BobbinThreadbare
            • 8 years ago

            Most games aren’t really CPU bound at this point, on any CPU.

            • Geistbar
            • 8 years ago

            If that is the case, then they won’t see a speedup from a boost to cell either way, no?

            • Airmantharp
            • 8 years ago

            deleted

            • BobbinThreadbare
            • 8 years ago

            Not traditionally no. But it should allow for more advanced physics like I’ve been saying all along.

            Of course there are other ways to do that, like put a faster GPU in that can do physics and graphics or two GPUs or even a 3rd dedicated chip.

            I don’t really care about Cell, as I don’t think what CPU is in the consoles will really change anything. Actually I’d like to see x86 chips just because it should be easier to do ports to PC.

            All that said, I believe the way the PS3 works is that the cell does some of the rendering work, so it could lead to better looking graphics.

      • Kaleid
      • 8 years ago

      They’ll break their tradition?

      • tikrjee
      • 8 years ago

      Sony’s announced at this year’s CES the PS3 will have a 10 year life-span minimum. So… don’t expect till 2014+. By then, who knows what the norm will be. Maybe our latest and greatest today will be laughably outdated? Maybe Moore’s law will no longer be applicable and we won’t see too many breakthroughs in regards to transistor counts past 2013? Maybe cybernetic implants or holograms will be come reality? We could even have lag-free connections with cloud-based storage systems, allowing for unlimited HD content streaming!
      The only thing that can possibly be known for sure at this point is it will be more powerful. As to how much more is to be determined. Suffice it to say, then, it will, of course, be the more powerful system than the NeXbox-whatever. But, as time has shown, the software will be governed by the less powerful system. So…
      Then again, AMD may finally overtake Intel again, SSDs may drop to $0.07/GB, and the US government may actually sit down together and get stuff accomplished. So who knows?

        • Corrado
        • 8 years ago

        The PS3 having a 10 year lifespan doesn’t mean there won’t be a PS4 in the next year or two. They are still making and selling PS2s, remember. The PS2 was stated to have a ’10 year lifespan’ as well, but Sony backtracked and said they would keep making them until people stopped making games or they stopped selling in enough quantities to justify it.

    • puppetworx
    • 8 years ago

    Real men game on a machine with a graphics card which will melt your face off. Consoles are for kids and always have been.

      • PenGun
      • 8 years ago

      Indeed the number of console wienies here is amazing.

      I would never of expected the population of a tech site like this to be so in love with this stupid, game destroying, for the technically challenged, garbage.

      • lilbuddhaman
      • 8 years ago

      Real men climb mountains, fight fires, wrestle bears, kayak the pacific, drink dos equis, transform into centaurs, and overclock their toasters.

        • PenGun
        • 8 years ago

        I have not wrestled a bear. I have one nearby, Thomas the black bear but he is huge and it would be a one hit fight. I am a werewolf in Skyrim … but no centaurs yet. My toaster already overheats.

        The rest I have done but I prefer a nice single malt.

      • Game_boy
      • 8 years ago

      Hardware is irrelevant to game quality. Come on.

        • PenGun
        • 8 years ago

        Game quality is a wide subject. As to whether sheer computing power makes a difference to game quality one could make an argument. It is however light years from Pong to Rage.

        Hardware certainly does make a difference in the possibilities one can explore with a game. One cannot play a game with intensive 3D graphics on a weak machine.

        The point of course is that due to consoles taking so much of the market we are stuck for many new games with Directx 9 so that they can be easily ported to the accursed console.

        Hardware is very relevant to game quality. Certainly one can make a crappy game on nice hardware, Rage springs to mind, but it would be very difficult to take Stalker in it’s present form and run it on a console. They are the best video games ever in my opinion.

      • Vrock
      • 8 years ago

      Real men don’t judge manliness by how one enjoys video games. Real men don’t do that at all.

      Now, pimply faced mama’s boy geeks living in their parents’ basements, those are the guys who determine manliness based on one’s gaming preference.

      Real men, on their pinkest, most metrosexual, in touch with their feminine side, wearing Secret because they ran out of Speed Stick emotional day, are still far more manlier than those types.

      This is the part where you feel embarrassed. Goodbye.

        • Firestarter
        • 8 years ago

        I was with you up until you mentioned wearing women’s deodorant, that shit just won’t fly!

      • paulWTAMU
      • 8 years ago

      I’m mostly a PC gamer but really? Did you have to go there? Who the hell cares what you enjoy gaming on?

      • Krogoth
      • 8 years ago

      Real men don’t give a flying hoot on how other people value their hobbies.

      • kamikaziechameleon
      • 8 years ago

      I gave you a +1 simply because their is so much pc gaming hate on these forums. Your statement is a bit silly and childish. Just because PC is my #1 platform doesn’t mean I don’t love consoles or that I’m any less interested in this news.

    • ALiLPinkMonster
    • 8 years ago

    Wasn’t the 360 GPU based on the X1950, a fairly solid mid-range performer for its time? Why would they go to such a low end part this time?

      • khands
      • 8 years ago

      It was based on the 1800/1850 and got some big changes (like unified architecture) that put it performance wise on par with the the 1950, which was top of the line when it came out. This is probably mostly due to the poorer initial reception that the 360 and ps3 had at launch due to price. I don’t expect any next gen console to launch north of 300 usd

        • StuG
        • 8 years ago

        You guys sure about this? I thought the 360 had a neutered HD3870

          • ImSpartacus
          • 8 years ago

          The HD3870 was released 2 years after the Xbox 360 was on shelves.

      • Sahrin
      • 8 years ago

      You have it backwards. The model number for Xenos was R500; basically, AMD used Microsoft’s money to fund the development of its first Unified Shader Architecture. This was then refined into the DX9.0c spec, in the form of the R520 (X1800).

      The R500 came before the X1800, not after. The X1900 was based on the R580; both are more capable GPU’s than the R500.

        • ALiLPinkMonster
        • 8 years ago

        Ah. Thanks for clearing that up.

        • l33t-g4m3r
        • 8 years ago

        Wrong. DX10 had unified Shader Architecture, not dx9. The xbox chip was a hybrid dx9/10 chip based off the x1900, or maybe vice versa. It had 48 shader processors, not 16, making it much closer to the x1900, not the x1800 which had 16.

          • khands
          • 8 years ago

          It was based off the 1800 and its performance is approximately equal to the 1950 because of the benefits afforded to it with things like unified architecture and more shaders, etc.

            • l33t-g4m3r
            • 8 years ago

            Really? It was based off the x1800 when the architecture is clearly the same as the x1900, and is still neither because it had a unified shader architecture. It was a dx10 version of the x1900, not a x1800. GOOGLE IT. The shader specs are clearly copied from the x1900.

            [quote<]The X1900 cards have 3 pixel shaders on each pipeline instead of 1, giving a total of 48 pixel shader units. ATI has taken this step with the expectation that future 3D software will be more pixel shader intensive.[/quote<] How many shader units does Xenos have? That's right, 48. [quote<]48 floating-point vector processors for shader execution, divided in three dynamically scheduled SIMD groups of 16 processors each. [/quote<] Anybody who claims Xenos is a x1800 is a LIAR. [quote<]the X1800 has 16 pixel shader processors[/quote<]

            • ALiLPinkMonster
            • 8 years ago

            Thanks for clearing up the clearing up of the clearing up, of the…. clarity. Or something.

            Anyway, elaborating a tad on my original point, I really think that a 6770 should be the bare minimum for their considerations. If they could afford to shove a version of the 6850 in the presumably small case, that would be ideal. I doubt they will/could, though.

    • TurtlePerson2
    • 8 years ago

    I have difficulty believing that this is true. One the one hand, I would expect graphics equal to midrange PCs in a console, but I wouldn’t expect the graphics that are available in midrange PCs in 2011 to be in the consoles released in 2013.

    • kamikaziechameleon
    • 8 years ago

    Well that’s disappointing.

    • sschaem
    • 8 years ago

    That conflict other rumors. My bet is that the new xbox will be at minimum twice as capable as a single 6670.

    The 6670 is about only about 3x faster than the current xbox 360… I dont see how 3x faster will be enough 6 years from now.

    240 vs 768 GFLOPS , thats not 6x
    16 vs 24 texture filtering units, thats not 6x
    64 vs 256 GB memory bandwidth, thats not 6x
    etc…

    3x faster would mean an update to 1080p (vs the current upscaling tricks) and little left over.

    Why even bother?

    Edit: And using a 6 serie GPU on 28nm would require extra efforts… Makes no sense. AMD is pretty proud that they increased power per equal area by 150% with the 7 serie… why go backward to the 6 when its more effort and more heat?

    Some leak seem to indicate dev kit are based on a dual GPU setup…
    I wouldn’t surprised if the new xbox uses a dual 7970 type GPU, and not a single 6670.

      • Antimatter
      • 8 years ago

      A dual 7970 type GPU? Power consumption for such a console would be in excess of 500W that’s about 3 times as much as the original Xbox 360.

      • Kaleid
      • 8 years ago

      Even one 7970 would be a nightmare to have in a small box.

      • Ironchef3500
      • 8 years ago

      I agree, this does sound a bit weak. And for what I am sure it will cost I hope we get beefier hardware. I really, really hope we get to play games RENDERED at 1920×1080 this generation. I am tired of my four year old gaming PC putting my 360 to shame. I want higher AA and AF, I want more than 30fps!

    • Xenolith
    • 8 years ago

    My not-so-bold prediction… the next xbox will not be called the Xbox 720. I’m thinking Xbox 7. What other rumors can we start?

      • OneArmedScissor
      • 8 years ago

      I’m sure they’ll come up with something highly marketable, like Xbox Vista Kin.

      • yogibbear
      • 8 years ago

      XBOXOMGWTFBBQ1!!!1q! XxX LMTD XxX GT GAMER EDITION LIVE ENABLED FULL HD CAPABLE FATALITY

    • HighTech4US2
    • 8 years ago

    Why haven’t consoles gone the MXM module route that laptops have gone?

    If consoles used the MXM module system then they could be upgraded during their life to newer GPU technology during the consoles life.

    It would also seem like a good revenue source for the console makers as they could have a base system (much like the integrated graphics of today) and then have optional MXM upgrade modules with newer/faster GPUs for the serious gamers.

    [url<]http://en.wikipedia.org/wiki/Mobile_PCI_Express_Module[/url<]

      • OneArmedScissor
      • 8 years ago

      Then how would it be a console? That defeats the entire purpose. They don’t care about hardware revenue. They care about game sales, and the easier it is to make games for a platform, the more there are to sell.

        • HighTech4US2
        • 8 years ago

        Because it would still be a console. Take the above rumored Xbox 720 with the outdated 6670 GPU built into the SOC and sell it for at or near cost to build mass market.

        Now include a slot on the unit so that a MXM card inside a protected module can be inserted into said slot. Now instant upgrade to the latest GPUs of today.

        As game software just have developers poll the system for it’s capabilities and enable more features when a more powerful GPU is detected.

          • OneArmedScissor
          • 8 years ago

          At best, there is one potentially useful upgrade cycle in there before the console is dead and gone. And for taking that risk, you lose:

          1) The efficiency of a system on a chip, making every game worse.
          2) The efficiency of making games with zero hardware variables.
          3) The time it takes to create and test multiple graphics and physics profiles.
          4) Console games requiring fewer bug fixes and patches than PC games.
          5) Games staying at “only” $60, detracting from sales.

          And all you’d gain is probably the ability to turn the game from 2x AA to 4x AA. Whoopdeefreakinda.

          • Corrado
          • 8 years ago

          In the history of consoles, add on upgrades have almost universally failed. Sega CD, Genesis 32x, N64 Disk Drive, N64 RAM pack, PS2 hard drive, Super Game Boy, Jaguar CD… should I go on? Letting people upgrade their consoles would cause ALL kinds of confusions and segregate the market. I have an current gen XBox, and I don’t want to have to worry about games running or not running on my system. Plus, why would they allow you to do that when they can, as I said before, re-sell you another console 6 years down the line and get all that new accessory revenue AND new, higher licensing fees?

      • Xenolith
      • 8 years ago

      This would result in a moving target. Publishers want a static target for game development.

      A moving target isn’t as big a deal in PCs because of ease of updating/patching. Also PC users are willing to put up with more bugs related to non-standard hardware.

        • HighTech4US2
        • 8 years ago

        As for game software just have developers poll the system for it’s capabilities and enable more features when a more powerful GPU is detected.

        And the MXM upgrade module would come from the vendor only not 3rd parties thus the vendor has control.

          • Xenolith
          • 8 years ago

          Publishers don’t want that extra potential for problems. They want a standardized experience when it comes to consoles.

            • HighTech4US2
            • 8 years ago

            This doesn’t seem to be a problem for those game publishers who produce the game for both the console market and the PC game market. The PC game has many many more variables than the MXM console solution I proposed

            With the console makers continuing using outdated graphics those here who have stated that the next generation consoles would kill PC gaming are absolutely wrong.

            • lilbuddhaman
            • 8 years ago

            Actually it is always a problem and is mentioned in nearly EVERY interview regarding Console Vs PC development. Not to mention the issues that the end user sees on a daily basis.

            As well, console “upgrades” rarely work, they have almost universally failed.

            • tay
            • 8 years ago

            You have no idea what you’re talking about. It’s fucking hard work getting your game to perform well with different hardware. Everyone who writes games professionally talks about it. What makes you think you’re right and they’re wrong?

            EDIT: Lilbuddhaman said the same thing above….

      • delsydsoftware
      • 8 years ago

      For one thing, console add-ons never sell as well as the actual console (see the Sega CD, 32X,extended memory carts on the N64, etc for examples). Very few 3rd party developers will take advantage of the additional hardware.So, you’ll have a handful of 1st party games supporting the new features, and maybe a couple 3rd parties.

      Also, this defeats the entire purpose of having a console. The one advantage consoles have is that they all have the same hardware. A lot of innovation in game coding has come from developers working within the limitations of the current hardware, such as data streaming tech used in Grand Theft Auto games. Upgradable hardware will lead to lazy development.

      • PainIs4ThaWeak1
      • 8 years ago

      I agree with you. I like the idea.

      But I don’t really see how publishers would (or would WANT to) enable any decent extent of graphical improvement (aside from the post-processing techniques OneArmedScissor mentioned). I mean, the game engine would still likely be the same regardless, and retain the same polygon count per frame for a given title.

      • sschaem
      • 8 years ago

      They want to upgrade EVERYTHING. CPU, storage system, memory bus, etc.., not just a dscreet GPU design (that console dont have to date)

      • StashTheVampede
      • 8 years ago

      Worst idea possible for a dedicated system. Consoles are dedicated units that are roughly fixed for their generation. The point of developing for them is that the maker guarantee’s nearly 100% compatibility between titles that shipped on launch day to titles shipped five years later — all working completely fine (with potentially a software update to the OS).

      I’m more interested to find out if its a single GPU. If Microsoft went with the mobile chip, they could piggy multiple units.

        • phileasfogg
        • 8 years ago

        I think this is precisely what sschaem was saying a few posts ago. Multiple (e.g. 2) low-power mobile GPUs makes sense in this context. Since semiconductor process yields (last time I looked) are inversely proportional to the die area and defect density (raised to some power), the number of gdpw (good die per wafer) can be increased by using smaller-sized die. Packaging costs are higher if you have to use 2 or more mobile GPUs on the PCB though. And, there’s the added cost of manufacturing the PCIe switch which links the GPUs to the PCIe root-complex. But, I’m thinking that there may be ways to save on die area and package size if they go with, for example, x8 lanes of PCIe Gen3 per GPU. That link width yields 8GB/s per direction at Gen3, to and from the Root Complex/switch. That’s twice the PCIe bandwidth of a Gen1 x16 GPU.

    • AGerbilWithAFootInTheGrav
    • 8 years ago

    well, thay’ll make the leap on PC… and it will become a clearer area on which to differentiate for PC only releases.

    • burntham77
    • 8 years ago

    Remember the 360 has 512 MB of GDDR3 memory. A bit of time has passed, so I would hope we would see at least double that on GDDR5 memory, if not more.

      • khands
      • 8 years ago

      4GB, unified architecture, minimum

        • ClickClick5
        • 8 years ago

        Being as there is not the background overhead processes we PC users face, I would bet 2GB easy. 4GB…..maybe…. but 2GB is still cheaper than 4GB (only by what now? $20?) so costs would go down too.

        My bet is this for the 720:
        -Quad Core POWER7 design CPU (with hyperthreading)
        -2GB GDDR5 Unified
        -6850 GPU

        If 512MB can do what we see now on a console, 2GB would just be used to fill silly things that a 1080 tv could not show clearly in the first place. Like a 16384×16384 texture on a die. So two dice and boom! But those will be some nasty high res black dots!

          • l33t-g4m3r
          • 8 years ago

          Now that would be a half decent console.

        • evilpaul
        • 8 years ago

        Skimping on RAM is a rich console tradition I don’t think we’ll see abandoned next gen.

          • odizzido
          • 8 years ago

          It’s such a shame too. You can really tell when playing console games that devs run into the ram limit a lot.

        • Anonymous Coward
        • 8 years ago

        Yeah I think a vast pool of RAM is more important than a fancy GPU or CPU. They could at least cache levels and help the worlds feel large.

        RAM is cheap and low power.

      • EsotericLord
      • 8 years ago

      The first Xbox only had 64MB of RAM. 512 was an 8x increase.

        • ClickClick5
        • 8 years ago

        They may use 4GB. But if they stick with 1080p, there will be a lot of wasted RAM. From their perspective, it is all cost. 8GB will be cheap too, but will 8GB serve the purpose of what resolution the console will output? I remember reading that Microsoft wanted to use 256MB total and Cliff Bleszinski from Epic begged them to use at least 512MB.

        Will they use 4GB? We dont know right now. I’m just not getting excited for a let down if they do not.

        What I do know is that Sony will be adding much more RAM into the PS4! One rumor I read was 6GB. Heh….we will see.

        But right, 4GB is 8x of 512MB…..so they might. Meh.

          • BobbinThreadbare
          • 8 years ago

          If it’s unified memory, it won’t be wasted. It can go to larger levels and less loading screens.

      • kamikaziechameleon
      • 8 years ago

      once one starts talking memory we have to start asking 32 or 64 bit??? Seeing as game consoles are becoming increasingly mutlitask focused, back ground Downloading, messaging etc its no longer fair to say consoles don’t have the overhead of pc’s

    • Corrado
    • 8 years ago

    The graphics in the Wii are god awful, yet it still sold a shit ton of systems and made a cubic buttload of money for Nintendo because they made money on the hardware. MS is in this game to make money. They’ve already established themselves in the market, so they now focus on mimicking that money printing machine that Nintendo has.

    • thefumigator
    • 8 years ago

    We should not forget that programming directly to a console has certain advantages. If programmers bypasses directx and get direct access to the GPU registers, then they could make things look better while improving performance at the same time.

      • Vrock
      • 8 years ago

      Somebody gets it, amazing.

      • sschaem
      • 8 years ago

      The xbox uses DirectX. The advantage is not the lack of Directx, but the fact that its possible to target a single HW shader implementation (among other benefit unique to console HW designs vs ‘GPU on a card’).

      Having developer able to write shader for a single HW implementation is NEVER going to happen on PC’s, get over it.

    • Xenolith
    • 8 years ago

    Is this only for the developer versions? Or for the consumer version? Or both?

    • gmskking
    • 8 years ago

    Somehow I doubt that it will have such a weak card.

    • yogibbear
    • 8 years ago

    Well… I thought I was looking forward to the next console cycle so that my PC games started looking a bit better and using my hardware a bit more. I guess not. I guess we’re already ahead of them by about 5x already. And they’re not coming for another 2 years…. extremely sad.

    • bittermann
    • 8 years ago

    Who cares as long as it can play games in true 1080P right? You’d be surprised how good it can look if all you had to do was code for a 6670. The only issue is if there is going to be enough horse power to do AA. What was everybody expecting?

      • Zoomer
      • 8 years ago

      It will look like shit. The 128 bit bus constrains the quality and resolution of textures being used.

        • LaChupacabra
        • 8 years ago

        I thought it being hooked up to a TV did that.

        • JMccovery
        • 8 years ago

        I don’t think memory bus widths work in that way… Even if it did, the textures wouldn’t look any better since they have to be sent from the CPU, which at best (x79) has a 256-bit bus.

        The main factor is [b<]how fast[/b<] the memory bus is (GDDR5 vs GDDR3), [b<]memory capacity[/b<] and [b<]GPU processing power[/b<] (texture decoding stages). You could have a GPU that has 8GB of GDDR5 on a 1024-bit bus, but if it can't process the textures correctly, then they will look like crap.

        • OneArmedScissor
        • 8 years ago

        The what bus? This is a system on a chip that will have its own RAM, not a PCIe card you’ve already seen before.

      • Peldor
      • 8 years ago

      After 8 years (2005 -> 2013) I certainly would expect more than a 6x increase in graphics horsepower. IMO, the next consoles would be smart to have at least a 67xx-class card, not 66xx (and preferably a 256-bit memory bus). At 28nm, which should be feasible for 2013 release, power draw would not be too high for a console (coming down from 108W at 40nm).

    • PainIs4ThaWeak1
    • 8 years ago

    So I guess the next-gen Xbox, will be called the “1080”?

    Oh, the madness of it all…

      • yogibbear
      • 8 years ago

      At least when it was a snowboarding game it made sense…

        • PainIs4ThaWeak1
        • 8 years ago

        Still remember that one, as well. Good ol N64.

        … Speaking of… THAT console name actually made sense as well. Take a hint Microsoft!

          • yogibbear
          • 8 years ago

          But if they call it the XBOX128 then then then…. MADNESS!

            • PainIs4ThaWeak1
            • 8 years ago

            touche’!

      • ludi
      • 8 years ago

      Nah, they’ll just switch to polar coordinates.

        • Geistbar
        • 8 years ago

        The Xbox 3, 70[super<]0[/super<]! It'll be twice as good as the Xbox 1.5, 70[super<]0[/super<]. Maybe for the next one they'll go all the way to spherical?

    • poulpy
    • 8 years ago

    Whether this is true or not I don’t think anyone should be expecting the next-gen consoles to have a state of the art high-end GPU consuming close to 200W at peak..

    Neither Sony nor Microsoft were breaking new grounds in GPU performance when they were released last time around and yet for both of them the hardware was a losing money affair for years.

    And even if it were a custom made 6 series derivative from AMD in the end it would still be leaps and bounds better than what they currently have at their disposal, and would also level the playing field to DX11 instead of what DX9 nowadays?

    • PrincipalSkinner
    • 8 years ago

    So rumours previously said Sony will pull out of next gen race and now this. But I also read somewhere that the next Xbox will have Avatar like visuals.

    • brucethemoose
    • 8 years ago

    I’m still treating this as a rumor though, as saying the 6670 is 6x as powerful as a 360 might be a stretch… but might carry some truth. A 360 GPU has 232 million transistors, a 6670 has 2x-3x that: 716 million. A 360 is clocked at 500mhz, a 6670 at 800. The Pixel and texture fillrate of the 6670 is roughly 2x-3x as fast as the 360. Memory bandwidth is doubled, memory is doubled, the bus width is the same.

    Kinda sad, ain’t it? A $70 GPU beating a 360.

      • Farting Bob
      • 8 years ago

      IGP’s on $70 CPU’s beat the 360 on paper. Its an old console these days, and even when it was new it used a chip based off an already outdated GPU. Its been die shrunk quite a few times and had more stuff built onto the die itself over time, the new ones are very efficient in power consumption while delivering much more than you would expect that chip to do on a desktop. A 6670 may not seem impressive to us, but it will be a huge leap forward for consoles. Anyway, people on tech sites dont think much of $300 GPU’s, while the whole system for a console sells for less than that. Build a $300 desktop and try gaming on it. You will struggle.

        • brucethemoose
        • 8 years ago

        [url<]http://www.kmart.com/shc/s/p_10151_10104_020V004628480000P?sid=KDx20070926x00003a&ci_src=14110944&ci_sku=020V004628480000[/url<] Stick in a $70 6670, and hardware wise, it's superior to a 360. The thing about consoles though is that due to their immense popularity, games are extremely well optimized on them. A game might run better on a 360 than it does in a rig with a Phenom II and a 6670, even though the PC has 3x or more power in every conceivable way. 7970/2500k rigs are essentially powering though unoptimized code with brute force.

        • axeman
        • 8 years ago

        It also makes a huge difference when the developer only has to optimize the rendering path for only one model of GPU. They can code for it’s strengths, and devise workarounds for its weaknesses. Devs do that on PCs and are accused of cheating or being shills for the green/red team.

    • Silus
    • 8 years ago

    Don’t doubt it. Consoles will always have weak processing capabilities, when compared to current tech on the PC. And mostly because it’s good enough for the target market, plus the fact that developers (by developing with the direct to metal approach) get the most out of that hardware.

    Does this hurt the games in both quality and quantity on the PC market ? Yes of course and they will continue to hurt it with no end in sight. We can only hope that some developers still give the proper attention that the PC needs.

    • Anonymous Coward
    • 8 years ago

    With TVs limited to 1920×1080 for the foreseeable future, it wouldn’t surprise me too much if MS toned it down a lot next time around and tried to make money on selling the console, Nintendo-style. Keep 100% backwards compatibility, add lots of entertainment center features (in software), maybe figure out a clever way to leverage software from the phone world, watch the price and go for volumes.

    Maybe they’ll find room in the budget for a useful amount of RAM.

      • cygnus1
      • 8 years ago

      By the time this comes out, 4K TVs will be on sale. And in the lifetime of this system, 4K will become standard. We’re talking many times the number of pixels on the screen as 1080P. If the next Xbox comes out this weak, it will be the next Wii: incapable of coming anywhere close to what current TVs are capable of.

      But, maybe that’s what MS is aiming for, good enough graphics. Something they can make decent margin on right from the get go.

        • Deanjo
        • 8 years ago

        [quote<]By the time this comes out, 4K TVs will be on sale[/quote<] I wouldn't hold my breath on that one. There isn't even a ratified 4k broadcast standard and there is a lot of infrastructure that has to be done once there is one by the various service providers. Remember 4k means a heck of a lot more bandwidth requirements as well. We won't see 4k in the same volume of existing 1080P for a minimum of another 5 years.

        • Phishy714
        • 8 years ago

        No way. Even now, 1080p isn’t a standard across the board, only a good handful of stations offer it. 4k tv’s will not even be somewhat affordable for AT LEAST another 2-3 years. (read: affordable = 1-2 thousand). Then on top of that, you won’t have stations offering 4k content for AT LEAST another 3-5 years after the 4k tv’s have gone mainstream like LED has.

        You are seriously looking at close to 10 years before 4k tv with 4k content become mainstream. So yeah, no point in releasing hardware to be able to support more than that.

          • yogibbear
          • 8 years ago

          All you need is some good p0rn to come out pushing 4k and those TVs will sell…..

            • Corrado
            • 8 years ago

            No one wants 4K porn. Have you seen HD porn? You can actually see that the women are haggard and used up meth heads

            • yogibbear
            • 8 years ago

            Point. Game. Set. Match. ’nuff said. I withdraw. πŸ™

            • burntham77
            • 8 years ago

            Yeah, I still have an appreciation for pre-HD porn where things are just grainy enough to tolerate.

          • cygnus1
          • 8 years ago

          None of you can seem to extrapolate dates. They’re talking about this thing being released almost 2 years from now. And it’s probably going to be around for 5+ years. Do you really not think manufactures will push 4K displays as a way to differentiate their products? 4K will at least become a common top tier option in the next 2 to 7 years.

          And who the hell cares about a broadcast standard? There were 1080i/p displays way before there was ever anything broadcast in HD.

            • Corrado
            • 8 years ago

            Thats the point. The TVs exist but other than steaming or BluRay, there is no media that is 1080P native even. Broadcast is 1080i, and with the amount of people I know that have a 1080P tv and REFUSE to pay the extra $5 a month for HD service, 4K as a true standard won’t happen for a long time.

            HDTV was present and adopted as a standard in 1996. How long did it take to become ubiquitous? 10 years or more? MS is not going to spend the money now so that a machine released in 2013 will still be current in 2019 or 2020. They’ll just make a new machine and sell it to you all over again.

        • OneArmedScissor
        • 8 years ago

        Lifetime of this system? lol! If it has a life at all, it’s not going to be the 7+ years the 360 got away with. This will be the last generation of consoles, and it will be shortlived.

        There aren’t going to be 4k TVs all over the place by 2015. There’s just no reasoning for that. And even if there were, their sales could only possibly pick up at the tail end of the “720’s” run, so it’s still irrelevant even in a hypothetical scenario.

        • Bigbloke
        • 8 years ago

        What utter tosh! Joe Schmo consumer can barely tell the difference between HD ready 720p and Full HD 1080p at normal viewing distances, he sure as hell isn’t going to pony up for a 4K TV, especially when he realises how he was had by the whole 3D comedy parade. Flat panel TVs are now a commodity and sell for peanuts (in relative terms) it is going to take a brave company to push a 4K panel on us with no known content to play on it. They would end up losing a lot of money defining a market, and almost all the current companies cannot take this step.

        Gamers need to take a reality pill and realise that TV, consoles and good enough graphics are going to be bedmates for quite a while. Go into any game store and look at the average buyer: they are moms and pops buying for their young kid. Us older more technorati folk are a small percentage of this market.

        I wish I was wrong, and I could have a 4K panel hanging on my wall (or better still a 4K projector), but that is a far-off pipedream for us all.

      • wierdo
      • 8 years ago

      Yeah I kinda see the logic in this rumor, I mean if the target market is 1080p at most, then there are better places there to spend your development budget on.

      I doubt 4k TVs will become mainstream within the next 5+ years at least, so such high resolutions wouldn’t be a big priority for consoles at this point imho.

      Also I’m sure the design will, as usual, be custom made to handle the specific needs of a console, like more closely coupled memory subsystems and maybe beefed up handling of physics etc. There’s allot of benefit in having a system optimized for its specific niche, both hardware as well as software-wise, so they can do a bit more with less.

      I hope they’re generous with the system memory size on this thing though, 2 gigs or more could be pretty handy I think.

        • DPete27
        • 8 years ago

        All this talk about 4k TV’s not being widely available and supported for another 5+ years makes me sad.

        Back to the XBox 720 (creative name by the way…sarcasm). If this is truly the graphics performance we’ll see in this console in 2 years its going to be a dark day for console gaming and PC gaming as well. First off, I would predict many to go back to PC gaming. Second, console gaming requirements being the ball-and-chain of game development, this would just further castrate the state of gaming quality going into the future.

          • OneArmedScissor
          • 8 years ago

          Sorry to throw gas on the fire, but what will actually dictate things by that point is phones.

    • Krogoth
    • 8 years ago

    A customized 6670 chip is more than sufficient for 2 Megapixel gaming. I suspect that GPU will actually be based on 7xxx architecture (76xx) mainly due to the die-shrinkage and being more programmable. That’s assuming that TSMC’s 28nm process hasn’t run into issues.

    CPU itself is probably going either be Intel (Sandy Bridge has excellent power efficiency) or another PowerPC design.

      • Anonymous Coward
      • 8 years ago

      No way they’ll partner with Intel. Not only do they want more control than Intel will likely allow, but IBM has a few perfectly fine ISA-compatible CPUs to choose between.

        • Corrado
        • 8 years ago

        They went down the Intel road before and were not pleased with the results in the original XBox. Intel decided to not give them discounts later in the lifespan, the same thing nVidia did. This pushed them to IBM/ATI for the 360.

      • Brad Grenz
      • 8 years ago

      The 76XX is just a rebadged 66XX. If this rumor is true we’re looking at a SoC. Llanos already have 6670 numbers of shaders integrated with 4 CPU cores. MS is just swapping out the Athlon x64 cores for PowerPC based ones (4-6?).

      Will be interesting if the 720 and WiiU end up within spitting distance of each other. Sony will be able to pretty easily come in over the top with a $400 box that is 4-8 times as powerful, potentially cornering the hardcore, Call of Duty and Madden markets.

        • OneArmedScissor
        • 8 years ago

        [quote<]If this rumor is true we're looking at a SoC. Llanos already have 6670 numbers of shaders integrated with 4 CPU cores.[/quote<] *ding ding ding* The 360 already has most of its chips moved into one 45nm package. This is just a further consolidated and shrunk down take on what they know for certain is possible. They're unlikely to be able to go further than 28nm by 2013. The doomsaying that this is way too far behind in technology is laughable. It will probably be more like Trinity's GPU than Llano's, and those should be able to swing most games at 1600x900 on a Windows laptop. An even more tightly integrated, more efficient chip that's completely dedicated to video games shouldn't have any issues with 1080p.

    • colinstu
    • 8 years ago

    Consoles are in the business of making the most money off of the cheapest parts. That has been the case for a LONG time and will continue to be that way.

    This thinking that console makers will fill up their consoles with expensive components to allow for a nicer looking experience is ABSURD! Corners can and will be cut (low res/low poly/upscaling/blurry awful or no AA/ etc).

    • blastdoor
    • 8 years ago

    “Still, if there’s any truth to this rumor, games might not make a truly substantial leap in visual fidelity for a long, long while.”

    I wonder how many of the gamers who whine about this also watch such stunningly “low visual fidelity” TV shows as South Park or The Simpsons.

    What level of autism must one achieve in order to care about higher visual fidelity at this point?

      • brucethemoose
      • 8 years ago

      HD doesn’t change the content of South Park or the Simpsons. But better hardware gives developers more to play with: more AI, bigger game worlds, more action, deform-able terrain, etc. The possibilities are endless… but are limited by the hardware.

        • derFunkenstein
        • 8 years ago

        You don’t want ot do most of that stuff directly on the GPU, though. The biggest limit is, believe it or not, time. They have to meet deadlines and ship products, unless we’re talking about Blizzard. Everybody wants to cry about delays but it DOES take time to create this stuff. Hardware limits are way down the list of issues.

        • Corrado
        • 8 years ago

        All of that stuff has little to nothing to do with GPU power, and more to do with CPU power.

      • burntham77
      • 8 years ago

      Creativity is clearly more important than processing power, but it’s good to have it there to allow for flexibility. One would want the developers to be limited by their imaginations, not by the hardware. Of course, that’s what PCs are for.

    • Derfer
    • 8 years ago

    The 7970 uses more power than any next gen Xbox will. One is a card, the other is a whole system. Consoles have relatively conservative power targets. Granted I’d of preferred the shaders were GCN based, but perhaps just die shrinking a 6000 series part will be more efficient. The new architecture does seem to have some bloat targeted at applications outside of gaming. Regardless it’ll be rendering games that look better than anything we currently have on the pc for far less money, at least for the first few years.

    Actually the parity between PCs and consoles may last longer this time since resolutions will be equal. Terms like consolitis will become limited to bad control schemes.

    • TravelMug
    • 8 years ago

    I call BS on this one.

      • colinstu
      • 8 years ago

      Nope. It’s unfortunately the truth.

    • odizzido
    • 8 years ago

    What should really happen is ATI/Nvidia start making games themselves. Assuming they wish to continue selling gaming GPUs to PC users.

    • JustAnEngineer
    • 8 years ago

    Let’s all say it again: “Consoles have killed innovation in game development.”

      • Game_boy
      • 8 years ago

      On the contrary, with a graphics and tech ceiling on each console, publishers are forced to innovate on content to get things to sell. Releasing the same game over and over doesn’t actually work (see: Guitar Hero).

        • grantmeaname
        • 8 years ago

        Yes it does (see: Rock Band).

          • derFunkenstein
          • 8 years ago

          Yes, that worked so well that EA dropped Harmonix.

            • kamikaziechameleon
            • 8 years ago

            EA??? what are you talking about, MTV owned them!

            • axeman
            • 8 years ago

            It worked well enough to make about a bazillion dollars before the novelty wore off…

        • Arclight
        • 8 years ago

        Yeah like games on rails with limited space to explore and lengthy cinematics.

        Yeah big improvement from the devs. I guess we should also applaud the late adoption of the latest DirectX features (most importantly the bad and late adoption of tessellation).

        I’m also very glad that publishers treat console gamers as first class citizens while PC gamers are treated like thieves worthy of being spied and have their drives scanned for the slightest OS or hardware change. /sarcasm.

          • SPOOFE
          • 8 years ago

          [quote<] PC gamers are treated like whiny drama queens[/quote<] Fixed, 'cuz I'm such a nice guy. πŸ™‚

            • Arclight
            • 8 years ago

            Tough talk from, i presume, a console player. Try playing games like Anno 2070 and see if you still got that smiley face.

            [url<]https://techreport.com/discussions.x/22333[/url<]

          • l33t-g4m3r
          • 8 years ago

          [quote<]PC gamers are treated like thieves[/quote<] Only because we put up with it my friend. If we all stopped buying their spy-ware riddled games, they'd have to remove the DRM.

        • Phishy714
        • 8 years ago

        Yes it does: See Call of Duty.

          • Game_boy
          • 8 years ago

          No. It’s sufficiently different that people keep buying it. The core game is the same but the setting, the maps, the single player campaign/story, the game balance and so on do change. In fact I would call that exactly what game companies should be doing, not taking franchises in weird directions for the sequel because they feel they need to SURPRISE people (personal example, f— Metroid Other M).

          You can’t argue with games that sell 10 million plus copies – 2D Mario, Mario Kart, Wii Sports/Fit and Call of Duty being the only recent ones. They are doing something right, the rest of the game industry should look to them.

          Guitar Hero was literally the same game content-wise, because new music was the content, not the plastic instruments people didn’t care about. When suitable music dried up, it died.

            • BobbinThreadbare
            • 8 years ago

            By this reasoning, Transformers is one of the great movie franchises of all time.

            • Game_boy
            • 8 years ago

            People want to see a bunch of CGI robots beat each other up while explosions happen in the foreground. Bay delivers perfectly.

            • Bensam123
            • 8 years ago

            This FUD should not be listened to at any level.

            You shouldn’t NOT innovate because you’re afraid of failing. This sort of thinking leads to a huge stagnation in not just video game development, but society as a whole.

            [url<]http://en.wikipedia.org/wiki/Fear,_uncertainty_and_doubt[/url<] Big success require big risks. The very fact that MW is such a huge hit without taking any risks goes to show you how little risks are actually being taken in AAA game development. There is no real competition. Even BF3 caved and become quite a bit more MW.

        • Prestige Worldwide
        • 8 years ago

        And yet every year, the same Call of Duty game with a different number in the box breaks entertainment industry sales records time and time again..

        • kamikaziechameleon
        • 8 years ago

        guitar hero died because of over saturation of the market, lol not because of a tech ceiling. Just bad business.

        • lilbuddhaman
        • 8 years ago

        Tell that to the CoD series.

        • cynan
        • 8 years ago

        Well then how about [i<]"Consoles have killed [b<]technical[/b<] innovation in game development."[/i<] Back in the late 90s, it seemed like every second PC game (which heavily relied on 3D visuals) would come out with a brand-new engine (or at least a heavily optimized prior engine). Now, even with current games, it seems that any ol' engine that will run directx 9.0c is good enough. Just look at the number of current and future releases that using the Unreal 3 engine. Yes, there was an improved version 2 which came in 2009. However, the fact remains is that the U3 engine is still the go to engine for many developers and there is no sign of this slowing down over the next couple of years.

        • Bensam123
        • 8 years ago

        This is heavy sarcasm mixed with a bit of trolling right?

        Cause Modern Warfare and a gajillion other console titles aren’t just all clones of each other, which conveniently has been ported to the PC.

        That aside, seriously this sort of BS is getting old. Putting less money into graphics doesn’t mean developers are all of a sudden going to become highly innovative and produce minecrafts out the wazoo for the console. You know why they aren’t? Because they haven’t! Look at all the games that are out now and the last six years in which consoles have been the focus. We’ve had your golden age scenario and it’s complete and utter shit.

        YOU CAN’T TRANSFORM MONEY INTO BETTER IDEAS! It doesn’t work. When it comes to game mechanics and innovation, people are the weakest link. You can have good graphics and shitty gameplay with the same budget as good graphics and good gameplay. It’s all about planning, concepts, ideology, and execution.

      • Krogoth
      • 8 years ago

      No, big-budget projects funded by big suits and investors who want to avoid risks (like MPAA/RIAA).

      There is still innovation going on in both console and PC arena if you were to venture into indie market.

      • derFunkenstein
      • 8 years ago

      No, there’s no innovation in the big-budget market because dopes are willing to pay for the same thing year in, year out.

        • Game_boy
        • 8 years ago

        I call that craftsmanship on the part of the developers. They know what sells and they are highly experienced in creating it. Why is that bad?

        “Innovation” usually turns out to be another physics-based 2D platformer with an ‘edgy’ art design that’s angling for GOTY awards so hard.

        • Bensam123
        • 8 years ago

        Aye… if you change everything to look the same then eventually people stop expecting more because they haven’t witnessed it any other way.

        Sadly this is one of the side effects of consolization.

      • LiquidSpace
      • 8 years ago

      not really, you have over 10 million WoW subscribers probably on old PCs with single to dual core CPUs and 2005/2006 GPUs.
      another thing, look at BF3 both the 6970 and GTX580 get without AA pretty much 60-80FPS and that’s cause DICE had nerfed and optimized the game a lot so that these high end GPUs would run it at 60FPS.

        • Bensam123
        • 8 years ago

        Similar scenario, there haven’t been better MMOs people were willing to spend money on or upgrade for, although that’s changed in the last year or so.

Pin It on Pinterest

Share This