AMD’s RV770 graphics chips to show up in early June?

AMD’s next-gen graphics processors may show up in just two months, according to a report by Fudzilla. The rumor site says Taiwan Semiconductor Manufacturing Company—the foundry that handles much of the chip production for both AMD’s graphics division and Nvidia—is already at the “pilot production” stage for future AMD RV770 graphics processors. Those GPUs are are expected to hit Radeon HD 4000-series graphics cards.

In fact, Fudzilla claims RV770 production is so far along that the chip might be ready in time for an unveiling at the Computex trade show, which is scheduled to take place in Taipei, Taiwan in early June. The GPU might launch “shortly before or after” the show, the site elaborates.

If previous reports about future AMD GPUs are accurate, the RV770 will appear in high-end Radeon HD 4850 and Radeon HD 4870 graphics cards. Supposedly, the 4850 will run at 850MHz with 512MB of 900MHz GDDR5 memory, while the 4870 will run at 1050MHz with 1GB of 1100MHz GDDR5 RAM. Both models will apparently have 480 stream processors, although we’ve heard rumors that they may have as many as 800.

Comments closed
    • ecalmosthuman
    • 13 years ago

    Come on AMD, make nVidia pay for resting on it’s laurels for too long.

    I’ve been ready to upgrage from my 8800GTS for months now, and I am ready and waiting for AMD to lure me back to the red team.

      • Forge
      • 13 years ago

      Their very pleasing open-source antics have made me ready to go red again as well. I’m hoping very hard that RV770 is a worthy adversary to D10U, and that Nvidia/ATI is once again a matter of choice, with the performance lead being shared.

    • Deli
    • 13 years ago

    Is it 50% faster for RV770 one chip or R700? If it’s the former, then OMFG.

    • Ashbringer
    • 13 years ago

    Wasn’t there a rumor that both AMD and Intel are working on Hybrids CPU’s that would change the architecture of how PC graphics and games are made? They’ll be made to run Ray Tracing.

    If Intel is going to release their hybrid chip later this year then AMD can’t be too far behind. Wouldn’t that make the RV770 the last graphics card we’ll see from ATI?

      • poulpy
      • 13 years ago

      Hybrids cpus aren’t just a rumour they’ve been on roadmaps for quite a while but AFAIK they won’t be a match for discreet GPUs.
      They’re going to be useful for bringing costs down and even do some GPGPU tasks quicker than x86 cores can but they won’t pack as much punch as a discreet card nor will they update as fast.

      • pluscard
      • 13 years ago

      Best I can tell, AMD and INTC are working in different directions.

      INTC is working on “Larrabee” which is a discrete card, but is attempting to do it using an x86 style cpu (it’s what they know).

      AMD is developing “Fusion” which will “combining general processor execution as well as 3D geometry processing and other functions of today’s GPUs into a single package.” according to wikipedia.

      AMD is also developing “Torrenza” which is using co-processors in additional sockets, but those co-processors could easily be GPU’s. AMD has defined a common interface so 3rd parties can use the AMD platform to build custom solutions.

        • Flying Fox
        • 13 years ago

        The Nehalem architecture has on its roadmap possible integration with a GPU core on the package/die too. So it is a similar idea to Fusion.

    • 0g1
    • 13 years ago

    If nVidia uses GDDR3 and a 512bit bus, thats basically the same bandwidth as GDDR5 on a 256bit bus. The difference would be the 512bit bus costs a lot more.

    I hope for ATI’s sake that they’ve improved their core a lot because they probably wont have a memory advantage.

    • matnath1
    • 13 years ago

    Ati than AMD has consistantly dissapointed. I’ll believe it when I see it.

    • floodo1
    • 13 years ago

    man i hope ATI wins, because they dont like anyone out from crossfire. AKA i have a crossfire capable board 🙂 (P35)
    nvidia should die for locking people that want SLI into their mobos 🙁 its questionable whether it makes good business since, and its just not nice 🙁

    other than that i really don’t care. if things keep going like they are now, i’ll end up buying a single nVidia card and being sad that I cant run sli 🙂

      • Meadows
      • 13 years ago

      Either that, or you wake up and buy an nForce board since you have money for SLI.

        • ChronoReverse
        • 13 years ago

        Except Nforce boards right now pretty much suck (in comparison to Intel) for anything other than SLI. It’s a decision that’s simply a PITA

    • lolento
    • 13 years ago

    If history repeats itself, TSMC tapeout doesn’t mean jack.

    R600 was taped-out 6 months before Xbox360 was released but the actual card didn’t materialized until a year later.

      • willyolio
      • 13 years ago

      fudzilla reported that it was “taped out” back in january.
      §[<http://www.fudzilla.com/index.php?option=com_content&task=view&id=5145&Itemid=1<]§ this time they said "in production."

      • Lord.Blue
      • 13 years ago

      R600 was ONLY for the Xbox. The x1900/1950 was based on that chip, but was not the same.

        • Xaser04
        • 13 years ago

        No it wasn’t. The R600 is the HD2900XT. The R500 is what is in the 360.

        The R500 (codenamed Xenos) GPU is nothing like the X19XX generation GPUS.

        • ish718
        • 13 years ago

        R6xx is HD2000 and HD3000
        Xbox’s gpu- Xenos which R600 is based on

    • Hattig
    • 13 years ago

    Glad to see ATI appears to finally be back on track.

    I’m sure it is 480 stream processors that perform like 800 of the previous generation stream processors.

    I’ll wait for the 240 stream processor variant.

    • danny e.
    • 13 years ago

    nvidia sucks. ati rules. ’nuff said.

    *hides 9600GT*

    • GTVic
    • 13 years ago

    Why RV770 why not R700? I thought they always first produced the monster chip for the high end cards. Then they reduce that chip (fewer pipelines or whatever), call it the RVxxx, and sell it in the mainstream products.

      • grantmeaname
      • 13 years ago

      That’s the green team. G80>G82>G84>G86 and on and on and on.
      With ATi, the higher the number the faster the chip.

      • Game_boy
      • 13 years ago

      RV770 is one core of the dual-core R700.

      • Flying Fox
      • 13 years ago

      Things have changed.

    • wingless
    • 13 years ago

    It has 50% more shader so it will be 100% faster according to my fanboy mathematical equations.

    Seriously though, doubling the TMUs may actually accomplish that with double the fillrate. Maybe us ATI guys can finally turn on AA 8x!

    • no51
    • 13 years ago

    I can’t wait.

    • conlusio
    • 13 years ago

    It would be useful if people just skipped the ‘effective clock’ when referring to DDR and referred to throughput. Its makes the whole thing clearer anyway, especially when you’re trying to compare say GDDR3 on a 512 bit bus and GDDR5 on a 256 bit bus (which is what I think its limited to if I read the whitepaper right)

    Edit: Meant to reply to #5.

    • Voldenuit
    • 13 years ago

    Why the fanboism in this thread?

    I couldn’t care less who puts out the faster video card in the market, as it’ll spur competition, increase performance and lower prices.

    We’ve been stuck at G80 level performance for almost 2 years now, and that’s just sad.

    A rejuvenated ati will wake up nvidia if all goes well.

      • ew
      • 13 years ago

      l[

        • Nitrodist
        • 13 years ago

        I believe the meme is:

        You must be new here…

          • ew
          • 13 years ago

          Oh, right. I actually am new here. Sorry.

      • deruberhanyok
      • 13 years ago

      Yeah, I can’t imagine how horrible those people with 8800GTX cards must feel.

      “Ugh. I’ve actually gotten my money’s worth out of this thing and haven’t had to upgrade in more than a year! That’s so annoying!”

      Because, you know, we haven’t had anyone try to spur competition, increase performance and lower prices since the 8800GTX was released. Er, wait a minute, 8800GT / HD3870 cards are how cheap?

        • Meadows
        • 13 years ago

        You have a point.
        Actually, more than one.

    • kilkennycat
    • 13 years ago

    Better be June 2008 or earlier for the RV770 – IN HIGH VOLUME. By Fall, AMD/ATi will have nothing to gloat about. Any ‘window of opportunity’ vs nVidia’s next-gen will be miniscule. Anyway, not a particularly good launch time, as Summer (in the Northern Hemisphere) is always a low point for computer-peripheral purchases.

      • Flying Fox
      • 13 years ago

      Huh? What about the back-to-school season?

    • 2x4
    • 13 years ago

    i still hope that nvidia will blow the amd away with its next-gen coming in july

      • Lord.Blue
      • 13 years ago

      I hope that they are equally powerful, so prices will fall.

      • pixel_junkie
      • 13 years ago

      I hope fanboyism will STFU.

        • VILLAIN_xx
        • 13 years ago

        best response ive seen in a while.

        haha.

      • TurtlePerson2
      • 13 years ago

      Wasn’t the 9800 GTX its next gen? That didn’t blow anyone away…

        • Lord.Blue
        • 13 years ago

        9800 is still the G92, so not next gen really – just a stop gap. The 8800GTX Ultra out performs it.

      • Nitrodist
      • 13 years ago

      I hope either company comes out with an absolutely amazing card.

      • Kaleid
      • 13 years ago

      Why? Ati needs to become stronger so the fight for the costumers becomes more vicious. Nvidia has too much of the market share already…certainly Intel.

    • Helmore
    • 13 years ago

    Uhmm, you guys are a little off on memory frequency, at least according to the rumours (we might as well have something completely different coming :P).
    Memory frequency for the 4870 is 2200 MHz. and thus 4400 MHz effective after taking DDR into account. The 4850 will have 1800 MHz, which means 3600 DDR. This is also the reason for the use of GDDR*[<5<]*, as GDDR5 also allows frequencies over 3 GHz (6 GHz effective (DDR)).

      • Meadows
      • 13 years ago

      If what you say is true, then TR thought the rumour already did cite DDR numbers when it didn’t.

      • spartus4
      • 13 years ago

      I find you numbers just a little too out there. Could you give us a link to the information that you site here? I would be a great help.

      • paco
      • 13 years ago

      -[<2200mhz*2bit/hz*256bit/interface*2bit/interface=2252800Mbit *0.125 byte = 281600MB/s = 281.6GB/s = Too high
      However that will be the correct bandwidth when you factor in a 512bit interface<]- Hynix claims it's chips can do 160GB/s on a 256bit interface §[<http://en.wikipedia.org/wiki/GDDR5<]§ -[<160*1000/0.125= 1280000 /2/2/256 = 1250MHZ (2500MHZ effective)<]- Oops I seem to have too many 2's in there it's 160/2/32(256bit/8) = 2.5 = 2500mhz (5000 effective). so 2200mhz would give us 2.2*32*2= 140.8GB/s and if they scale up to 3ghz about 192 GB/s of peak bandwidth on a 256-bit bus, * 2 for 512-bit :)

        • 0g1
        • 13 years ago

        I doubt GDDR5 would be slower than GDDR4.

          • Meadows
          • 13 years ago

          Given that GDDR4 is already slower than GDDR3 in some cases.

        • 0g1
        • 13 years ago

        Ahh, that makes sense. I assumed your calculations were correct at first but now I look at it again im wondering what the “2bit/interface” was for … thanks for correcting it!

      • MadManOriginal
      • 13 years ago

      Excessive bandwidth really helped the 2900xt. Oh, wait…

      Granted with 50% more shaders these have a lot more processing power but still I don’t know if crazy bandwidth will matter.

        • ish718
        • 13 years ago

        I’m sure the higher number of shaders will improve performance but the excessive bandwidth won’t make a difference since games can’t even take advantage of it. The faster GDDR5 memory should improve performance…

          • MadManOriginal
          • 13 years ago

          Yes that’s what I was saying. Now if with GDDR5 they can lower costs with a slimmer memory bus but still deliver high bandwidth and not poor memory timings then it would be an all around win.

    • Gerbil Jedidiah
    • 13 years ago

    Oh please let it be true. I want Crysis in all it’s glory!

    And I want it with a SINGLE videocard!

      • Meadows
      • 13 years ago

      Then you’d better pay attention to nVidia’s next generation card.

        • 2x4
        • 13 years ago

        i hope you’re right man.

          • charged3800z24
          • 13 years ago

          If games were enhanced for ATI cards, we would already have a different story on our hands. Seems everything is Intel/nVidia oriented. The 3000 series have potential, but can’t utilize it. Only 3dmark06 shows this in shader power.

            • piesquared
            • 13 years ago
            • TurtlePerson2
            • 13 years ago

            The reason games aren’t enhanced for AMD is because AMD doesn’t work with developers like Nvidia does.

            • Flying Fox
            • 13 years ago

            They do, they just don’t have a catchy name for its programs. They need to send the hardware to the devs for testing and to work with them to fix driver bugs in games.

            • Hdfisise
            • 13 years ago

            The latest source pack like TF2 etc was optimised for ATi cards so they do have some games helping them 😀

            • Meadows
            • 13 years ago

            Yeah and HL2 has a “Runs best on ATI” logo on it and TR’s reviews consistently show it running better on nVidia.

            • pluscard
            • 13 years ago

            AMD has recognized this and is working more closely with game developers now.

    • ssidbroadcast
    • 13 years ago

    That’s being /[

    • gtoulouzas
    • 13 years ago

    Cue customary f.u.d. on “theinquirer” about ATI’s next part wiping the floor with nVidia (this time for real, promise!) in 5..4..3..2..

Pin It on Pinterest

Share This