Report: Radeon HD 7000 series about to hit mass production

AMD might not be finished adding new cards to its Radeon HD 6000 series, but that doesn’t mean the next generation is all that far away. The folks at DigiTimes say they’ve gotten word from sources at graphics card makers that the Radeon HD 7000 series, a.k.a. Southern Islands, will hit mass production next month.

The site doesn’t reveal too many details, although an earlier story by SemiAccurate hinted that the first Radeon HD 7000-series cards would have 28-nm GPUs derived from the same architecture as Cayman chips inside current Radeon HD 6950 and 6970 cards. SemiAccurate expressed doubt about TSMC’s ability to deliver the 28-nm parts in volume, however, saying retail cards might arrive some time between July 2011 and July 2012.

In other AMD graphics news, DigiTimes says three budget Radeons will hit retail shelves in about a week’s time: the Radeon HD 6670, 6570, and 6450. We already reviewed the Radeon HD 6450, but a quick search at Newegg currently shows no matches for that card outside of a pre-built HP system. I haven’t heard about the 6670 and 6570 before, though; perhaps those cards are based on the Turks GPU we saw in AMD’s 2011 roadmap last year.

Comments closed
    • odizzido
    • 11 years ago

    I would be more excited if games worked properly with linux. Not that games work quite right with windows 7 either. Windows XP was lucky since anything that didn’t work with windows would work with dosbox(that I have played at least)…..but windows 7 doesn’t have anything to use for the inbetween games like starcraft/VTM:BL.

    • DrCR
    • 11 years ago

    I wish AMD/ATI could get their act together and release some proper Linux drivers. Only then could I get excited.

    • derFunkenstein
    • 11 years ago

    He said the 4850 level of performance, not the 4850 itself.

    • abw
    • 11 years ago

    [url<]http://www.anandtech.com/show/907[/url<]

    • PRIME1
    • 11 years ago

    RTFA! The price is $54.99

    • ImSpartacus
    • 11 years ago

    Yup. It’s unfortunate, but we won’t see any enormous changes until the consoles are updated. Two or three years and we might see a new console and a new generation of PC gaming.

    • Anomymous Gerbil
    • 11 years ago

    You’re suffering from judging the smallish iterations that come with each card and/or engine and/or DX version. To cure yourself, just compare today’s games with what you got 10, 15 and 20 years ago, and you’ll soon appreciate the gains made with each “small” advance.

    • PRIME1
    • 11 years ago

    That’s what PhysX and 3D are for. Well for modern graphics cards anyways.

    • JustAnEngineer
    • 11 years ago

    Current-generation PC gaming processors are not too powerful. Old-tech consoles are too wimpy.

    • Squeazle
    • 11 years ago

    I didn’t, I just thought they might want to put the link there, rather than before that was mentioned. Semantics really, nothing important.

    • dpaus
    • 11 years ago

    Poseurs! I was rockin’ a ‘Radeon 8514’ in the mid-90s 🙂

    • abw
    • 11 years ago

    Also have a Radeon 7500 circa 2003 on my old laptop….

    • neon
    • 11 years ago

    I already have a Radeon 7000. It has some of the sweetest 2D performance, plus HydraVision ftw!

    • potatochobit
    • 11 years ago

    they dont need 6 different choices at 5$ incriments
    not to mention if you include last years batch they still sell and the year before that
    its like a math contest with product naming

    • tejas84
    • 11 years ago

    by bcronce “Personally, I’m rooting for ATI as nVidia has been a bunch of jerks.”

    Never were truer words spoken. I used to think Nvidia could do no wrong. Now I see them as the Rambus of the GPU industry.

    They are jerks and pricks….period

    • Buzzard44
    • 11 years ago

    I know right…

    Oh wait, there’s like 90%+ of the PC market that doesn’t need $100+ graphics.

    • madmanmarz
    • 11 years ago

    There should only be like one budget card – bare minimum for casual games, HD video, etc.
    Then like a $100 GPU, $150, $200+…

    • Decibel
    • 11 years ago

    I think you missed the last line of the article:

    [quote<]With its Radeon HD 6000 series product lines fully filled, AMD is already in preparation for the next generation Radeon HD 7000 series (Southern Islands) GPUs and is set to mass produce the GPU in May this year.[/quote<]

    • potatochobit
    • 11 years ago

    NO MORE BUDGET CRAP
    what is with them flooding the market with this junk
    who is buying this stuff?

    • bcronce
    • 11 years ago

    We’re in a transition unlike other transitions. DX11 is the beginning of a new way of rendering and CPUs/GPUs are currently too powerful. Then next generation of games are going to make use of many cores and actually make use of the new DX11 tech.

    DX11 allows for a whole new way of rendering compared to how everything has been done for the past 15 years. BF3 will make use of a lot of those abilities and UT3 looks to be awesome.

    Once those new games come out, any shortfalls or new ideas on where to go next will come out and Microsoft/ATI/nVidia will have new ideas on which way to go. I really think this will herald in a dramatic change in 3D graphics that will happen over the next 3-5 years.

    • can-a-tuna
    • 11 years ago

    “Radeon HD 6450, but a quick search at Newegg currently shows no matches for that card ”

    And yet you made price-performance conclusion about the card that it would suck and tell people to buy GT430 instead. There is still no street price for the card. How confusing!

    • ronch
    • 11 years ago

    I don’t know if it’s just me or not, but every time a new game title or engine rolls out it somewhat always falls short of being really realistic. Somehow it reminds me of calculus, that graphics quality always approaches reality but never really gets there. Funny thing is, for that itty-bitty bit of graphics improvement, new games demand far more graphics horsepower. Is it all really necessary? All those rendering tricks, and for what? Surely it gets harder and harder every time but not to the point of rendering all our high end cards function as slide show devices. Maybe the next show stopper still isn’t here (or maybe it’s Unreal Engine 3) but you can bet your a$$ it’ll come and you can kiss your GTX 590 prowess goodbye.

    • bcronce
    • 11 years ago

    I think SC2 is mostly CPU bound. It only makes use of ~2 threads and pegs both quite well.

    • Johnny5
    • 11 years ago

    Sales may be slow to change, but even if ATI is doing more with less die space or whatever, the pricing will generally be competitive so the consumer can reasonably go either way. Nvidia might have to get by on tighter margins to match ATI price, but they will do it as long as they can afford to, and probably even if they can’t, just so they don’t lose market share for when they do have cheaper, faster chips to hock. But even if sales don’t change if ATI has much better margins then they can be more financially successful, which is a fine reward for their efforts. And sales might be slow to change, but they are changing.

    • HyperFire
    • 11 years ago

    Hopefully we will be able to play SC2 at 1920×1080@120Hz without dropping below 60fps in battles.

    • khands
    • 11 years ago

    The 4850 at this point is a long deprecated part. We’ll probably see the 6750 (if one ever launches) at $100 or less with comparable power at launch.

    • designerfx
    • 11 years ago

    The more processing power that is provided, the more it’s used.

    I was about to get a 6970 but it sounds like I probably may as well wait for 7 series in order to either a: let the 6 series drop in price or b: get a midline 7 series for the performance of the 6 series.

    Even new games such as Rift (MMO) are pretty demanding lately.

    • OneArmedScissor
    • 11 years ago

    What will use it is what they made clear to be their intention a few years back:

    More and more monitors. And then one day, they’ll have a holodeck.

    • michael_d
    • 11 years ago

    If mass production will start next month then we should see the final product in late June which coincides with recent announcement that the 7000 series will be previewed in mid-June.

    • sweatshopking
    • 11 years ago

    When the quarter finishes, you check the market shares. it might be a percent or two different, but that’s it.

    • bcronce
    • 11 years ago

    New Unreal 3 preview video required SLIx3 580s to have a smooth FPS. I’m sure newer engines will make use of that power over the next year or two.

    • bcronce
    • 11 years ago

    I love my ATI6950(unlocked @ 915/1400), but I wonder how a 28nm Fermi will do. nVidia was targeting 32nm when they made their card and it has done quite well at 40nm, but I wonder if nVidia will be better to take advantage of the huge die-shrink.

    Either way the customers will win and both nVidia and ATI have great products. I can’t wait to see benchmarks.

    Personally, I’m rooting for ATI as nVidia has been a bunch of jerks.

    • jjj
    • 11 years ago

    Since TSMC said that they expect 2-3% of revenue to be from 28nm in Q4 and there is no one else talking about 28nm parts before Q4 this seems highly unlikely .
    Using GF for it is even more unlikely since GF doesn’t have 28nm yet and AMD already stated that GPU/chipset on GF’s 32nm next year would be too soon.

    • PrincipalSkinner
    • 11 years ago

    This brings us to an interesting problem. What will use all that power? Crysis 2 won’t. Thousands of UE3 games won’t. Battlefield 3 maybe?
    Anyway, my recently bought 5850 just got older now.

    • OneArmedScissor
    • 11 years ago

    If their new strategery is kicking people every time their next master plan comes to fruition, I can’t blame them. Better to have fresh blood and keep the ball rolling than let them sit around long enough to turn into another Ruinz.

    • stdRaichu
    • 11 years ago

    TBH, I think it’s more likely do be due to AMD being the first to market with a small die (cheap) that used very little power and still packed a wallop in terms of GPU grunt. The 5000 series was designed from the ground up to be small and power efficient (especially under load), making it and its derivatives natural choices, and nVidia didn’t respond to it for a loooong time.

    Disclaimer: I was an nVidia guy until the 5770 came out.

    • OneArmedScissor
    • 11 years ago

    Yeah, but now AMD is intentionally killing that market lol.

    • Flying Fox
    • 11 years ago

    And yet one has to wonder why Dirk was ousted.

    • Game_boy
    • 11 years ago

    A lot more laptops and OEM desktops have been using AMD discrete graphics since bumpgate. Notably Apple.

    • Game_boy
    • 11 years ago

    Let’s hope it brings a 4850 level of performance below $100. The 5750 was >$100 and there was no 6xxx part in that range.

    • odizzido
    • 11 years ago

    Let’s hope the jump to 28nm gives the nice boost in performance that hasn’t really happened since TSMC flopped and kept everyone at 40nm.

    • sweatshopking
    • 11 years ago

    but will it impact nvidia? probably not that much. changing markets takes years, usually. ever when they were kicking ass with the 4k series, they didn’t sell much more. the only people who change are the nerds, like us. anyone who goes to futureshop/bestbuy, dell, etc. is going to get whoever the loser with corporate training tells them to. and it’s brand, not device that wins in that world.

    market share has been pretty much the same for years. i don’t see this changing anything.

    • ImSpartacus
    • 11 years ago

    Yes, it’s awesome. Gotta love some competition, eh?

    • shank15217
    • 11 years ago

    They have been putting the pressure since 4000 series last I checked.

    • dpaus
    • 11 years ago

    The elves at AMD certainly have been busy lately… Between new CPUs, APUs and GPUs, I think AMD is introducing more new-and-significantly-improved products in the current 12-month period than they have in the last 5 years combined. If they can start to deliver 7000-series products in the second half of 2011 that’s going to put more competitive pressure on Nvidia than ATI was ever able to.

    • Squeazle
    • 11 years ago

    The first site you linked is referencing the 6000 series, 3 more cards on the way thing.
    Not that it shouldn’t. It’s just in a weird place.

Pin It on Pinterest

Share This