Site posts 3DMark scores for Radeon HD 6800 cards

After the rumors, the leaked specs, and the leaked photos, we’ve now come across purported 3DMark results for AMD’s upcoming Radeon HD 6800-series graphics cards. The numbers track pretty well with what I was able to infer from that comparative slide that found its way online last month.

According to Xfastest, the Radeon HD 6870 scored P16270 in 3DMark Vantage, while the slower 6850 managed to hit P14872. Those numbers are slightly below those of the existing Radeon HD 5870 and 5850, respectively, the site says. Xfastest also posted some 3DMark06 scores, which show the two newcomers pretty much matching their previous-gen siblings.

According to recent evidence, these Radeon HD 6800-series cards will feature Barts Pro and XT graphics processors, but those won’t be the highest-end GPUs in the Northern Islands family. Rumor has it that AMD is also cooking up a Radeon HD 6970 with a more powerful Cayman XT graphics chip inside. (Thanks to Expreview for the link.)

Comments closed
    • tygrus
    • 9 years ago

    If the product is virtually identical then they should have the same name with maybe a digit on the right (least significant) changed.
    If products use the same generation core than they should use the same family designation (digit on the left).

    I would understand very easily a 6850 being better than a 5870 if there was a >20% performance difference. Look at the G92 and G94 and how many naming generation of Nvidia products that got used for the low end.

    • ryko
    • 9 years ago

    let’s hope they are priced at $150 and $200….but that would make too much sense. price them high and watch more and more people not care about them. a major reason why many skipped the 5×00 series is they weren’t really a good value…especially compared to the 4×00 series cards.

    • Krogoth
    • 9 years ago

    Geez, a leaked score from a dubious source based on a synthetic benchmark is creating a lot of fuss. I guess people cannot wait for real hardware and real benchmarks.

    Anyhow, I don’t see the big deal. GPUs are running into problems of their own due to physics. It is expected with exponential growth. The days of rapid performance leaps are over. Get used to tiny bumps here and there like we were seeing with CPUs since Prescott.

    If the HD 6xxx series ends up being marginally faster than HD 5xxx that it replaces. It will likely have one key advantage for AMD. It will be cheaper to produce. Perhaps, cheaper than the current Fermi derivatives. I cannot said whatever or not this will translate into lower MSRP prices.

    The current pricing situation with HD 5xxx is a combination of TSMC 40nm troubles, lack of intense competition for most its product life, crummy economy and the waning demand for performance graphics.

    The PC as a gaming platform is no longer the driving force in the market. It is already playing second fiddle to gaming consoles. All recent attempts to develop a PC game that push the hardware envelope were fiscal disappointments to major publishers. As an unintended side-effect, indie developers are unwilling to gamble on the same strategy. The result is a sea of console ports, ultra-conservative MMORPGS with a few exclusive titles here and there.

    The proof? A 8800GT and its modern derivatives can still deliver acceptable performance in most current games. Nvidia and AMD are coming up with performance taxing and ridiculous marketing gimmicks (Eyefinity, 3D Vision, PhysX) to justify their current performance line. Their long-term strategies involved moving away from PC gaming as their bread and butter onto more promising waters.

      • YeuEmMaiMai
      • 9 years ago

      I would not call my laptop with a I5 running faster @ 2.97Ghz compared to my core 2 duo desktop running @ 3.6 an incremental performance increase……that’s a pretty dang good jump….

    • Bensam123
    • 9 years ago

    So the new generation, is the old generation… wtf?

      • flip-mode
      • 9 years ago

      Seriously? No, it is not. It’s all new silicon, supposedly, with totally redesigned shaders. Obviously, all that can clarify things at this point is TR’s review.

    • wira020
    • 9 years ago

    Does anyone here really buy gpu according to its’ supposed market segment or model number?… I usually buy gpu according to my budget and after comparing it a bit to the competition.. but maybe I’m just weird..

      • Voldenuit
      • 9 years ago

      The market segment pretty much determines how much the companies charge for them.

      If AMD moves a midrange part into an upmarket segment, the performance won’t change, but you can bet they will try to charge more for it.

    • Dingmatt
    • 9 years ago

    AMD Board Meeting:

    CEO – “Crap Nvidia’s Fermi is catching up, damn those affordable 460’s; quick we need a new product”

    Underling 1 – “But we’ve run out of numbers in the 5000 series…”

    CEO – “Then just overclock and re-badge the old stock and market them as the 6000 series, do I have to think of everything..?”

    Underling 2 – “Didn’t Nvidia piss alot of people off doing that?”

    CEO – “That’s why we’ll just shuffle our naming conventions around a bit to confuse the fan-base.”

    Underling 1 – “That’s brilliant!”

    Underling 2 – “There’s only so much overclocking we can get a way with..?”

    CEO – “Then just spread rumours over a vaporware dual chip card… something like the ‘6990 XXL’; we can always decline its existence in 6 months”

    Underling 2 – “I’m sold!”

      • GrimDanfango
      • 9 years ago

      That’s all hilarious, but you missed the part about these being completely newly designed chips. It’s still unclear whether they’re *actually* replacing the 58x0s with the Barts chips, or whether this is just some speculation that everyone’s taken as fact.
      But the fact remains that even if they go with that naming convention, you’ll have cards that perform similarly that cost a lot less and use a lot less power than before, and a month or two later a chip that costs similar and uses similar power that runs a hell of a lot faster.

      Doesn’t seem such a cheap marketing ploy to me… seems like business as usual.

      The only issue anyone has is with this supposed segment-naming change, which is based of one leaked chinese slide with no other sources to back it up, and really doesn’t change anything even if it turns out to be true. Basically, everyone’s bitching because it seems like Cayman might just take a month or two longer to turn up.

    • The Egg
    • 9 years ago

    wait wait wait a minute here….

    I missed the part where AMD officially put a model number on something. For all you know, they could make these the 6450 and 6470.

      • Voldenuit
      • 9 years ago

      Would you buy a new BMW if they took next year’s 1 series and named it a 3 series? And kept it at the pricing of the outgoing 3 series?

      In your analogy, people would be getting more for less. Here, they’re getting less for more*. Big difference.

      * Well, even pessimists like myself are expecting the 6800s to retail for less than the 5800s. But in the tech world, stagnation is synonymous with decline. A 6800 that performs roughly the same as a 5800 1 year later is no great deal, unless it comes with a bargain price. Which everyone is saying it won’t.

        • bimmerlovere39
        • 9 years ago

        Returning the 3 series to a smaller size sounds like a lovely idea, actually…

        Not that that’s at all relevant ๐Ÿ˜›

        • Corrado
        • 9 years ago

        I don’t know if you priced a 1 series lately, but they’re really not much cheaper than a 3 series. a 135 reasonably optioned is over $40K.

      • Waco
      • 9 years ago

      Are you trying to introduce some logic here?

      *gasp*

      None of this is official yet everyone is getting riled up as if AMD just fucked their mother. I just don’t get it.

        • GrimDanfango
        • 9 years ago

        Hehe, yeah… I’ll laugh my arse off if this turns out to be called the 6770 after all. The only shame is that in the world of the internet there won’t be any way to laugh directly into everyone’s actual faces ๐Ÿ˜›

        Reserving judgement for judgement day… only a week to go ๐Ÿ™‚

    • internetsandman
    • 9 years ago

    why are they increasing the model numbers for a lower level of performance? it’d make more sense for barts to be the 5700 series =/ what’s the deal with saying it’s a next gen product that performs worse than it’s previous gen equivalent?

      • MadManOriginal
      • 9 years ago

      Maybe they figured that no real change in price/performance the last generation, just added features/price, except for top tier chips wasn’t good enough and they wanted to go for worse price/performance.

      • Voldenuit
      • 9 years ago

      Well, since the “new” 6700 is a rebadged Juniper (5700), we can rejoice that AMD is at least producing /[

    • Goty
    • 9 years ago

    To settle this for all the alarmists, just make the following substitution:

    6850 –> 6750
    6870 –> 6770

    There, now everything is right with your world. The numbers associated with each product segment have changed, not the performance tiers associated with those segments.

      • Voldenuit
      • 9 years ago

      Can we insert our pricing of choice as well?

        • Game_boy
        • 9 years ago

        Yeah, if these are priced like 58xx it’s sad face time.

    • Voldenuit
    • 9 years ago

    I think this is a dangerous game AMD is playing. They’re taking advantage (so they think) of a lack of competition to push their midrange part into an enthusiast pricing tier.

    If, however, nvidia manages to respond with price cuts and/or fully functional GF104 parts (hmmm…conspiracy gears whirring), they may be forced to cut prices as well. Now, once people become accustomed to paying $150 for a x800 part, the damage to their pricing tier may well carry on to future generations. Oh well, I guess there’s always hexadecimal…

    New, the Radeon 7A2E!

    • Sargent Duck
    • 9 years ago

    well, that’s disapointing. I was looking at upgrading my 4850 to a 6850, but if it’s only the performance of a 5850…don’t see a reason to upgrade yet.

    Guess I’m waiting for 7850 : (

      • aestil
      • 9 years ago

      This is basically why AMD ‘changing’ the naming scheme is really a bad move on their part. Some consumers with something like a 4850 thinking ‘I’m going to upgrade to the 6850’ will be disappointed that the performance seems to have gone backwards from the generation they skipped.

      They won’t realize what they really want is the 6950, and that the die size of the 6850 is closer to the 6750.

      For this ‘rebranding’ to work, AMD has to educate a lot of users that are not watching the rumor forums every day.

      Also, with 460GTX 1 gig’s selling for 200 – 220, and probably closer to 200 when this 6850/70 gpu launches, I think AMD should go for the juglar and price Barts XT at $220 and Barts Pro at $180 MSRP. That would put a tremendous amount of pressure on the GTX470 and the GTX460 [both versions].

      That would be successful and probably universally lauded as great value gpu’s. For some reason I don’t see AMD doing this.

        • Voldenuit
        • 9 years ago

        l[

    • flip-mode
    • 9 years ago

    It’s all about price. If the 6850 comes in over $200 or the 6870 over $250 I’ll be sad – very, very sad. Er, well, maybe not that much since I’m not playing anything that the 4850 doesn’t handle. The fact is that I just don’t need a new card. I almost want to need a new card. Well, if I make it to summer 2011 with the 4850 that’ll be pretty swell – that’d mean almost three years on a card I bought used for $150. During my progression from Radeon 9500 Pro to Geforce 7900 GS I was buying/selling cards at least once a year if not more frequently than that. (9500 Pro to 9800 Pro to 6800-128 to 7600 GT to 7900 GS) It has been really nice to have something and see it last.

    PS – I don’t get all the complaining over the benchmark used when it’s a r[

      • sweatshopking
      • 9 years ago

      +1!!!!!!!!!!!!!!

    • Mystic-G
    • 9 years ago

    I prefer real world gaming benchmarking.

    • eitje
    • 9 years ago

    Oh, I didn’t realize people still used 3DMark for stuff.

      • can-a-tuna
      • 9 years ago

      What game would you suggest that would be neutral enought to evaluate nvidia/AMD graphics performance and could be used for years to come? 3dmark is good in giving rough performance estimation in one go.

        • GTVic
        • 9 years ago

        Duke Nukem Manhattan Project

    • AMDguy
    • 9 years ago

    Is there an error here, or is the new generation 6870/50 actually slower than the older 5870/50?

    • TaBoVilla
    • 9 years ago

    3DMark scores? for 3 years the *[

      • Spyder22446688
      • 9 years ago

      Pfffft. 1920×1200 or GTFO.

        • can-a-tuna
        • 9 years ago

        Exactly :). 1080p TV-screen has too few pixels.

          • Waco
          • 9 years ago

          I’m sure those extra 120 pixels in height really change how the game plays.

            • Dingmatt
            • 9 years ago

            Yeah, the extra 11% resolution height makes all the difference (you can see the sky and ground at the same time)

            • Waco
            • 9 years ago

            Pffft…with proper FOV settings you see the same damn thing height-wise regardless.

            • Dingmatt
            • 9 years ago

            Yep, with added fish bowl effect.

            • Waco
            • 9 years ago

            Which is how you see in real life. I fail to see the problem.

            • Krogoth
            • 9 years ago

            The problem is from butthurting people who subconsciously think that bigger numbers = better and users who were on 16:10 panels before 16:9 became mainstream and took over.

            I really don’t see the big fuss either. They are both widescreen formats.

            • Dingmatt
            • 9 years ago

            I disagree, I believe the problem stems from an inferiority complex in people who couldn’t afford the extra 120px.

            Cos we all know 1080’s lower down that price list.

            • GrimDanfango
            • 9 years ago

            The difference is, 1920×1200 is a professional workstation resolution, 1920×1080 is a TV widescreen format. So 1920×1200 tend to more commonly be higher quality, more expensive screens. So yeah, while there’s not a huge difference in screenspace, 1920×1200 is definitely the true highest-endest awesomest of the two, 1080p sucks ๐Ÿ˜›

            I still haven’t dared to venture into the mightily expensive realms of 2560×1600 however… that’s the ultimate test of Crysis, hehe.

      • Sahrin
      • 9 years ago

      uh…not if you actually want to *play* the game. Crysis is a glorified tech demo.

        • BobbinThreadbare
        • 9 years ago

        Not really, it was a full game that was actually fun.

        Hell the single player is about 5 times as long as MW2.

          • Voldenuit
          • 9 years ago

          Yeah, had a lot of fun with Crysis in Delta mode myself. The game is a lot better than the running joke on the internet suggests.

            • sweatshopking
            • 9 years ago

            It was just too short. I beat it in like 3.5 hours first run through on hard. I was like WTF? I thought it was the first big boss, and the game was over!

            I didn’t think it was a bad game, just WAY to short, and once they brought the aliens in, it wasnt as good.

Pin It on Pinterest

Share This