The many seasons of better benchmarks

Comments closed
    • mimotope
    • 8 years ago

    Inferior and expensive hardware did not stop Apple from being financially successful.

    • j1o2h3n4
    • 8 years ago

    I’m really ignorant here, who is that nvidia spokeperson? I thought Jen-Hsun Huang is taiwanese & with glasses.

    • Mr Bill
    • 8 years ago

    Remember back when AMD zelots would gnash their teeth because a fast Intel single core would absolute whip a multicore AMD in video game frames per second (Quake in particular)? Now AMD brings out a mobile Llano CPU that whips in frames per second and the focus partially shifts to encoding, encryption, and other CPU intensive tasks, where sadly, even extra cores don’t seem to count as much as having three memory channels instead of two. So sad, too bad. LOL.

      • I.S.T.
      • 8 years ago

      What in blue blazes are you talking about? By the time AMD’s multicore CPUs began appearing, they had the IPC advantage in most applications, especially games!

      If you mean Core 2 based CPUs like single core Celerons, they were always clocked too low to compete with AMD’s Athlon 64 X2 budget offerings.

      So, I really have no idea what in Carmack’s name you mean.

        • Mr Bill
        • 8 years ago

        AMD may have had the IPC in some things but they did not have the “C”.

    • Abdulahad
    • 8 years ago

    It’s quite interesting how Fusion is changing the landscape for AMD as well as managed to bring it to profitability..

    • Arclight
    • 8 years ago

    So Intel Pentium 4 (a bad chip, no one can deny) + nVidia FX (a bad series of cards, i had a FX 5500 and i hated it with passion) = AMD Fusion chips meanning they are a combination of slow CPU and slow GPU?

    That sounds about right!

      • Firestarter
      • 8 years ago

      Northwoods weren’t bad chips at all.

        • Arclight
        • 8 years ago

        Like hell it was….

          • derFunkenstein
          • 8 years ago

          No, Northwoods P4s were actually pretty good CPUs. It was Prescott that got botched.

            • travbrad
            • 8 years ago

            Yep the P4 and K7 were very closely matched during the Northwood time period, it wasn’t until the K8 that AMD had the huge performance advantage. I guess Intel only matching AMD could be considered “bad” by today’s standards though.

            • Krogoth
            • 8 years ago

            Actually, Northwood Bs and Northwood Cs pull ahead of their AXP counterparts. The only thing going for AXPs where they were unlocked and the platform itself was more affordable.

            Prescott and A64 refresh changed that though. 😉

            • travbrad
            • 8 years ago

            I guess Intel did have a slight performance advantage (while AMD had a slight price advantage) at that state, but overall they were still very “closely matched”, with AXP even performing better in certain applications.

            I dug up an old article from that time period, and it looks pretty darn close to me. 🙂
            [url<]https://techreport.com/articles.x/4725/13[/url<]

      • Dashak
      • 8 years ago

      There’s a considerable percentage of desktops still being used that house P4s. Completely fine for their users, too.

        • FuturePastNow
        • 8 years ago

        I know, right? I’d still be using one if bad caps hadn’t killed its mobo two years ago. Good enough for web browsing, able to hundle (up to) 720p video, could play any version of SimCity. What more do I need? Low power consumption? Don’t make me laugh.

        The computer I have now can of course encode video much faster, and play much newer games at a much higher resolution. But if that old P4 still worked, I wouldn’t have shelled out to replace it. Although, my P4 at least had Hyperthreading and a meg of cache; those Celerons with 128k of L2 really were awful processors to use.

        • clone
        • 8 years ago

        so long as the users don’t do much that’s true but P4’s didn’t age well for anyone hoping for more.

        AMD’s weren’t super amazing but the old K8’s and even the K7’s have aged far better.

        on a side note for whatever reason, I suspect the lower power consumption I found P4 boards failed more often than AMD’s…. even the Asus ones had lots of cap problems, my sis in law is using a 3200+ K8 and my mother in law is still using a 2200+ K7 AMD HP desktop that still works ok despite every single cap big and small being bloated and several showing leaks, I said 6 months ago I didn’t think it’d last much longer and yet it’s still rolling along doing e-mail and surfing albeit slowly…. she never shuts it off may be why, don’t know thought it’d be dead by now.

        all of my P4’s are dead now although that may be due to the Asus P4p’s I was using…. don’t know why but all are dead.

    • boing
    • 8 years ago

    My new AMD-based notebook has a HD-Internet on it.

      • SonicSilicon
      • 8 years ago

      High Definition Internet? Hard Drive Internet?
      I’m sorry, but “HD-Internet” has no meaning.
      Hmm, was it a typo? HD+Internet?

        • torquer
        • 8 years ago

        Humor, you lack it.

        • boing
        • 8 years ago

        No, the sticker on it does indeed say “AMD HD-Internet”.

          • derFunkenstein
          • 8 years ago

          Probably indicates GPU-accelerated flash for 720p and/or 1080p Youtube videos.

    • wierdo
    • 8 years ago

    Just helped a friend get a Llano based laptop, the price/performance (~$450) was perfect for his needs – light gaming on the go, such as WOW and counter strike 2 etc – so I wouldn’t say the “total user experience” line is far fetched in this situation.

    • tbone8ty
    • 8 years ago

    im not gonna lie….i like the effort…..but these are just not working here.

    stick to what you do best

      • Evil_Sheep
      • 8 years ago

      If this comic strip was a horse, I’d shoot it.

        • NarwhaleAu
        • 8 years ago

        It’s not that bad… it’s at least good enough to do some plowing / light haulage.

      • NarwhaleAu
      • 8 years ago

      You’ve just got to think of it more as witty commentary on the state of the industry… rather than an actual “comic” strip.

    • vvas
    • 8 years ago

    Heh, spot on. Of course it couldn’t be helped; AMD probably couldn’t risk using the brand new Bulldozer core for their mainstream APUs (hedging one’s bets and all that), which is why Llano has to make do with the tried-and-true (and by today’s standards, slow) K10 core. The real fun starts next year with Trinity, which they’ll probably try to push out the door as fast as they can, and then pretend that the current chip never existed. 🙂

    • sweatshopking
    • 8 years ago

    i think the most important question, and one i haven’t seen asked yet is: Which spokesman would you rather make out with? i like the red tie on the ginger, but i think ginger’s and red might just be too much. i’ll probably go intel.

      • dmjifn
      • 8 years ago

      Intel /is/ the most Don-Draper of the three, so I’m going to have to agree.

      • MadManOriginal
      • 8 years ago

      But IME redheads are crazy-wild bitches in the sack.

        • sweatshopking
        • 8 years ago

        interesting, i’ve only ever been with my wife. if she dies/leaves me, i’ll have to keep that in mind!

        • dpaus
        • 8 years ago

        I can confirm that…. But brunettes can surprise you, too.

    • ronch
    • 8 years ago

    AMD did something clever with Llano. Instead of throwing away a still serviceable CPU architecture they bolted on some capable graphics circuitry and extensive power saving technologies. Might sound simple on paper but the end result is a compelling product. And because they just integrated existing parts to create it, the cost of R&D is a fraction of what it would have been if each part of the chip was built from the ground up.

    • Convert
    • 8 years ago

    Well done Fred, I’ve been following the comics and they are getting better and better.

    • WillBach
    • 8 years ago

    Well, the A8 may win in frames / system price. It’s true that a lots of spokespeople use the phrase “total [user] experience” in areas where the experience is one dimensional and measurably worse on their products. A company can get away with it if there are features / use cases they support or if buying their product means that the consumer doesn’t have to buy a second product later.

    As a thought experiment, it should be easy to see that a phone that integrates a decent music player with it’s other functions can frequently win out against an identically priced phone (even if that phone is a better phone) because it’s also an MP3 player replacement. I can see A8 systems being competitive overall even if an A8 is much slower than a Core i5 2600K and a GTX 580 if it costs enough less.

      • riviera74
      • 8 years ago

      Llano might be OK for a desktop, maybe. But I would seriously consider getting one (A6 or A8) for a DTR notebook, especially given that the power draw is not as high as older AMD notebook processors have been. Either way, Bulldozer and Trinity cannot come soon enough.

    • Misel
    • 8 years ago

    While the statements seem to be the same again. Are you sure the A8 is gonna fail the same way the GeForce FX and the P4 did. I haven’t seen any reviews as bad as “back in the days” when the others were released.

      • Corrado
      • 8 years ago

      I think largely because AMD didn’t come out and say “A8 is gonna wipe the floor with everything! Just you wait and see!” nVidia and Intel were talking smack like nobodies business back then only to release subpar products.

      • Game_boy
      • 8 years ago

      The TR reviews of Precott and Geforce FX were actually fairly positive. Go back and read them. I have no idea how they got the terrible representations they have today.

      Phenom II since its launch has been further behind the competition on performance than either of these, and that got GOOD reviews.

        • ludi
        • 8 years ago

        They weren’t bad products in absolute performance terms, but they did under-deliver relative to what was promised and the power/heat issues couldn’t be ignored.

        • rxc6
        • 8 years ago

        In the case of the FX, the terrible representation today might have something to do with the fact that it was perceived as bad back in the day:

        “But I do have a few definite opinions about the GeForce FX 5800 Ultra before it rides off to take its place alongside the 3dfx Voodoo 5 6000 in the Museum of Large Video Cards That Didn’t Quite Make It. This would have been a great product had it arrived six months earlier at this same clock speed with lower heat levels, a more reasonable cooler, and lower prices. As it stands, the GeForce FX 5800 Ultra is not a good deal, and I wouldn’t recommend buying one. Yes, it’s very fast, especially in current games. It’s also loud, expensive, and did I mention loud? ”

        So yeah, the products weren’t bad. I still would argue that the reviews were not positive and in light of the competition they were a bad choice.

      • Farting Bob
      • 8 years ago

      The P4 failed? Holy crap, i wish just once in my life i could fail that well, because the thing made billions in revenue. It may not have been the best processor in its price range at launch but it made vastly more money than its rivals.

        • ludi
        • 8 years ago

        You may have forgotten about its sunrise and sunset products, the Willamette and the Prescott. Then there was this project called Timna right around the same time, which blew up in the RDRAM debacle, which itself occurred because of Intel’s over-reaching design plans for the Pentium4 family.

        Northwood was a great processor, and Cedar Mill remediated Prescott’s largest shortcoming but the design was reaching EOL at that point.

        Intel made money anyway because that’s what Intel does. But they’ve done better.

          • Krogoth
          • 8 years ago

          Tinma = Pure R&D project that was the spiritual predecessor of Sandy Bridge (SoAC). Intel tried to make a P6 design (Pentium Pro-Pentium 3) and throw in a memory controller, I/O and graphics. It is likely that the design was simply too complex and expensive to be economically feasible with process tech at the time. RDRAM deal was probably the final nail in the project’s coffin.

            • ludi
            • 8 years ago

            [quote<]It is likely that the design was simply too complex and expensive to be economically feasible with process tech at the time.[/quote<] I'm assuming you just made that up from scratch. Timna was on track to be a mainstream product but was tied to a fully integrarted RDRAM controller. By the time it became clear that RDRAM would not come down in price, thus excluding Timna from its intended lower-end markets, the product was too far along in development to justify a redesign. Intel attempted to pair the device with an external memory translator hub but the MTH had serious bugs that ultimately ashcanned the entire project.

    • Pax-UX
    • 8 years ago

    It’s like cars, they all do a good enough job so we’ve started worrying about fuel efficiency or the extras, nobody is overly concerned with the engine any more. This is what happens when a product no longer bring anything unique to the table.

      • jorjxmckie
      • 8 years ago

      While mainstream users may not care, enthusiasts care, and many mainstream users will ask their enthusiast friends for advice when purchasing new computers.

        • OneArmedScissor
        • 8 years ago

        Then the “enthusiast” says, “Dude, get this $1,000 Core iX desktop! It has 10% higher single threaded performance!” and the “mainstream user” learns to never do that again and heads over to Best Buy to grab whatever laptop is $500.

          • thanatos355
          • 8 years ago

          It’s more likely that the enthusiast will point their friend towards a 2500k, a 2600k, or an 1100t. Only a fanboy or a wannabe would point to a $1000 processor as a good investment for grandma, Uncle Bob, or Neighbor Joe.

            • OneArmedScissor
            • 8 years ago

            WHOOOOOOOOOOOOOOOOOOOOOOOOOOOSH

            • thanatos355
            • 8 years ago

            I just saw the quotes on enthusiast…

            • BobbinThreadbare
            • 8 years ago

            It wasn’t a funny joke in the first place

            • flip-mode
            • 8 years ago

            I don’t know if I’d call that a “joke”, per se. There is no punch line, no irony. My take is that he was just pointing out that enthusiasts frequently float around outside the realm of practicality.

            • cegras
            • 8 years ago

            Which is a tired, old stereotype.

            • dpaus
            • 8 years ago

            So, these two enthusiasts walk into a bar. One says ‘Ouch!’ and the other says ‘Wasn’t that supposed to have been raised?’

        • bimmerlovere39
        • 8 years ago

        And just like cars, the enthusiasts will recommend something that isn’t a “big” brand (Intel, Toyota/Honda) for something that offers a better overall value (AMD, Mazda/Subaru/etc), some will listen, but a lot will still go for the brand name they know. Sadly.

      • Krogoth
      • 8 years ago

      Indeed, CPUs have become commodity products in eye of the mainstream.

      There’ no killer mainstream application that needs tons of cores and cache. Actually, it has been this way for years. It’s just finally sinking in.

      PC Gamers are hard to press to get newer hardware platforms when consoles dictate the baseline. You don’t need $399+ CPUs and $299+ GPUs to get a quality gaming experience.

      The only market where more CPU performance still matters is prosumers. That’s what Bulldozer and Sandy Bridge-E are going to tackle. The early-adopters who are going to jump on their platforms are going to play “beta-tester”. 😉

        • SPOOFE
        • 8 years ago

        [quote<]There' no killer mainstream application that needs tons of cores and cache.[/quote<] There's no NEW killer mainstream app that needs etc. etc. Photoshop will, and has for a long time, use whatever you throw at it... and when you're waiting almost a full minute waiting for a weak Unsharp Mask to finish processing, you start wishing for more cores.

          • Firestarter
          • 8 years ago

          Lightroom is so much better on a fast multi-core PC!

          • Krogoth
          • 8 years ago

          Photoshop = Workstation class application.

          It is certainly not a mainstream product (geared for average joe) considering its licensing costs.

          We are taking about gamers and average joe users here. You will be hard to press to find an application geared for them that genuinely needs the power of high-end CPUs to obtain acceptable performance.

          AMD and Intel are both fighting uphill battles. The market for faster and faster CPU has been diminishing over the years. It is becoming more and more niche. That’s why the new trend is tablets, smartphones and netbooks. The focus is trying to make platforms small and portable while maintaining enough performance to do mainstream tasks.

Pin It on Pinterest

Share This