FX-9590 now available in retail kit, at massive discount

Remember the FX-9590? AMD introduced the hot-clocked Vishera chip in June and loudly touted its 5GHz Turbo frequency. The 220W flagship was supposed to be reserved for pre-assembled systems from boutique builders. However, it soon popped up for sale as a bare processor—and with a whopping $920 price tag. By August, the FX-9590 was selling on its own at several vendors, including Newegg, which had the chip listed for $880.

So, what about now? Funny story. The FX-9590 is down to $350 at Newegg. That’s for the bare chip, but there’s also a full retail kit for $390. The kit comes with an all-in-one water cooler, so it’s a pretty good deal, relatively speaking.

The slightly slower FX-9370 has also made its way into a retail package. That chip peaks at 4.7GHz but has the same 220W TDP as the FX-9590. You can grab it with what appears to be an identical liquid cooler for $290. AMD’s previous top-of-the-line CPU, the FX-8350, still sells for $200.

In just a few months, AMD’s 5GHz monster has gone from nearly a thousand bucks to just $50 more than the Core i7-4770K. And I still can’t recommend it. But it’s been a while since we talked about CPU price cuts, and this is evidently a pretty big one.

Comments closed
    • thesavings
    • 6 years ago

    Nice and fun,I found discount for FX 9590 only $319.95 from amazon [url<]http://www.hotdiscountfinder.com/?s=FX+9590[/url<]

    • spigzone
    • 6 years ago

    [url<]http://www.forbes.com/sites/jasonevangelho/2013/10/08/why-amds-mantle-matters/[/url<] "AMD claims Mantle was created in collaboration with several developers, but why haven’t we heard about them? It turns out DICE and EA have a timed exclusivity agreement from a PR standpoint, but an AMD representative assured me we’ll hear about several more developers and their Mantle-powered games in November during AMD’s APU13 conference." The FX series might be looking a whole lot more competitive in gaming in a few months.

    • rwburnham
    • 6 years ago

    AMD, melting the icebergs.

      • Deanjo
      • 6 years ago

      The real source of global warming.

        • ronch
        • 6 years ago

        Not really. Intel may be ahead in terms of lower power consumption per computer these days but there were far more Intel machines in use in the history of personal computing (they’ve always held the lion’s market share), thereby making up for the lower power consumption and definitely eclipsing AMD in terms of total dissipated heat.

          • chuckula
          • 6 years ago

          All that… and of course, Pentium 4…

          • Klimax
          • 6 years ago

          Hm, that would be funny calculation… (I suspect it would be more equal then some suspect.)

          • Deanjo
          • 6 years ago

          You are completely forgetting about Opteron powered climate simulator supercomputers that have the thermal output of a white dwarf.

            • ronch
            • 6 years ago

            If you’re talking about K8 Opterons, no, the K8 architecture was pretty efficient compared to Intel’s wares back then, and still probably had less than 50% server market share (IIRC, ok?). If you’re talking about K10/K10.5 or Bulldozer Opterons, then the answer is still no because AMD’s market share increasingly shrunk during these architectures’ runs and AMD’s power consumption figures per server probably didn’t compensate for the shrinking number of Opterons shipping during that time.

            • just brew it!
            • 6 years ago

            Does that mean they can simulate a white dwarf without needing any actual simulation code?

            • ronch
            • 6 years ago

            Well, no, without simulation code it can’t simulate a white dwarf. But then, I’d much rather use it to play Crysis.

          • just brew it!
          • 6 years ago

          Continuing this rather pointless tangent for yet one more post…

          Direct dissipation of heat isn’t the real issue anyway. The pollutants emitted to generate the electricity are a bigger concern. (But since the amount of electricity consumed is proportional to the heat dissipated I guess the argument still works.)

    • DarkMikaru
    • 6 years ago

    AMD FANBOY.. yes I am. A fool, no I am not. I agree with many of you who posted questions such as “what were they thinking” and “who would buy this”. I love AMD, all my family’s rigs, work stations at work, home servers, customer builds for customers etc. I build almost exclusively AMD, (unless Intel in a given environment / task & budget would be a better choice) it gives the value and performance at a competitive price point. But this… I just can’t justify this. Even as its come down to 350 I just can’t shell out my hard earned money for this thing. Especially since 5Ghz should be the base clock anyway, not “Turbo Mode”. Ugh.

    Honestly, I also buy AMD because I like their products & imagine a world without AMD. Everyone would be paying 400 dollars for Celerons! We need them, period. IBM ain’t coming back in and has already stated they want nothing to do with hardware, the money is is services & software. They’re right! You Intel fans should think about that. Grandma won’t be able to tell the difference between an i3 & an A4 based build. They both will get her to Facebook – Candy Crush & email just as quickly. And you’d save 70 bucks. Anyway… I’ve ranted enough. This was just a bad idea, i hope they didn’t bin a shit ton of these thinking Boutique builders and overclockers would flock to it. 🙁 Come on AMD… we need you!!!

    • spigzone
    • 6 years ago

    “In just a few months, AMD’s 5GHz monster has gone from nearly a thousand bucks to just $50 more than the Core i7-4770K. And I still can’t recommend it.”

    How about if it schools the Core i7-4770K on Battlefield 4 with Mantle and EA says to expect the same for all it’s upcoming Mantle games?

      • Klimax
      • 6 years ago

      Only in how many minutes it will heat your room… (For anything else it lacks sufficient power. like Pentium 4 did all those years ago, of which BD is adaptation)

      BTW: I did heat room with Pentium 4 Northwood clocked to 4.2GHz. (So I would know a thing or two about that use case)

    • ronch
    • 6 years ago

    You know why this failed? Because AMD tried to fool the unsuspecting buyer by taking a good old FX-8350 that goes for just $190 (an unbeatable value, BTW), overclocks it to a speed most FX-8350 chips can reach anyway, and charges a stupid price for it. In a sense, AMD tried to charge much, much more for their FX-8350 chips. Not really a good idea unless we’re all a bunch of idiots. Maybe it could work if these things clocked at 8.0GHz or if FX-8350 chips were locked.

    Late edit – corrected typo

    • kamikaziechameleon
    • 6 years ago

    I don’t know how AMD is holding it together, they were in such a better position before bulldozer came out. Part of that was making products that preformed more poorly than their prior gen and another part is Intel always moving forward. I don’t see how AMD CPU’s will compete in the near future without stupid price cuts.

      • ronch
      • 6 years ago

      Competing with Intel must be one of the most difficult things any tech company can ever find itself doing. I personally applaud AMD for having stuck this long in this oligopoly, despite having made several questionable moves in recent years.

      And despite Bulldozer not really blowing Intel away, I think it’s one of the most interesting CPU architectures ever designed. Can you name any other processor which features shared front ends and sporting a ‘speed demon’ design? Bulldozer is really some piece of work. It’s not really a bad architecture at all: it’s just a little ahead of its time. Game developers are well on their way to making games use as many cores that’s available to them and Bulldozer will start to look better and better.

    • ronch
    • 6 years ago

    Why even pay $350 for these? At that price the 4770k is a no-brainer. If you would rather get AMD then the FX-8350 at $190 is a no-brainer as well.

    • spigzone
    • 6 years ago

    [url<]http://www.eurogamer.net/articles/digitalfoundry-future-proofing-your-pc-for-next-gen[/url<] "We approached a number of developers on and off the record - each of whom has helped to ship multi-million-selling, triple-A titles - asking them whether an Intel or AMD processor offers the best way to future-proof a games PC built in the here and now. Bearing in mind the historical dominance Intel has enjoyed, the results are intriguing - all of them opted for the FX-8350 over the current default enthusiast's choice, the Core i5 3570K." It will be interesting to see how this shakes out, FX-8350 vs Core i5 3570K, on Battlefield 4 with Mantle, which will be the first real 'future proofing' test case. I wouldn't bet on the Core i5 3570K.

      • maxxcool
      • 6 years ago

      /yoda/ bait is strong with this one… mrrrmmmmmmm hehehehehe /yoda/

      • ronch
      • 6 years ago

      I bet my computing future on the FX-8350. More and more it seems like I made a good bet.

        • DarkMikaru
        • 6 years ago

        Honestly, I didn’t need to upgrade from my X6 1090T but I wanted to try the 8350. I’m a happy user, and it gives me the performance level I was prepared to pay for.

      • End User
      • 6 years ago

      [quote<]It will be interesting to see how this shakes out, FX-8350 vs Core i5 3570K, on Battlefield 4 with Mantle, which will be the first real 'future proofing' test case. I wouldn't bet on the Core i5 3570K.[/quote<] Mantle is a new low-level graphics API specifically geared for AMD’s Graphics Core Next architecture. It has nothing to do with the CPU. You will be able to pop a Graphics Core Next GPU into a 3570K rig and play Battlefield 4 using Mantle.

        • maxxcool
        • 6 years ago

        they actually burbled that that part of the code is for better extra core utilization. which I am for despite my COMPLETE AND UTTER HATRED OF THE 8-CORE LIE that they are selling now.

        what i am interested in is how well hyperthreading will benefit. 😉

        • spigzone
        • 6 years ago

        And I’ll be able to pop a Graphics Core Next GPU into a FX8350 rig and go all Hunyakian Rapinator on your Skittlebones!!

        How does ‘It has nothing to do with the CPU’ even make sense??

          • Deanjo
          • 6 years ago

          Nothing like slappin a killer card on a bus challenged platform.

      • Duck
      • 6 years ago

      The price gap is between the 8350 and 3570k is bigger over here too. About $32 here compared to $20 in the USA at newegg. More value from AMD.

      • allreadydead
      • 6 years ago

      AAA+ game developpers eh ?

      Are you talking about the ones that made games for primarily for consoles and didn’t even spend time to properly port to PC ? The ones that doesn’t give a shit about PC nearly a decade now ? The developpers that got critized endlessly for their “non-PC” style approach to new game developping ?

      Yes, they own the industry. And no, I will not take a damn word of them. Big firms and big time developers who ruined awesome PC Game Franchises for adapting it to consoles can die in fire.

        • spigzone
        • 6 years ago

        For the love of god no, I was talking about the [b<] other [/b<] ones!!

    • ronch
    • 6 years ago

    This is a slap on the face of AMD, figuratively speaking, and whoever dreamed up this stupid idea of trying to one-up Intel by cranking the Vishera chip all the way to its practical limits needs to be slapped in the face as well (and shown the door), literally speaking. (Wait, isn’t he the crazy marketing guy holding the 8-ball? Whatshisname again?) And what about you guys who defended this crazy thing, saying that the FX-9590 is a great product from a great company? If you thought it is a great product, why didn’t you shell out $900+ for it? Oh, hey, look! It’s down to $340 now! Why don’t you guys buy it if you think it’s a good product?

    Look guys, I love AMD and I think AMD is an awesome company, but they’re being held down by a bunch of stupid people within the company. AMD needs to kick their a$$es out the door and start hiring really talented, un-stupid people if they want to regain the respect they’ve lost due to the stupid hires they got along the way.

      • jibkat
      • 6 years ago

      Talks with focus group’s is the guys name.

      • truprecht
      • 6 years ago

      The only people who should be slapped are the ones who paid $920 for a Vishera-based CPU.

        • ronch
        • 6 years ago

        Well, seeing as only stupid kids who have more money than brains will even consider buying these crazy things, I bet their parents slapped them with the credit cards they used to purchase them. Too bad they don’t make laptop-sized credit cards.

    • Unknown-Error
    • 6 years ago

    At 220W I wouldn’t take even if they gave it for free. AMD’s pathetic excuse for an engineering team can shove the entire Bulldozer based line up their collective *beep*. And Rory Read really needs to pull his head out of his *beep* and breath some fresh air.

      • tipoo
      • 6 years ago

      Bearing in mind their R&D budget is several times smaller than Intels, I wouldn’t be so harsh on the engineers. The managers, sure. There’s only so much a tiny company can do against one that easily drops 5 billion dollars on new architectures. To put that in perspective, the entire first iPhone cost only 125 million to make.

        • BobbinThreadbare
        • 6 years ago

        Intel’s R&D budget is more than the revenue AMD brings in for one year.

      • ET3D
      • 6 years ago

      You know you can underclock and undervolt it, right? I’d take one for free any day.

      • just brew it!
      • 6 years ago

      Nit pick: It’s a Piledriver based CPU, not Bulldozer.

      And while the FX-9590 and FX-9370 were indeed ludicrous “stunt” products at launch (and remain unattractive even after this price cut), the rest of the Piledriver-based line is actually pretty reasonable.

        • Unknown-Error
        • 6 years ago

        I said “Bulldozer based”. Piledriver is the same ‘Bulldozer’ m-Arch with some very minor tweaks and clock bumps. Phenom II was reasonable for its day, Piledriver not so much. The amount of power the desktop models suck-in during load is inexcusable. Don’t forget Phenom II was on 45nm. Go back to the good old ‘Athlon’ days and compare the power efficiency with Intel NetBurst CPUs. AMD fanbois have conveniently forgotten how efficient AMD used to be. The most baffling thing is why AMD did a 180′ and went for a power-hungry, low-IPC, high-frequency m-Arch not to mention having access to only inferior fabs unlike Intel.

        Lets see whether ‘Steamroller’ based CPUs/APUs will join the same douche club.

          • ronch
          • 6 years ago

          Not a chip architect here, but I would think designing a narrow, speed-demon architecture is less difficult and less expensive than designing a very wide CPU architecture that would require a very sophisticated scheduler capable of adequately feeding the wide array of execution units. Intel has the money and resources to do it, AMD doesn’t even if they wanted to. When planning Bulldozer AMD needed to consider their resources carefully. I think AMD did a decent job with Bulldozer considering their available resources. It’s harder to make bold decisions when you have far less resources. And of course, if you’re Intel you could easily hire the best engineers available. Having said that, I think AMD engineers deserve much credit for what they’ve done, whether they’re the best engineers available or not.

            • just brew it!
            • 6 years ago

            But… a narrow, speed demon architecture makes extreme demands on your process tech. If even Intel (acknowledged to have the best process tech in the industry) couldn’t pull it off decently with the P4, what made AMD think they could do it?

            This is likely a gross over-simplification, but my guess is that AMD:
            * Over-estimated how quickly mainstream applications would migrate to heavily multi-threaded designs.
            -and-
            * Under-estimated how much their process tech would limit their upside on clock speed and/or TDP headroom.

            • ronch
            • 6 years ago

            Regarding your second bullet point, AMD probably hoped that circuit design advances would make up for their manufacturing prowess, or lack thereof.

      • ronch
      • 6 years ago

      No way, man. If they gave it away for free, I WOULD take it… Then I would down-clock it to FX-8350 speeds and probably achieve far less power consumption, probably 125w.

        • sschaem
        • 6 years ago

        much less. fx-8350 can undervolt stock to ~95w

        If those are binned you might hit <90w at stock 8350 speeds.

          • ronch
          • 6 years ago

          I was able to undervolt my FX-8350 to a not-very-low 1.338v using my MSI 990FXA-GD65, whether it’s the chip’s fault or the board’s, I have no way of knowing. Not sure how much power savings that grants me. Any lower than that and I get crashes when maxing all cores when encoding video.

          By the way, undervolting one notch lower than that (1.32v or something) still lets my system pass Passmark’s BurnInTest Pro stability testing app, but my system hangs when transcoding video using Any Video Converter. So I don’t know just how thorough B.I.T. Pro is.

      • clone
      • 6 years ago

      I’d buy it for $250 and be perfectly happy…. $350 is still too steep… $300 with a liquid cooler maybe but I wouldn’t be feeling very excited so much as curious at that point…. at $250 I’d be excited.

        • sschaem
        • 6 years ago

        Get a $145 FX-8320 and a $40 liquid cooler, are you excited that this is $185 ?

          • clone
          • 6 years ago

          no, just sold my FX 8320, no interest.

          I’d buy just the cpu for $250 and put my own liquid cooling assembly on it, getting the cheapest available adds no value.

          the nicety of the 9590 is that it’s guaranteed to run at 5000mhz, while the 8320 would be cheaper it would also be problematic for an extra $70 the guarantee would be worth it.

    • Srsly_Bro
    • 6 years ago

    Wow. What a surprise. I never thought this day would come.

      • ronch
      • 6 years ago

      Surprise? No, I fully expected this to happen.

    • Meadows
    • 6 years ago

    Well there’s a slap in the face if I’ve ever seen one. Early adopters, eh.

      • HisDivineOrder
      • 6 years ago

      The reason it dropped so hard was because there were (next to) no early adopters.

        • Meadows
        • 6 years ago

        All for nought, because I’d still rather just get an FX 8350, if I had to pick. Although the possibilities of buying the FX 9590 and then downclocking (and seriously downvolting) it to observe differences in binning are attractive, too.

    • UnfriendlyFire
    • 6 years ago

    Absolute steal for those that live in fairly cold climates (*cough* Michigan *cough*), and especially if they have cheap electricity.

      • ronch
      • 6 years ago

      Well, if the FX-9590 is a steal at $350 for dissipating 220w and helping warm your home, imagine what an FX-8350 overclocked to FX-9590 speeds would be, costing just $200 at online stores today and probably generating roughly the same amount of heat to make you feel warm and fuzzy during those cold nights.

      • Deanjo
      • 6 years ago

      *snicker* Michigan being cold *snicker*

        • UnfriendlyFire
        • 6 years ago

        Do you live in Alaska or northern Canada?

          • Deanjo
          • 6 years ago

          Saskatchewan

    • albundy
    • 6 years ago

    ” However, it soon popped up for sale as a bare processor—and with a whopping $920 price tag.”
    it popped up from a no-name vendor from who knows where, and they are still selling it at that price. that’s like the enquirer accurate, lol.

    • phez
    • 6 years ago

    Dat 220 jiggawatts.

      • ronch
      • 6 years ago

      Imagine how many trips across the space-time continuum that would allow.

        • dashbarron
        • 6 years ago

        Until the carburetor fails you.

      • LukeCWM
      • 6 years ago

      TL:DR

    • Ryhadar
    • 6 years ago

    I wouldn’t blame you guys if you didn’t, but is there any chance of one of these coming to the labs?

    It’s an interesting offering, even if power consumption would be ridiculous. It would be fun to find out the particulars of this chip.

      • ronch
      • 6 years ago

      Not much, really. It’s just an overclocked FX-8350 with a new model number and a ridiculous price. Cherry picked? I think not.

    • tbone8ty
    • 6 years ago

    Nice! these are fun to overclock and play around with!

      • sschaem
      • 6 years ago

      Those are already overclocked.

      And having good luck nowadays on the fx-8350 / 8320 will be much harder…

    • WaltC
    • 6 years ago

    Of course you can’t recommend it. Who could? Many other FX chips clock anywhere between 4.5GHz and 5GHz at *stock voltage*, on air, and don’t get close to consuming a 220W TDP..;) The market for the FX-9590 was over and done pretty much before it was “launched.” I guess this is one of those things that got into the product pipeline and slipped through the cracks somehow when the FX yields dramatically improved. My 6-core Vishera is a power-sipping (comparatively) 95W TDP cpu and clocks to ~4.5GHz on stock voltage and with stock cooling ROOB, and cost me a whopping $130. The whole FX-9590 deal had me scratching my head from day one–obviously something planned prior to getting the yields with FX that AMD has been getting for the last year or so with Vishera. Certainly this thing is an odd duck–only the most uninformed n00b might fall for this thing, or buy a Mac, etc…;) [OK, shouldn’t have said that, I know.] With that kind of power draw only an inexperienced person is going to buy it regardless of price–there are too many regular Vishera cpus that’ll clock nearly as high while consuming half the wattage with a max 125W TDP, and then there are 6-cores like mine with a 95W TDP, etc.

      • sschaem
      • 6 years ago

      This is what happen to the last guy that tried to run an FX at 5ghz on air
      [url<]http://tinyurl.com/mdg5t6t[/url<] The tornado that resulted from the fan spinning almost opened a portal to another dimension... Joke aside, the FX even on stock voltage/cooling will throttle during stress test and the fan will spin 'out of control'. 7000+RPM And for overclocking, the FX is anything but trivial, and > 4.2ghz you better have first class VRM on that motherboard. The biggest hurdle is actually APM. You can overclock to 4.6ghz, but you wont get ANY real speedup vs stock because APM will power limit at ~140watt. You need to disable turbo&APM to truly unlock those chips. Use a tool like AIDA stability monitor and graph the actual multiplier while running a Prime95 "max power torture test"... even at stock the CPU will throttle... dont forget those ear earplugs, and grab a crow bar in case you do open that portal and head crabs comes dropping in...

        • ronch
        • 6 years ago

        Your tornado joke made me chuckle. Didn’t know you had it in you, dude.

      • Srsly_Bro
      • 6 years ago

      Did you keyboard come without the Return key? That wall of text hurts my eyes.

    • Airmantharp
    • 6 years ago

    [quote=”Geoff Gasior”<]In just a few months, AMD's 5GHz monster has gone from nearly a thousand bucks to just $50 more than the Core i7-4770K. And I still can't recommend it.[/quote<] And that's okay.

      • ronch
      • 6 years ago

      The only way I can recommend it is if AMD sells them for the same price as the FX-8350. I may consider buying it instead of the FX-8350. Then I’d downclock it to 4.0GHz.

        • ermo
        • 6 years ago

        On the assumption that the silicon for both FX-9xxx models is cherry picked, I would be okay with paying a little extra for one.

        And by ‘a little extra’ I mean like $25 or so — it shouldn’t be too much more expensive than an i5-3570K (so around $225). And then I’d tune it for 125-130W power consumption at whatever frequency it could support.

        At that price and power envelope, it’d be decent — if not stellar — value for money, especially if you have the option of recompiling the software you use to take advantage of its non-poor suits (such as they are).

          • ronch
          • 6 years ago

          [quote<]On the assumption that the silicon for both FX-9xxx models is cherry picked[/quote<] That's the thing. These chips [u<]don't[/u<] feel like they're cherry picked. AMD just wants to charge you almost a grand for them by giving it a higher stock multiplier.

            • just brew it!
            • 6 years ago

            I pretty much agree with that. All you’re really getting for the extra $ is warranty coverage for your “overclocked” chip.

    • spigzone
    • 6 years ago

    ***** [u<][b<]Looks like I kinda had my head up my ass on this. My apologies.[/b<][/u<] ***** [url<]http://www.anandtech.com/[/url<] Speaking of AMD, what's with AnandTech's red boxed, premium eye-spot placement 'AMD CENTER'? And 'Nvidia' appears nowhere. Talk about kicked to the curb and then p/ssed on. It can only mean AnandTech is privy to information that leads them to believe Nvidia is going to become irrelevant and AMD is going to become the Belle of the Graphics Ball -> talk of the town .. as it were. They're early birding associating AnandTech with AMD to establish it as THE place to go for hot AMD info.. This does not bode well for Nvidia.

      • Airmantharp
      • 6 years ago

      I wouldn’t read too much into it. AnandTech has been blasted for showing overt bias before, only for the ‘squealers’ to quickly quiet down as history proves them wrong. It really just sounds like they’re trying to highlight AMD’s current push on a number of fronts, which is a good thing, we’ve noted- it will push Nvidia (and Intel, and ARM) to continue to innovate and otherwise respond. And again, no one has the combined graphics and x86 CPU (and now ARM) portfolio that AMD brings to the market, even if they’re not the best in any one particular area.

        • spigzone
        • 6 years ago

        This isn’t one of those Radeon vs Geforce ‘slanted review’ arguments that would break out occasionally.

        This is an entirely different order of magnitude. This is giving AMD THE sole brand name premiere FRONT PAGE spot. On one of the most highly respected and popular tech review sites on the web.

        It’s handing AMD incredible PR value on a silver platter.

        It’s also a ginormous Nvidia diss.

        The other Tech Review sites are going to see that and go WTF??? What does AnandTech know that we don’t?

        Like EA at the Hawaiian Event … feeling free to kick Nvidia in the nuts and pushing them under a bus.

        Pretty soon JHH will be on stage doing his Rodney Dangerfield impression … a very dark and brooding Rodney Dangerfield impression.

          • khands
          • 6 years ago

          Anandtech specifically mentioned that AMD is paying for this placement. It’s pretty much a giant ad.

      • cmrcmk
      • 6 years ago

      I’d be shocked if it’s anything more than paid advertising.

        • NeelyCam
        • 6 years ago

        That’s exactly what it is. Anand said so when they started it over a month ago

        • spigzone
        • 6 years ago

        Apparently so, but it’s still a radial move as there’s no indication on the front page that’s the case. It’s all very tastefully done and I’d say it’s money very well spent.

          • Klimax
          • 6 years ago

          Need money, get a tech company pay for placement of articles on their products. And every larger server needs some money and there are not that many sources. (ads, paid content, store, corp. zone)

      • WaltC
      • 6 years ago

      Agree with Airmantharp that it doesn’t mean much at all. Anand has been mindlessly and shamelessly plugging the Mac for a couple of years now and Apple’s still relegated to a 5% world-wide market share versus Windows (like has been the case for the last 20 years +.) In fact, if Apple’s US market share weren’t at an all-time high of 10% or so, the world-wide market-share for OS X would be in the neighborhood of 2%-3%. In some countries, OS X doesn’t even manage to tick over the 1% mark. So not everything Anand pimps automatically succeeds, I guess is the point.

      AMD is in an enviable spot vis-a-vis nVidia because of its position not only in the burgeoning PC market (temporarily slowed a bit), but also in the exploding xBone/PS4 console market, where AMD is 100% of the action there. AMD is now in an industry leading position which nVidia is going to find to be very rough sledding.

      But…I vote with nVidia on the notion that the PC market will continue to grow–despite its temporary falloff as the result of world-wide economic conditions–which means plenty of market share for nVidia in the discrete gpu markets if nowhere else. Tegra is a small but growing segment of nVidia’s market–for instance–but its PC gpu market remains by far–by several hundred percent–nVidia’s largest money maker. As long as nVidia remains competitive and puts out competitively priced, competitively performing gpus, they’ll be around for a long time to come.

      I personally like the company’s gpu products less than those made by AMD (and ATi before the merger), but I don’t want to see nVidia out of the game–competition is good for everyone–it’s good for Intel and it’s certainly good for AMD.

        • NeelyCam
        • 6 years ago

        [quote<]I don't want to see nVidia out of the game--competition is good for everyone--it's good for Intel and it's certainly good for AMD.[/quote<] Nonsense. It would be [i<]exceedingly good[/i<] for AMD if NVidia was out of the game. Competition is only good for consumers; companies prefer monopolies

          • spigzone
          • 6 years ago

          True dat … and Rory Read is definitely going for the kill, he’s taking a lead pipe and blowtorch to Nvidia’s GPU business.

            • Klimax
            • 6 years ago

            He better ensure pipe is not corroded and full of toxicity and blowtorch is undamaged or it won’t be NVidia on receiving end. (And I suspect that your favorite scenario won’t happen)

            • spigzone
            • 6 years ago

            EA is going all in with AMD for a reason.

            • faramir
            • 6 years ago

            EA … the Origin- and DRM-touting idiots who declined into obscurity a couple of years ago ? Nobody cares about EA.

            I’d be delighted to see AMD do better but whichever way it turns out it won’t have anything to do with EA.

            • LastQuestion
            • 6 years ago

            Ya. Who cares about those nobodies that publish Battlefield, Mass Effect, Crysis, and the upcoming Titanfall. I mean, they’re only the worlds largest publisher – where do they get off thinking they’re important, & stuff.

            +1

            • Klimax
            • 6 years ago

            Free money and publicity? There were no technical reasons behind that.

            • Deanjo
            • 6 years ago

            “Going all in” would imply that they will [b<]only[/b<] be doing AMD's Mantle and that is something they are definitely not doing.

        • insulin_junkie72
        • 6 years ago

        [quote<]I vote with nVidia on the notion that the PC market will continue to grow--despite its temporary falloff as the result of world-wide economic conditions[/quote<] US PC shipments DID manage to go up 0-3.5% in the third quarter (depending whose numbers you believe). Haswell got the credit. Worldwide shipments were down 8-9% again. Asus and Acer both got killed - down 23-34% this quarter.

      • chuckula
      • 6 years ago

      [quote<] Speaking of AMD, what's with AnandTech's red boxed, premium eye-spot placement 'AMD CENTER'? And 'Nvidia' appears nowhere. [/quote<] That's because illegal AMD bribery antitrust monopoly illegal!!!* * I'm sure that the *exact* same people who say that Nvidia has committed some crime against humanity every time a product ships with an Nvidia part instead of AMD will be 100% intellectually honest and totally agree to apply the exact same standards of ethics to AMD that they do to Nvidia.... OK, I can't keep a straight face typing this anymore.

        • spigzone
        • 6 years ago

        Nvidia acted behind the curtain in douchebaggery ways. This is totally in the open. Big difference.

        • Spunjji
        • 6 years ago

        Behind-the-scenes deals to pay off people to provide “opinions” on your competitor’s products / exclude them from your product lines are a little different to having a tech website openly announce that they’ve been paid to have a special section for your products on their site.

        Wait, no, it’s a /lot/ different.

        FWIW I still don’t like either.

      • peartart
      • 6 years ago

      Under the “what is this?” mouseover on the page:

      [quote<]You've landed on the AMD Portal on AnandTech. This section is sponsored by AMD. It features a collection of all of our independent AMD content, as well as Tweets & News from AMD directly. AMD will also be running a couple of huge giveaways here so check back for those.[/quote<] Reading is hard.

        • HisDivineOrder
        • 6 years ago

        Why waste time reading when you could be writing a long post about how AMD is going to take over the world with HUMA and HSA and Mantle and TrueAudio? Seriously, reading might lead to knowledge and knowledge to a SLIGHTLY less biased opinion…

        And that would take some people away from the delusion that AMD is going to magically and suddenly pop out a solution to their current financial woes through a megahit that will catapult them from disaster to king of the industry.

        It takes a lot of energy to do all that. There’s just no time for reading.

        EDIT: This is especially true for someone declaring AMD winner of the GPU wars in a forum post to an article about how AMD’s high end CPU’s launched as AMD’s return to the highest of the high end have just tanked in value and are being sold exactly as they were said they wouldn’t be.

        The desperation to balance out the hard truth with some obsessive opinions must be overwhelming.

          • spigzone
          • 6 years ago

          Missing an aspect of what’s going on at AnandTech’s site doesn’t change my assessment of what AMD is about to do to Nvidia or that there’s squat Nvidia can do to stay relevant.

          • Spunjji
          • 6 years ago

          Never missing an opportunity to inject your own counter-bias there, HDO. 😀

        • spigzone
        • 6 years ago

        Reading is easy, comprehending is hard.

        Also, I get no mouse over pop-up related to the AMD CENTER.

Pin It on Pinterest

Share This