Remember that 5GHz FX-series processor teased by Gigabyte? Well, it’s now official. Earlier this morning, AMD announced its "first-ever" 5GHz processor, the eight-core, Piledriver-based FX-9590.
AMD has also unveiled a slightly slower offering, the FX-9370, whose eight cores are clocked at 4.7GHz.
At least, I think that’s how fast these things run. The AMD announcement is awfully light on specifics, and while it refers to the FX-9590 as a 5GHz processor and to the FX-9370 as a 4.7GHz one, it also describes those clocks as "Max Turbo" speeds.
AMD says both the FX-9590 and the FX9370 will be "available from system integrators globally beginning this summer." There’s no mention of retail availability—or of pricing, for that matter.
Nor is there any word about power envelopes. For what it’s worth, the rumor mill quoted a 220W TDP for the 5GHz model. That does sound a little high, but keep in mind the fastest Piledriver-based FX chip available today, the FX-8350, has a 125W thermal envelope and only hits 4.2GHz in Turbo mode. Achieving an 800MHz increase over that no doubt requires certain tradeoffs.
We’ve asked AMD for some clarifications about the precise specs of these two products. We’ll update this story when we know more.
Update from Scott: In my review, I overclocked the FX-8350 to a Turbo peak of 4.8GHz. You can see how it performed. Notice that we couldn’t get the base clock speed past 4.5GHz without overheating or instability. AMD only quotes Turbo peak frequencies for these new CPUs, so I expect the limitations haven’t changed much on this silicon since our review. Also, notice that the total system power draw in our test workload was 262W with this overclocked config. That lends some credence to the rumors of a 220W TDP for these parts. Don’t expect to remove that kind of heat from a CPU quietly without some fairly exotic water cooling.
depends on what you define as a “cost,” because with the introduction of fracking, natural gas has become one of the dirtiest, most destructive, environmentally-damaging-on-a-GLOBAL-scale forms of energy around.
But yes, it is cheap.
I bought mine because it was the only platform that supported 8 sticks of ram, and because of other advantages at the time that were unique to the LG2011 platform. I have been rewarded 100 times over for my choice.
NeelyCam right?
Minus 50!
but at the end of the day, pushing the chips to the breaking point on real world silicone is one of the fastest ways to identify real world problems in the foundry, the process and on die.
of course these limited runs were more necessary in the 1ghz days than today.
the 3930k shall probably be dropping its price when around this comes out anyway to make room for the 4930k in 4 months
Waiiiiiiiit back circa p120 & p166 times you could only change the multiplier (although freely on every chip) but couldn’t change the voltage AFAIK.
Also back then the TDPs were tiny compared to modern times. 12.81 watts for the p120 for instance. The small heatsink and fan they came with should have stopped the chip getting really hot.
And then you over clock the Intel CPU… And you back to the norm of and trailing
You do understand that even at 5Ghz 220W AMD’s CPUs are hilariously slower than Intel’s for single threaded tasks, right?
That’s most people’s issue, I feel. We all know that a GeForce Titan or Radeon 7970 GE burn 250W of power, but at least they’re massively faster than the GPU that consume half the power.
If a desktop user wants ‘raw power’ they’ll be running the CPU that best suits their workload.
And for most users, that CPU will be a K-series Haswell chip.
Thank you for swiftly proving that you know nothing.
Why? AMD hit 130nm not long after Intel and their 90nm process came online with much better performance. The pain started 65nm and down, so the 2005-frozen-fanboy might well believe that AMD’s expected success would have propelled their partnership-based process tech on to even greater heights.
Pedantry aside I thought this was pretty funny.
C’mon guys, 220W *is* ridiculous. Long-term AMD fans would agree because we were busy mocking Intel for pulling the same shit in the Netburst days…
Indeed. Hardware enthusiasts in general have long been oddly on the conservative side with regards to CPU temperature too. 60 degrees under load seems to have been a general consensus when it’s far below the rated operating temperature of many chips.
Personally I’ve not had a CPU die on me that wasn’t physically damaged (poor Athlon XP-M). Motherboards, on the other hand…
Yes; electromigration gets worse with temperature, even if the voltage is low (e.g., with a really bad cooling solution). And cycling between high and low temps can cause thermal expansion related structural damage (think NVidia’s “Bumpgate”)
give your head a shake, yes it uses more power, but we are talking a difference of $12-$18 a YEAR ?? @max
Power issues?? Give your head a shake you newbs. we are talking a difference of about $12-$20 a YEAR!! these chips are for Desktop users who want raw power. An overclocked fx8350 beats a stock Haswell I7, yes it needs cooling and uses a tad bit more power. Not sure if you guys are just ignorant or intel fanboys?
This is #1 reason PS4 and Xbox ONE switched to AMD cpu’s AND graphic cards. Bang for buck, period. This latest chip is for the people who are dubious of overclocking, if your not, the fx8350 can run stable @5.4ghz on a Zalman triplefan air cooler zero problem. seen in multiple pro-setups
Yeah, if I didn’t already have a 2600k at 4.4 Ghz.
Higher voltage, mostly. Electromigration is still a problem, as far as I know.
true that but then ppl don’t want to put power consumption into context.
power consumption is just another tool for some to hate with which is why I mentioned in another thread I was more interested in the mobo’s that officially supported 220watts than the cpu.
1st part, you are absolutely correct and one could argue that it was a sign of things to come regarding those who focus on mhz vs actual performance.
it doesn’t change that AMD was first to 1ghz, Intel was first to 2 & 3 but they did it using a garbage processor that offered poor IPC, AMD then reached 4 and 5ghz first with a cpu that again…… offers weak IPC.
the constant lesson being focusing on mhz costs performance.
beyond that the entire slot A line suffered from cache errors that kept them from being validated in enterprise apps and cad apps, these issues were fixed once AMD went to using ondie cache instead which doesn’t change the dubious first to 1ghz accomplishment.
looking back did a slot A 1ghz athlon outperform a 1ghz P3 or no?…. doesn’t matter Intel came out too late either way.
A single overclocked high end GPU (ie, hd 7970) will draw more than an overclocked fx.
You can’t have it both ways. First you make an argument that because this FX-9590 is some sort of halo product, it might reasonably sell for $500. Well guess how much an i7 3930k sells for? Not a whole lot more. Since when do corvettes cost as much as elite European yachts? Even if it is a zr1…
Addendum: Let the [H] hate flow through you!
Remember that AMD’s manufacturing processes are fundamentally different than Intel’s in that AMD (now GloFo) improves the process over the course of its lifetime and Intel does not. I’m not saying that The FX-9590 wont be a hot mess, but if Vishera could hit 5Ghz on an overclock, and AMD managed to tweak its process over the last 9 months, I don’t think 5Ghz as a turbo clock would be unreasonable in a normal TDP window.
Somehow I don’t see Jarred’s conjecture about the price and TDP of the FX-9590 as being as contentious as some people here do since all he’s claiming is that he thinks AMD actually wants to see these chips.
All sites have variation in their editorial views an quality, which is why individual authors are given credit for the pieces that represent their views. Even TR – a site with very consistent writing, in my opinion – has some pieces that are better written and thought out than others.
I just think y’all are being b****y for no good reason. 🙂
IIRC, Bulldozer and it’s derivatives were designed to scale to high clockspeeds, but much like Intel’s Prescott, heat got in the way. As AMD refines its process I’m not surprised that they can get their high end CPUs to high clockspeeds along with the resulting drop in efficiency.
For laptop users, temps in the 70s-80s are considered fine…
I think this really stems from there being no real data or proof on how temperatures hurt longevity. Everyone knows higher temperatures hurt longevity, but how much is the real question. Silicon has changed over the years, but people have also gained more experience working them. Figuring out how much and how far you can push things before you end up in the danger zone (which is still very arbitrary) is still very much in the air.
Was this with increased voltage or increased temperatures?
It’s entirely possible for different chips to tolerate higher temperatures differently (as well as voltage). One of my friends described modern chips as being very flexible and tolerant compared to older silicon. It’s really hard to go wrong with them unless you’re really trying.
XDR is alive and well.
AMD had to cripple their off-die L2 cache to get to 1 GHz though. They had to run it at 1/3 speed and that killed performance scaling. Coppermine was fine at 1 GHz, it was 1.13 GHz that they failed to validate properly for some mysterious reason.
sorta not really, AMD was first to 1000mhz but Athlon was designed to hit 1000mhz when it was conceived and didn’t need excessive voltage or a special motherboard to reach it.
Intel’s P3 didn’t fair so well in comparison but then P4 overcame that and Intel reached 2ghz and 3ghz first….. did AMD hit 4 and 5ghz “officially” first?
NeelySweep detected.
release the kraken
[quote<]AMD will now and forever be the first to hit 5GHz. It has marketing value.[/quote<] For what its worth, IBM did 5ghz with Power6 about five years ago, and they fabbed that sucker on 65nm.
[quote<]This is a cool product. It's like a crazy factory souped up american muscle car - powerful and loud. A Man's CPU.[/quote<] Silicon doesn't quite have the same sex appeal though.
If only Ford hadn’t stopped at V10.
Anyone who looked at the fabrication process would know.
A clever move perhaps. It allows them to mostly dodge the question of value, and especially the question of power usage.
I prefer the pear ones
Should I have said “This is a [b<]hot[/b<] product." instead? I guess that'd be more factual. I think it's looks sexy. I'm not buying it, though, because I no longer in that market - I don't have a giant gaming rig full of fans with LED lights anymore.
With Intel i7-39** one gets more cores and extra things at reasonable TDP, while having nice overclocking… (like 40 lanes of PCIe 3.0 or quad channel memory)
The corpse of Rdram is dead and buried, thank goodness…;)
I think you won’t have trouble running [url=http://i40.tinypic.com/eq5zll.jpg<]this[/url<] then..
Um, dude, that was a joke.
[quote<]This is a cool product.[/quote<] Oh my god, Neely! Are you feeling alright???
This is clearly a fanboy product. It is neither amazing in performance nor practical in any application. Its the same thing as a low end GPU with 4 gb of GDDR3 memory; pointless.
AMD. Intel would never call a module with a pair of cpus that can only execute 1 core at a time ‘dual core’.
To crunchy … but I do love the bitter aftertaste … (of desperation that is)
I have successfully booted Win7 on my FX-8350 at 5.25 GHz. Managed to make a benchmark in FryBench, but not stable really. I have a Asus Sabertooth and have found that overclocking through BIOS/UEFI works better than AMD-overdrive etc. Time consuming though. I haven’t got any turbo core to work without APM master mode, so when all cores are loaded, it throttles like crazy. Maybe I should test the overdrive thing more.
And yeah, great review as always 🙂
You will basically be buying a factory overclocked part.
On Damage’s results I found it interesting that the part locked up at around +55C. I’ve had my 2600k run stable at +85C. My theory is that SOI process makes it more difficult to cool down the chip – if the external temperature is 55C, the “internal” temperature (=transistor temperature) is much higher, messing up the operation. Meanwhile, in a bulk process (like Intel’s), there is no insulator between transistors and the heat sink, so transistor temps are closer to the external temperature. So if temp monitoring tools are reporting 85C, it’s still OK because that’s roughly the temp of the transistor?
Does this make sense…?
That’s a V12 – it’s only for the European elite and costs more than a yacht
I’m wondering if there will be any headroom at all for overclocking when the “stock” clock is 5ghz, or if you will basically be buying a factory overclocked part.
If history is an indicator, it probably will. I’ve personally seen two chips go flaky after extended overclocking. And one was a whopping 13 Mhz overclock from Pentium 120 to 133. The other went belly up after a couple of years at 208 Mhz (stock 166).
This is like AMD fully exposing their aspirations to be the company that brings back the Netburst CPU’s in all their exploding, flaming glory. AMD is doing what Intel had planned to do back then, but couldn’t. If AMD can’t have the performance market or the low power market, then they will take Intel’s plans from years ago and THEY WILL DO WHAT INTEL COULD NOT!
Now where is Rambus when you need them?
I had a Prescott Pentium D 3.0 at 3.75, and a calculator told me it was drawing 185 watts at full tilt. I believed it; very hot and noisy, but stable!
Why are you going to a smartphone, tablet, and Apple device review site for info on AMD processors?
Should have delayed the Northern Hemisphere release till Thanksgiving. Time for a heatsink/circulator to be designed that functions as a winter-time room heater.
Do you think that nVidia wants anything to do with the AMD train-wreck? Access to x86 is probably the last thing nVidia wants at the moment. As for jumping on the console gravy-less train, nVidia got a nasty taste of that with the Xbox in 2002.
I was wondering about that- it seems people are getting much more comfortable with higher temperatures, at least as reported by the CPUs.
I know this all used to be tied to stability such as in a successful Prime95 run, likely 24 hours, where most settled on ~60c as the ‘max’ temperature CPUs should run at. But if these things are stable closer to 90c, who cares? Is it really going to die any sooner?
There’s a reason we’re here and not there :).
I don’t bother with Anandtech mostly; I read the reviews I care about mostly for the additional data points and situational awareness, not because I like their stuff- some of their articles are bordering on [H] editorial quality (i.e., no editor at all). And yet the [H] still puts out decent reviews if you can get around the grammar and spelling (and sentence structure ad nauseum).
It is a recommended max. It doesn’t start throttling itself till around 100c, just like with Intel chips (although this can change depending on your board).
I mean Geoff was running his 4770k hotter and considered it stable.
lol it will match a Haswell at 4.0 while using twice the power. sound like a real smart choice…
Must be some pretty revolutionary stuff from GloFo, maybe their fabled fully depleted SoI or something.
My FX-8350 will run at 4.4 GHz at it’s stock voltage. To get it to 4.6 I have to raise Vcore beyond stock stock turbo voltage though, and at 5 GHz it isn’t really stable at around 1.55 Volts. Easily cooled with at triple radiator though, but as I increase the Vcore over 1.55 V (to try to make it stable at 5 GHz), the power consumption of the system at full prime load goes just shy of 400 W, that’s a good ~150 W over stock.
So it must be cherry picked and tops just one module (is just one core possible?) turbo to 5 GHz. Otherwise I have a hard time seeing it on the same silicone we have today.
The only relatively popular tech site I’ve seen that has consistently good “editorial quality” is TR.
Perhaps the base clock is 4.2 ghz.
You know what? I like this CPU.
I didn’t think I would, and it’s taken me some time to digest it’s ‘usefulness’, but it’s growing on me (not that I’d ever buy it; no AMD CPU is fast enough).
Basically, it’s an FX-8350 with a guaranteed overclock, and the TDP rating to match. What’s not to like here? I mean, for those things that the FX-8350 is good at, like Bensam’s streaming and what-not, it’s a great idea; and hell, if it’s cheaper than Intel’s K parts while providing full virtualization support along with ECC, it’d be a steal for workstations where you’d otherwise not want to risk overclocking.
AMD can set the price at $230.
AMD can’t change the laws of physics though.
I actually agree with almost all those points.
125w and ~$230
fx-8350 at stock clock undervolted is about 85w TDP, not hard to imagine low volume cherry picked fx-9550 to run at 20% higher clock and 10% higher TDP.
And for the price, AMD cannot price its chips above Intel when performance is so much lower.
Remember 5ghz is turbo only.
This is mainly a try at boosting the FX ‘single threaded’ performance.
That’s what I’ve thought more than once
Nah, I heat with Natural Gas, it’s about 25 times cheaper than using ontario hydro to heat your home.
Actually it was snowing here on the 3rd 🙁
…the two aren’t related?
Yup; and their editorial quality has gone straight downhill since Anand stopped asserting direct control.
I still like the articles he writes himself, but these other guys… some of them need help.
I could hear that damn FX5800 Ultra in a crowded conference hall at the Quakecon before it hit retail. Room full of nerds swooning to talk to Kyle and Anand (to think!) after they gave away a bunch of parts and did a Q&A.
That think was loud. And it was slow. So much funny.
But the buck stops somewhere
IT GETS BETTER
Jarred in the comments: [b<]Worst-case, I'd expect TDP to be 135W.[/b<] LOLOL
So the FX-9550 is meant look like it that can beat i7-4770K at stock in raw performance. It is shame that it still fails short of i7-3970X where the number of threads is king.
Reminds me more of the Prescott and Smithfield-based Penitum 4 Extreme Editions back in the day.
So, Cider, which is a legit beer, made with another fruit, called apple. Interesting, that.
Go home AMD, you’re drunk.
I managed to hit 5ghz with my 8350 at 1.5v with water cooling, but the temperature under load was 60C. That’s a little too close to the 62C cutoff, so I backed it down to 4.8ghz.
How can AMD get it (jaguar SoC) and not get it (220w CPU) at the same time?
Anand didn’t write that article, Jarred Walton did.
I was kind of bummed when I built my personal Sandy Bridge system and found it ran so cool my PC didn’t double as a foot-warmer under my desk anymore. Since I have electric heat and it’s on about 7-8 months of the year — and I don’t have a/c — that means its lower power consumption probably costs me a slightly higher electric bill (since I’m now having to warm the whole room instead of just my feet).
I want that whole article archived as proof that anyone who says Anand has some evil bias against AMD is on crack*.
1. They flat out say that a 5GHz Vishera will have a 125 watt TDP just like the FX-8350.
2. They claim that AMD is going to go through this whole process to charge a whopping $10 more for the FX-9590 than they do for the FX-8350.
* Actually, maybe there is an evil bias: Anand spreads rumors that are so ridiculously favorable to AMD to create a hype bubble. Then when the truth comes out, the same truth that was brain-dead obvious to everyone else, AMD looks worse because of the artificial hype.
click the link, not a CPU in sight 😉
Anandtech: [i<]"We expect them to replace the existing FX-8350 and FX-8320 eventually, but they may initially launch at a higher price depending on how AMD and their partners feel they stack up against the competition. Considering the pricing and performance offered by Intel’s Haswell i7-4770K at $350, most likely the FX-9590 will be close to the $200 mark."[/i<] I'd like to know what drugs they are on.
This [i<]kinda[/i<] happened to me when I put together my first computer. The short of it is, I didn't do nearly enough research. Back in 2005 I was headed off to college and wanted a new PC. I had heard all of my friends -- who didn't build PCs per se, but knew a good PC when they saw one -- laud AMD CPUs and ATI Radeons. I figured that since all my friends trumpeted their performance that AMD and ATI must be rolling in money and that I wouldn't be able to afford them. I was very happy with the performance of my previous desktop machine, a Gateway Pentium 4 @ 1.5GHz and a Geforce MX 2 -- if I recall correctly -- and decided to build my own around a Pentium 4 and a new nvidia card. Also figuring that I was showing them some deserved consumer loyalty in the process. I also thought that since AMD wasn't Intel building the computer and installing windows was going to be vastly different from what I was used to, so I never even looked. I found a pentium 4 for $175 @ 3.0GHz with hyper threading (WHOA! They get up that high now?) and a 7600GT for $150 that fit into my price range. All told, I think I spent $900 on that machine thinking it was a great deal. Keep in mind, my last desktop was from 2001 and I figured $1800 was an entry level price so I thought $800 was a steal. I mostly built my PC with the knowledge that my friends had (not much). I had also re-installed windows xp dozens of times and re-built my PC from scratch a couple of times out of curiosity. After I successfully completed my build I had minor issues that I couldn't figure out (like why my 250GB hard drive only formatted to 137GB). I googled my problems and found that, hey, cool! There are people out that that actually review and benchmark hardware. I wonder how what I bought performs -- OH MY GOD I BUILT THE WORST COMPUTER EVER!!! Well not really, but my mistake propelled me into the world of computing hardware and I dove right in. I know my life is very different as a result (I work as a web developer now; I went into college thinking I would be a biology teacher). What I'm saying is: you probably shouldn't take my hardware advice. 😛
This, exactly this. It doesn’t matter how absurd the chip is in common sense terms, the chip makes sense as a niche product and at 5 GHz the chip makes in psychological marketing terms.
If AMD sells 1,000 of these chips at $500 a pop, good on AMD. If AMD sells some multiple of that number, even better.
People who buy these chips should very well know in advance exactly what they are getting: a chip that performs very decently and probably beats stock-clocked Intel chips in a number of benches, but that uses a lot of electricity and generates a lot of heat while doing so. That’s the deal, take it or leave it. Most people will certainly leave it, but there’s certainly a niche market that will buy this.
There has been talk of the AM3 platform being retired and the APU platform being moved front and centre, could this be one last hurrah for AM3+ before AMD puts it out to pasture?
There’s a joke in all this somewhere about sending it out with a bang…
Hawt.
While I was looking into OCing my 8350, a lot of people were able to hit 4.8-5ghz with 1.5v (non-turbo), where as you guys were only able to hit 4.5ghz. So TRs OCing results may not be representative of a cherry picked chip.
This may have more data points: [url<]http://www.overclock.net/t/1318995/official-fx-8320-fx-8350-vishera-owners-club[/url<] From what I've read online though, the 5ghz model was 4.8ghz stock with 5ghz turbo. Is TR attempting to get it's hands on one of these for benchmarking. >:D
Make life take the lemons back! Get mad! I don’t want your damn lemons, what the hell am I supposed to do with these? Demand to see life’s manager!
These might be popular in Canada since they double as whole-house furnaces.
I think they should start using GUIDs.
That’s an excellent question and a point that I’ve raised many times:
It’s 2005 (maybe 2006 right when AMD said it was buying ATI and before the Core 2 really was out and about): A group of AMD fanboys gets frozen in a block of ice and don’t see any new development until last week when they get thawed out.
As an experiment, you give all of the AMD fanboys full access to the paper technical specifications of Haswell and the FX-8350 or even FX-9590. Further, you give them test systems to play with and benchmark. The only caveat is that you hide the identity of the manufacturer from the fanboys so that they don’t know who made the chips, just how the chips perform. Then you take a survey of the fanboys to have them vote on who made Haswell and who made Piledriver.
Tell me with a straight face that all those AMD fanboys will pick the FX-8350 (or even FX-9590) as being made by AMD… it ain’t gonna happen. I’d put it at 99% guessing wrong, with the other 1% accidentally voting wrong.
Funny how you could say that after Netburst died a long time ago.
Wait, so if we were told the specs and benchmark scores of today’s Intel chips relative to this, who would you think would’ve been producing Haswell chips?
What I pay for A/C in the summer is offset by not having to buy heating oil in the winter.
I see you’ve lost a few billion bucks already. Must be tough.
Yeah, I agree. Even without considering a marketing or braggingrights effect – and, granted, that’s certainly the main thing here – there are probably some people with workloads that resonate well with the piledriver architecture and if a competent OEM picks those parts up, they’ll get a fast machine with full guarantee and probably hand picked silicon.
AMD’s model numbering scheme makes me wanna go to the pool and drown myself.
[quote<]A Man's CPU[/quote<] This is Jerry's CPU!!!
Wow, I never thought of it that way. You’re right. Selling an overclocked part forever gives AMD bragging rights to have reached that clock speed first in an official product. Back when both Intel and AMD raced to hit 1.0GHz, power wasn’t such a big issue but I’m sure AMD and Intel really squeezed out every bit of juice from their chips, which also probably jacked up their power consumption numbers, respectively.
Right now, according to my UPS, my system (5ghz 8350, Radeon 7870, 24″ LCD) is drawing 138w. CnQ is enabled, so the CPU speed is bouncing between 1.4 and 5 ghz.
Thank god.
Just hotter, not noisier. The noise would come from the fans. The CPU itself is dead silent as far as we can sense.
/mini-sarcasm
It’s the same as Intel Extreme edition chips. There’s no logical reason to buy one… people do it anyway.
[quote<]Intel with their fancy highly tuned efficient V6s and subdued exhaust tones need not apply.[/quote<] I have four numbers and a letter for you: 3930K.
Now, if only the AMD-Nvidia merger came through…
See, this has a 5GHz stock turbo clock. Of course IB/HW will have to be at stock clocks too to make the comparison fair. And yes – this chip would kill 4770k with its eight threads in those multithreaded benchmarks.
This is a cool product. It’s like a crazy factory souped up american muscle car – powerful and loud. A Man’s CPU. Intel with their fancy highly tuned efficient V6s and subdued exhaust tones need not apply.
Of course it is. I won’t be surprised if AMD produces only five of these. Only for the rich.. and too lazy to overclock a very easy to overclock chip.
Hey guys, I fell asleep for ten minutes and dreamt that this CPU has a 55w TDP!
/sarcasm
Furnace is right. Use one of these and you’ll feel like you’re in a furnace.
And as if that’s not bad enough, just wait till your power bill arrives.
But never mind all that: The logo is awesome! /s
If they can bring it in at 195W, or even 199, it’ll be more marketing fodder. The fact that they neglected to mention the TDP in the announcement suggests strongly to me that they can’t, and that it’s something north of 200W.
So, you’re trying to apply logic to overclocking fanbois? Should be interesting…
NeelyCam said it’s a pure marketing/halo play. Was NeelyCam right again? Yes, he was.
World’s First [u<]9[/u<]-core Desktop Processor 8 cores was so... 2011.
Indeed…
Yeah, and those same folks know how to overclock, so they can easily get an FX-8350, OC it, and arrive at the same performance and TDP levels for a lot less money.
I really can’t understand the point of these models. Why would anybody pay big money for what are essentially overclocked Vishera chips? If the TDPs are 95w I would’ve been impressed because it means these are really cherry picked parts. But as it is, most FX-8350 chips can probably hit these clocks and suck as much power, so there’s really nothing special about these chips apart from having a higher model and AMD’s ‘official’ nod that you own a 5.0GHz CPU, not some ordinary FX-8350 that was just pushed beyond official spec.
5Ghz Ludacris Speed errr Heat. So I’m back to waiting to see what the next generation of AMD cpus bring
edit: (not that I thought this was a generational leap…)
Yeah, if I’m a 1-percenter, I’d not be buying AMD for anything.
That’s all fine and dandy if they’re making lemonade out of lemons. But I fear they’re trying to make beer out of sour apples!
Okay, I’ll shut up now. 😉
A growup requires a continuous draw of couple of kW unless you use special LED lights. There is also a smelly exhaust from the place.
Two good gfx cards in crossfire/sli will consume much more than an overclocked fx.
Try harder next time.
FX8350’s @ 5ghz is going to be faster than FX-9590
Thats because the base clock speed of FX-9590 is only 4.7GHz (5GHz figure is just for the max turbo)
In other but related news, authorities are having a hard time determining grow ops and FX-9590 users based on power meter readings.
[quote<]In all likelihood, they will beat 3770Ks/4770Ks in many benchmarks[/quote<] If the 3770K and 4770K are pegged at stock clocks and if you replace the "many" with "several highly-multithreaded integer heavy benchmarks that already play to the strengths of the FX-8350" then I agree with you. However, as Damage already noted, even with the "disastrous" overclocking capabilities of Ivy Bridge & Haswell, they can still easily overclock to levels that handle a 5GHz Piledriver even in benchmarks that are favorable to AMD's designs.
As Damage pointed out, if you looked at AMD’s marketing from 8-10 years ago and then I told you the specs for these new parts, you’d swear that they were being produced by Intel or Nvidia instead of AMD…
[Edit: ahh… I’ve hit a nerve with the AMD religious zealots. I love the smell of naked hypocrisy in the morning… it smells like desperation.]
Some folks don’t care about power consumption though, especially when that’s all Intel is working on. There’s a niche market for this kind of thing.
If AMD included the 2015 almanac with one of these things I’d buy all the FX-9590s AMD can produce.
Greed is good! XD
This would’ve given Doc Brown a heart attack and finish him off.
[url=https://www.youtube.com/watch?v=WOVjZqC1AE4<]$399[/url<], same price as the competition's product but just much much hotter and noisier.
Yep, AMD is in cahoots with the power companies.
Guess they’re desperate to catch Haswell and are squeezing every bit of juice from Vishera, even if it means making whoever’s crazy enough to buy this thing pay a billion every month on his/her power bill.
It’s a nice idea but the 1% crowd buy hex-core Intels instead because they’re outright faster at pretty much everything.
I want AMD to succeed as much as the next guy but this 5GHz nonsense would only make sense if it actually meant AMD could claim a victory in [i<]something[/i<].
I’d be ok with the 220w TDP and whatever AMD wants to call it if this thing runs at 8.0GHz base, 9.0GHz turbo.
Oh, memories.
What do you mean? You have NO IDEA how much more performance an extra 400Mhz will give this chip. (hint less than 10%).
LONG LIVE NETBURST! /let the hate flow through you…./
Still the same L3 clock since 2009, long since ironing out an entirely new process, architecture, HKMG, power gating, turbo boost, resonant clock mesh…
Nothing wrong with ultra cherry picked cpu’s and slapping a warranty on them. This is for the 1% crowd that has to have the best and doesn’t want to deal with a roll of the dice to get a 5GHz chip…I really don’t think they care about TDP…
It’s not like releasing this chip has any impact on their other initiatives, so why not release it?
Take your comment about “preventing comments”, and apply that reasoning to AMD releasing CPU’s.
OEM only :/
How about…not releasing this at all ?
And what does me not liking it and thus not buying it, have to do with commenting on it ? If not liking / buying something, prevented comments about it, then these comments sections would be almost empty.
They sure are really good at making fancy logos though. Check out my other post on this article.
Or, just click [url=http://i40.tinypic.com/eq5zll.jpg<]here[/url<].
[quote<]almost doubling the TDP for a 17% gain is nonsense.[/quote<]Actully, given the non-linearities involved, it's almost perfect sense. Unfortunately, as others noted, it's about the only option AMD has available to it at the moment given its current designs and the fabs it can use.
I often wonder WTF the naming department is thinking.
“Hey guys, we have a new SKU here, what should we call it?”
“Well, we could stick with the naming convention we’ve developed or we could throw it all away because bigger numbers are better!”
“Great idea!”
*sigh*
Well someone has to help the power supply companies sell 1400w PSU’s.
Like looking through a furnace window…
I like the red color scheme.
On related news, moving forward, AMD also came up with fancy new logos and artwork for their stuff, in case someone here still doesn’t know.
[url<]http://i40.tinypic.com/eq5zll.jpg[/url<] Very nice, actually. Now if only the chips themselves look exactly like those fancy graphical stuff.
Come on AMD! How about focusing on improving something that actually matters, like task energy!
Thanks for the reminder! I remembered that you had done some overclocking back when the FX-8350 launched, so now we have a good approximation of what to expect with these parts.
Edit: After a review of those numbers, I’m pretty sure that my 4770K even at its rather meager 4.6GHz daily-use overclock isn’t obsolete quite yet.
Well, what else can they do? A new and/or improved microarchitecture is not yet ready.
The new chips are clearly not designed for “ordinairy” consumers. These chips are just for fun. If you don’t like it, don’t buy them.
AMD is really throwing out everything it can…I thought the times of touting higher frequencies as the best thing ever were over. Evidently they are not…in AMD’s RDF of course.
AMD already has a power efficiency problem and they release an even higher clocked CPU ? Ok…
Pfft, I’ve already got 2 8350’s @ 5ghz and another at 4.5 on stock v.
You go to war with the product line you have, not with the product line you wish to have. AMD took a lemon and made lemonade. Good move, IMO
220W???
What the…..*beep*!
Well they definitely need something brilliant, all right. Call me oldschool, but I’m thinking a brilliant PRODUCT would be more beneficial.
But still won’t [url=http://www.youtube.com/watch?v=ll7rWiY5obI<]go to 11[/url<]! Hint: Watch the last bit on that video, the part about the Marshall.
Yeah. I’m seeing red, dude.
No, it’s not just the price. It’s the power consumption. Power consumption almost made me get Intel instead of my current FX.
Essentially. But I think it’s a brilliant marketing move.
[quote<]If they wanted to totally not get laughed at they would price it at or below the 3770k[/quote<] 3770k is a high-volume product. These are low-volume, limited/special edition halo chips, and should be priced accordingly. I think there will be a (small) market for these, even if they are priced north of $500, and it's all about perception anyway: AMD will now and forever be the [i<]first[/i<] to hit 5GHz. It has marketing value. In all likelihood, they will beat 3770Ks/4770Ks in many benchmarks. We can keep arguing about TDP and efficiency and whatnot, but until Intel offers a 220W chip that beats this monster, AMD has the crown.
That’s pronounced “Jiggawatts” for those not familiar with Dr. Brown
I took an FX-8350 to a 4.8GHz Turbo peak a while back. Here’s how it performed:
[url<]https://techreport.com/review/23750/amd-fx-8350-processor-reviewed/13[/url<]
So you’re saying that these are nothing more than overclocked chips with a factory warranty?
When you push the chip outside its intended frequency range, you gotta give it some voltage. This is no different from regular overclocking.
Any word on pricing? I heard its OEM only?
Shoulda called it FX-5800 Ultra.
5Ghz cores with the L3 running at DDR3 speed, should perform great.
If they make the cooler look like a Mr. Fusion, this will be all win.
It’s [url=http://www.youtube.com/watch?v=SiMHTK15Pik<]over 9000[/url<]!
I’m sure that in 2085 plutonium is available in every corner drugstore, but in 2013, it’s a little hard to come by.
” Achieving an 800MHz increase over that no doubt requires certain tradeoffs.”
yeah, but that’s not that big of an increase from 4.2ghz. almost doubling the TDP for a 17% gain is nonsense.
Price will make or break these higher clocked FX chips depending on if they’re limited edition.
If they wanted to totally not get laughed at they would price it at or below the 3770k
Following AMD’s historical scheme of naming its FX processors….
Does this thing have 9 cores?
AMD, are you in a hurry to run out of model numbers? At most, I would’ve named this FX-8390.
It probably won’t be a great chip but for academic purposes i’d still like to see how it performs.
So much for AMD being a GREEN company.
220 GIGAWATTS!?
[url=http://www.youtube.com/watch?v=BzSYyyyFOko<]Pow Pow Pow Pow Pow Pow Powar![/url<]