jossie wrote:Seems timely for me to revive this thread in light of
http://techreport.com/news/24976/bf4-ot ... ly-for-amd
Maybe I can make do for a little while, or pick up a 7870 when one is on sale again. I'd be interested in seeing what Steamroller has in store for us.
JohnC wrote:P.S: Reading the comments for these incorrect articles was fun, though
Airmantharp wrote:JohnC wrote:P.S: Reading the comments for these incorrect articles was fun, though
I'm still not convinced the article was incorrect- this is EA we're talking about, not DICE
JohnC wrote:Go to IGN's article and read their corrected title:
"Update: EA says Frostbite 3 Optimization Not Exclusive to AMD"
AMD pays EA for slapping their "Gaming Evolved" sticker on the box and for early Beta access (to which Nvidia can also have access, unofficially ) so they can optimize their drivers before the game based on that engine is out, they don't pay DICE for implementing hardware-specific optimizations in their engine
anotherengineer wrote:Your cpu can still hold its own for the time being.
ronch wrote:AM3+ boards are usually quite a bit cheaper
NovusBogus wrote:More cores gives very little benefit because few real-world applications are capable of using more than one or two. Always go with a smaller number of faster cores. Do keep in mind that the 4670K and 4770K are both quad core chips, the i7 just throws in a mostly useless marketing gimmick and a not-so-useless larger L3 cache. E8400 is a god among megahurtz so it's probably not too much of a limiter.
smilingcrow wrote:NovusBogus wrote:More cores gives very little benefit because few real-world applications are capable of using more than one or two. Always go with a smaller number of faster cores. Do keep in mind that the 4670K and 4770K are both quad core chips, the i7 just throws in a mostly useless marketing gimmick and a not-so-useless larger L3 cache. E8400 is a god among megahurtz so it's probably not too much of a limiter.
Well, your user name gives you away sir, loads of bogus info there.
End User wrote:Airmantharp wrote:We're up to the point today where we can use more than four physical cores
Fast Intel quad cores are the CPU of choice for gaming.
Airmantharp wrote:I'll agree that Nvidia has the better GPU technology (and the absolute fastest GPU), but not for the reasons you mentioned.
Nvidia's 'half-Kepler' GK104 spin is simply more efficient than AMD's large GPU (Cayman?). It does more work per watt AND per mm^2 of die, while using the same TSMC 28nm manufacturing process.
However, AMD isn't nearly as far behind hardware wise as they are driver wise; we're still waiting for their 'new' driver that fixes the stuttering issue with multiple cards. This is a big deal, especially with 4k coming up. AMD has a chance to be the 'budget' option again, but they're actually going to be held accountable for the user experience this time. Thanks TR!
For CPUs, it depends on where you're looking. For desktop gaming CPUs Intel has a clear lead, but that doesn't really extend anywhere that is power-usage agnostic, and there are things that having a highly-clocked CPU with eight live x86 cores will put AMD ahead, so long as the FPUs don't get abused.
Just note that Jaguar is worlds ahead of any shipping Atom. Intel does have a competitive Atom coming, but it isn't here yet; conceivably they could have taken that Xbox/PS contract out of AMD's hands if they weren't a year behind AMD in the 'tablet' form-factor space that Jaguar competes in.
And as for 'quiet'? That's all on AMD. The Titan uses more power and dissipates more heat than an HD7970, but it also has Nvidia's latest blower; and unlike AMD, Nvidia puts effort into their blowers. That's one of the things that sold me on a pair of GTX670's. No HD7970 could be that quiet (not with AMD's blower), the HD7950's couldn't even be purchased with them, and the third-party crap everyone was slapping on AMD cards is just not suited to running more than one card in a system. Gotta get that heat out!
Nec_V20 wrote:I will admit to being prejudiced against Nvidia, because they consistently brought out cards which performed blindingly in the benchmarks but as soon as they were out in the wild they did nothing but BSOD and that for months whilst they fobbed the customers of with "rare instance" or "new driver" and it took them at least three months to have the drivers reasonably running with the hardware. This was not a once off for one generation of Nvidia cards but something that occurred generation after generation. For all I know they might still be doing it, I don't know and don't care any more because their policy thoroughly sickened me off them and I don't even look at them any more.
Nec_V20 wrote:Thus I would personally not touch an Nvidia card with a bargepole out of principle. Who buys an AMD graphics adapter with a stock cooler anyway? I know I never have.
I'm not going to challenge the veracity of your statement; I fully believe your experience. I will say, though, that your experience is likely far from average, and that being more specific would help us better understand your stated experience and argument(s) that it underpins.
I bought two HD6950 2GB cards for use with Battlefield 3 (most strenuous case) on a 2560x1600 monitor. I wanted a relatively quiet and compact system, and a big part of that was making sure that the heat could actually get out of the case.
Airmantharp wrote:And this just came across the [H] thread- so much for Gaming Evolved.