Personal computing discussed
Moderators: renee, Flying Fox, morphine
sschaem wrote:From the preview is look like the A10 is an i3 class CPU. If you all think Trinity is a POS, then the same is true for the entire Intel i3 CPU line.
The difference is that for anything GPU related the A10 easily beat a i7-3770k with the highest end Intel IGP.
Lets not dismiss OpenCL and WebGL just yet, both where Intel APU as pretty deficient.
With Trinity you get an i3 class CPU but 2x faster IGP then the HD4000, for the price of an i3.
Its not what everyone is looking for (if you need a discreet GPU, trinity make no sense), but you need to be an Intel fanboy to buy an i3 when Trinity is available.
sschaem wrote:From the preview is look like the A10 is an i3 class CPU. If you all think Trinity is a POS, then the same is true for the entire Intel i3 CPU line.
sschaem wrote:No contest on energy inefficiency ?
"Trinity got a 100W TDP !! but the i3 only got a 55w.. 55w is better..." sigh.
What do you prefer system A that idle at 40 watt, but can consume 115w to deliver 40fps gaming or
system B that also iddle 40 watt but only consume 70w to deliver 15fps.
[...]
sschaem wrote:Abysmal core? same performance as intel i3 in most situations, check the benchmarks.
I already stated my view that Trinity makes no sense if your goal is to use a discreet GPU,
So you cant make that argument
What is unclear is how Trinity will scale in the future when game include more Compute task.
You cant count on it, but its possible that Trinity might get better overtime because of its solid IGP.
From all I see, the A10 is an i3 class CPU (At least from Anand review and others), but Its often over 2x faster then the HD4000 paired with a 3770k (upto 5 time faster then the i3 IGP). etc..
Unless you plan on doing discreet gpu gaming, or an Intel fanboy, I still dont see how an i3 makes any sense over an A10.
And seriously , if you'r building a gaming PC with a discreet GPU, go with an i5.
chuckula wrote:After seeing the reviews it looks like an i3 + practically any discrete GPU from the last 2 years in the $70+ range will be the better choice for playing games. Trinity's high clocks and quad integer pipelines do give it a few wins in selected CPU benchmarks over a much lower-clocked i3, but it comes at the cost of doubling the power consumption.
SERIOUSJACK85 wrote:How come it is duplicate thread? when i am talking specifically about A10-5800K upgrade from 1055T processor and the thread you are referring to has different scope (Talking Trinity in general).
TheEmrys wrote:Perhaps we can just all agree that the A10 is great for people who fit in the demographic of:
Gamers who want:
More than an i3
Less than an i5
Will never have a discrete gpu
For this criteria, the A10 is a perfect chip..... for the .02% of computer users. I'm sure AMD will sell tens of them.
Chrispy_ wrote:I was at a Borderlands2 LAN this weekend, and I needed to get a laptop that could run it.
My previous laptop had a downclocked 4650 with DDR3, which is basically the same as a Llano A6, right down to the architecture, shader count and clockspeed. It ran the original borderlands very well; 30fps with everyone turned on, or 60fps if you disabled just dynamic shadows.
Having read the Ivy Bridge reviews and seen on paper how close HD4000 is to a Llano A8, I picked up one of our two i7-3612QM machines and set about installing Borderlands2 with the expectation that it would be similar, if not slightly better than my old laptop which was the equivalent of a lowly 320-shader A6.
I couldn't have been more wrong; Complete slideshow at 2-3 fps. Granted, this was 1080p at default settings.
I know Borderlands 2 might be a little more demanding than its predecessor but it seems to run exactly the same on all my other hardware. I dropped it down to 720p and turned everything to off or low. Still no good. 20-30 fps, but obviously only in the high teens when in combat, rather than the 30+ I would call multiplayer-friendly; Basically, this is Xbox performance, not PC performance. I suppose it's still technically better than an Xbox since that doesn't even run at 720p. Anyway, that was the lowest supported resolution.
I dug out an old i3 with a 16-shader Geforce 210M, expecting similarly poor performance. Bear in mind that this is only 16 downclocked shaders of an obsolete architecture, which was last seen in the 240-shader GTX 280 a long time ago. Based on shaders and clockspeeds, I was expecting this thing to be 15x slower than a rather old desktop card, and therefore not up to the task.
I couldn't have been more wrong; Completely acceptable at 15-ish fps on the default settings, using the silly 1600x900 native resolution. So I cranked everything down to minimum@720p and it ran at maybe 40fps for the weekend, dropping to about 25fps during combat. Bearable for something with such underwhelming specs.
The moral of my rather longwinded story is this:
Never underestimate the incompetence of Intel when it comes to drivers, especially when talking about games that are "plays best on AMD" or with Nvidia's TWIMTBP support.
Bauxite wrote:Thats pretty narrow minded, they are aiming at OEM not gamers.
TheEmrys wrote:OEM's don't care about gaming performance, for the most part. For those that do, they will have a discrete card. Even the HTPC crowd will only find this platform OK, due to power use. I see no compelling reason to use something like Trinity when my i3-2120t does everything at 35W. The sad truth of the matter is that AMD is sinking its hopes on APU's when its a very small demographic. AMD has to get something going to be relevant to the mainstream. Trinity simply isn't it.
TheEmrys wrote:Perhaps we can just all agree that the A10 is great for people who fit in the demographic of:
Gamers who want:
More than an i3
Less than an i5
Will never have a discrete gpu
For this criteria, the A10 is a perfect chip..... for the .02% of computer users. I'm sure AMD will sell tens of them.
flip-mode wrote:TheEmrys wrote:I think OEM's will love this.Trinity simply isn't it.
vargis14 wrote:Looks great for the HTPC crowd with 6570 w ddr3 graphics performance and its pretty dang close to the i3 series x86 performance. With a household with more 3 hdtvs like mine and you already have a i5 or i7 sandy or ivy system for video transcoding and file serving. I think they will make fine inexpensive HTPC's with enough graphics power for madvr or whatever you prefer along with i3 cpu power for the most part. You can even do some decent gaming on them specially with older titles, and blizzard games like D3 and older.
ptsant wrote:Trinity is quite balanced in for it's price. You really need a discrete card at ~80-100$ to get something much better than trinity (otherwise don't bother). The added discrete card is not pocket change. Obviously, Trinity is not for me, because I'll probably get a 7970 or something like that but when you have a $500-600 budget, the $80 gfx card is not trivial. Compare this with i7 3770k buyers. Why the hell would I care for lousy HD4000 graphics when I'll be paying $300 for a cpu? I mean, who uses 3770k with onboard graphics?
ronch wrote:I just saw AMD's website update to include A10 chips.
http://www.amd.com/US/PRODUCTS/DESKTOP/ ... s-pib.aspx
What I really find pathetic is how AMD maintains its website. They really gotta fire the guys who maintain it. Near the bottom of the product page (just above the disclaimers), under Additional Information, click on 'Specs' and you'll be presented with a spec table that doesn't even include the new Trinity parts! No A10-5800K, no A10-5700, none. So sloppy. Is the guy responsible for maintaining and updating the AMD website tired of his job?
Actually i just visted Intel's site (haven't seen it in a while), it's actually quite good. You have to go to Menu>Intel products>3rd Intel Corei....> then select between "Laptops, Desktops, Server" tab, then the Core i category (eg. Core i3) then it shows all the SKUs. There you click on the name of a particular SKU and another page opens where it shows all the info you need (revisions, TJ max, etc.).