Personal computing discussed

Moderators: renee, Flying Fox, morphine

 
Airmantharp
Emperor Gerbilius I
Posts: 6192
Joined: Fri Oct 15, 2004 10:41 pm

Re: Bulldozer fail prediction by AMD Staff

Sat Jun 25, 2011 3:04 pm

swaaye wrote:
Regardless of whether many people are supposedly happy to live on into eternity with Core 2 or Phenom level performance, that's not going to stop Intel from continuing to push the limits and bury AMD if they can't keep up. AMD is way behind right now and if Bulldozer flops even in servers then I think things are going to get very ugly for them. Intel is advancing on all of the same fronts and they are really only behind on GPU right now and that's mostly because they just don't take it very seriously. It hasn't stopped them from selling a bazillion GMA IGPs after all.


I'd hazard to say that they're not behind on the GPU by much- the amount of gaming that I've been able to do on my i7-2620M is staggering, and unsuspected- I didn't buy the laptop to play games. I think that it'll take only a few tweaks, driver updates, and transistor infusions and they could stand side by side with AMD's APUs on graphics if they wished, while beating the ever-loving piss out of their CPUs.
 
deadrats
Gerbil XP
Posts: 301
Joined: Tue Feb 19, 2008 7:16 pm

Re: Bulldozer fail prediction by AMD Staff

Sat Jun 25, 2011 3:11 pm

JustAnEngineer wrote:
Your lackluster DX9-era graphics and your cheap 1920x1080 32" TV are quite visibly inferior when placed next to my 2560x1600 30" IPS LCD panel with titles that fully utilize the speed and the DX11 features of my Cayman GPU.


how much did that monitor cost? i picked up a 32" hdtv at walmart for $300 and last christmas the local pathmarks here in nj were selling 32" hdtv's for $199.

as for cayman, how much did that video card cost and how many games, other than a few tech demos, actually make use of dx11? and let's not forget that unless you're running win 7 you also need to spend a few hundred more upgrading the OS.

and for what? to play battlefield 3? i think maybe you guys need to actually get laid once in a while.
 
deadrats
Gerbil XP
Posts: 301
Joined: Tue Feb 19, 2008 7:16 pm

Re: Bulldozer fail prediction by AMD Staff

Sat Jun 25, 2011 3:20 pm

swaaye wrote:
Regardless of whether many people are supposedly happy to live on into eternity with Core 2 or Phenom level performance, that's not going to stop Intel from continuing to push the limits and bury AMD if they can't keep up. If Bulldozer flops even in the niches it's supposed to be great for then I think things are going to get very ugly for them. Intel is advancing on all of the same fronts and they are really only behind on GPU right now and that's mostly because they just don't take it very seriously. It hasn't stopped them from selling a bazillion GMA IGPs after all.

Today's CPU market makes me wonder where AMD would be if they didn't have that fusion marketing and ATI GPUs to sell. They have their entire consumer CPU line, including the giant hexa cores, at under $200. Not exactly raking in the dough.


i think it's precisely because they are involved in the gpu market that their cpu offerings are struggling; i.e. it's sucking resources in terms of development funds and engineers from the cpu front to the gpu front.

be that as it may, amd selling 6 core cpu's for substantially less $200 is a huge win for consumers as it helps keep intel's cpu prices in check.

i would love it if amd released a sub $200, low clocked 12 core cpu, for multitasking, video editing, audio work it would be awesome and in fact i would love if both intel and amd released low clocked (1 ghz and below) multi-cored cpu's (16+ cores) because it would force lazy programmers to stop relying on higher IPC and higher clock speeds to increase the performance of their software, they would start thinking of ways to heavily thread their apps.

it would also be a huge boon to video games as programmers started exploiting the multi-threaded capabilities of the cpu's to bring us better AI (very easy to multithread) as well as better physics (again, with numerous collisions, very easy to multithread).
 
DancinJack
Maximum Gerbil
Posts: 4494
Joined: Sat Nov 25, 2006 3:21 pm
Location: Kansas

Re: Bulldozer fail prediction by AMD Staff

Sat Jun 25, 2011 3:24 pm

deadrats wrote:
i would love it if amd released a sub $200, low clocked 12 core cpu, for multitasking, video editing, audio work it would be awesome and in fact i would love if both intel and amd released low clocked (1 ghz and below) multi-cored cpu's (16+ cores) because it would force lazy programmers to stop relying on higher IPC and higher clock speeds to increase the performance of their software, they would start thinking of ways to heavily thread their apps.

it would also be a huge boon to video games as programmers started exploiting the multi-threaded capabilities of the cpu's to bring us better AI (very easy to multithread) as well as better physics (again, with numerous collisions, very easy to multithread).


If all this multi-threading is so easy, why aren't these "lazy programmers" taking advantage of it?
i7 6700K - Z170 - 16GiB DDR4 - GTX 1080 - 512GB SSD - 256GB SSD - 500GB SSD - 3TB HDD- 27" IPS G-sync - Win10 Pro x64 - Ubuntu/Mint x64 :: 2015 13" rMBP Sierra :: Canon EOS 80D/Sony RX100
 
DancinJack
Maximum Gerbil
Posts: 4494
Joined: Sat Nov 25, 2006 3:21 pm
Location: Kansas

Re: Bulldozer fail prediction by AMD Staff

Sat Jun 25, 2011 3:29 pm

deadrats wrote:
how much did that monitor cost? i picked up a 32" hdtv at walmart for $300 and last christmas the local pathmarks here in nj were selling 32" hdtv's for $199.

as for cayman, how much did that video card cost and how many games, other than a few tech demos, actually make use of dx11? and let's not forget that unless you're running win 7 you also need to spend a few hundred more upgrading the OS.

and for what? to play battlefield 3? i think maybe you guys need to actually get laid once in a while.


I'm not sure I understand this. Everyone has their own budget. It's not just opinion that and IPS panel looks better than the $300 TV that you have. The video card he has runs the games you play faster at highest quality possible and he has the ability to take advantage of DX11 features. If you want to play your games with DX9 effects forever and view them on a TN based TV that's fine, but there isn't anything wrong with buying higher quality components either.
i7 6700K - Z170 - 16GiB DDR4 - GTX 1080 - 512GB SSD - 256GB SSD - 500GB SSD - 3TB HDD- 27" IPS G-sync - Win10 Pro x64 - Ubuntu/Mint x64 :: 2015 13" rMBP Sierra :: Canon EOS 80D/Sony RX100
 
Airmantharp
Emperor Gerbilius I
Posts: 6192
Joined: Fri Oct 15, 2004 10:41 pm

Re: Bulldozer fail prediction by AMD Staff

Sat Jun 25, 2011 3:30 pm

deadrats wrote:
JustAnEngineer wrote:
Your lackluster DX9-era graphics and your cheap 1920x1080 32" TV are quite visibly inferior when placed next to my 2560x1600 30" IPS LCD panel with titles that fully utilize the speed and the DX11 features of my Cayman GPU.


how much did that monitor cost? i picked up a 32" hdtv at walmart for $300 and last christmas the local pathmarks here in nj were selling 32" hdtv's for $199.

as for cayman, how much did that video card cost and how many games, other than a few tech demos, actually make use of dx11? and let's not forget that unless you're running win 7 you also need to spend a few hundred more upgrading the OS.

and for what? to play battlefield 3? i think maybe you guys need to actually get laid once in a while.


You do realize that your HDTV is less than half the resolution of the 30" IPS screens myself and JAE are using, right? Also, Windows 7 is a ~$99 investment for an OEM copy. And please keep the personal attacks to yourself, this isn't R&P.
 
seeker010
Gerbil First Class
Posts: 143
Joined: Sat Oct 19, 2002 8:52 am

Re: Bulldozer fail prediction by AMD Staff

Sat Jun 25, 2011 4:17 pm

*ahem*, personal attacks aside....

AMD's city cores and star cores didn't set the world on fire, and it doesn't look like bulldozer is the second coming of the hammer either. the staggering amount of BS that will almost assuredly come out of AMD if Bulldozer fails to meet expectations will make Intel's "netbust" fiasco look like a walk in the park. At least Intel roared back with conroe.
 
derFunkenstein
Gerbil God
Posts: 25427
Joined: Fri Feb 21, 2003 9:13 pm
Location: Comin' to you directly from the Mothership

Re: Bulldozer fail prediction by AMD Staff

Sat Jun 25, 2011 4:25 pm

swaaye wrote:
Regardless of whether many people are supposedly happy to live on into eternity with Core 2 or Phenom level performance, that's not going to stop Intel from continuing to push the limits and bury AMD if they can't keep up.

I would actually say, based on how well the Nehalem and now Sandy Bridge generations overclock, that Intel is doing quite the opposite. They're consciously slowing down their CPUs - not releasing even faster models - so that they can keep an edge on AMD but not bankrupt them. They NEED AMD to survive.
I do not understand what I do. For what I want to do I do not do, but what I hate I do.
Twittering away the day at @TVsBen
 
JustAnEngineer
Gerbil God
Posts: 19673
Joined: Sat Jan 26, 2002 7:00 pm
Location: The Heart of Dixie

Re: Bulldozer fail prediction by AMD Staff

Sat Jun 25, 2011 4:37 pm

deadrats wrote:
let's not forget that unless you're running win 7 you also need to spend a few hundred more upgrading the OS.
When DirectX 11 came out two years ago, Microsoft provided it for free to work with the 64-bit operating system that I had purchased two and a half years earlier for less than $100. 8) I actually had forgotten about the luddites who haven't upgraded their operating system in a decade. However, I guess someone's got to still be providing the computing power for the botnets. :lol:

It is perfectly valid to question the high cost of high-end PC gaming gear. I knew when I purchased the 4 megapixel display that it would require me to purchase more expensive graphics cards than would be needed to provide smooth gaming on a lower resolution screen. It's the gift that keeps on giving.


It's not worth getting worked up over Bulldozer performance at this point. If you've shorted AMD stock, you're probably rooting against AMD's new processors. If you're not invested in the company, just wait and see. The benchmarks will come out, the product will come out and eventually Tech Report will review it and provide the snark that some folks seem to be craving right now.

We should count on Intel adjusting their prices to match AMD's. With the manufacturing process technology moving to the 32nm node, AMD should be able to be more competitive. I expect that the biggest upshot of Bulldozer's introduction will be a slight decrease in Intel's profits on their mid-range CPUs. That's more savings for us (the consumers), whether we choose AMD or Intel.
· R7-5800X, Liquid Freezer II 280, RoG Strix X570-E, 64GiB PC4-28800, Suprim Liquid RTX4090, 2TB SX8200Pro +4TB S860 +NAS, Define 7 Compact, Super Flower SF-1000F14TP, S3220DGF +32UD99, FC900R OE, DeathAdder2
 
swaaye
Gerbil Team Leader
Posts: 281
Joined: Mon Apr 21, 2003 4:45 pm
Contact:

Re: Bulldozer fail prediction by AMD Staff

Sat Jun 25, 2011 6:17 pm

derFunkenstein wrote:
swaaye wrote:
Regardless of whether many people are supposedly happy to live on into eternity with Core 2 or Phenom level performance, that's not going to stop Intel from continuing to push the limits and bury AMD if they can't keep up.

I would actually say, based on how well the Nehalem and now Sandy Bridge generations overclock, that Intel is doing quite the opposite. They're consciously slowing down their CPUs - not releasing even faster models - so that they can keep an edge on AMD but not bankrupt them. They NEED AMD to survive.

True. Otherwise the DoJ will probably turn their eye towards them. Also, with only a struggling middling competitor, Intel can make the high end as slow and as expensive as they desire. The unlocked chips reveal the crazy headroom they don't utilize in order to maximize yields.
 
deadrats
Gerbil XP
Posts: 301
Joined: Tue Feb 19, 2008 7:16 pm

Re: Bulldozer fail prediction by AMD Staff

Sat Jun 25, 2011 7:02 pm

DancinJack wrote:
If all this multi-threading is so easy, why aren't these "lazy programmers" taking advantage of it?


glad you asked:

1) the common programming language C is about 40 years old (from circa 1970) and does not have native support for threads, you need to use a threading library like pthreads or the win api and neither of those is all that easy to work with. if programming languages allowed programmers to do something like:

thread1{
commands go here};
thread2{
commands go here};

you would see much more multi-threaded code, however you have a chicken and the egg scenario, programming languages haven't traditionally offered programming constructs like that because cpu's didn't have many cores and cpu's didn't have many cores because programming languages didn't allow for easy multi-threading.

in fact if you ever go through a comp sci program, you won't take a class on multi-threaded programming until your third or fourth year, though that may be different in colleges like penn state that have switched to teaching all their comp sci classes in java as java has multi-threading built into the language.

you are however seeing the better programmers taking advantage of Open CL, Direct Computer and CUDA to leverage the multiple streaming cores within gpu's to allow for the calculation of physics and AI on the gpu (that's why many games only enable certain physics effects via PhsyX, because a gpu with it hundreds of cores is capable of handling all the collision calculations without choking while a cpu with just a handful of alu's would be brought to it's knees of it tried to perform hundreds of collision calculations per second.

aren't you glad you asked?
 
deadrats
Gerbil XP
Posts: 301
Joined: Tue Feb 19, 2008 7:16 pm

Re: Bulldozer fail prediction by AMD Staff

Sat Jun 25, 2011 7:09 pm

DancinJack wrote:
I'm not sure I understand this. Everyone has their own budget. It's not just opinion that and IPS panel looks better than the $300 TV that you have. The video card he has runs the games you play faster at highest quality possible and he has the ability to take advantage of DX11 features. If you want to play your games with DX9 effects forever and view them on a TN based TV that's fine, but there isn't anything wrong with buying higher quality components either.


i'm all for higher quality components but consider a guy like me, sitting in front of a system with a gts250, vista 64, 4 gigs of ddr2 and an e7400 coupled with a 19" monitor capable of 1280x1024 max resolution.

battlefield 3 comes out, does it really make sense to run out and upgrade the monitor, cpu, ram, video card, monitor and OS just to play that one game?

as for DX11, where are the games that take full advantage of DX11?

then you have this reality, we have no way of knowing if battlefield 3 will be any good, maybe it will have a horrible engine like crysis or be a pile of crap like dnf and a complete waste of money.

to me it seems totally silly to go build a high end system for the sole purpose of playing one game, but gain maybe i'm just getting old and can't see things from the point of view of a 15 year old that doesn't actually have to work for his money.
 
deadrats
Gerbil XP
Posts: 301
Joined: Tue Feb 19, 2008 7:16 pm

Re: Bulldozer fail prediction by AMD Staff

Sat Jun 25, 2011 7:14 pm

Airmantharp wrote:
You do realize that your HDTV is less than half the resolution of the 30" IPS screens myself and JAE are using, right? Also, Windows 7 is a ~$99 investment for an OEM copy. And please keep the personal attacks to yourself, this isn't R&P.


i didn't realize that suggesting that getting laid once in a while would lead to clearer judgement was considered a "personal attack", how's about i set things straight and suggest that you never get laid again, all better?

as for win 7 being about 99 bucks, let's assume that's true; that's $60+tax for the game plus about 100+tax for the OS, you really think spending $160 for a game is the way to go?

and again you're ignoring the initial investment in that monitor as well as the cost of the video card needed to use that monitor to it's fullest potential.

to me it just smacks of poorly thought out planning on the part of the gamer that actually goes that route.
 
deadrats
Gerbil XP
Posts: 301
Joined: Tue Feb 19, 2008 7:16 pm

Re: Bulldozer fail prediction by AMD Staff

Sat Jun 25, 2011 7:18 pm

JustAnEngineer wrote:
When DirectX 11 came out two years ago, Microsoft provided it for free to work with the 64-bit operating system that I had purchased two and a half years earlier for less than $100. 8) I actually had forgotten about the luddites who haven't upgraded their operating system in a decade. However, I guess someone's got to still be providing the computing power for the botnets. :lol:


i stand corrected, i was not aware that dx11 was released for vista, i'm updating as i type this.
 
DancinJack
Maximum Gerbil
Posts: 4494
Joined: Sat Nov 25, 2006 3:21 pm
Location: Kansas

Re: Bulldozer fail prediction by AMD Staff

Sat Jun 25, 2011 7:29 pm

No one here(Air, JAE, or myself) built their computer solely for the purpose of playing BF3. I don't know why we seem to be stuck on that. Some of us have played previous BF games and will buy BF3 based on its pedigree regardless of what people say about the game. I suspect I'll buy it day one when it comes out.

All three of us have IPS monitors at 1920x1200 or above. We didn't buy these monitors so we could justify upgrading to a more powerful video card. We like the extra resolution/work space, accurate colors, and viewing angles. Having a powerful video card to power these monitors is something that we understood before we bought them in the first place. Like I said earlier - everyone has a budget that they want to stick to and we incorporate these things into it. So, it's entirely possible that for you, a high resolution monitor and $200+ video card isn't needed. Doesn't mean other people can't use it.

We don't need to assume that W7 is 99 bucks. It is.
i7 6700K - Z170 - 16GiB DDR4 - GTX 1080 - 512GB SSD - 256GB SSD - 500GB SSD - 3TB HDD- 27" IPS G-sync - Win10 Pro x64 - Ubuntu/Mint x64 :: 2015 13" rMBP Sierra :: Canon EOS 80D/Sony RX100
 
Airmantharp
Emperor Gerbilius I
Posts: 6192
Joined: Fri Oct 15, 2004 10:41 pm

Re: Bulldozer fail prediction by AMD Staff

Sat Jun 25, 2011 8:01 pm

@deadrats: continued personal attacks reported.

Also, as has been stated, if I chose to spend money on a decent monitor, CPU, and GPUs to back it up, wasn't that my choice? I do use the extra resolution and color accuracy, and I do play more than one game. With regards to BF3, it's a benchmark, not just because it's Battlefield or because it looks pretty, but for me because of just how good Bad Company 2's multiplayer is. There's no comparison; trying to play MoH MP (also made by DICE) or CoD is just silly afterwards; there's a reason it was voted to the top of multiplayer games last year.
Last edited by Airmantharp on Sat Jun 25, 2011 8:05 pm, edited 1 time in total.
 
Airmantharp
Emperor Gerbilius I
Posts: 6192
Joined: Fri Oct 15, 2004 10:41 pm

Re: Bulldozer fail prediction by AMD Staff

Sat Jun 25, 2011 8:05 pm

deadrats wrote:
JustAnEngineer wrote:
When DirectX 11 came out two years ago, Microsoft provided it for free to work with the 64-bit operating system that I had purchased two and a half years earlier for less than $100. 8) I actually had forgotten about the luddites who haven't upgraded their operating system in a decade. However, I guess someone's got to still be providing the computing power for the botnets. :lol:


i stand corrected, i was not aware that dx11 was released for vista, i'm updating as i type this.


Right now the only real thing Vista lacks at the 'Home' level is Trim support; with Service Pack 2 on a clean install it's pretty quick as well. I have a number of copies still in use on family machines without issue, it's still an incredibly viable computing and gaming platform. If you're using XP still though- get with the times. Microsoft went way out of their way to plug the security design issues from Win32 with Vista's re-engineering, and they've largely succeeded.
 
morphine
TR Staff
Posts: 11600
Joined: Fri Dec 27, 2002 8:51 pm
Location: Portugal (that's next to Spain)

Re: Bulldozer fail prediction by AMD Staff

Sun Jun 26, 2011 12:02 am

Corrado wrote:
Does high end gaming still exist? What game today REALLY needs > 2 cores and > $200 video card?

Every one from ~3 years on. I've upgraded to a much faster quad core coming from my Core 2 Duo E6400, and the difference was night and day. Now I have smooth 50+ FPS nearly all the time. I've actually gone to the trouble of finding out why, and I found that between most game engines and video drivers and assorted trinkets, you have, on average, and by my Mark I eyeball, a 2.5 core usage in the majority of games. Since you can't use half a core, that's 3. And some games do actually use all four, thought admittedly those are rare.

As for the video card, a $200 card of today can play nearly every game with maximum detail at resolutions up to 1920 give or take, and no argument there. However, that's likely to change in these next couple years (as ever, really). PC gaming may have plateau'd somewhat, but that doesn't mean that there aren't dividends to reap from good hardware, and that we aren't on the verge of new evolution.
There is a fixed amount of intelligence on the planet, and the population keeps growing :(
 
yuhong
Gerbil
Posts: 31
Joined: Mon Dec 08, 2008 6:50 pm

Re: Bulldozer fail prediction by AMD Staff

Sun Jun 26, 2011 2:25 am

I agree there is a couple of things wrong in this article. On AMD and benchmarks, not mentioned is that Intel used far sleazier ways back in the days AMD beat Intel, like having the CPU dispatcher in Intel's own compiler enable SSE only if a GenuineIntel CPU is detected. As I mentioned in another post, the reason AMD joined Bapco in the first place is that the SYSMark 2002 benchmark were skewed against Intel. And AMD had a point when they said that GPU performance was more important than CPU performance, as demostrated by Liano. Not to mention Intel's infamous marketing payments that was illegal. Also not mentioned was that the poor performance of the B1 stepping was exactly why they delayed Bulldozer in the first place. That being said, I knew that buying ATI at the time Core 2 Duo came out was a mistake.
 
DeadOfKnight
Gerbil Elite
Posts: 726
Joined: Tue May 18, 2010 1:20 pm

Re: Bulldozer fail prediction by AMD Staff

Sun Jun 26, 2011 4:06 am

SC2 is also CPU intensive. It can be a demanding game, but nothing that isn't affordable to play on ultra at 1080p for a desktop.
It does tend to be CPU bottle-necked unless you're running on ultra settings at a 2560 x 1440 resolution with AA enabled.

Basically you will notice a CPU bottleneck if playing with lower settings on a mainstream AMD laptop, and it will really suck.
Last edited by DeadOfKnight on Tue Jun 28, 2011 10:54 pm, edited 1 time in total.
Intel Core i7-5775c, Asus Maximus VII Formula, Win 10 Pro
RTX 2080 Ti FE, Corsair 2x8GB DDR3-1866, Corsair AX860
Corsair H105, WD Red 4TB x2, Creative X-Fi Titanium HD
Samsung 850 EVO 1TB, Alienware AW3418DW, Corsair 450D
 
Corrado
Minister of Gerbil Affairs
Posts: 2574
Joined: Sun Feb 17, 2002 7:00 pm

Re: Bulldozer fail prediction by AMD Staff

Sun Jun 26, 2011 9:46 am

morphine wrote:
Every one from ~3 years on. I've upgraded to a much faster quad core coming from my Core 2 Duo E6400, and the difference was night and day. Now I have smooth 50+ FPS nearly all the time. I've actually gone to the trouble of finding out why, and I found that between most game engines and video drivers and assorted trinkets, you have, on average, and by my Mark I eyeball, a 2.5 core usage in the majority of games. Since you can't use half a core, that's 3. And some games do actually use all four, thought admittedly those are rare.

As for the video card, a $200 card of today can play nearly every game with maximum detail at resolutions up to 1920 give or take, and no argument there. However, that's likely to change in these next couple years (as ever, really). PC gaming may have plateau'd somewhat, but that doesn't mean that there aren't dividends to reap from good hardware, and that we aren't on the verge of new evolution.



If you bought a 2 core modern CPU (i5 2390T) and whatever video card you could get for $199 right now, I venture it would play any game out @ 1920x1080. Are there instances where you want more? Absolutely. But i said REALLY NEEDS more than that. Everything will run acceptably on that and still look very nice. I'm willing to venture that BF3 will run fine and look great on that rig at that resolution as well. Its like the difference of going from 400hp to 550hp. Is it faster? Absolutely. Are you going to notice and is the gain really worth the additional cost? Not for the vast majority of people, no.
 
Airmantharp
Emperor Gerbilius I
Posts: 6192
Joined: Fri Oct 15, 2004 10:41 pm

Re: Bulldozer fail prediction by AMD Staff

Sun Jun 26, 2011 12:21 pm

Corrado wrote:
morphine wrote:
Every one from ~3 years on. I've upgraded to a much faster quad core coming from my Core 2 Duo E6400, and the difference was night and day. Now I have smooth 50+ FPS nearly all the time. I've actually gone to the trouble of finding out why, and I found that between most game engines and video drivers and assorted trinkets, you have, on average, and by my Mark I eyeball, a 2.5 core usage in the majority of games. Since you can't use half a core, that's 3. And some games do actually use all four, thought admittedly those are rare.

As for the video card, a $200 card of today can play nearly every game with maximum detail at resolutions up to 1920 give or take, and no argument there. However, that's likely to change in these next couple years (as ever, really). PC gaming may have plateau'd somewhat, but that doesn't mean that there aren't dividends to reap from good hardware, and that we aren't on the verge of new evolution.



If you bought a 2 core modern CPU (i5 2390T) and whatever video card you could get for $199 right now, I venture it would play any game out @ 1920x1080. Are there instances where you want more? Absolutely. But i said REALLY NEEDS more than that. Everything will run acceptably on that and still look very nice. I'm willing to venture that BF3 will run fine and look great on that rig at that resolution as well. Its like the difference of going from 400hp to 550hp. Is it faster? Absolutely. Are you going to notice and is the gain really worth the additional cost? Not for the vast majority of people, no.


You're mostly right; the Intel dual-cores, especially the i5's that self-overclock, are a blast for most games. But given that Battlefield 3 is our benchmark, and what is known about Bad Company 2 Multiplayer (Battlefield is all about multiplayer!), getting a dual-core now, even one that has two more fake cores, just doesn't make sense. That game will tank a quad core; hell, I was testing AMD's new GPU drivers (they crash) while watching the stats, and I noticed on average 80-100% usage on a 2500k at stock. It was holding at 3.4GHz. I had been wanting to go to an i7-875k, but I'm glad I waited for the 2500k, as this thing hits 4.5GHz on stock volts, and I think that's what I'm going to need for BF3! (along with the pair of HD6950's to push my 30" panel).
 
swaaye
Gerbil Team Leader
Posts: 281
Joined: Mon Apr 21, 2003 4:45 pm
Contact:

Re: Bulldozer fail prediction by AMD Staff

Sun Jun 26, 2011 1:19 pm

Why did this thread have to turn into an argument over the value of high end PC gaming hardware? The first guy who responded to the argument shouldn't have taken the bait.
Last edited by swaaye on Sun Jun 26, 2011 1:27 pm, edited 2 times in total.
 
Airmantharp
Emperor Gerbilius I
Posts: 6192
Joined: Fri Oct 15, 2004 10:41 pm

Re: Bulldozer fail prediction by AMD Staff

Sun Jun 26, 2011 1:24 pm

swaaye wrote:
Why did this thread have to turn into an argument over the value of PC gaming hardware? The first guy who responded to the argument shouldn't have taken the bait.


Dunno, but I was thinking along the same lines earlier- but this happens with most threads it seems. Once the original idea is worked out (we should wait until the actual benchmarks come out...), whatever other ideas that came up get discussed in length.
 
Ryhadar
Gerbil XP
Posts: 463
Joined: Tue Oct 21, 2008 9:51 pm

Re: Bulldozer fail prediction by AMD Staff

Sun Jun 26, 2011 3:00 pm

Meh.

I'm sure it's been said already but:
  • This doesn't explain why nVidia and Via left
  • It's a Theo Valich article... I don't usually judge journalists, as I probably wouldn't be very good at it myself, but his work is just bad. Just a few weeks ago, he had an article that said that BD integer performance is much better than sandy's.

Regardless of the outcome, I will probably buy a 4-core bulldozer whether it's competitive against sandy or not. As long as it's got better IPC than my C2D E6600 at 3.0 GHz and it's priced right I'll be happy.
 
Glorious
Gerbilus Supremus
Posts: 12343
Joined: Tue Aug 27, 2002 6:35 pm

Re: Bulldozer fail prediction by AMD Staff

Mon Jun 27, 2011 8:22 am

deadrats wrote:
1) the common programming language C is about 40 years old (from circa 1970) and does not have native support for threads, you need to use a threading library like pthreads or the win api and neither of those is all that easy to work with. if programming languages allowed programmers to do something like:

thread1{
commands go here};
thread2{
commands go here};

you would see much more multi-threaded code, however you have a chicken and the egg scenario, programming languages haven't traditionally offered programming constructs like that because cpu's didn't have many cores and cpu's didn't have many cores because programming languages didn't allow for easy multi-threading.


1) The inelegant expression or obtuseness of the APIs is not the primary problem when it comes to threading. The difficulty of using threads is not the limiting factor, rather, it is the difficulty of using threads well. You are treating the entire concept of concurrent execution as little more than a programming language's high-level primitive, and that's ridiculous. Those are just tools. The relative ease of handling the tools has very little to do with the overall difficulty of the project you are attempting to finish with them.

2) The difficulty in using concurrent execution for single programs is coherent synchronization. Thread 1, thread 2, thread n are all trying to accomplish result 1. Your multiplicity of threads is working towards a unity of result, and this means that your program is going to have to resolve them together somehow at some point. This problem is not even remotely trivial. Race hazards suck and the methods needed to prevent them sap performance, increase development time, and require skill.

3) Your chicken & egg narrative is simply wrong. Hardware historically lacked SMP because it was prohibitively expensive. It wasn't until the gains from using transistors towards IPC began to level off that it made sense to start going after TPC. They reason you didn't see two original pentiums on the same die is because it was uneconomical to the point of being physically infeasible.

4) Your narrative about programming language is inaccurate. Procedural programming languages are inherently uncomfortable with threading because of their paradigm; they are step-by-step depictions of the execution environment and threading completely confounds that. It's easy for you to invoke "thread 1 do this" and "thread 2 do that", but that's clearly just a description of two different programs you've merely called threads. The question isn't what they are doing separately, but what they are separately doing together. Keeping that effort coherent means either leaving the procedural concept altogether or making it thread-safe, something you cannot invoke via a primitive. In reality, it usually takes an intensive amount of effort to make code thread-safe, and that's what most of those APIs you mention are about. Even then, they merely provide the tools to do it correctly, it's ultimately up to the programmer.

deadrats wrote:
in fact if you ever go through a comp sci program, you won't take a class on multi-threaded programming until your third or fourth year, though that may be different in colleges like penn state that have switched to teaching all their comp sci classes in java as java has multi-threading built into the language.


Which is indicative of the fact that TLP isn't remotely simple. You have to be well-versed in all the basics before you can even begin.

As to using a new language like java, you're really asking a different question at that point.

deadrats wrote:
you are however seeing the better programmers taking advantage of Open CL, Direct Computer and CUDA to leverage the multiple streaming cores within gpu's to allow for the calculation of physics and AI on the gpu (that's why many games only enable certain physics effects via PhsyX, because a gpu with it hundreds of cores is capable of handling all the collision calculations without choking while a cpu with just a handful of alu's would be brought to it's knees of it tried to perform hundreds of collision calculations per second.


Flappy flags and moar particle effects! Whooo! I'm glad I spent all that money on "better programmers." :roll:
 
DeadOfKnight
Gerbil Elite
Posts: 726
Joined: Tue May 18, 2010 1:20 pm

Re: Bulldozer fail prediction by AMD Staff

Tue Jun 28, 2011 11:02 pm

Glorious wrote:
Flappy flags and moar particle effects! Whooo! I'm glad I spent all that money on "better programmers." :roll:

We've all fallen victim to good marketing at one point or another, all we can do is learn from it.
It's almost useless trying to convince the brainwashed consumers that they've been lied to.
Intel Core i7-5775c, Asus Maximus VII Formula, Win 10 Pro
RTX 2080 Ti FE, Corsair 2x8GB DDR3-1866, Corsair AX860
Corsair H105, WD Red 4TB x2, Creative X-Fi Titanium HD
Samsung 850 EVO 1TB, Alienware AW3418DW, Corsair 450D
 
SecretMaster
Graphmaster Gerbil
Posts: 1356
Joined: Mon Jul 23, 2007 11:01 pm
Location: New York

Re: Bulldozer fail prediction by AMD Staff

Sat Jul 09, 2011 3:59 pm

So is there any word as to how soon we'll see Bulldozer? I thought it would shortly follow the Llano release.
 
Kurotetsu
Gerbil Elite
Posts: 548
Joined: Sun Dec 09, 2007 12:13 pm

Re: Bulldozer fail prediction by AMD Staff

Sat Jul 09, 2011 4:22 pm

Ryhadar wrote:
Meh.

I'm sure it's been said already but:
[list]
[*]This doesn't explain why nVidia and Via left


And this is the part that bugs me the most. EVERYONE is so dead focused on whatever huge conspiracy is behind AMD leaving, yet no one seems to have noticed that Nvidia and VIA left as well. I can understand no one giving a crap about VIA, but you'd think SOMEONE would have something to say about Nvidia leaving.
Under Construction Forever~~~
 
mav451
Gerbil
Posts: 12
Joined: Fri Nov 20, 2009 1:49 am

Re: Bulldozer fail prediction by AMD Staff

Sat Jul 09, 2011 4:54 pm

SecretMaster wrote:
So is there any word as to how soon we'll see Bulldozer? I thought it would shortly follow the Llano release.


[H] has an event next Saturday. Probably won't be releasing benchmarks, but I believe people will be able to use it hands-on. So regardless of positive/negative impression, we should have a good idea then.
http://hardforum.com/showthread.php?t=1620551
.: i5 750 @ 3.6Ghz | CM212Plus + P12 | P55-UD3R [BIOS F2] | 4GB G.Skill CL8 | eVGA GTX 260
.: 4 x 1TB WD | Corsair TX750 | Lian Li PC-A70A | X-Fi | Logitech Z-2300

http://heatware.com/eval.php?id=31362

Who is online

Users browsing this forum: No registered users and 41 guests
GZIP: On