Glorious wrote:Interesting article Jigar.
I can totally see managers thinking like he describes.
AMD wants to save money, so their management looks at ATI which heavily relies on automated layouts (as does Nvidia) for their graphics products and thinks:
It works for them, right? And it's cheaper, right?
So what could go wrong? It's proven method that's cheaper, so we're obviously throwing money away!
Glorious wrote:This kind of thinking is running rampant everywhere from what I can tell. All over the place managers & leaders are looking at numbers and treating them the same as reality, ignoring the fact that the numbers are, and will always be, an abstract represention of reality, not a depiction of reality itself. That mistake is called reification, and it's turning into a big problem.
JBI wrote:Even worse is when management tries to bend reality to their will by manipulating the numbers. I have seen situations where projects are literally managed by enabling and disabling time charge codes to control what people are (theoretically) working on! (Result: Engineers continue working on whatever they need to be working on to meet their schedules, and charge time against "overhead" instead. Management then gets royally pissed off because there's too much "overhead".
Glorious wrote:Yup, that sort of problem is universal at this point. Managers then spend big money buying "better" tools with more granular "control" in order to better circumvent such things.
Which, as you've noticed, is entirely missing the point.
Jigar wrote:This looks like a good read.
Althernai wrote:Jigar wrote:This looks like a good read.
This is interesting, but difficult to understand. While the chip was being designed, the man all the way at the top was an engineer -- one of the best out there. This is not some corporate suit who can't tell the difference between designing a GPU and designing a CPU; he had to have known that there is a price to pay for automation. I guess the choice could have been between using this automation and getting a larger and slower chip and delaying the launch (yet again...), but they wound up delaying it anyway since their first revision was almost certainly not competitive even with Phenom II, never mind Intel.
Krogoth wrote:Bulldozer is superior to K10.
It easily beats X4 and catches up with X6 despite having two less "real cores" at its disposal at multi-threaded applications.
AMD is already working on desktop versions of the architecture.
CBHvi7t wrote:there is no way to make this a good desktop chip.
just brew it! wrote:That seems overly pessimistic to me. If they can tweak the design/process to get the power consumption down and/or clock speeds to ramp, and drop the price a notch, it'll be competitive in the mid-range. It may never be an *excellent* desktop chip, but IMO it still has the potential to be a *good* one.
just brew it! wrote:If they can tweak the design/process to get the power consumption down and/or clock speeds to ramp, and drop the price a notch, it'll be competitive in the mid-range. It may never be an *excellent* desktop chip, but IMO it still has the potential to be a *good* one.
Users browsing this forum: Google Adsense [Bot] and 3 guests