Personal computing discussed

Moderators: renee, Flying Fox, morphine

 
windwalker
Gerbil First Class
Posts: 142
Joined: Wed Mar 23, 2011 2:25 pm

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Tue May 29, 2018 4:41 pm

Captain Ned wrote:
Dollars to doughnuts, walk up to any Apple store and say "here's the new MBP and it's OUR OWN NEW CHIP, NO MORE INTEL, but it costs more". The RDF faithful will shove their credit cards through any slot you have.

Since when did Apple care about business cases and pricing now that they've got the sheep hooked?

Just because other people have different preferences doesn't make them stupid or insane or brainwashed.
 
windwalker
Gerbil First Class
Posts: 142
Joined: Wed Mar 23, 2011 2:25 pm

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Tue May 29, 2018 4:49 pm

just brew it! wrote:
How so? Seems to me that they just understand that there are many aspects to having a successful business plan, and that Apple seems to have hit on a winning formula.

The Apple zealots who unquestioningly shell out big $ for the latest shiny thing are weirder, IMO.

Indeed, the weird ones are not the people on the side lines scratching their heads in bafflement and disbelief that their snide and patronising disdain is ignored, but the ones who simply spend their own money on what they want to buy and use.
 
Captain Ned
Global Moderator
Posts: 28704
Joined: Wed Jan 16, 2002 7:00 pm
Location: Vermont, USA

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Tue May 29, 2018 4:55 pm

windwalker wrote:
but the ones who simply spend their own money on what they want to buy and use.

Are they buying it because of documented and actual "superior performance" or are they buying it because to do so signals virtue and confirms their membership in a self-selected club?

I've watched Apple since roughly 1978. While they have clearly had periods where they were the top of the heap for PC performance, those days are at least 15 years ago. What was friendly kidding over the RDF circa 1990s has clearly morphed into fetishisation to signal entry into a desired social class.
What we have today is way too much pluribus and not enough unum.
 
jackbomb
Gerbil XP
Posts: 363
Joined: Tue Aug 12, 2008 10:25 pm

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Tue May 29, 2018 5:41 pm

DancinJack wrote:
jackbomb wrote:
I'm giving Apple the benefit of the doubt. Intel hasn't significantly increased single core performance in years. I've been nothing but impressed with just how fast the latest round of iDevices feel. Even full desktop sites render just as quickly on an iPhone as they do on a Kaby Lake laptop. It's nuts!

This kinda confuses me. It's like somewhere in Apple world there is this contingent that thinks Apple can just CRUSH Intel on single-threaded performance whenever they want. Y'all crazy.

I wouldn't put it past them to achieve IPC parity with Intel in the next few years. They're definitely achieving much larger performance gains with each new generation. I wouldn't be at all surprised if they've met or surpassed Haswell levels of IPC with the A11.
Like a good neighbor jackbomb is there.
 
Glorious
Gerbilus Supremus
Posts: 12343
Joined: Tue Aug 27, 2002 6:35 pm

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Tue May 29, 2018 5:42 pm

windwalker wrote:
Glorious wrote:
You are saying the only difference between a "real" Mac with x86 and a "real" Mac with Apple ARM is that you won't be able to hackintosh for cheaper once Apple goes Apple ARM.

I haven't said that at all.


Literally, you did.

Here, let me help you:

windwalker wrote:
That's me. The only Mac I'm interested in is an ARM one.
I see no reason to buy a real Mac when I can hackintosh a cheap PC.


Do you understand deductive logic?

Or can you explain to me what you meant to say as opposed to what you actually did?

windwalker wrote:
Congratulations, you have vanquished the straw man.


Can you even explain what this strawman argument of mine even is?

Because, dude, I know you can't.

windwalker wrote:
Maybe to you. To me it's about price and quality, just like any other product. In the case of computing products performance is the largest part of quality.


And it'll just be faster, because?

Previously you said it "can be", now you are acting as if it is a given?

windwalker wrote:
x86 processors are not particularly useful for any task, they are general purpose.
x86 is the JavaScript of instruction sets: surpassed by most competitors and rendered wholly inadequate yet still widely used because of the cost of breaking the inertia.


Which architecture surpasses x86?

Put your cards up, brah.

windwalker wrote:
Dude, you have a serious problem with reality perception. Where did you get this silly notion that I won't be able to hackintosh any more?


Where are you going to buy an Apple ARM processor from? Currently, you can buy equivalent Intel processors+boards from basically anyone. You're not in the same situation with ARM, at all, for innumerable reasons.

Do you understand literally any of this stuff?

windwalker wrote:
That makes zero sense.
There is no reason, no business case and no market niche for an ARM Mac that is not significantly cheaper than an equivalent x86 model.


WHICH IS WHY YOU USE CHEAPER HACKINTOSHES???!?!?!?!? OMGWTFBBQ-WHO-THE-HECK-ARE-YOU?!!?!

DUDE THOSE ARE -YOUR- WORDS:

windwalker wrote:
I see no reason to buy a real Mac when I can hackintosh a cheap PC.


YOU ARE INSANE, AS I SAID FROM THE START BEFORE YOU EVEN POSTED ANYTHING.

windwalker wrote:
Apple doesn't have a huge markup on Macs. Just because PC makers have the profit margins of potato farmers doesn't make Apple cartoonish evil greedy bloodsuckers.


WHAT DIMENSION HAVE YOU COME FROM?

The Macintosh takes ~50% of ALL profits for the PC sector, year after year.

Macs constitute 5%-10% of all PCs.

YOU KNOW THIS IS TRUE, AS YOU HACKINTOSH "CHEAP PCS"

YOUR OWN WORDS, LET ME HELP YOU x2

windwalker wrote:
I see no reason to buy a real Mac when I can hackintosh a cheap PC


^ DUDE THAT GUY ABOVE WAS YOU THE WHOLE TIME!

windwalker wrote:
Just because other people have different preferences doesn't make them stupid or insane or brainwashed.


I'm not even anti-Apple, I've defended them plenty of times in the past.

You are just moon-barking insane and walking a complete contradiction.


EDIT: My head might explode, where do we find these people?!?

HOW ARE YOU EVEN REAL?
 
NovusBogus
Graphmaster Gerbil
Posts: 1408
Joined: Sun Jan 06, 2013 12:37 am

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Tue May 29, 2018 5:46 pm

Sometimes Motley Fool is insightful, but this is not one of those times. Setting aside the conspicuous lack of any actual sources, I question the author's understanding of how the ARM ecosystem actually works. But I guess it's good flamebait or whatever.

Anyway, here are my predictions:

-Apple will push their products to a high speed ARM implementation over the next few years, mostly because of the absolute-power thing. It will have a bunch of fast-ish cores, but still lose out to Intel on real world IPC.

-Apple will deal with the performance gap by optimizing their OS and in-house software products like Safari for their particular flavor of ARM, like what console makers and their software partners do. The fanboys will loudly trumpet cherry picked, context-free soundbytes as 'proof' that Apple's ARM is the most bestest thing ever, but no one but them will actually care for the same reasons that no one but Xbone/PS4 fanboys cares that Jaguar can run AAA console games.

-Macbook Pro sales will see a noticeable drop in sales due to losing the buyers who installed Windows on top of the otherwise rather good hardware. Cook or his successor will rationalize this on the investor call by noting that the money saved by not having to pay Intel on the other sales more than makes up for the loss. Other product line sales will see no noticeable effect, and draw down over the next few years as the market gets increasingly saturated with 'good enough' legacy Apple hardware.
 
derFunkenstein
Gerbil God
Posts: 25427
Joined: Fri Feb 21, 2003 9:13 pm
Location: Comin' to you directly from the Mothership

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Tue May 29, 2018 8:34 pm

This thread got real entertaining thanks to the moonnbats.
I do not understand what I do. For what I want to do I do not do, but what I hate I do.
Twittering away the day at @TVsBen
 
confusedpenguin
Gerbil Team Leader
Posts: 228
Joined: Tue Nov 15, 2011 12:50 am
Location: Potato State

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Tue May 29, 2018 10:05 pm

The way I understand it ARM processors are decent for performing one task at a time, and to do a different task they need to be switched to do a different task in software. Not a good way to multitask. For raw horsepower, I don't think ARM will be able to beat the x86 design, based on, well, the way ARM is fundamentally designed. https://www.youtube.com/watch?v=X4BxUiqWq8E
 
christos_thski
Gerbil
Posts: 50
Joined: Sat Sep 08, 2012 11:09 pm

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Tue May 29, 2018 10:13 pm

windwalker wrote:
chuckula wrote:
It's WindWalker. He looks at the fact that x86 processors are actually useful for tasks that don't include viewing Apple-Approved content and picking from curated Apple-Approved ideas as the greatest thing that's wrong with the world today.

x86 processors are not particularly useful for any task, they are general purpose.
x86 is the JavaScript of instruction sets: surpassed by most competitors and rendered wholly inadequate yet still widely used because of the cost of breaking the inertia.


I remember when Apple zealots were making that argument back in 1994. They even had, like, graphs and ****. ALL SCIENTIFIC MAN.

Can't find the original ad on my old byte magazine stack, so here's a faithful recreation.

Image

I forget though. How did that turn out?
 
just brew it!
Administrator
Posts: 54500
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Tue May 29, 2018 10:34 pm

confusedpenguin wrote:
The way I understand it ARM processors are decent for performing one task at a time, and to do a different task they need to be switched to do a different task in software. Not a good way to multitask. For raw horsepower, I don't think ARM will be able to beat the x86 design, based on, well, the way ARM is fundamentally designed. https://www.youtube.com/watch?v=X4BxUiqWq8E

I think you're a little misinformed here. The first half of that video is a dog's breakfast, and attempts to perpetuate RISC/CISC distinctions which are no longer true. Although x86's heritage is CISC, all modern x86 designs internally translate x86 instructions into RISC-like "micro-ops". In order to push clock speeds into the GHz range, you need to simplify the operations that the CPU cores do with each clock tick. Everybody does this now. The only remaining advantages of x86 ISA are that it reduces memory bandwidth requirements slightly (by acting as an on-the-fly object code compression algorithm for code that isn't in cache), and binary compatibility for existing x86-only applications. There's nothing inherent in the ARM architecture which locks it into "performing one task at a time".

Intel opened the door for ARM's rise when they failed to address the low-power-moderate-performance market in a timely manner when the mobile device revolution kicked into high gear. The current segmentation of ARM for mostly mobile and x86 for everything else is essentially an accident of tech industry history.

Edit: @christos_thski - Yeah, everything is essentially RISC at its core these days. So I guess you could say that RISC won, but Intel had the last laugh anyway in the desktop/server space by adopting the best ideas from both worlds. AMD also won a moral victory of sorts when they thwarted Intel's attempt to shift 64-bit CPUs to VLIW (Itanium), by convincing most of the industry to move to x86-64 instead.
Nostalgia isn't what it used to be.
 
chuckula
Minister of Gerbil Affairs
Posts: 2109
Joined: Wed Jan 23, 2008 9:18 pm
Location: Probably where I don't belong.

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Tue May 29, 2018 10:40 pm

christos_thski wrote:
windwalker wrote:
chuckula wrote:
It's WindWalker. He looks at the fact that x86 processors are actually useful for tasks that don't include viewing Apple-Approved content and picking from curated Apple-Approved ideas as the greatest thing that's wrong with the world today.

x86 processors are not particularly useful for any task, they are general purpose.
x86 is the JavaScript of instruction sets: surpassed by most competitors and rendered wholly inadequate yet still widely used because of the cost of breaking the inertia.


I remember when Apple zealots were making that argument back in 1994. They even had, like, graphs and ****. ALL SCIENTIFIC MAN.

I forget though. How did that turn out?


The funny point about what windwalker risibly refers to as "logic" is that he inevitably ends up committing heresy by calling Apple a bunch of idiots while pretending that he's their biggest fan.
4770K @ 4.7 GHz; 32GB DDR3-2133; Officially RX-560... that's right AMD you shills!; 512GB 840 Pro (2x); Fractal Define XL-R2; NZXT Kraken-X60
--Many thanks to the TR Forum for advice in getting it built.
 
christos_thski
Gerbil
Posts: 50
Joined: Sat Sep 08, 2012 11:09 pm

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Tue May 29, 2018 10:44 pm

just brew it! wrote:
confusedpenguin wrote:
The way I understand it ARM processors are decent for performing one task at a time, and to do a different task they need to be switched to do a different task in software. Not a good way to multitask. For raw horsepower, I don't think ARM will be able to beat the x86 design, based on, well, the way ARM is fundamentally designed. https://www.youtube.com/watch?v=X4BxUiqWq8E

I think you're a little misinformed here. The first half of that video is a dog's breakfast, and attempts to perpetuate RISC/CISC distinctions which are no longer true. Although x86's heritage is CISC, all modern x86 designs internally translate x86 instructions into RISC-like "micro-ops". In order to push clock speeds into the GHz range, you need to simplify the operations that the CPU cores do with each clock tick. Everybody does this now. The only remaining advantages of x86 ISA are that it reduces memory bandwidth requirements slightly (by acting as an on-the-fly object code compression algorithm for code that isn't in cache), and binary compatibility for existing x86-only applications. There's nothing inherent in the ARM architecture which locks it into "performing one task at a time".

Intel opened the door for ARM's rise when they failed to address the low-power-moderate-performance market in a timely manner when the mobile device revolution kicked into high gear. The current segmentation of ARM for mostly mobile and x86 for everything else is essentially an accident of tech industry history.


I know you weren't responding to my post, but if I remember correctly, the pentium pro was already breaking down x86 instructions into "risc-like" microops while apple was playing up their imagined "x86 is doomed" record for the acolytes (which we're about to see a repetition of, if they ever switch to their own cpus).

Edit : you edited in a response as I was writing this, and we're basically in agreement. it's still funny though, as apple (and to some degree IBM and motorola) were turning a bind eye to the evolution of x86 technology and tilting at imaginary windmills. ;-)
Last edited by christos_thski on Tue May 29, 2018 10:47 pm, edited 1 time in total.
 
just brew it!
Administrator
Posts: 54500
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Tue May 29, 2018 10:45 pm

christos_thski wrote:
I know you weren't responding to my post, but if I remember correctly, the pentium pro was already breaking down x86 instructions into "risc-like" microops while apple was playing up their imagined "x86 is doomed" record for the acolytes (which we're about to see a repetition of, if they ever switch to their own cpus).

Heh. I edited my post just as you were posting this. :lol:

Yes, CPU tech has a complex history.
Nostalgia isn't what it used to be.
 
christos_thski
Gerbil
Posts: 50
Joined: Sat Sep 08, 2012 11:09 pm

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Tue May 29, 2018 10:58 pm

just brew it! wrote:
christos_thski wrote:
I know you weren't responding to my post, but if I remember correctly, the pentium pro was already breaking down x86 instructions into "risc-like" microops while apple was playing up their imagined "x86 is doomed" record for the acolytes (which we're about to see a repetition of, if they ever switch to their own cpus).

Heh. I edited my post just as you were posting this. :lol:

Yes, CPU tech has a complex history.


Indeed it does. Trotting out the tired inaccurate "x86 cisc" argument, as they do, has become some kind of shibboleth for people who for one reason or another dream of PCs going the way of the dodo (from powerpc to the magical "cell" cpu to arm-on-desktop...). It's just.... tiresome. :)

Remember when ... linux on playstation 3 was going to be the end of home PCs? :lol:
 
just brew it!
Administrator
Posts: 54500
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Tue May 29, 2018 11:06 pm

Some other related thoughts...

I do not consider myself to be an Apple fan, but I use a MacBook Pro at my day job since that's what the business unit I work in has standardized on.

On the plus side, it has a fantastic internal display, and the battery life is great.

On the other hand, the keyboard sucks, as does the fact that you can't upgrade the internal RAM or SSD beyond the (limited) stock configuration. And while it is aesthetically pretty, IMO it's "form over function" -- the sharp edges of the meticulously crafted textured aluminum case have cut holes in the backpack I use to carry it around, and the slim form factor means the tiny fan(s) get obnoxiously loud if I put more than a moderate CPU load on the thing.

I guess the non-replaceable battery isn't a big deal for my current use case since IT will just replace the whole laptop for me if the battery starts to go south. I hear the newer MBP keyboards suck even worse though, so I'm going to put that off for as long as possible.

Given the price, I certainly wouldn't pay for one of them with my own hard-earned money.

An ARM-based MBP would be a complete non-starter for me (as well as many of my co-workers), as a lot of us need to run x86 VMs to do our jobs...
Nostalgia isn't what it used to be.
 
just brew it!
Administrator
Posts: 54500
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Tue May 29, 2018 11:09 pm

christos_thski wrote:
Indeed it does. Trotting out the tired inaccurate "x86 cisc" argument, as they do, has become some kind of shibboleth for people who for one reason or another dream of PCs going the way of the dodo (from powerpc to the magical "cell" cpu to arm-on-desktop...). It's just.... tiresome. :)

POWER is still a thing, though it doesn't make the tech headlines much any more. Still used in aerospace, and (with POWER9) they're making another run at the datacenter. No aspirations to return to the desktop AFAIK. (Disclaimer: I currently work for IBM... nowhere near the CPU division though.)

christos_thski wrote:
Remember when ... linux on playstation 3 was going to be the end of home PCs? :lol:

Or when SteamOS/SteamBox was going to be the end of Windows gaming? :lol: (At least that was still x86...)
Nostalgia isn't what it used to be.
 
the
Gerbil Elite
Posts: 941
Joined: Tue Jun 29, 2010 2:26 am

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Tue May 29, 2018 11:36 pm

NovusBogus wrote:
Sometimes Motley Fool is insightful, but this is not one of those times. Setting aside the conspicuous lack of any actual sources, I question the author's understanding of how the ARM ecosystem actually works. But I guess it's good flamebait or whatever.

Anyway, here are my predictions:

-Apple will push their products to a high speed ARM implementation over the next few years, mostly because of the absolute-power thing. It will have a bunch of fast-ish cores, but still lose out to Intel on real world IPC.


I think Apple will actually surpass Intel in IPC but Intel will continue to have higher clocks, especially on the highend. Intel will flirt with 5.0 Ghz turbos in desktops once things with 10 nm settle down. At base clocks on high core count chips (think vs. Core i9) Apple will be surprisingly competitive as Intel doesn't flex their clock speed potential due to power consumption/thermal constraints.

NovusBogus wrote:
-Macbook Pro sales will see a noticeable drop in sales due to losing the buyers who installed Windows on top of the otherwise rather good hardware. Cook or his successor will rationalize this on the investor call by noting that the money saved by not having to pay Intel on the other sales more than makes up for the loss. Other product line sales will see no noticeable effect, and draw down over the next few years as the market gets increasingly saturated with 'good enough' legacy Apple hardware.


For better or worse, there is Windows for ARM. Those that need to dual boot will likely be able to continue.

The bigger issue will be virtualization which requires reworking hypervisors to the new platform.
Dual Opteron 6376, 96 GB DDR3, Asus KGPE-D16, GTX 970
Mac Pro Dual Xeon E5645, 48 GB DDR3, GTX 770
Core i7 [email protected] Ghz, 32 GB DDR3, GA-X79-UP5-Wifi
Core i7 [email protected] Ghz, 16 GB DDR3, GTX 970, GA-X68XP-UD4
 
HERETIC
Gerbil XP
Posts: 488
Joined: Sun Aug 24, 2014 4:10 am

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Tue May 29, 2018 11:46 pm

Captain Ned wrote:
windwalker wrote:
Glorious wrote:
That is entirely about willpower and personal ethics, not instruction sets.

Maybe to you. To me it's about price and quality, just like any other product. In the case of computing products performance is the largest part of quality.

As one who has been exposed to and consciously rejected the RDF since the late 1970s and the Apple ][ and IIe, it seems that the average Apple-bot of today assumes quality arises from price.


YUP-It's a combination of
1.K.I.S.S.
2.Make it pretty.
3.Con purchasers into believing they're paying a high price,because they're getting the best.

And they still can't get the most basic of things right-A keyboard that just works.............................
 
the
Gerbil Elite
Posts: 941
Joined: Tue Jun 29, 2010 2:26 am

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Wed May 30, 2018 12:18 am

just brew it! wrote:
confusedpenguin wrote:
The way I understand it ARM processors are decent for performing one task at a time, and to do a different task they need to be switched to do a different task in software. Not a good way to multitask. For raw horsepower, I don't think ARM will be able to beat the x86 design, based on, well, the way ARM is fundamentally designed. https://www.youtube.com/watch?v=X4BxUiqWq8E

I think you're a little misinformed here. The first half of that video is a dog's breakfast, and attempts to perpetuate RISC/CISC distinctions which are no longer true. Although x86's heritage is CISC, all modern x86 designs internally translate x86 instructions into RISC-like "micro-ops". In order to push clock speeds into the GHz range, you need to simplify the operations that the CPU cores do with each clock tick. Everybody does this now. The only remaining advantages of x86 ISA are that it reduces memory bandwidth requirements slightly (by acting as an on-the-fly object code compression algorithm for code that isn't in cache), and binary compatibility for existing x86-only applications. There's nothing inherent in the ARM architecture which locks it into "performing one task at a time".


The video is bad but the idea that ARM gets to conserve energy by having a simpler instruction decoder. Having aligned instructions of (generally) the same length make fetch and decode pretty simple. Raw performance is still based upon implementation but considering that we are currently limited by power consumption and thermals, this is an edge that ARM developers can leverage. OoO logic on the RISC side gets the benefit of fewer instruction formats which makes finding register dependencies conceptually easier. Again with the ease of implementation, the window is open to lower power consumption.

I would argue that the size of x86 code varies heavily with bloat creeping in due to newer ISA extensions increasing instruction size. This is compounded now that there are a handful of three or more register operands (FMA for example).
Dual Opteron 6376, 96 GB DDR3, Asus KGPE-D16, GTX 970
Mac Pro Dual Xeon E5645, 48 GB DDR3, GTX 770
Core i7 [email protected] Ghz, 32 GB DDR3, GA-X79-UP5-Wifi
Core i7 [email protected] Ghz, 16 GB DDR3, GTX 970, GA-X68XP-UD4
 
just brew it!
Administrator
Posts: 54500
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Wed May 30, 2018 1:04 am

the wrote:
The video is bad but the idea that ARM gets to conserve energy by having a simpler instruction decoder. Having aligned instructions of (generally) the same length make fetch and decode pretty simple.

Given the complexity of modern high performance execution units, the hit from the instruction decoder is likely lost in the noise now unless you're talking about ultra-low power stuff (e.g. early Atom cores) where we're not dealing with OoO. And decoder complexity is partially offset by the reduced memory bandwidth needs to fetch instructions that don't hit the cache.

the wrote:
Raw performance is still based upon implementation but considering that we are currently limited by power consumption and thermals, this is an edge that ARM developers can leverage. OoO logic on the RISC side gets the benefit of fewer instruction formats which makes finding register dependencies conceptually easier. Again with the ease of implementation, the window is open to lower power consumption.

x86 instruction decode is a solved problem for Intel and AMD (and VIA I guess, since they're still making x86 CPUs... but are no longer relevant).

the wrote:
I would argue that the size of x86 code varies heavily with bloat creeping in due to newer ISA extensions increasing instruction size. This is compounded now that there are a handful of three or more register operands (FMA for example).

I would argue that those newer ISA extensions are still a big net win from a performance/watt perspective, and that ARM implementations aren't likely to implement them more efficiently; at best it'll be a wash, given equivalent process tech. Implementing FMA and similar instructions aren't rocket science... doing it efficiently poses similar challenges regardless of ISA.
Nostalgia isn't what it used to be.
 
Pancake
Gerbil First Class
Posts: 161
Joined: Mon Sep 19, 2011 2:04 am

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Wed May 30, 2018 2:25 am

confusedpenguin wrote:
The way I understand it ARM processors are decent for performing one task at a time, and to do a different task they need to be switched to do a different task in software. Not a good way to multitask. For raw horsepower, I don't think ARM will be able to beat the x86 design, based on, well, the way ARM is fundamentally designed. https://www.youtube.com/watch?v=X4BxUiqWq8E


ALL pre-emptive multitasking happens as a result of the operating system scheduler (software) deciding when a certain task gets to run for a bit. It's all software. In the old days there was one CPU executing one thing at a time. You would let a task run for, say, 10 milliseconds. A timer interrupt would occur handling back execution to the operating system which could then switch to a different task (a context switch). These days there are multiple cores with some CPU implementations (ARM and x86) having simultaneous multi-threading providing a larger number of things that can run at the same time. The other somewhat chaotic model used in the old Mac OS was cooperative multi-tasking where a task ran for as long as it wanted until it decided to hand control back to the operating system which could then decide to pass execution on to another task. Which was a really rubbish way of doing things because a buggy or crashy app could hang your entire system and lose all your work.

But it's all software (with some help from hardware timer interrupts).

Why, as a young boy I implemented pre-emptive multitasking on my Commodore 64 allowing me to run multiple BASIC (or machine language) programs at the same time - each getting an even share of the CPU. I boggled at why Apple could only come up with such a rubbish multi-tasking model on their much more powerful hardware. If you know anything about software and hardware you'll see Apple has a long, long history of half-baked rubbish solutions which still attract adoring fans.
 
ptsant
Gerbil XP
Posts: 397
Joined: Mon Oct 05, 2009 12:45 pm

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Wed May 30, 2018 3:03 am

windwalker wrote:
Glorious wrote:
there'd be a non-trivial portion of basically crazy people who would buy the darn thing just -BECAUSE- it was ARM.

That's me. The only Mac I'm interested in is an ARM one.
I see no reason to buy a real Mac when I can hackintosh a cheap PC.
The ARM Mac can be both price and performance competitive with Intel PCs. If wanting that makes me crazy I don't want any of your sanity.


We'll see about performance. Maybe it will be competitive with the low end. Maybe.

But price? No. There is no way you'll be getting a cheap Apple product. Apple does all this effort to increase margins, not to sell cheaper. Apple never competes on price.
Image
 
ptsant
Gerbil XP
Posts: 397
Joined: Mon Oct 05, 2009 12:45 pm

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Wed May 30, 2018 3:13 am

NovusBogus wrote:
-Apple will push their products to a high speed ARM implementation over the next few years, mostly because of the absolute-power thing. It will have a bunch of fast-ish cores, but still lose out to Intel on real world IPC.

-Apple will deal with the performance gap by optimizing their OS and in-house software products like Safari for their particular flavor of ARM, like what console makers and their software partners do. The fanboys will loudly trumpet cherry picked, context-free soundbytes as 'proof' that Apple's ARM is the most bestest thing ever, but no one but them will actually care for the same reasons that no one but Xbone/PS4 fanboys cares that Jaguar can run AAA console games.

-Macbook Pro sales will see a noticeable drop in sales due to losing the buyers who installed Windows on top of the otherwise rather good hardware. Cook or his successor will rationalize this on the investor call by noting that the money saved by not having to pay Intel on the other sales more than makes up for the loss. Other product line sales will see no noticeable effect, and draw down over the next few years as the market gets increasingly saturated with 'good enough' legacy Apple hardware.


I agree with your predictions. Apple can be competitive on (selected) benchmarks and (selected) use cases by optimizing the hell of the vertically integrated software stack. From the CPU to the libraries everything can be tuned to the maximum. When the typical differences between similar CPUs (as constrained by process, cost and power) are small fractions, accumulated gains by software optimization can easily make up for any hardware deficiencies. The proof is in the console market, as you say.

I would also like to note that I have a friend who works in Apple and codes a fundamental part of the OS core. His code already runs on everything that Apple makes, from the iWatch to the iPhone to the Mac Pro. So, Apple is definitely able to transition very big chunks of the OS painlessly to ARM. In fact, the differences between OS X and iOS on a tablet are smaller than what we think. That doesn't mean that Apple WILL migrate everything to ARM, but they certainly want to be able to play that card even if it only serves a small part of their product line or as a negotiating strategy against Intel.
Image
 
Pancake
Gerbil First Class
Posts: 161
Joined: Mon Sep 19, 2011 2:04 am

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Wed May 30, 2018 3:43 am

ptsant wrote:
NovusBogus wrote:
-Apple will push their products to a high speed ARM implementation over the next few years, mostly because of the absolute-power thing. It will have a bunch of fast-ish cores, but still lose out to Intel on real world IPC.

-Apple will deal with the performance gap by optimizing their OS and in-house software products like Safari for their particular flavor of ARM, like what console makers and their software partners do. The fanboys will loudly trumpet cherry picked, context-free soundbytes as 'proof' that Apple's ARM is the most bestest thing ever, but no one but them will actually care for the same reasons that no one but Xbone/PS4 fanboys cares that Jaguar can run AAA console games.

-Macbook Pro sales will see a noticeable drop in sales due to losing the buyers who installed Windows on top of the otherwise rather good hardware. Cook or his successor will rationalize this on the investor call by noting that the money saved by not having to pay Intel on the other sales more than makes up for the loss. Other product line sales will see no noticeable effect, and draw down over the next few years as the market gets increasingly saturated with 'good enough' legacy Apple hardware.


I agree with your predictions. Apple can be competitive on (selected) benchmarks and (selected) use cases by optimizing the hell of the vertically integrated software stack. From the CPU to the libraries everything can be tuned to the maximum. When the typical differences between similar CPUs (as constrained by process, cost and power) are small fractions, accumulated gains by software optimization can easily make up for any hardware deficiencies. The proof is in the console market, as you say.

I would also like to note that I have a friend who works in Apple and codes a fundamental part of the OS core. His code already runs on everything that Apple makes, from the iWatch to the iPhone to the Mac Pro. So, Apple is definitely able to transition very big chunks of the OS painlessly to ARM. In fact, the differences between OS X and iOS on a tablet are smaller than what we think. That doesn't mean that Apple WILL migrate everything to ARM, but they certainly want to be able to play that card even if it only serves a small part of their product line or as a negotiating strategy against Intel.


I predict the opposite. Apple ARM will beat x86 senseless on general purpose code. Not through tight optimisation of some software stack but straight out plain bashing of x86 on general purpose mixed integer/FP code like what I might write and use. General purpose data structures and algorithms used in web browsers, compilers, spreadsheets and word processors. Quite to the contrary to what you might wish for the only place x86 can avoid a hiding is using it's extremely wide vector FPU vector code for audio/video processing or whatever quite specific types of computation. x86 will absolutely have its teeth kicked out. It might be the required stimulus to reignite a new CPU war (to the benefit of us users).
 
ptsant
Gerbil XP
Posts: 397
Joined: Mon Oct 05, 2009 12:45 pm

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Wed May 30, 2018 6:16 am

NovusBogus wrote:
I predict the opposite. Apple ARM will beat x86 senseless on general purpose code. Not through tight optimisation of some software stack but straight out plain bashing of x86 on general purpose mixed integer/FP code like what I might write and use. General purpose data structures and algorithms used in web browsers, compilers, spreadsheets and word processors. Quite to the contrary to what you might wish for the only place x86 can avoid a hiding is using it's extremely wide vector FPU vector code for audio/video processing or whatever quite specific types of computation. x86 will absolutely have its teeth kicked out. It might be the required stimulus to reignite a new CPU war (to the benefit of us users).


Well, I don't claim to have the technical knowledge but I don't see how ARM is intrinsically better. I suppose it comes down to whether Apple can outdo decades of x86 experience and talent and huge patent libraries. And, even though I understand that Apple is ridiculously rich, so is Intel.

So, to refine my prediction a bit, I think it is possible that Apple wins in raw performance over a big chunk of the CPU space (from low-power to high-perf) after 2-3 iterations if they make a series of brilliant decisions. But certainly not with the first iteration of the chip.
Image
 
just brew it!
Administrator
Posts: 54500
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Wed May 30, 2018 6:23 am

Pancake wrote:
I predict the opposite. Apple ARM will beat x86 senseless on general purpose code. Not through tight optimisation of some software stack but straight out plain bashing of x86 on general purpose mixed integer/FP code like what I might write and use. General purpose data structures and algorithms used in web browsers, compilers, spreadsheets and word processors. Quite to the contrary to what you might wish for the only place x86 can avoid a hiding is using it's extremely wide vector FPU vector code for audio/video processing or whatever quite specific types of computation. x86 will absolutely have its teeth kicked out. It might be the required stimulus to reignite a new CPU war (to the benefit of us users).

No idea where you're getting this from. As already noted, x86 is RISC under the hood now; ARM and x86 are using the same tricks to squeeze out additional IPC. I expect things to get increasingly competitive, but there's no reason to expect that ARM will "beat x86 senseless".
Nostalgia isn't what it used to be.
 
the
Gerbil Elite
Posts: 941
Joined: Tue Jun 29, 2010 2:26 am

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Wed May 30, 2018 6:47 am

just brew it! wrote:
the wrote:
The video is bad but the idea that ARM gets to conserve energy by having a simpler instruction decoder. Having aligned instructions of (generally) the same length make fetch and decode pretty simple.

Given the complexity of modern high performance execution units, the hit from the instruction decoder is likely lost in the noise now unless you're talking about ultra-low power stuff (e.g. early Atom cores) where we're not dealing with OoO. And decoder complexity is partially offset by the reduced memory bandwidth needs to fetch instructions that don't hit the cache.


There are several reasons why Atom lost in the ultra low power market and this is indeed one of them. The comparison here is that there are ARM instructions that fly through there decoders: there doesn't need to be a difference between the micro-op and the incoming instruction. For ARM, decoders are still necessary to handle instructions that are implemented via microcode. ARM being RISC didn't inherently help on the performance side but it certainly helped keep power consumption in check to dominate in that area.

The Core series does have one major difference between Atom in this area: decoded instruction cache. This small cache permits the system to by-pass the decoder logic to increase performance and decrease power consumption. The Core series also has the benefit of doing the reverse: it can combine two instructions into a fused micro-op.

I would differ in that the execution units in the backend are relatively simple compared to the OoO logic and dispatching that is done in the front end. The front end is all about keeping those execution units busy by figuring out the most optimal order operations are performed. This also includes breaking up a single instruction into several micro-ops that are executed by the backend.

just brew it! wrote:
the wrote:
Raw performance is still based upon implementation but considering that we are currently limited by power consumption and thermals, this is an edge that ARM developers can leverage. OoO logic on the RISC side gets the benefit of fewer instruction formats which makes finding register dependencies conceptually easier. Again with the ease of implementation, the window is open to lower power consumption.

x86 instruction decode is a solved problem for Intel and AMD (and VIA I guess, since they're still making x86 CPUs... but are no longer relevant).


Intel's decoders are not static. One factor is necessity as Intel is continually adding new instructions which is a given that the decoders have to adapt to these changes. However, Intel is still making more changes. Previously mentioned are instruction fusion and the decoded micro-op cache. Decoders also have to adapt to the execution hardware (ie. one 256 bit AVX instruction is broken into two 128 bit wide micro-ops on the AMD side for example). There has at least been research into fusing multiple 128 bit operations into larger 256 bit or 512 bit operations for execution as well, though I haven't heard of this actually being implemented in hardware.

AMD did also try to solve the decoder problem differently with their Bulldozer family. That didn't turn out well but highlights how to do decode best is being explored.

just brew it! wrote:
the wrote:
I would argue that the size of x86 code varies heavily with bloat creeping in due to newer ISA extensions increasing instruction size. This is compounded now that there are a handful of three or more register operands (FMA for example).

I would argue that those newer ISA extensions are still a big net win from a performance/watt perspective, and that ARM implementations aren't likely to implement them more efficiently; at best it'll be a wash, given equivalent process tech. Implementing FMA and similar instructions aren't rocket science... doing it efficiently poses similar challenges regardless of ISA.


I generally agree here as the execution side of things is a more narrow context of what is being done. The main efficiency to be gained on the execution side is being able to gate the unused part of operations (ie only half of a 256 bit unit needs to be active for 128 bit operations). Both x86 and ARM tend to do this but as SIMD size grows, this will be increasingly important. We have yet to see an ARM SVE implementation in the wild. However, it is the front end that dictates overall utilization of these units. I would argue that this has a greater impact on performance/watt than what can be extracted on purely the execution units in the backend.

The other factor here is that these wider SIMD extension also put a greater pressure on cache in both terms of bandwidth and capacity. If the cache can't keep up, the wide execution units can start thrashing power states which hurts performance/watt.
Dual Opteron 6376, 96 GB DDR3, Asus KGPE-D16, GTX 970
Mac Pro Dual Xeon E5645, 48 GB DDR3, GTX 770
Core i7 [email protected] Ghz, 32 GB DDR3, GA-X79-UP5-Wifi
Core i7 [email protected] Ghz, 16 GB DDR3, GTX 970, GA-X68XP-UD4
 
Glorious
Gerbilus Supremus
Posts: 12343
Joined: Tue Aug 27, 2002 6:35 pm

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Wed May 30, 2018 7:43 am

the wrote:
There are several reasons why Atom lost in the ultra low power market and this is indeed one of them. The comparison here is that there are ARM instructions that fly through there decoders: there doesn't need to be a difference between the micro-op and the incoming instruction. For ARM, decoders are still necessary to handle instructions that are implemented via microcode. ARM being RISC didn't inherently help on the performance side but it certainly helped keep power consumption in check to dominate in that area.


No, the biggest reason was that they were paired with chipsets (which were on ancient process nodes as normal, with no real power management, as normal, etc... all of which was typical and largely fine for desktops but a huge problem for mobile) and other things that literally drew more power than the processors.

Mobile and ultra-low power aren't just the CPU, they are -EVERYTHING-. You *HAVE* to focus on the entire design, from the display, to the support chips, to the support discrete electronics, to the software stack at ALL levels.

Apple does that, they do it extremely well, and they have been doing it for a very, very long time.

Intel, well, they didn't even do that for their own support chips, at least initially, and they never really could for the whole device, and they didn't have much to do with software either.

Apple, again, controls ALL of that.

^ That is the biggest reason BY far.

I mean, the first Atom desktops had like <5w CPUs and like ~20w northbridges and crazy stuff like that.

the wrote:
The Core series does have one major difference between Atom in this area: decoded instruction cache. This small cache permits the system to by-pass the decoder logic to increase performance and decrease power consumption. The Core series also has the benefit of doing the reverse: it can combine two instructions into a fused micro-op.


Uh, there are lot of differences....? They are completely different implementations of the ISA.

Which is sort of the point I (and JBI) have been trying to pound into your skull for years: to a large extent, the ISA is just window dressing. What's actually going on in the shop is largely what matters.

*yes, yes the window dressing has effects. But, as we also keep trying to tell you, the irregularity/complexity of the instruction format for x86 isn't just a drawback. CISC was designed like that to save memory, and by saving memory we save bandwidth/hitting memory.

That has significant power implications in the present day.

You routinely ignore this to rail about how x86 MUST BE DECODED!!!!!!!

Look, I know you're obsessed with *JUST* all this CPU blather of yours, but they are just aren't the only thing that matters when it comes to the power draw of a device. These days, they increasingly aren't even the most important part: When Linux finally mainlined the power "package" SATA <-> power-state stuff in 4.15, that was a 25% power savings on idle.

https://patchwork.kernel.org/patch/9952739/

NOT. JUST. THE. CPU. DECODE. LOGIC.

ARGGHHHHHHHHHHHHH

the wrote:
I would differ in that the execution units in the backend are relatively simple compared to the OoO logic and dispatching that is done in the front end. The front end is all about keeping those execution units busy by figuring out the most optimal order operations are performed. This also includes breaking up a single instruction into several micro-ops that are executed by the backend.


The point is that these things are all integrally wrapped together, and the decode logic is essentially noise as JBI discussed.

The decode logic in the Pentium Pro, a nearly QUARTER OF A CENTURY OLD PROCESSOR, was 40% of the die. That was on a 800nm processor (check my zeros, I think that's right, it was expressed in microns[!!!] back then)!

https://arstechnica.com/features/2004/07/pentium-1/7/

By the time of the Pentium 4, it was "well under 10%"

https://arstechnica.com/features/2004/07/pentium-1/2/

^ that article, itself, is nearly 15 years old!

This is not an issue unless you are in very, very low power designs. Basically sub-watt. We've talked about this before, and you just insist you are right and I'm wrong, except that I continually cite all sorts of evidence and research and you do nothing but reiterate your own personal feelings.

Well, I don't really care about your personal feelings and since I've caught you making significant factual mistakes in these sorts of claims before, so I really just don't think you're the expert you only seem to think you are.

the wrote:
Intel's decoders are not static. One factor is necessity as Intel is continually adding new instructions which is a given that the decoders have to adapt to these changes. However, Intel is still making more changes. Previously mentioned are instruction fusion and the decoded micro-op cache. Decoders also have to adapt to the execution hardware (ie. one 256 bit AVX instruction is broken into two 128 bit wide micro-ops on the AMD side for example). There has at least been research into fusing multiple 128 bit operations into larger 256 bit or 512 bit operations for execution as well, though I haven't heard of this actually being implemented in hardware.


That's not even remotely what JBI said or implied.

He was pointing out that they've been doing it for well over 25 years (The Pentium original had decode then, ARM has decoding now, this isn't as ridiculously manichean as you propose). So, yes, it's a solved problem, because despite people saying that x86 was a dead-end in the late 80s and early 90s, it's now basically the ubiquitous architecture for non-mobile computers. Die? Dude, it murdered virtually everything else. All those competitors, plenty of which were RISC, are now either on life-support or hiding in the hills in extreme niche applications.

ARM was originally a desktop processor. Apple basically bought it, and focused everything about that ecosystem towards mobile. It wasn't because ARM was magically better, the only part of "ARM" that really contributed, sui generis, was the simplicity (for cost reasons primarily) and its CURRENT lack of various cruft.

Now, of course, there are all sorts of cruft and other complexities. We have profiles for the architectures, semi-native bytecode execution, "thumb" instructions, etc...
 
leor
Maximum Gerbil
Posts: 4878
Joined: Wed Dec 11, 2002 6:34 pm
Location: NYC
Contact:

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Wed May 30, 2018 12:17 pm

I'm not an expert (or a doctor), but my guess would be that given physics and thermal constraints are the current bottleneck for getting faster chips, my guess would be if Apple REALLY wanted to produce a competitive chip that was better than Intel in SOME things and competitive in others, they could do it.
 
Buub
Maximum Gerbil
Posts: 4969
Joined: Sat Nov 09, 2002 11:59 pm
Location: Seattle, WA
Contact:

Re: How Apple Dethroned Intel As the World's Most Innovative Chipmaker

Wed May 30, 2018 1:18 pm

Pancake wrote:
The other somewhat chaotic model used in the old Mac OS was cooperative multi-tasking where a task ran for as long as it wanted until it decided to hand control back to the operating system which could then decide to pass execution on to another task. Which was a really rubbish way of doing things because a buggy or crashy app could hang your entire system and lose all your work.

That would describe Windows 3.1 and earlier, as well.

Windows 95 and such had preemptive multitasking, but didn't do memory protection very well. Which is why it was still a lot more "crashy" than Windows NT and newer.

Who is online

Users browsing this forum: No registered users and 51 guests
GZIP: On