I've always thought of the PC as somewhat less efficient than other platforms. For example, I've always wondered how systems, such as the Playstation (1) can make do with a 'slow' 33MHz CPU to be able to push games such as Gran Turismo, with all those pretty (for its time) car models. Back in 1995 we can only run those kinds of games using a much faster Pentium at, say, 166MHz.
Because the PC used to have to do all the work on the CPU, with very minor help (like in sound) from other components. Only in the early 90s (or late 80s at best) did we have simple things like not
having the CPU take care of every little disk transfer and sound buffering - an advancement that came courtesy of DMA.
On the other hand, with older consoles (and the Amiga / Atari ST / etc), the CPU was always a relatively minor player. All of them relied on their "GPUs", sound chips and other helps - all fixed-function - to do all the heavy lifting. This was both good and bad. Good because you could make a lot of things happen with very little hardware, but it made for difficult programming and porting, and provided little flexibility when you came across a situation that the helper chips couldn't handle.
PS - This is one of the reasons why one should never, ever compare MHz except when talking about different versions of the same chip, whatever it is