I rarely post here any more (more often lurk on my free time). But I read this article the other day that seems right up the alley of more knowledgeable Gerbil's
http://www.nature.com/news/the-chips-ar ... aw-1.19338
The whole article is a pretty good read I think. One snippet caught my eye in particular
The industry road map released next month will for the first time lay out a research and development plan that is not centred on Moore's law. Instead, it will follow what might be called the More than Moore strategy: rather than making the chips better and letting the applications follow, it will start with applications — from smartphones and supercomputers to data centres in the cloud — and work downwards to see what chips are needed to support them. Among those chips will be new generations of sensors, power-management circuits and other silicon devices required by a world in which computing is increasingly mobile.
Does that seem a little backwards to anyone else? Or maybe I'm misinterpreting it. It almost seems like it would be more costly/troublesome to design a chip for each application. I guess that is already done with smart phones. But I get the impression that it is "re-inventing the wheel" each time. I'm way out of my expertise base. Thoughts?