I look at Moore’s Law in two ways. One is the traditional getting more into a chip. The other is the overall power of a PC. That includes the GPU as well as the CPU. Unless you’re strictly heavy number crunching, adding more CPU power doesn’t really do much for you. You can’t type any faster. Displaying text seems immediate, and so on. But for graphically intensive work like games or photo editing, the GPU becomes much more important, and those seem to be getting more noticeably powerful all the time.
I think we’ll see some company, probably not Apple, make a leap with smart watches like Apple did with the iPhone. Google Glass was intro technology. In some ways it’s like Apple Watch. Too noticeable for many people. Now it’s a question of making the concept work in the real world. I can see an add-on for people who already wear glasses or sunglasses that’s unobtrusive. Maybe connected to your smart phone for the heavy lifting. And at some point, the circuitry will be small enough to add to contact lenses. Bob
But I’ve been editing highres photos on cheap laptops without too much hassle for a long while… similarly, while it’s steps behind the highest end PCs, I’m hard-pressed to spot what’s different between games on my Xbox One and my Xbox 360. Things feel like they’ve really hit a plateau – some of that’s “well things are ‘good enough’”, lets focus on power consumption etc, and some of that’s hitting bad quantum limits in how small you can sketch pathways on a chip…Kirk
These two guys are great when they get chatting.