Check out Google Glass. It’s the next big thing: wearable, context-aware computing.
Google’s doing this with Glass, Apple is chasing the same goal with its rumored smartwatch.
This is quite literally the next web, surpassing the semantic web. It’s contextual computing, and everything’s headed here. If post-PC computing (iPads, tablets) was the last gold rush, context-aware computing is the next. I like Apple’s chances, because it has a huge install base of post-PC devices (iPhones, iPads) and no doubt an ‘iWatch’ will integrate with your iPhone to combine sensor and computing power for the overall user experience.
For the record, I also like Google Glass, too, but I’m not keen on having to wear an optical computer around all the time. Then again, I’ve never used one, so maybe impressions will change. I do think that using a optical approach gives user a true HUD experience, something that won’t be matched by a wrist-mounted computer.
We’ll see. Whatever happens, it will be exciting.