This post originally appeared in Street Fight.
Apple just entered augmented reality, without most people really noticing. Though the iPhone 7 was met with a collective ‘meh,’ the real impact is below the surface, where the world’s biggest company collides with tech’s biggest opportunity.
Disappointment stemmed from Apple’s failure to launch a VR product, or at least a blatantly obvious one (read: headset). The common sentiment is that Apple is behind in the VR arms race. But with AR — the later and bigger opportunity — it’s right on time, and a bit subtle.
While watching the iPhone 7 launch, two things jumped out at me, rooted in the sensory targets of all media: sight and sound. Sight is where most people think about AR (a la graphical overlays through wearable glasses). But sound is where AR’s form factor could secretly reside.
First, with sight, the dual camera system of the iPhone7 Plus is explicitly about capturing more artful photography through greater depth of field (bokeh effect, etc.). But the hidden implication is more about depth mapping which is a key component of computer vision and “true AR.”
As I discussed last month, true AR goes beyond graphics that are just slapped on the physical world, as in Snapchat stickers and Pokemon Go. AR’s potential rather involves graphics that interact with the real world in dimensionally accurate ways. And depth mapping is the first step.
It’s also notable that Apple isn’t going a wearable (glasses) route. Like Mark Zuckerberg, it believes the nearer-term play is with the smartphone — which also happens to be its core business. For better or worse, the consumer populace is still attached to slabs of metal and glass.
But moving away from the smartphone is where Apple’s second big AR play comes in: sound. Wireless AirPods will not only replace the tangled nest of white rubber that’s standard issue for middle-class pockets and purses, it could represent a new form factor for local discovery.
Unpacking that a bit, Airpods’ sleekness and portability could condition a use case to essentially leave them in your ears at all times. From there it becomes a new channel that’s all about ambient audio. This could make AR’s informational “overlays” less graphical and more audible.
When watching the AirPods’ unveiling, the first thing I thought of was Google’s construct of “micro moments.” These are the user-prompted content snacking moments in the grocery line or subway — pulling out your phone for a quick fix of email, Facebook or Snapchat.
Could AirPods create a new class of micro moments that are audible instead? One advantage is discreetness: It’s less cumbersome than pulling out your phone, and certainly less prone to assuming “glasshole” status. Then again, AirPod theft and loss are going to be a thing.
Of course visual media won’t go away and is more conducive to several content formats. But audio could take over a certain share of micro-moments like getting informed about surroundings. Think: local discovery, news, shopping and social pings. It’s going to kill on trivia night.
All of this also aligns with the rise of personal assistant apps. The arms race between Siri, Alexa, and Cortana has accelerated voice processing innovation. So during the 2-3 hardware cycles before AirPods reach ubiquity, voice interfaces will continue to get ready for prime time.
So what does this all mean for local players? Just as I’ve said for VR and AR more broadly, experiment early and often. Be ready to pivot or port, rather than get caught off-guard as happened to many local media players during the advent of the commercial internet and smartphone.
The good news is that if any of the above pans out, there’s time to develop strategies. Apple has just given us hardware (and hiring) hints of an AR development platform that builds on the existing app economy. The clues are all there, though subtle, both seen and heard.