On the road to AR, Facebook snagged some attention this week with the arrival of its first smart glasses. As Apple inches toward the introduction of its own augmented spectacles, are there lessons here about what these things do and how they should work?
What Facebook’s glasses do
First, let’s take a glance at Facebook’s smart glasses. They offer a pair of cameras to take photos and videos, boast a microphone and a speaker, and are controlled using a voice assistant.
Most observers describe these things as similar to Snapchat spectacles. Facebook has called them Stories, so now you know what the Story feature in Facebook most of us never use is for. The glasses are manufactured with Ray-Ban and look like Wayfarers. They cost $299 and Facebook says it wants them to “create a sense of social presence.”
They aren’t AR glasses. Instead, they will let you film everything that happens around you, presumably so Facebook can analyze your habits and pump you up with advertising, which appears to be the social media company’s basic business plan.
The glasses look cool, the touch controls seem cool, but all you’re getting really is music on the move, a voice assistant that also takes calls, and the chance to take photos/videos on demand. These are accessories at best and are designed to work with an app on a smartphone. These smart glasses seem a little dumb.
What can Apple learn from all this? Here’s a few suggestions.
I doubt Apple’s glasses will cost $299. They will network with an iPhone to deliver a range of high-end tools and will be supported by a set of app development tools designed to support what we already know are the key markets for AR: Location-based games such as Pokémon; enterprise and medical use; and access on-the-go to ambulatory information and entertainment. I’ve come across recent claims Apple’s first-generation glasses won’t support AR. I suspect an on-lens information system remains plausible.
Facebook’s offering is just a data-capture device with a voice assistant. Apple will offer more, and probably charge more, too.
If you wear Facebook glasses, don’t be surprised to find yourself asked to leave them at the box office when you attend a live performance, take part in a meeting, visit court, or even wander around some shopping malls. As word spreads, it will become a social faux-pas to wear them at social gatherings. People don’t trust Facebook on privacy.
What Apple learns: We know Apple supports privacy (or used to). When it chooses to introduce its own glasses, it will need to create location-based technologies through integration with Maps that enable destinations to prevent filming from taking place.
It also seems likely an activity light will demonstrate when filming/recording is taking place. Privacy advocates will demand the latter, while certain locations will require the former.
There doesn’t yet appear to be a developer environment around Facebook’s glasses. That’s going to limit what they can do. Apple knows that to create compelling experiences for a vast audience, it must enable a developer ecosystem. This is the approach it takes across all its products, bar some accessories.
Apple’s glasses focus is likely to evolve around augmented reality experiences and information access, such as guides to shopping malls, tourist guides, or even overlaid information to guide medical procedures. (Arguably, medical and some industrial deployments became the key success stories around Google Glass.)
Apple’s work in Arcade means it also has the potential to create its own interactive gaming experiences to support the products it eventually ships. Plus, why not sit back and watch Apple Music or TV+ videos on your glasses when you travel?
Ultimately, Facebook’s strategy seems flawed by lack of preparation. It is only now investing in processor technologies to support its efforts. It has Oculus technologies, but these seem to be on a different tack, focused on pure AR experiences. Integration with the smartphone seems quite limited, Facebook has failed to build a solid reputation around privacy and doesn’t (yet) appear to have built much in the way of a supporting development environment. To be fair, beyond Oculus, Facebook does have several assets it could leverage; I have no doubt it is already iterating more sophisticated glasses in its labs.
Apple, meanwhile, already seems to be working on how to develop health sensors for its future wearables. It is fair to note that most market speculation at present sees Apple introduce a less well-featured first iteration of its glasses next year, before a more sophisticated solution ships a few years down the line.
What it calls to mind
Facebook’s new offering calls to mind Samsung’s early move to rush the introduction of its own smart watch before Apple basically defined the category with Apple Watch. Samsung won a lot of support for its move at that time, but Apple blew it out the water when it followed suit.
Samsung’s Galaxy Watch still has its advocates, (not least because some models are round, which so many people prefer), but while Apple Watch holds about 28% of the market, Samsung has just 7.6%, according to Counterpoint.
It’s an open question the extent to which Facebook will be able to leverage its stained brand to grab market share in the emerging smart glasses market as Apple and others bring the products they have been developing to market in the next year or so.
Facebook’s push seems unlikely to dent the universe, but certainly shows the growing competition between both companies as the battle for the multiverse intensifies.