Through a partnership with Luxottica, Facebook CEO Mark Zuckerberg announced at the Facebook Connect conference on virtual and augmented reality (AR) in September that it will begin selling smart glasses in 2021.1 This is the first step toward more advanced AR glasses, which will require many more years of development to produce. The upcoming release of Facebook’s smart glasses represents much more than the availability of just another gadget, however. Zuckerberg’s company is preparing for a new computer age, one in which data and privacy will become even more delicate topics than ever before.
Imagine yourself in a noisy café. A friend walks up to you and speaks; you can hear him clearly because the background noise is muted. Someone else approaches your table. In your field of vision appears a simple question: “Add to the conversation?” With a subtle movement of your finger, you choose yes. The three of you have a quiet conversation; the background noise remains muted. A new question appears in your field of vision: “Accept a video call from Jane Doe?” Moments later, an avatar that completely resembles Jane Doe sits down at an empty chair at your table. Yours is a party of four now.
If Michael Abrash, chief scientist at Facebook Reality Labs, and his team are successful, that scenario will largely be a reality by 2030. That group of friends in the café wear smart glasses, probably Luxottica-made Ray-Bans, and bracelets. The software technology comes from Facebook.
We have been using a graphical user interface (GUI), icons, and click-and-drag options on all kinds of computers for decades. We’ve also gotten used to stumbling through reality with our heads down, fixated on our smartphones. In this new AR smart glasses era, we will stand proudly in the middle of reality with our heads held high. Everything we need to know will be whispered into our ears and appear subtly in our field of vision.
New technological developments will make it possible to control all these features using simple finger and hand movements. And the GUI we are accustomed to using now will make way for the ultralow-friction contextualized AI interface. An invisible butler will anticipate our needs and ask questions only when in doubt. We will communicate with the invisible butler with subtle finger movements, too.
But how does that work? Abrash, who is enthusiastic about motor nerve cells in the wrist, described during the Facebook Connect conference that finger movements of just 1 mm are detectable and even the intention to move a finger can be read. If this is true, the computer can become a pseudo-natural extension of thinking and movement.
Another crucial aspect of Facebook’s project is called Live Maps. This technology consists of three layers:
- The location layer that indicates where things are;
- The index layer that indicates what is known about physical things (eg, this tree is an oak); and
- The ontological layer that indicates why physical, virtual, and conceptual things are important in a given context.
Live Maps are a virtual model of your real existence on this planet. This flow of data is collected and organized from your personal perspective, with the help of the AR glasses, so that the glasses become your personal butler.
All-knowing personal butlers tend to be creepy. It will take years before AR glasses reach consumers, giving us time to get used to the idea and to think thoroughly about the ethics and privacy concerns this technology raises, which is certainly not an unnecessary luxury for Facebook.
Erik L. Mertens, MD, FEBOphth
Physician CEO, Medipolis-Antwerp Private Clinic, Antwerp, Belgium
Chief Medical Editor
1. Hatmaker T. Facebook is launching smart glasses in 2021, its ‘next step’ to an AR device. TechCrunch. September 16, 2020. Accessed October 1, 2020. https://techcrunch.com/2020/09/16/facebook-ar-glasses-2021/