Facebook Reality Labs is marching forward on the path to augmented reality (AR) in what it calls a “paradigm shift in how humans interact with computers” by way of a pair of artificial intelligence-driven glasses.
“Imagine a world where a lightweight, stylish pair of glasses could replace your need for a computer or smartphone,” posits the Facebook Reality Labs blog. The blog promotes the benefits of the device as “contextually-aware AI to help you navigate the world around you, as well as rich 3D virtual information within arm’s reach.” It also capitalizes on the current sentiments surrounding social isolation due to harsh lockdown measures installed by governments in many countries, saying the technology will give users some much-needed reprise by granting the “ability to feel physically present with friends and family — no matter where in the world they happened to be.”
Facebook Reality Labs compares its innovation to the development of the mouse and the operating system like Windows, called a “graphical user interface,” first developed in the 1960s, which ultimately paved the way for the home and personal computing we rely on for nearly all facets of our lives today.
Facebook says the “all-day wearable AR glasses” will be a new paradigm “because they will be able to function in every situation you encounter in the course of a day. They need to be able to do what you want them to do and tell you what you want to know when you want to know it, in much the same way that your own mind works — seamlessly sharing information and taking action when you want it, and not getting in your way otherwise.”
Success
You are now signed up for our newsletter
Success
Check your email to complete sign up
The social media giant says it is developing the “next computing platform centered around people,” which will rely on an all-new method of human interface with computers, and says it has developed a “set of principles for responsible innovation” to drive development in a “responsible, privacy-centric way.”
The principles are composed of four parts:
- Never Surprise People
- Provide Controls That Matter
- Consider Everyone
- Put People First
They also say “teams are required to include privacy goals as part of their planning process” in AR development.
Facebook Reality Labs will change human life with artificial intelligence data mining
The team at Facebook Reality Labs gives an example scenario for how their all-day wearable AR glasses will function in the real world. The project is sold on an idealistic ground that involves you wearing AR glasses and a “soft wristband” before you leave your home for the day. Your artificial intelligence “Assistant” asks if you want to listen to a favorite podcast and you click on a “play” button with a flick of your wrist-powered wristband.
The Assistant, driven by AI data mining of your behavior and tendencies, asks if you would like to order your standard 12 ounce Americano when entering a local coffee shop, where you then sit down at a table to begin work, which you no longer require a laptop for.
“You head to a table, but instead of pulling out a laptop, you pull out a pair of soft, lightweight haptic gloves. When you put them on, a virtual screen and keyboard show up in front of you and you begin to edit a document. Typing is just as intuitive as typing on a physical keyboard and you’re on a roll,” says the blog.
In what was perhaps a teaser of things to come, Andrew Bosworth, VP of Facebook Reality Labs, posted in May on Twitter about floating monitor-like displays that can be resized with gesture control, adding in a reply to his original tweet, “This is real footage using prototype headsets.”
The example continues, “but the noise from the cafe makes it hard to concentrate,” revealing that the “Assistant,” using the technology in the glasses, will also have the ability to affect how your brain hears the sounds around you. “…special in-ear monitors (IEMs) and active noise cancellation [will] soften the background noise. Now it’s easy to focus. A server passing by your table asks if you want a refill. The glasses know to let their voice through, even though the ambient noise is still muted, and proactively enhance their voice using beamforming. The two of you have a normal conversation while they refill your coffee despite the noisy environment — and all of this happens automatically.”
The blog explains the nuances of how their “soft wristband” is designed to work with a “range of neural input options, including electromyography (EMG).”
“While several directions have potential, wrist-based EMG is the most promising. This approach uses electrical signals that travel from the spinal cord to the hand, in order to control the functions of a device based on signal decoding at the wrist.”
The company says EMG is so precise it can detect a movement as fine as a single millimeter and that “ultimately it may even be possible to sense just the intention to move a finger.”
In September, CEO Mark Zuckerberg announced that Facebook Reality Labs would be partnering with Ray-Ban to make the glasses in a live stream from Facebook Connect.
According to a CNBC report, the tech giant began a rollout of a prototype to its own employees to be used around the office. While the Project Aria prototype used by staff was not for sale to the public and did not possess AR features, “The devices will capture video, audio, eye-tracking and location data that Facebook can use to aid its development of augmented reality smart glasses.”
“It’s a research device that will help us understand how to build the software and hardware necessary for real, working AR glasses,” said Bosworth to CNBC.
Follow us on Twitter or subscribe to our email list