Following the unexpected success of the Meta Ray-Ban smart glasses released at the end of 2023, there’s definitely some traction in the smart glasses space and both Meta and Snap presented in recent days new prototypes in an attempt to take an early lead in what could become a major computing platform in the future.
Despite all its features (phone calls, music streaming, photos and now an AI assistant), the Meta Ray-Ban device is not yet the long-awaited augmented reality (AR) glasses, as the lenses are just lenses and not displays capable of overlaying digital images on real-world environments. But it has been obvious since Day 1 that the Meta Ray-Ban would be one of the first iterations of a device that will be much more ambitious in the future and come up with AR/spatial computing technology at some point.
It’s then no wonder that Meta unveiled this week an AR glasses prototype, called Orion, that integrates a holographic display, offers multitasking windows and big-screen entertainment and, obviously, comes with an AI assistant that understands the world around it.
Meanwhile, Snap’s Spectacles make heavy use of AR filters over real-world objects, operate via voice commands and hand gestures, and are launching (for developers only) with apps including a golf simulator and a virtual pet.
Both glasses will not be available to consumers as Meta and Snap still have to tackle various challenges before turning them into consumer products. Both prototypes are indeed bulky (notably when compared to the Meta Ray-Ban) and need to be slimmed down, the AR displays need to improve (relatively low resolution for the Orion, low field of view for the Spectacles) and the battery life is still limited (45 minutes for the Spectacles).
But clearly, we are getting closer to fully functional AR glasses that could hit the market in the next couple of years and achieve smart glasses’ stated goal: allowing users to have access to their daily digital apps, hands free – in other words, complementing or even replacing the smartphone…
Apple’s initiatives in the space should also be monitored as it’s likely that the Tech giant will redirect at some point its Vision Pro spending and spatial computing technologies towards smart glasses that have much more mainstream potential.
In conclusion, more than ten years after Google’s pioneer initiative, smart glasses could soon become reality thanks to a flurry of advances in display/audio technology, spatial computing and AI and turn into an everyday device/computing platform, just like the smartphone. Should this scenario unfold, AR glasses would open up a massive market to connectivity/AI chips makers (e.g. Qualcomm provides the main chip for the Ray-Ban Meta), optical/photonics specialists for the optical waveguide (which basically consists of a set of reflective surfaces transmitting images from a tiny projector located in the glasses’ frame to the user’s eyes) and display makers (Sony’s displays have been selected for both Apple Vision and Xreal).