| | | Google Says First AI Glasses With Gemini Will Arrive in 2026 Project Aura, a prototype pair of smart glasses developed by Google and Xreal, runs the same Android XR software as Samsung’s recently launched Galaxy XR headset. By Chris Welch December 8, 2025 at 1:00 PM EST
Takeaways by Bloomberg AI
- Alphabet Inc.'s Google is working to create two different categories of artificial intelligence-powered smart glasses to compete with existing models from Meta Platforms Inc.
- Google outlined several software improvements coming to Samsung's Galaxy XR headset, including a travel mode and a PC Connect app that will let any Windows PC connect to the headset.
- Google's new products represent a more refined approach to smart glasses compared with Google Glass, and the company is collaborating with hardware partners including Samsung Electronics Co., Warby Parker and Gentle Monster.
Alphabet Inc.’s Google said it’s working to create two different categories of artificial intelligence-powered smart glasses to compete next year with existing models from Meta Platforms Inc.: one with screens, and another that’s audio focused.
The first AI glasses that Google is collaborating on will arrive sometime in 2026, it said in a blog post Monday. Samsung Electronics Co., Warby Parker and Gentle Monster are among its early hardware partners, but the companies have yet to show any final designs.
Google also outlined several software improvements coming to Samsung’s Galaxy XR headset, including a travel mode that will allow the mixed-reality device to be used in cars and on planes.
Competition is heating up in the emerging category of AI and augmented reality, or AR, glasses. Meta has been the most prolific. It sells reasonably affordable glasses under the Ray-Ban and Oakley brands, which have been met with positive reviews and decent sales, and also recently introduced a more expensive pair with an integrated display. Snap Inc.’s first AR glasses for consumers are also set to debut next year, and Apple Inc. is plotting an entry into the space around the same time.
For Google, the new products — and its new Android XR operating system — represent a more refined, calculated approach to smart glasses compared with Google Glass, an ahead-of-its time product that flopped with consumers a decade ago due to its bizarre design, poor battery life and privacy concerns.
During a demo at one of Google’s New York City offices, I was able to try several different pairs of prototype AI glasses along with an early sample of glasses the search giant is working on with Xreal, codenamed Project Aura.
Like Meta’s popular Ray-Bans, most of these glasses connect wirelessly to a smartphone and rely on the handset for processing requests, whether that’s asking Google’s Gemini AI assistant to play a song from YouTube Music or analyzing ingredients in front of you to come up with a recipe. Having the phone handle so much of the heavy lifting is what enables these glasses to be slim and light enough to pass for regular spectacles.
In a hands-on demo ahead of Monday’s launch, I tried two different smart glasses prototypes with built-in displays: one was monocular, meaning it had a single screen built into the right lens, and the other was binocular, with a display for each eye. Both types support augmented-reality overlays for apps such as Google Maps and Google Meet, but the binocular design offered a bigger virtual display.
Google Maps will be a major use case on the upcoming smart glasses.Photo: Google“We want to give you the freedom to choose the right balance of weight, style and immersion for your needs,” the company said in the blog post.
The company’s goal is to make as many software experiences as possible work well across both formats. In the demo, I tested a real-time translation feature that could show the conversation with onscreen captions — but I could also shut off the display and focus on the translation coming through the speakers.
The screen in particular has its merits. When using Google Maps, the possibilities go well beyond augmented turn-by-turn directions. I could look downward for a larger top-down map of my current location, complete with a compass showing which direction I was facing. When taking a photo and asking Gemini to spruce it up using Google’s Nano Banana Pro generative AI model, I was able to look at a preview of the final result without needing to pull out my phone.
The Xreal eyewear is unique in that it’s a standalone experience: Project Aura runs Android XR like Samsung’s bulkier headset — but in a sleeker form that resembles the Chinese company’s other glasses, including the One Pro. Similar to the Galaxy XR, Project Aura must be plugged into an external battery pack at all times to function.
The glasses will offer a 70-degree field of view, which is more immersive than Xreal’s current lineup. Navigating around Project Aura felt familiar and intuitive, since it uses the same hand tracking as Samsung’s headset, albeit with fewer cameras.
Galaxy XR ImprovementsGoogle and Samsung are reacting to early feedback about the $1,800 Galaxy XR headset, introducing several features that users and reviewers have been asking for. One is the forthcoming travel mode that makes it easier to use the XR while in motion. Before, windows would go flying past you in a car or plane, which made watching a movie on a flight needlessly difficult.
Google is also releasing a PC Connect app that will let any Windows PC connect to the Galaxy XR and mirror their laptop screen in the virtual environment. The app, which is currently in beta, works for gaming, too, which could go a long way toward broadening the headset’s appeal. This is an improvement over the situation at launch, when customers needed one of Samsung’s own Galaxy Book laptops to use the virtual desktop mode. Google is also working on a macOS version, it said.
Additionally, Google is rolling out a new “Likeness” avatar style that’s far more realistic than Samsung’s cartoonish designs. With the help of a smartphone app, Galaxy XR owners will be able to scan their face for an accurate representation of themselves on video calls. Likenesses mimic facial movements and hand gestures. Like the PC Connect app, this feature is launching in beta. Both will be available starting this week.
bloomberg.com . . . . Chat GPT:
According to recent reporting, Google’s “AI glasses” slated for 2026 will use chips from Qualcomm. 36Kr+2Daily CyberSecurity+2
More specifically:- The “new” AI-glasses project (not just the developer-focused Project Aura) is reportedly being manufactured by Foxconn, with design reference from Samsung — and Qualcomm supplying the chipset. 36Kr+1
- For Project Aura itself (the AR glasses shown at I/O 2025), Google + partner Xreal plan to use Qualcomm’s “Snapdragon XR”-series silicon optimized for spatial computing. Yahoo Tech+2LinkedIn+2
If you like — I can check which exact Qualcomm chip model (e.g. XR2, XR2+, XR1 or similar) is expected in Google’s 2026 glasses — might be helpful for comparing with other AR/VR devices. . . . What we know so far - Google’s 2026 AI glasses (different from Project Aura) are being developed with Foxconn (manufacturer), Samsung (reference-design partner), and Qualcomm (chipset supplier). 36Kr+2Daily CyberSecurity+2
- The broader XR ecosystem from Google and Samsung is built around Android XR — an OS designed for advanced AR/VR/XR devices. blog.google+2Qualcomm+2
- For related XR products, Qualcomm’s flagship XR chipset is currently Snapdragon XR2+ Gen 2. That SoC is promoted for next-gen mixed-reality hardware (AR/VR/MR). TechCrunch+2XR Today+2
- For example, a recent XR headset by Samsung using Android XR (closely allied to Google) uses XR2+ Gen 2. UploadVR+1
What is not yet confirmed - The insider report on Google’s 2026 smart-glasses only said “Qualcomm supplies the chips.” It did not name whether this is XR2+ Gen 2 or some other Snapdragon-XR (or future) SoC. 36Kr+1
- Because the glasses are reportedly still in POC (proof-of-concept) / small-batch testing, Google might change chipsets before final release — possibly using a newer XR-class chip when available. KrASIA+2Daily CyberSecurity+2
What makes sense — but remains speculative Given the track record (Android XR + XR-class hardware) and what’s powering companion XR products today, the most plausible candidate for Google’s 2026 AI glasses would be Snapdragon XR2+ Gen 2. That chip is explicitly designed for immersive XR — high-resolution per-eye output, on-device AI, and low-latency passthrough, which align with what one expects from “AI glasses.” TechCrunch+2Qualcomm+2
Additionally, wearables like XREAL’s Project Aura — another Android XR-based glasses project — use a combination: a custom “X1S” chip inside the glasses frames + a separate computing “puck” powered by Qualcomm Snapdragon. South China Morning Post+2Android Authority+2 So Google’s final AI glasses might adopt a similar hybrid — or opt for a single powerful XR SoC.
Bottom line: The chipset is publicly only described as a “Qualcomm” XR-class chip. The best educated guess is Snapdragon XR2+ Gen 2 — but nothing is confirmed. |
|