Mark Gurman, senior tech journalist at Bloomberg, revealed in a recent interview that Apple smart glasses will launch this year, designed explicitly to counter Meta’s Ray-Ban dominance and what he calls pulling the rug out from under the social media giant’s momentum in wearables. This aggressive timeline marks Apple’s first major push into smart glasses since the company shelved earlier AR headset concepts, signaling a shift from the Vision Pro’s full-display approach toward something far lighter and more practical.
Key Takeaways
- Apple smart glasses launching 2026 with cameras, no on-lens display in first generation
- Glasses designed to work with Apple Intelligence for visual queries via Siri
- Camera quality emphasized as superior to Meta Ray-Bans, includes LiDAR-style environmental sensing
- Related wearables include AirPods with cameras and an AI pendant for voice-only interactions
- Device will be iPhone-dependent, functioning as an accessory rather than standalone
What Apple Smart Glasses Launch Will Actually Look Like
Gurman’s description of the Apple smart glasses launch strategy reveals a pragmatic first-generation product that mirrors Meta’s Ray-Ban formula rather than attempting full augmented reality displays from day one. The glasses will feature cameras mounted on the lenses and sides, enabling users to photograph and record video, but critically, they will not project information directly onto the glass in front of your eyes. Instead, users will ask Siri questions about what they are looking at, and responses will come through audio or on a connected iPhone screen, not on the glasses themselves. This approach sidesteps the technical challenges that have plagued AR display development for years while still delivering practical AI-powered vision features.
The camera system represents where Apple intends to differentiate. Rather than copying Meta’s camera quality, Gurman indicates Apple is building higher-end optics with a dual-camera setup—one primary camera for video capture and a second sensor similar to iPhone LiDAR for environmental context and depth sensing. This configuration enables features like live translation, landmark identification, plant and animal recognition, and turn-by-turn directions overlaid through audio cues or the iPhone. The glasses will include microphones for voice control and likely an LED indicator to show when the camera is actively recording, addressing privacy concerns that plagued Google Glass years ago.
Apple Smart Glasses Launch Powered by Custom Silicon and Siri
The processing power behind the Apple smart glasses launch comes from a custom system-on-chip based on Apple Watch architecture, allowing on-device AI processing for certain tasks while offloading heavier computations to a paired iPhone. This design choice keeps the glasses lightweight and battery-efficient—critical for all-day wearability—while ensuring that Apple Intelligence features like visual understanding, translation, and object recognition benefit from the computational power of the iPhone ecosystem. The glasses will support multiple frame styles in metal and plastic finishes with color options, matching Apple’s approach to the Watch, where fashion and customization drive consumer appeal beyond pure functionality.
Controls will be straightforward: tap gestures on the frame to capture photos or trigger actions, and voice commands to Siri for queries about what you are viewing. This simplicity contrasts with more complex gesture systems and avoids the social awkwardness that derailed Google Glass, which was perceived as intrusive because it looked like you were constantly recording. Apple’s glasses, by design, invite less suspicion—they resemble stylish Ray-Bans rather than futuristic surveillance devices.
Apple Smart Glasses Launch as Part of a Broader Wearable AI Strategy
The Apple smart glasses launch does not exist in isolation. Gurman revealed that Apple is simultaneously developing AirPods equipped with cameras for AI-powered awareness of your surroundings, and an AI pendant or pin worn on clothing for voice-based interactions without requiring a display or full glasses. This ecosystem approach allows users to adopt Apple Intelligence at different price points and use cases—the glasses for visual tasks, AirPods for ambient awareness, and the pendant for pure voice commands. The pendant is particularly interesting as an entry-level product, offering AI assistance without the complexity or cost of smart glasses, and initially without any on-device display.
All three devices remain tethered to the iPhone, functioning as accessories rather than independent computers. This dependency is both a limitation and a strength: it keeps costs down, ensures security through Apple’s ecosystem, and leverages the processing power users already carry in their pocket. The glasses may integrate with Find My, Apple’s location-tracking network, adding another layer of utility.
How Apple Smart Glasses Launch Compares to Meta Ray-Bans
Meta’s Ray-Ban smart glasses have captured early momentum in the smart eyewear category by shipping a functional, fashion-forward product that works as a standalone camera device with basic AI features. Apple’s approach borrows the camera-centric, display-free formula from Meta but intends to differentiate through superior optics, tighter iPhone integration, and deeper Apple Intelligence features. Where Meta Ray-Bans excel at simplicity and independence, the Apple smart glasses launch will emphasize visual AI—identifying plants, translating signs, describing scenes—powered by the computational resources of the iPhone and Apple’s AI models.
The key strategic insight from Gurman is that Apple is not trying to leapfrog Meta with a full AR display; instead, it is acknowledging that first-generation AR glasses with on-lens displays remain technically immature and socially awkward. By launching a camera-first product, Apple can ship something useful immediately while building the software ecosystem and gathering data that will inform future generations with actual AR displays. It is a pragmatic acknowledgment that the technology is not ready for what consumers actually want, but the cameras and AI are ready now.
When Will Apple Smart Glasses Launch?
Gurman stated the glasses will launch this year, meaning 2026, though some reports suggest production may begin in 2026 with a full market launch potentially slipping to 2027. No official Apple announcement has confirmed these timelines, and the company has not disclosed pricing or exact availability details. The urgency to launch reflects Apple’s recognition that Meta is building an early lead in smart glasses adoption, and waiting another two years would cede the category entirely to a competitor.
Will the Apple smart glasses have a display on the lenses?
No, not in the first generation. Gurman explicitly stated that full AR displays on the lenses will not happen initially, despite being technically possible in future versions. The first-generation glasses will be camera-only, with all visual feedback delivered through audio or an iPhone screen. This design choice prioritizes practicality and battery life over the flashy AR experiences that have proven difficult to execute at scale.
Can you use Apple smart glasses without an iPhone?
The glasses will be heavily iPhone-dependent for full functionality, though some on-device AI capabilities may work independently. This tethering strategy keeps the glasses lightweight and affordable while ensuring that premium features like translation, object recognition, and detailed scene descriptions leverage the iPhone’s processing power. Standalone operation is not a design goal for the first generation.
What other Apple AI wearables are coming alongside smart glasses?
Apple is developing AirPods with built-in cameras for environmental awareness and an AI pendant for voice-only interactions without a display. These products expand the smart glasses launch into a broader wearable AI ecosystem, allowing different users to engage with Apple Intelligence at varying levels of complexity and cost.
Apple’s smart glasses launch represents a calculated competitive move against Meta’s early dominance, not a revolutionary leap forward in AR technology. By shipping a camera-first, AI-powered device that integrates smoothly with the iPhone, Apple can capture market share while avoiding the display pitfalls that have plagued AR development. The real battle will not be won by the first-generation glasses themselves, but by how quickly Apple can build software features that make them indispensable. If the company executes on visual AI and ecosystem integration, the glasses could redefine how people interact with information in the physical world. If it stumbles on software or pricing, they will be a footnote in the history of wearables. The 2026 launch will reveal which path Apple has chosen.
This article was written with AI assistance and editorially reviewed.
Source: Tom's Guide


