Apple has begun an aggressive push to develop AI-powered smart glasses that would compete head-to-head with Meta’s Ray-Ban line, according to reporting from Bloomberg’s Mark Gurman, with an internal target of shipping the product by late 2026. As part of that effort, the company has canceled a planned Apple Watch model with a built-in camera, redirecting engineering talent toward the glasses project instead.
If the timeline holds, Apple would be entering a product category where it has zero shipping hardware and where Meta already has years of real-world sales, user feedback, and manufacturing partnerships behind it. That makes this one of the riskier bets Apple has placed in recent memory, and one of the most consequential for the future of face-worn computing.
Meta’s head start is substantial
Meta did not stumble into smart glasses. The company launched its first pair, Ray-Ban Stories, in 2021 through a partnership with eyewear giant EssilorLuxottica. It followed up with the Ray-Ban Meta glasses in 2023, adding Meta AI voice features and improved cameras, then expanded the lineup further in 2024 with new styles and live AI capabilities powered by its multimodal Llama models.
At its Connect 2024 event in September, Meta went further still. The company showed off Orion, a prototype pair of true AR glasses with a built-in display and a neural wristband for gesture control. While Orion is not yet a consumer product, the demo signaled where Meta’s roadmap is headed: glasses that can overlay information on the real world while reading subtle hand movements, not just glasses that listen and respond through a speaker.
That multi-generational head start gives Meta advantages Apple cannot replicate overnight. Meta has real-world battery-life data from millions of users, a locked-in manufacturing relationship with the world’s largest eyewear company, and a developer community already building for the platform. Apple would need to match or surpass that foundation at launch to avoid being dismissed as a latecomer.
What Apple reportedly brings to the fight
Apple’s pitch, based on Gurman’s reporting, centers on the same ecosystem leverage the company has used to dominate in other categories. Tight integration with iPhone, iCloud, Apple Watch, and AirPods could make Apple glasses feel like a natural extension of a system hundreds of millions of people already use daily. Apple’s on-device machine learning, which powers features like Live Text, Visual Look Up, and the company’s expanding Apple Intelligence suite, would presumably form the AI backbone of any glasses product.
Privacy could be another differentiator. Apple has spent years marketing itself as the company that processes data on your device rather than sending it to the cloud. For a product category that puts cameras on your face in public spaces, that positioning matters. Consumers and regulators alike are already uneasy about always-on recording, and Apple’s established privacy reputation could ease adoption in ways Meta’s data-heavy business model cannot.
But translating those advantages into a glasses form factor is a distinct engineering challenge. A pair of glasses has a fraction of the battery capacity, thermal headroom, and processing power of an iPhone. Running meaningful AI workloads under those constraints is not a solved problem for anyone.
The technical ceiling is still low
A recent preprint published on arXiv in April 2026 helps illustrate just how hard this is. Researchers built an always-on AI agent that runs on Meta’s current Ray-Ban smart glasses, testing continuous visual understanding, language processing, and context tracking throughout a user’s day. The results were illuminating but sobering: the system worked, but battery life dropped sharply under sustained AI workloads, interaction latency spiked during complex tasks, and the researchers flagged significant unresolved questions about what data gets captured and how bystanders are affected.
The paper, which has not yet undergone full peer review, tested Meta’s existing hardware rather than anything from Apple. But its findings apply broadly. Any company shipping AI-powered glasses in 2026 or 2027 will face the same physics: small batteries, limited cooling, and a user who expects the product to last a full day without a midafternoon charge. Apple’s custom silicon expertise, honed through years of A-series and M-series chip development, could help on the efficiency front, but the gap between a phone chip and a glasses chip is enormous.
Pricing and the Vision Pro lesson
Apple’s recent experience with Vision Pro looms over the glasses project. The mixed-reality headset launched in February 2024 at $3,499, a price that attracted developers and enthusiasts but kept mainstream buyers on the sidelines. Sales slowed quickly enough that analysts began questioning whether Apple had misjudged the market’s willingness to pay a premium for first-generation spatial computing hardware.
Meta took the opposite approach with Ray-Ban Meta glasses, pricing them starting at $299, squarely in consumer electronics territory rather than developer-kit pricing. That strategy built a larger installed base faster, which in turn attracted more app development and more media attention.
Apple has not disclosed any pricing for its rumored glasses, and Gurman’s report did not include details on that front. But the company faces a genuine strategic tension: price high and risk another slow-adoption cycle, or price aggressively and accept thinner margins on hardware that may need years of iteration to reach its full potential. How Apple resolves that tension will shape whether the product reaches tens of millions of users or remains a niche accessory for committed Apple loyalists.
Regulatory pressure is building for everyone
Neither Apple nor Meta will ship AI glasses in a regulatory vacuum. The European Union’s AI Act, which began phased enforcement in 2025, imposes new obligations on AI systems that process biometric data or operate in public spaces. In the United States, state-level privacy laws continue to multiply, and federal legislation remains a moving target. Apple’s fiscal year 2025 annual report acknowledges these pressures broadly, citing evolving data-protection rules and competitive intensity across its product lines, though it does not mention smart glasses by name.
For glasses specifically, the regulatory questions are pointed. How do you notify bystanders that a camera is active? Where does AI processing happen, on the device or in the cloud? What happens to the visual data after it has been analyzed? These are not hypothetical concerns. Google learned this the hard way with Google Glass more than a decade ago, when public backlash over recording in social settings helped kill the consumer product before it ever reached wide release.
Apple’s privacy-first branding gives it a rhetorical advantage here, but rhetoric alone will not satisfy regulators. The company will need to demonstrate concrete technical safeguards, and those safeguards will need to work within the power and processing constraints of a glasses form factor.
A wider race is forming
Apple and Meta are not the only companies eyeing this space. Google announced Android XR in late 2024, a platform designed to power both headsets and smart glasses, with Samsung as a hardware partner. Snap has continued iterating on its Spectacles line, most recently releasing a developer-focused AR version. And startups like Brilliant Labs and Even Realities are shipping lightweight smart glasses with AI features at lower price points.
But the Apple-Meta rivalry is the one that will define the category for most consumers. Both companies have the scale, the AI investment, and the distribution to turn smart glasses from a curiosity into a mass-market product. The question is whether Apple can compress years of learning into a single product launch, or whether Meta’s iterative approach, shipping early, learning fast, and improving with each generation, proves to be the more durable strategy.
What to watch between now and late 2026
Several signals will clarify the picture in the months ahead. If Apple begins filing glasses-related patents at an accelerated pace, or if supply-chain reports from East Asia identify specific component orders, the late-2026 timeline will gain credibility. A mention of new wearable categories at WWDC in June 2026 would be an even stronger indicator. Conversely, if Apple stays silent through the summer, a delay into 2027 or beyond becomes more likely.
On Meta’s side, watch for updates to the Ray-Ban Meta line and any movement toward shipping a consumer version of the Orion display glasses. Every month Meta spends as the only major tech company with AI glasses on store shelves is another month of user data, developer relationships, and brand association that Apple will have to overcome.
For now, the confirmed facts are narrow but meaningful: Apple is reportedly building AI smart glasses, Meta already sells them, and the technical and regulatory challenges facing both companies are real and unresolved. Everything else, including launch dates, pricing, and which AI assistant ends up on your face, remains an open question worth tracking closely.
More from Morning Overview
*This article was researched with the help of AI, with human editors creating the final content.