Morning Overview

Apple smart glasses may outshine Meta Ray-Bans with a advanced weapon

Apple is preparing to ship smart glasses by the end of 2026, aiming squarely at Meta’s Ray-Ban smart glasses in a wearable AI market that has tripled in size over the past year. The company’s reported edge is a camera-driven AI system called Visual Intelligence that can identify objects, translate text, and act on what the wearer sees, all without pulling out a phone. That capability, already live on the iPhone, could reshape the competitive balance in a product category where software is rapidly eclipsing hardware as the primary source of value.

Apple Bets on Eyewear Over a Camera Watch

Apple has accelerated prototype development for smart glasses as part of a broader push into AI-powered wearables, according to Bloomberg reporting. The company reportedly shelved plans for a camera-equipped Apple Watch in favor of concentrating resources on eyewear, a decision that signals where Apple sees the highest return for on-device AI. Supply-chain sources described a ramp-up in prototype activity, with a target window by the end of 2026 for a product designed to compete directly with Meta’s Ray-Ban partnership. While Apple has a history of entering categories late, the company typically waits until core technologies and component costs align with its preferred margins and user-experience standards.

The shift from wrist to face is not arbitrary. A camera mounted on glasses sits at eye level, capturing exactly what the wearer is looking at, which gives AI models far richer context than a wrist-mounted sensor ever could. Apple’s broader strategy of pivoting toward camera-centric AI features, such as pointing a device at an object to instantly pull up relevant information, depends on that alignment between the camera’s field of view and the user’s attention. Glasses solve that problem by default, turning the wearer’s perspective into a continuous input stream. That makes eyewear a more natural host for Visual Intelligence than a watch, particularly for tasks like translation, navigation, and real-time assistance in unfamiliar environments.

Visual Intelligence as the Competitive Edge

The feature Apple calls Visual Intelligence already works on the iPhone 16 lineup, and its capabilities hint at what a glasses version could deliver hands-free. According to Apple documentation, the system uses the device camera to identify objects and places, translate and summarize text in real time, and convert physical media like event flyers directly into calendar entries. It also includes on-screen analysis that can read and interpret whatever the camera captures, from restaurant menus to product labels. Porting those functions to a wearable worn on the face would eliminate the friction of raising a phone to scan something, turning a deliberate action into a passive, always-available capability that runs in the background until needed.

That distinction matters because Meta’s Ray-Ban glasses, while popular, still route much of their AI processing through Meta’s own assistant and lack the deep operating-system integration Apple controls on iOS. An Apple glasses user who already owns an iPhone, iPad, or Mac would inherit a unified ecosystem where Visual Intelligence results feed directly into native apps like Calendar, Maps, Safari, and Translate. Meta has no equivalent closed loop. For the roughly one billion active iPhone users worldwide, that tight integration could make Apple’s offering stickier than any competitor relying on third-party software bridges. It also positions Apple to monetize the experience over time through services and upgrades to its broader hardware line, rather than treating glasses as a standalone gadget.

A Market Tripling in Size, With Tensions at the Top

The commercial stakes are rising fast. Smart-glasses sales tripled to more than 7 million units in 2025, according to Financial Times analysis of the wearables market. EssilorLuxottica, the eyewear giant that manufactures Meta’s Ray-Ban smart glasses, confirmed that figure in its own earnings commentary, framing AI glasses as a significant growth driver. But that growth comes with friction. Meta and EssilorLuxottica are now sparring over pricing for the next generation of Ray-Ban AI glasses, a dispute that could slow product updates or push retail prices higher at precisely the moment Apple enters the category. If Meta pushes for lower prices to accelerate adoption while EssilorLuxottica seeks to protect premium margins, the partnership could struggle to keep pace with a vertically integrated rival.

The Financial Times analysis also identified a structural shift worth watching. Value in the eyewear industry is migrating away from frames and brand names toward the software ecosystems running on those frames. That trend favors companies that control both the hardware platform and the AI stack, which is exactly the position Apple occupies with iOS and Apple Intelligence. EssilorLuxottica makes the physical product, but Meta controls the software, and the two companies now disagree on how to split the economics. Apple, by contrast, would own the entire vertical, from chip design through operating system to AI model, eliminating the partnership friction that is currently slowing Meta’s roadmap. For investors tracking this shift, tools like the FT markets dashboard and longer-horizon lenses such as the Monetary Policy Radar help contextualize how changing interest rates and risk appetites could influence capital flows into hardware-heavy bets like smart glasses.

Why the Coverage May Overstate Apple’s Readiness

Most reporting on Apple’s glasses project, including the Bloomberg supply-chain account, relies on unnamed sources and prototype-stage details rather than confirmed product specifications. Apple has not made any public statement about smart glasses, nor has any executive tied Visual Intelligence explicitly to an eyewear product on the record. The connection between the iPhone’s existing Visual Intelligence features and a future glasses product is an analytical inference, not a confirmed plan. Readers should weigh that gap. The technology exists and works on the phone, but whether Apple can miniaturize it into a comfortable, all-day wearable at a consumer price point by late 2026 is an engineering and manufacturing question no leak has yet answered. Battery life, heat dissipation, and privacy-preserving camera design remain open challenges for every company in the category.

The user interface for Apple’s glasses would reportedly depend on speakers, microphones, and a dedicated camera to give the device environmental context, a setup that raises its own design and regulatory hurdles. Subtle audio cues and voice input have to be balanced against social norms around speaking to a device in public, while always-on cameras invite scrutiny from privacy advocates and policymakers. Apple has emphasized on-device processing and data minimization in its broader AI narrative, and any glasses launch would likely lean heavily on those themes to differentiate from rivals. At the same time, the company will be competing not just for consumers but for talent: as AI becomes a core differentiator, rankings such as the FT business-education tables highlight where the next generation of AI and hardware leaders are being trained, and where Apple and its competitors may look to recruit.

What Apple’s Move Means for the Broader Wearables Landscape

If Apple does ship smart glasses by 2026, the move will reverberate well beyond consumer electronics. A successful launch would validate camera-first wearables as a mainstream computing platform, encouraging developers to build software that assumes a continuous visual feed rather than sporadic phone checks. That, in turn, could accelerate adoption in fields like logistics, field service, and healthcare, where hands-free access to instructions or patient data offers clear productivity gains. Enterprises that have hesitated to roll out head-mounted displays may find it easier to justify pilots once a consumer-grade product from Apple normalizes the form factor. For consumers, everyday tasks such as navigating a city, comparison shopping in a store, or documenting home repairs could quietly shift from phone screens to subtle overlays and audio prompts delivered through glasses.

The ripple effects will also be felt in media, advertising, and subscription businesses that depend on understanding how people interact with digital content. As more interactions move from touchscreens to ambient interfaces, publishers and platforms will experiment with new formats optimized for quick glances and spoken responses. The Financial Times, for example, is already segmenting its audience with tailored access options, and tools like its subscription finder underscore how business models are evolving around personalized, multi-device reading. In a world where smart glasses become another gateway to information, the competition will not only be over who ships the sleekest hardware, but over which ecosystems can deliver the most useful, trustworthy, and context-aware experiences without overwhelming users or eroding their privacy.

More from Morning Overview

*This article was researched with the help of AI, with human editors creating the final content.