Morning Overview

Report: Apple is planning a Siri camera mode and upgraded visual AI for iOS 27

Apple plans to add Siri as a dedicated shooting mode inside the iPhone Camera app with iOS 27, according to Bloomberg’s Mark Gurman, citing people with knowledge of the plan. The change would place Siri alongside familiar options like Photo, Video, and Portrait, turning the camera into a real-time AI assistant that can identify objects, scan nutrition labels, and pull contact details from business cards. If the plans hold, it would be Apple’s most significant rethinking of the Camera app in years and a direct answer to visual AI tools already offered by Google and Samsung.

What Apple is reportedly building

The core idea is simple: instead of requiring a press-and-hold gesture on the Camera Control button to launch Visual Intelligence, Apple would bring the feature into the Camera app itself, accessible with a swipe. A new Siri-branded mode would sit in the mode carousel, giving users a way to point their phone at something and get information about it without leaving the app they already open dozens of times a day.

Gurman’s reporting identifies several specific capabilities tied to the upgraded mode. Nutrition label scanning would let users aim their camera at food packaging and get ingredient or calorie breakdowns on screen. Contact extraction would turn a photographed business card, conference badge, or flyer into a saved entry in the Contacts app, cutting out the tedious step of typing names, phone numbers, and email addresses by hand.

These additions build on what Visual Intelligence already does. Apple’s own support documentation confirms the feature can identify plants and animals, translate text, summarize documents, read content aloud, and create calendar events from photographed invitations. The underlying recognition and text-processing systems are already running on recent iPhones. What changes in iOS 27, based on the reporting, is how Apple packages and surfaces those tools.

This is not an isolated move. In January 2026, Gurman reported that Apple was rebuilding Siri as a conversational chatbot across iPhone and Mac, framing the effort as a response to pressure from OpenAI and other AI competitors. The camera-specific details reported in late April fit within that broader overhaul, suggesting the Siri mode is one visible piece of a much larger assistant redesign rather than a standalone experiment.

Apple itself has publicly signaled ambitions along these lines. At WWDC 2025, the company described plans to extend Visual Intelligence to on-screen content and to enable actions like searching for similar items through third-party services. The reported iOS 27 changes represent an acceleration of that trajectory, not a departure from it.

Why this matters for iPhone users

The most immediate impact would be discoverability. Right now, Visual Intelligence is tucked behind a hardware gesture that many iPhone owners have never tried. Moving it into the Camera app’s main interface would expose the feature to the hundreds of millions of people who open that app daily to take photos and videos but have no idea their phone can also identify a dog breed, translate a restaurant menu, or pull text from a whiteboard.

Consider a practical scenario: you are at a grocery store comparing two cereal boxes. Today, you would need to know about the Camera Control button, press and hold it, and hope Visual Intelligence can parse the label. Under the reported iOS 27 design, you would open the Camera app, swipe to the Siri mode, and point. The friction drops considerably, and the feature goes from a power-user trick to something anyone might stumble into.

The competitive context matters, too. Google Lens has offered real-time object identification, text extraction, and visual search for years, and Samsung has built similar AI camera features into its Galaxy phones. Apple’s current approach, burying Visual Intelligence behind a button shortcut, has left it feeling like an afterthought compared to those rivals. Placing Siri directly in the Camera app would close that gap in a way users can immediately feel.

It is also worth noting that Apple integrated ChatGPT into Siri with iOS 18.2, creating an unusual dynamic where a competitor’s AI powers parts of Apple’s own assistant. Building out Siri’s visual capabilities natively, through on-device recognition and Apple’s own models, could reduce that dependence over time and give Apple more control over the experience.

What is still unclear

No Apple executive has confirmed the Siri camera mode or the Visual Intelligence relocation. The reporting relies on anonymous sources, and Apple has not commented. Plans described as internal can shift before a public announcement, which for iOS 27 would likely come at WWDC in June 2026, followed by a public release in the fall.

Key implementation questions remain unanswered. Gurman’s reporting does not specify which iPhone models will support the new mode, whether it will require the same hardware as the Camera Control button, or how it will work on older devices that lack that button entirely. The nutrition scanning feature has no confirmed technical partner or database behind it, leaving open questions about regional availability, dietary standards, and data accuracy.

No leaked code, prototype screenshots, or patent filings have surfaced to corroborate the interface changes. That is not unusual for Apple software features months before release, but it means the reporting stands on a single, albeit highly credible, source. Individual capabilities like the depth of nutrition breakdowns or the reliability of contact parsing are more speculative than the existence of a Siri mode itself.

Privacy is another open question. Deeper assistant integration into the camera viewfinder will raise fresh concerns about how much of what the camera sees is processed on-device versus sent to Apple’s servers or third-party cloud services. Apple has historically emphasized on-device processing for sensitive data, but the company has not addressed how that principle would apply to a persistent Siri mode analyzing live camera feeds. Until Apple speaks to this directly, it remains one of the biggest unknowns.

How credible is the reporting

Mark Gurman has accurately previewed Apple software and hardware changes before official announcements on numerous occasions over more than a decade of covering the company. His sourcing to multiple people familiar with the plans adds weight, though it still represents pre-announcement intelligence rather than a confirmed product commitment. Apple’s own public statements and existing feature set align with the direction described, which makes the reporting plausible on its face.

The safest read: the Siri camera mode is likely in development and on track for iOS 27, but the specific feature list and final interface could change before Apple shows anything publicly. WWDC 2026, expected in June, will be the first real test of these claims. If Apple walks onstage and swipes to a Siri mode in the Camera app, the reporting will be validated. If not, it may simply mean the timeline slipped rather than the plan was wrong.

For now, the report offers the clearest picture yet of where Apple wants to take the iPhone camera: from a tool for capturing moments to a tool for understanding the world in front of you, with Siri as the guide.

More from Morning Overview

*This article was researched with the help of AI, with human editors creating the final content.