Apple is rebuilding Siri as the Campos chatbot this fall, reports suggest

Siri has always been part of the iPhone, but rarely the part that feels like the future. It’s always present for a timer or a command, then out of view again, leaving the heavy lifting to apps, taps, and searches.
Now Apple is preparing a shift that would turn Siri into a true front door for the operating system. Bloomberg reports Apple is gearing up to overhaul Siri later this year into its first full AI chatbot, code-named Campos, in an effort to join the generative AI race led by OpenAI and Google.
The rollout is set up in two phases.
First comes an earlier Siri update that keeps the current interface while adding long-promised Apple Intelligence features, like analyzing on-screen content and using personal data to assist in tasks. That update is pegged to iOS 26.4 in the coming months.
Then comes the larger change later in the year: Campos, a chat-style Siri designed for back-and-forth dialogue and deeper task completion, closer to what users expect from ChatGPT or Google’s Gemini.
Apple aims to previewit at WWDC in June and release it in September as the headline feature for the next major iOS, iPadOS, and macOS releases.
From voice assistant to OS layer
Campos is positioned as more than a smarter assistant. It is described to be deeply embedded across the iPhone, iPad, and Mac, replacing the current Siri interface while keeping familiar triggers like the “Siri” command or holding the side button.
The goal is not simply to provide answers. It is to sit across Apple’s workflows. Campos is expected to search the web, create content, generate images, summarize information, and analyze uploaded files, while integrating into core apps like Mail, Photos, Music, Podcasts, TV. If done properly it could be much more than a chatbot.
Siri finally gets agency
The most telling detail is that Campos is described as capable of interpreting what’s on the screen. It can analyze open windows, suggest next actions, and take commands that move beyond text replies. It’s also expected to control device features, letting users place calls, set timers, or launch the camera.
If that works, Siri stops being a feature you occasionally use and becomes something you rely on to control the device. It even raises a bigger possibility: Campos could eventually take the Spotlight, absorbing device search into a single chat interface.
The Google partnership under the hood
Another twist is the technology. Campos is said to rely heavily on a custom AI model developed by Google’s Gemini team.
At the same time, the report refers to the systems behind Siri as “Apple Foundation Models,” with different versions tied to the rollout. The near-term update reportedly uses “Apple Foundation Models version 10,” while Campos runs a higher-end “version 11” comparable to Gemini 3.
The details are a blurry on purpose. The reporting does not fully spell out whether Apple is licensing Gemini, fine-tuning a Google model, or training a custom made one on Google infrastructure. What it does make clear is that Apple’s chatbot plan depends heavily on Google’s models and perhaps its compute too.
Privacy tension: memory and where compute runs
One issue under discussion is how much the chatbot should remember about its users. Chatbots get more useful when they retain the context, but Apple is said to be considering sharply limiting memory to preserve privacy.
There is also a possible infrastructure shift. While the earlier update reportedly runs on Apple’s Private Cloud Compute, Apple and Google are discussing hosting the more advanced Campos chatbot directly on Google servers using TPUs. That would be a major optical change for a company that has built its brand on privacy.
Apple is also designing Campos so the underlying models can be swapped out over time, giving it the option to move away from Google later if it chooses.
There is one support signal pointing in the same direction. The Information reports that Apple is working on an AirTag-sized AI wearable pin with cameras, microphones, and a speaker, and it could ship as soon as 2027. If Campos really becomes an AI layer across the OS, a small wearable starts to look like a new surface built around an assistant that understands context, screens, and surroundings.
Why Apple is making the turn now
Apple has argued for a long time that users prefer AI woven directly into features rather than a standalone chat experience. But conversational AI has become the default expectation.
The move appears to be a response to the rocky Apple Intelligence rollout in 2024. The urgency also rises as OpenAI is trying to become an AI platform, is building new devices with Jony Ive, and has reportedly poached dozens of Apple engineers.
The integration is the product
Apple can make a chatbot. What's harder is whether it will feel like Apple.
If Campos is fast, truly useful inside core apps, and privacy-credible even while leaning on Google models, it becomes the first Siri upgrade in years that changes behavior instead of just adding a checklist feature. If it isn’t, Siri doesn’t merely stay behind. It has become the place Apple users notice the gap every single day, which they already do.
Y. Anush Reddy is a contributor to this blog.



