With just a few weeks left until WWDC, new reports have detailed the major changes coming to Siri in iOS 27. Apple is preparing a completely rebuilt virtual assistant powered by Google Gemini, marking one of the most significant updates to the platform in years. The revamped Siri will introduce an always-on, agentic experience that can tap into personal data and take actions across apps, along with a new system-wide search gesture and a dedicated chatbot-like interface.
An always-on agent powered by Gemini
According to the latest reporting, the Gemini-powered Siri in iOS 27 will function as an 'always-on agent' that can access personal data and perform actions across various apps without requiring constant user prompts. This represents a fundamental shift from the current Siri, which primarily responds to direct queries. The new assistant will be capable of proactive suggestions, context-aware automation, and seamless integration with third-party services. For example, it could automatically pull up a calendar event, send a message to a contact, or adjust smart home settings based on time of day or location.
Apple has been working on this overhaul for several years, with early indications appearing in rumored partnerships with Google. The decision to use Gemini, Google's advanced multimodal AI model, signals Apple’s recognition of the need for more powerful natural language processing and reasoning capabilities. Unlike previous versions of Siri, which relied on Apple's own smaller models, the Gemini integration allows for far more complex interactions, including understanding context from previous conversations, handling multi-step tasks, and even generating content like email drafts or summaries.
New system-wide search gesture
One of the most user-facing changes in iOS 27 is a new gesture that allows users to invoke a system-wide 'Search or Ask' feature from anywhere. By swiping down from the top center of the screen, a bar appears in the Dynamic Island with the option to 'Search or Ask'. From there, users can tap a microphone icon to activate voice mode or start typing. The report notes that this interface is similar to the current Spotlight Search but can show more advanced results and additional data from within apps. For instance, a search for 'weather' might display a mini app card with the forecast, or a search for a contact could show recent messages and upcoming meetings.
This gesture represents Apple’s attempt to unify search and assistant functions into a single, intuitive interaction. The Dynamic Island, introduced with the iPhone 14 Pro, now serves as the central hub for Siri interactions, replacing the previous full-screen takeover. When activated from the power button or via the wake word, Siri will appear in the Dynamic Island with options to search or ask, keeping the user’s current task in view. The transparent results card that appears after invoking Siri can be swiped down to enter a full chatbot conversation mode, which looks similar to a text message thread. This mode includes inline mini app cards to display results for weather, appointments, notes, and more.
Dedicated Siri app with chatbot interface
For the first time, Apple is planning a standalone Siri app with a chatbot-like conversational interface. This app will include features like uploads, history, and pins, allowing users to have ongoing conversations with the assistant. The chatbot view closely resembles popular AI chat interfaces from competitors, such as ChatGPT or Gemini, but is deeply integrated with the system. Users can upload images and documents to Siri via a '+' button, and the assistant will be able to analyze and respond to them. This opens up possibilities for tasks like summarizing a document, extracting text from an image, or even generating captions for photos.
The report also indicates that users will have the option to toggle between Siri, ChatGPT, or Gemini as their default search engine within the search bar. This flexibility is a departure from Apple's usual walled-garden approach and suggests a growing recognition that users want choice in their AI assistants. The toggle will be easily accessible, allowing users to switch based on the task at hand. For example, ChatGPT might be preferred for creative writing, while Gemini could be better for data analysis or search.
Background and historical context
Siri was originally launched in 2011 as one of the first consumer-facing virtual assistants, but it quickly fell behind competitors like Amazon’s Alexa, Google Assistant, and later GPT-based services. Apple has made incremental improvements over the years, including on-device processing, improved language support, and integration with HomeKit, but the assistant has been widely criticized for its limited capabilities. The iOS 27 overhaul is seen as Apple’s attempt to catch up and redefine Siri for the age of generative AI.
The partnership with Google is particularly notable given the competitive dynamics between the two companies. Apple has previously used Google as the default search engine on iOS, and this deal extends that relationship into the AI space. However, Apple has also been working on its own large language models, and some analysts believe the Gemini integration is a stopgap until Apple’s own models catch up. Regardless, the move positions Siri as a more versatile and powerful tool for users.
WWDC 2026 is expected to be dominated by AI-focused announcements, with Siri being the centerpiece. The conference’s artwork reportedly features a new Siri animation, which will likely debut alongside the software. The new design is said to be more fluid and expressive, using gradients and light effects that are consistent with iOS 27’s Liquid Glass aesthetic. This visual overhaul extends to other parts of the system, including the Camera app, which will reportedly gain AI-powered features such as real-time object recognition and enhanced computational photography.
Analysis and implications
The new Siri represents a strategic pivot for Apple. By embracing a third-party AI model, the company can offer a competitive assistant without needing to develop the underlying technology from scratch. This allows Apple to focus on user experience and integration while leveraging Google’s AI infrastructure. For users, the benefits are clear: a more capable, context-aware assistant that can handle complex tasks and retrieve information from deep within apps.
However, the reliance on Google raises privacy concerns. Apple has long positioned itself as a champion of user privacy, with features like on-device processing and differential privacy. The Gemini integration likely involves sending some data to Google’s servers, though Apple may implement privacy measures such as anonymization or real-time processing with zero data retention. The company is expected to share more details about privacy safeguards at WWDC.
Another implication is the potential impact on app developers. The new Siri will have deeper access to app data, which could enable new kinds of integrations but also requires developers to adopt new APIs. Apple is likely to introduce new developer tools to facilitate this, and apps that embrace Siri integration could see increased user engagement.
Finally, the addition of a standalone Siri app and chatbot mode positions Apple to compete directly with services like ChatGPT, Gemini, and Microsoft Copilot. While these services are available as separate apps, Siri’s deep system integration gives it an advantage. Users can invoke the assistant from anywhere, get results inline, and continue conversations across devices. This could make Siri the default AI assistant for many iPhone users, potentially reducing the need for third-party apps.
The report also hints at future capabilities, such as real-time translation, advanced automation through Shortcuts, and even proactive health suggestions. As Apple continues to refine Siri’s AI, the assistant may become an indispensable part of daily life, handling everything from scheduling to content creation. For now, the focus is on getting the fundamentals right: a fast, accurate, and private assistant that truly understands and anticipates user needs.
Source: 9to5Mac News