Apple was starting to fall behind its rivals in artificial intelligence, and the company has decided to make a move in partnership with Google. The alliance between the two companies will bring Gemini to Siri's heart, opening a new chapter for the iPhone assistant and for what Apple encompasses under the Apple Intelligence umbrella.
In the coming weeks we will begin to see the first tangible results: a Siri version powered by Gemini which will debut as a test in iOS 26.4 and will serve as an appetizer for a much deeper overhaul, planned for iOS 27 and the WWDC developers conference in June.
What will the new Siri with Gemini be like in iOS 26.4?

In early January, Apple confirmed a strategic collaboration with Google Gemini to take a leap forward in its assistant. According to multiple leaks, the company plans to release a prototype of the new Siri with Gemini this February, with the intention of launching it to the public in March or April alongside iOS 26.4.
This first wave won't be a major revolution yet, but it will mark a clear change in daily use. Among the features in development, the following stand out: World Knowledge Answers, a feature designed to provide answers based on information from the Internet with explicit references, in a style similar to that of advanced chatbots like ChatGPT.
Another key point will be the memory of context and previous conversationsApple had already experimented with something similar years ago, but it was never officially launched. With Gemini in the background, Siri should be able to follow the thread of the conversation, understand references to what's on screen, and better adapt to the user's needs at any given moment.
Deeper integration with system and third-party applications is also expected. The idea is that Siri can perform complex actions within apps Using only your voice: from searching for a specific photo in the gallery, opening a note to jot down a shopping list or compose a message for work, to more specific tasks depending on each compatible application.
Alongside these new features for Siri, iOS 26.4 will include smaller, more classic changes, such as a new emoji pack (with icons such as an orca, a trumpet, or a treasure chest), improvements to tracking lost AirPods using the Find My app, additional security checks before signing in with Apple ID or iCloud, and support for autofilling cards saved in the Passwords app within third-party applications.
The roadmap: from the February beta to the big leap in iOS 27
The version of Siri we'll see in iOS 26.4 is actually an intermediate step. Leaks like those from journalist Mark Gurman suggest that Apple will present at WWDC in June a much more conversational Siri, reinforced by both Gemini and Apple's own cloud infrastructure.
In that next phase, the assistant should be able to search the web, generate text and images, help with programming tasks, summarize long documents or analyze files that the user uploads to the device, all with an interaction style much closer to what is expected of a modern chatbot today.
Meanwhile, rumors suggest that Apple is working on a Custom chatbot with codename CamposIt is intended to be integrated into iOS 27, iPadOS 27, and macOS 27. The idea is that Campos will eventually replace Siri's current interface for certain tasks, allowing users to both type and dictate commands flexibly, and becoming one of the highlights of this year's WWDC.
If this plan comes to fruition, 2026 would be the year in which Siri goes from being a relatively limited voice assistant to a comprehensive assistance systemwith advanced automation capabilities, context understanding and improved access to online information, combining cloud power with local processing on the latest devices.
However, as was the case with the first Apple Intelligence features, it's likely that the most demanding features of this new Siri will be limited to recent iPhone modelswhich can make a noticeable difference between users depending on the device they have in their pocket.
Apple, Google and the fine print of the Gemini alliance

The agreement between Apple and Google to use Gemini has not been publicly detailed, but several analysts suggest that Apple could pay around $1.000 billion a year in exchange for access to a customized version of this AI model and associated cloud services.
This move is reminiscent of the agreement by which Google already pays Apple billions to be the default search engine within the Apple ecosystem. The difference, in this case, is that Apple wants to leverage the power of Gemini without relinquishing brand prominence: Gemini will not appear as a standalone product on the iPhone, but rather as an engine that powers features under the umbrella of Siri and Apple Intelligence.
From Cupertino, they insist that the company will maintain strict control over privacy and dataAlthough it relies on Google's models, Apple has no intention of opening the door to a "pure" Gemini within iOS and emphasizes that the integration will be done its way, with its own security layer and processing sensitive information under its own standards.
In an internal statement, Apple reportedly went so far as to say that «Google's technology provides the strongest foundation for Apple's core models," while expressing confidence that the collaboration will lead to new experiences that fit with its product philosophy.
Not everyone welcomes this approach. Figures like Elon Musk have publicly criticized the concentration of power that controlling both Android and Chrome, as well as a significant portion of the AI ​​infrastructure that Apple will use, represents for Google. Furthermore, Apple's alliance with OpenAI and now with Google leaves other players, such as Grok of xAI, clearly outside the spotlight of the major platforms.
Privacy and execution: on the device and in the private cloud
One of the points that generated the most doubt was exactly where the new Siri with GeminiApple had presented its Private Cloud Compute as the solution to bring advanced AI features to the cloud without sacrificing privacy, but it was later leaked that it was considering using Google's infrastructure more directly.
In the latest call with investors, Apple CEO Tim Cook sought to put an end to the matter by explaining that the models will run both on the device itself and in a private cloudDesigned to maintain the privacy standards that the company considers differentiating factors, it hasn't provided many technical details, but has emphasized that these principles will remain "industry-leading."
Cook also clarified that, although Apple will continue to independently develop some of its models, the A customized version of Siri that will arrive with Gemini It is a direct result of the collaboration with Google. The stated goal is to offer a more personalized assistant, deeply integrated with apps and capable of understanding natural language more effectively.
Regarding return on investment, Apple is confident that incorporating AI into established products, such as the iPhone, will generate considerable added value and open new business opportunities in its services. For now, however, the company has not specified what percentage of users can currently access Apple Intelligence or how this impacts sales of new devices.
What does seem clear is that the schedule is fairly well defined: Siri's first improved features in mid-February With iOS 26.4, the completely revamped version will be unveiled during WWDC in June and launched on a massive scale alongside iOS 27 later on, in a rollout that will be adjusted according to compatible markets and devices.
Siri's role compared to other attendees and Gemini's arrival at the car
Although the Siri refresh with Gemini points to a significant leap forward, Apple isn't abandoning the approach it has maintained in other areas. In the car, for example, the company is working to ensure that third-party AI assistants work within CarPlayThis would open the door to talking to ChatGPT, Gemini, or Claude from the vehicle's own interface.
The idea being considered is that these assistants could manage complex queries and natural conversations While driving, with an interface adapted to the car's screen and CarPlay controls, questions like "what are the differences between a hybrid and an electric car?" or "give me a summary of today's news" could be answered without touching the phone, with direct responses on the dashboard.
Despite this, Apple does not plan to replace Siri as main control layer in CarPlayThe company's assistant will continue to be responsible for native tasks: sending messages, making calls, playing music, handling navigation with Apple Maps, and even controlling advanced vehicle functions in the most complete versions like CarPlay Ultra.
According to leaks, Apple will not allow it either. replace the physical Siri button or the wake word through commands associated with other assistants. The official justification is security: the company wants to rely on technologies it controls from beginning to end for any function that directly affects driving or the operation of integrated systems.
There's no firm date for when third-party AI will arrive in CarPlay, but sources point to a window of "coming months" to enable supportFrom there, it will be up to OpenAI, Google, and other companies to update their apps to offer a truly well-integrated in-car experience, at least in those markets where regulations allow it.
A change of course in Apple's AI strategy
For quite some time, there was a feeling that Apple was lagging behind in artificial intelligence, without a clear roadmap compared to the push from other tech companies. The combination of the alliance with Google, the bet on Apple Intelligence And internal changes within the AI ​​team point to a more decisive shift.
Back in 2024, the company had already begun making progress with the integration of ChatGPT in certain Siri functionsallowing access to the OpenAI model database to improve responses and offer image generation, although these features were limited to the most recent iPhones and did not reach the entire catalog.
The move to Gemini 3 and the new agreement with Google signal a more stable strategy, with Apple acknowledging its need to rely on well-established partners while developing its own models. This doesn't mean relinquishing control over the user experience, but rather accepting that the AI ​​race is progressing at a pace that demands collaborations of this kind.
For users in Spain and the rest of Europe, the key will be to see which features actually reach each region...in which languages ​​and with what limitations imposed by data and competition regulations. Historically, some features linked to voice assistants have taken longer to arrive outside the United States, and it remains to be seen whether the rollout with Siri and Gemini will be more uniform.
What is emerging, in any case, is a scenario in which Siri will go from being the somewhat rigid assistant that many users had abandoned to becoming the gateway to a whole new world. AI services layer integrated into the operating systemWith Google as the driving force behind the scenes, but with Apple setting the pace, the interface, and, at least on paper, the rules of the game in terms of privacy.

