The biggest takeaway from Apple’s splashy WWDC event earlier this year was the next evolution of Siri in the age of AI. Unfortunately, many of those promising upgrades have yet to become available to the general public, and what is already available isn’t exactly groundbreaking.
The road ahead doesn’t look too bleak, even if redemption seems far away.
According to Bloomberg, Apple is working internally on LLM Siri based on an advanced AI stack that will allow the assistant to make back and forth conversations and handle more complex questions.
LLM, short for Large Language Model, is the secret sauce behind conversation products like OpenAI’s ChatGPT and Google’s Gemini. Apple’s intention with LLM Siri doesn’t stray too far, as the company wants it to behave in much the same way as Gemini.

“The updated Siri will rely on new Apple AI models to communicate more like a human,” the report claims, adding that an announcement will take place sometime in 2025, followed by a release in spring 2026.
That is not catching up with the competition. You can already experience many of these benefits on iPhones. The Siri-ChatGPT integration, which is now live for iOS test builds, can make this happen.
Google recently released the standalone Gemini app for iPhones, which brings Gemini Live calling mode to Apple smartphones as well. This raises an important question: why wait longer than a year when competing products already offer this convenience?
It’s also worth noting that Apple will officially add support for more third-party language models, such as ChatGPT, as part of the Apple Intelligence bundle. According to Bloomberg, Google’s Gemini integration is already in the queue.

Right now, Apple’s attempts to catch up with virtual assistants are alarmingly slow. Google has already transferred many of the Google Assistant responsibilities to Gemini, and its integration with tools like Gmail and Docs is already quite rewarding.
OpenAI has also launched ChatGPT Search, making it easier for users to find information on the web, but in a much more conversational way than Google Search. But that’s not all. The next step for the Microsoft-backed company is a web browser. Perplexity has also launched its own search and shopping products.
The most notable upgrade for LLM Siri will reportedly be the ability to interact with apps. “It will also make more extensive use of App Intents, which will enable more precise management of third-party apps,” the Bloomberg report says.

From a user perspective, letting Siri perform tasks across apps has long been a hit and miss. That future now seems imminent, even if it is still over a year away. But again: Apple won’t be the only fighter in this quest.
Android Authority reports that Gemini in Android 16 (which already has a Developer Preview in the wild) could gain the ability to perform tasks in third-party apps. So far, Gemini’s activities have been limited to Workspace tools such as Gmail, Docs, and Calendar, among others.
It looks like Apple is definitely making the right moves with plans for LLM Siri. But by the time all these prophecies come to fruition, the competition would have raced far ahead with a proven track record of solid conversational AI chops.