Apple Throws In The Towel, Asks Google To Design A Custom Gemini LLM For Siri
Apple Throws In The Towel, Asks Google To Design A Custom Gemini LLM For Siri
Homepage   /    technology   /    Apple Throws In The Towel, Asks Google To Design A Custom Gemini LLM For Siri

Apple Throws In The Towel, Asks Google To Design A Custom Gemini LLM For Siri

🕒︎ 2025-11-03

Copyright Wccftech

Apple Throws In The Towel, Asks Google To Design A Custom Gemini LLM For Siri

Apple appears to have conceded defeat apropos its in-house Siri revamp strategy, and is now leaning on Google to design a custom Gemini-based Large Language Model (LLM) to power the new Siri in the cloud. Mark Gurman: Apple is paying Google to design a custom Gemini AI model for its Private Cloud Compute framework The legendary Apple tipster, Bloomberg's Mark Gurman, reported in his latest 'Power On' newsletter that the Cupertino giant seems to have thrown in the proverbial towel when it comes to creating an in-house AI model to power the revamped Siri's upcoming features, all couched under the Apple Intelligence banner. Instead, Apple is now reportedly paying Google to create a custom Gemini-based AI model for its Private Cloud Compute framework, where relatively simple AI tasks are performed by using computational resources of the device itself, while the more complex tasks are offloaded to Apple's private cloud servers using encrypted and stateless data. Do note that the upcoming Siri revamp will have three major components: Query planner - this is Siri's decision-making layer that decides on the most efficient route to fulfill a given user request, with available options including web search, accessing personal data such as calendar entries or photos, or using a third-party app via App Intents, which itself is a framework that makes a given app discoverable and usable by Siri, allowing users to perform certain in-app actions without having to physically open the app. Knowledge search system - Siri will gain a general knowledge database to answer trivia queries without having to resort to ChatGPT, other third-party AI integrations, or web-based results. Summarizer - this is a core tool within Apple Intelligence, allowing Siri to leverage third-party AI models such as ChatGPT to summarize a given text or audio snippet, including: Notification summaries Mail and Messages summaries Safari webpage summaries Writing tools Audio summaries Under Apple Intelligence's evolving architecture, all on-device AI processing will leverage Apple's bespoke foundational models. Complex queries, however, would be offloaded to Apple's private servers using encrypted and stateless data to maintain user privacy, where Google's custom Gemini-based AI model would process those queries. This comes as Bloomberg reported back in August that Apple engineers were struggling to ensure that Siri performed adequately across apps, and in critical scenarios such as banking. With this new approach, Apple appears to be trying to compensate for critical lacunae within its own AI expertise. Of course, the Cupertino giant is still expected to market the revamped Siri as a core Apple technology, which uses Apple's backend servers and a bespoke interface. Do note that Apple has been working to introduce a number of key Apple Intelligence features with its Spring 2026 iOS update (iOS 26.4 most likely). These include: In-app Actions Siri would be able to carry out context-based tasks within supported apps via voice commands. This includes adding an item to a grocery list, sending a message through the messaging app, or playing music. Personal Context Awareness Siri would soon be able to leverage personal data to offer tailored services such as scouring the Messages app to find a specific podcast mentioned in a text conversation. On-Screen Awareness Siri will soon gain the ability to understand the content that is on the screen and perform a number of agentic tasks as a result.

Guess You Like