What Extensions are
Apple internally calls "Extensions" the system that will allow users to access generative AI capabilities from installed apps, on-demand, through Apple Intelligence — Siri, Writing Tools, Image Playground and more.
In practice: when you ask Siri to draft an email or when you use Writing Tools, you'll be able to choose whether Apple Intelligence (Apple's on-device model) or Claude, Gemini, ChatGPT or another external model generates the response. The choice is made per task or by default.
Why Apple is changing strategy
The reason is simple: Apple Intelligence didn't convince. Launched in iOS 18.1 (October 2024), the market response was lukewarm. On-device models were limited, the renewed Siri kept being delayed, and the integration with ChatGPT (launched as a temporary solution) was clunky.
With competition pressuring (Google integrating native Gemini in Android, Samsung with Galaxy AI), Apple opted for a move unthinkable in 2024: letting in the rivals.
Which models come
Current testing includes Google Gemini and Anthropic Claude. ChatGPT (via OpenAI) has been integrated since iOS 18. There's no official confirmation but Mistral and Meta's open version (Llama) are expected too.
Apple Intelligence will remain the "first step": it processes the request on-device or in Private Cloud Compute. If the Apple model decides it needs a more capable model, it triggers the "extension" toward the chosen model (with user consent and respecting privacy).
Other iOS 27 features
Redesigned Photos: new editing tools with Apple Intelligence. Three key options: Extend (generate content beyond the frame), Enhance (auto-adjust color and light with AI) and Reframe (change perspective of spatial photos). All on-device.
Visual Intelligence expanded: scan nutrition labels, add printed phone numbers and addresses directly to Contacts.
Wallet: scan physical tickets and cards to digitize them.
Safari: automatically name Tab Groups based on content.
Siri at last
Apple's big AI move of the year is the complete redesign of Siri, promised for spring 2026 (June if WWDC tradition holds). It will be more conversational and capable of completing multi-step tasks.
Internal testing points to Siri using Apple Intelligence as an orchestration layer and delegating to Claude or Gemini for complex reasoning. The result should be comparable in capability to ChatGPT Voice or Gemini Live, but with the advantage of full device control.
Implications for developers
For Apple developers, the change opens two opportunities: App Intents already connects apps with Siri; with Extensions, those same intents will be executable with reasoning from the user's chosen external model.
The second: companies will be able to deploy apps that assume any model is available, without lock-in to a specific provider. It's a paradigm shift from the OpenAI lock-in Apple had imposed.
Conclusion
iOS 27 marks the end of Apple's pretense of doing everything in-house in AI. It's a pragmatic concession: the market evolved faster than the internal AI team could execute. For users, the choice is a net win. For Anthropic and Google, it's validation of the consumer market. For OpenAI, an alarm: it stops being the only default.