TLDR;
- Apple introduced real-time translation, content-aware assistance, and enhanced AI tools during WWDC 2025.
- ChatGPT is now more deeply integrated into iOS, iPadOS, and macOS through a privacy-first design.
- The company is opening its on-device AI models to developers, enabling broader AI experiences across apps.
- While Apple’s AI rollout is measured, it signals a significant shift in how the company approaches innovation.
Apple has revealed a sweeping set of AI features designed to reshape how users interact with its ecosystem at the ongoing 2025 Worldwide Developers Conference (WWDC 2025)
Notably, a central theme was the expansion of ChatGPT integration, which now plays a larger role in Apple’s operating systems. From real-time translation in FaceTime and Phone calls to context-aware assistance across apps, Apple made clear it is stepping more assertively into the AI arena.
These features will arrive in iOS 26, iPadOS 26, macOS 26, and visionOS 26 later this year. Users can expect AI to enhance everything from basic messaging to productivity and fitness. For instance, a new feature allows users to analyze on-screen content and either query ChatGPT directly or launch a related Google search. This brings AI to the forefront of everyday user activity in a way Apple has not attempted before.
ChatGPT Comes to Life in Apple Ecosystem
Perhaps the most talked-about feature is the seamless integration of ChatGPT into native Apple apps. Users can now access ChatGPT across a wide array of functions, such as writing assistance, content explanations, and image generation in Apple’s revamped Image Playground app. Crucially, this integration was built to preserve Apple’s longstanding commitment to privacy. OpenAI will not receive any data unless users explicitly grant permission, and the majority of processing remains on-device.
You can now use ChatGPT to create images in Image Playgrounds. #WWDC25 pic.twitter.com/PR1lENE6Om
— AppleInsider (@appleinsider) June 9, 2025
Craig Federighi, Apple’s senior vice president of software engineering, emphasized that Apple’s AI architecture was designed from the ground up with user control in mind. The company is relying on a 3-billion parameter language model that runs directly on devices, allowing for many AI functions to operate without an internet connection. This privacy-first design aligns with Apple’s brand but has raised concerns among analysts who feel the company may be underpowered compared to cloud-driven rivals like Google and Microsoft.
A Shift in Strategy
Meanwhile, a notable shift emerged at this year’s WWDC. Apple is now inviting third-party developers to integrate AI features directly into their apps by offering access to its foundational models. This includes a new set of tools within Xcode 26 that uses large language models to assist with code writing, testing, and documentation. In the past, Apple was known for maintaining a tightly closed ecosystem. This new openness signals a change in philosophy, acknowledging that broader collaboration is essential to keep pace in the AI space.
Developers will be able to leverage on-device AI capabilities without compromising user privacy, maintaining Apple’s core values while accelerating innovation. The move positions Apple not just as a creator of AI features but as a platform on which developers can build their own AI experiences. This could foster a new wave of innovation within Apple’s ecosystem, much like what happened during the early days of the App Store.
Despite the splashy announcements, market response was muted. Some analysts described Apple’s new AI features as incremental rather than groundbreaking. They noted that many capabilities, such as live translations and smart content suggestions, have long been available on Android devices. Apple’s slower, more cautious rollout reflects its internal struggle between advancing innovation and preserving its reputation for privacy and reliability.