Your iPhone, iPad, Mac and Apple TV will soon look different from what you have been used to for years. Further, AirPods will be able to do more than just channel music into your ears. iPad is set to become a more versatile tool than ever before with “windowing”. Here are some of our favourite updates that have been announced at Apple’s annual developers conference — WWDC 2025 — at Apple Park in Cupertino.
A new name, a different feel
Apple has introduced a new version number for its many operating systems — it will now be associated with the year. The next set of operating systems will be called iOS 26, iPadOS 26, watchOS 26, macOS Tahoe 26, tvOS 26 and visionOS 26.
Apple has chosen 2026 because the new version of its operating systems usually appears in fall for the public at large, and many of the new features are available the following year. There will now be consistency across platforms. Before the change, Apple’s operating systems included iOS 18, WatchOS 12, MacOS 15 and VisionOS 2.
Apple has also confirmed that its next Mac operating system is called Tahoe, as in Lake Tahoe.
The current iPhone model is iPhone 16, which came out in September 2024. So the next series — most probably called iPhone 17 — is expected to be announced in September, but there’s no word on it being renamed to iPhone 26 rather than iPhone 17.

macOS Tahoe 26 lets users customise the desktop and Dock with new looks for app icons and widgets crafted from multiple layers of Liquid Glass
Liquid Glass magic
The theme running through all the new operating systems is Liquid Glass, which is the biggest design change since iOS 7 (released in 2013). Now, each time somebody says Liquid Glass, one could be interested in going hic-hic, and this may even become the theme of some Instagram Reels.
The design update brings the overall design philosophy closer to that of visionOS. “This is our broadest software design update ever. Meticulously crafted by rethinking the fundamental elements that make up our software, the new design features an entirely new material called Liquid Glass. It combines the optical qualities of glass with a fluidity only Apple can achieve, as it transforms depending on your content or context. It lays the foundation for new experiences in the future and, ultimately, it makes even the simplest of interactions more fun and magical,” said Alan Dye, Apple’s vice-president of human interface design.
The company said the Liquid Glass-powered user interface (UI) can communicate with surrounding content and then intelligently adapt to light and dark modes.
The design element is being implemented through the system. For example, when you swipe up on the iOS 26 lock screen, there is a glass edge. Or take the Camera app as an example — it has menus that are transparent and features are overlaid on top of the camera feed.
To implement the design, Apple had to redesign all the controls, toolbars and navigation UI. Liquid Glass uses real-time rendering to react to movement. The result can be felt on buttons, switches, sliders, text, media controls and even tab bars and sidebars.

Apple's Worldwide Developers Conference began at Apple Park on June 9. Picture: Getty Images
The company is publishing an updated set of APIs to allow developers to begin updating their apps before the feature rolls out to the public at large in the fall.
Controls, toolbars, and navigation within apps have been redesigned. Previously configured for rectangular displays, they now fit perfectly concentric with the rounded corners of modern hardware and app windows.
By using Liquid Glass materials and the new and updated controls, developers have the opportunity to refresh the design of their apps to make every user interaction even more intuitive and delightful.
Microsoft implemented transparency effects in its Windows operating system since the launch of Windows Vista in 2007.
Better multitasking on iPad
Apple is dialling the multitasking experience to 11 on the iPad with the upcoming iPadOS 26. The move may even blur the lines between its many operating systems and it has been a request from users for many years.
At the heart of the multitasking improvements is a new windowing system. Apps continue to launch in full-screen by default, but after that, users can resize apps into windows using the new grab handle. Apps will be able to remember the size and position of the window if you close or minimise them. Further, if an app was once used in a windowed state, it will reopen the same way next time.
Users will be able to flick to tile windows towards the screen edges while long-pressing highlights more tiling options. The windowing system will also be possible within Stage Manager, and it works across displays.
The change will help users to use the iPad more efficiently and for more productive tasks.
Making it easier to manage multiple apps, Expose is coming to iPad. It allows an overview of all open windows, allowing quick switching. There is a new gesture that allows users to swipe home twice to minimise all open apps and windows, thereby returning to the Home Screen.
Another helpful feature: A menubar will be visible at the top of the screen, like in macOS, allowing quick access to key app functions and controls. Like on the Mac, the active app includes familiar menu entries like File, Edit, View, Window, Format, Arrange and Help.
The Files app is also being updated with features like List view, which displays more information about your documents. It features resizable columns and collapsible folders. You can also choose default apps for specific file types. Apple offered an example at WWDC: You can edit photos with Photoshop, Darkroom or Pixelmator.
Boost for Apple Intelligence
Apple is offering a boost to its artificial intelligence offering called Apple Intelligence by opening up the underlying technology it uses to app developers.
Apple developers will be able to build tools and features using the company’s large language models. It will help greatly as Apple is focusing on practical uses for AI, like helping with translations for conversations in real-time.
Apple software chief Craig Federighi said that opening its AI models to developers will“ignite a whole new wave of intelligent experiences in the apps users rely on every day”.
Since the models run locally on Apple devices, that is, without access to Cloud servers, there will be a significant emphasis on privacy. Federighi offered the example of a puzzle app creating a personalised quiz for a user based on the notes on their device.
Access to Apple’s AI models will at first be limited to the smaller ones that work locally on its devices.
“With access to the on-device Apple Intelligence foundation model and new intelligence features in Xcode 26, we’re empowering developers to build richer, more intuitive apps for users everywhere,” said Susan Prescott, Apple’s vice-president of worldwide developer relations.
Federighi said in a statement: “Last year, we took the first steps on a journey to bring users intelligence that’s helpful, relevant, easy to use, and right where users need it, all while protecting their privacy. Now, the models that power Apple Intelligence are becoming more capable and efficient, and we’re integrating features in even more places across each of our operating systems.”
Apple also said it is bringing Visual Intelligence, the company’s AI-powered image analysis technology, to the iPhone screen in iOS 26. For example, if you open a social media app and see a jacket, you can use Visual Intelligence to do an image search for the jacket. To make the search take place, just press the same button as you would use to take a screenshot.
Fun elements

The powerful and intuitive new windowing system lets users fluidly resize app windows, place them exactly where they want, and open even more windows at once
Mix two emoji: With Genmoji, you’ll be able to combine two emoji into one. For example, you might merge the sloth and light bulb emoji if you want to poke fun at yourself for being slow to understand a joke.
Shortcuts versatility: Shortcuts will be able to use Apple Intelligence models to improve your workflows.
Live translation: Across Messages, FaceTime and Phone app, Apple is bringing live translation to the mix. The company’s on-device AI models will translate a message into your recipient’s preferred language as you type. On responding, each message is translated into your language. In FaceTime, you’ll see live captions as the person you’re chatting with speaks, and over a phone call, Apple Intelligence will generate a voiced translation.
Visual Intelligence upgrade: Besides working with your iPhone’s camera, the tool can scan what’s on your screen. It will also benefit from deeper integration with ChatGPT, allowing you to ask the chatbot questions about what you see.