What’s New in the iOS SDK

Learn about the key technologies and capabilities available in the iOS SDK, the toolkit you use to build apps for iPhone, iPad, or iPod touch. For detailed information on API changes in the latest released versions, including each beta release, see the iOS & iPadOS Release Notes.

iOS 15 SDK

With the iOS 15 SDK, you can build apps that create new kinds of shared experiences with SharePlay and the Group Activities API. Swift 5.5 introduces concurrency support, built into the language with async/await and Actors. Focus and notifications help users concentrate on what matters most, and provide new APIs for your app to differentiate which notifications users need to see most. ARKit and RealityKit provide powerful customization capabilities to help your AR experiences look even more convincing. Create ML gets easier and more powerful with Swift and playground integration, as well as on-device training. And web extensions come to Safari on iOS and iPadOS for even more flexible and powerful browsing experiences.

SharePlay and Group Activities

SharePlay offers a new way for people to share your app. Media streaming apps can let users share content through the new Group Activities API with full-fidelity video and all syncing handled by the system. And for shared experiences beyond media streaming, the GroupSessionMessenger API offers a secure data channel that syncs information between multiple instances of your apps across multiple users.

View documentation

Focus and notifications

With Focus, users can have notifications delivered at times that work best for them, and with the Interruption Levels API, you can provide more nuanced delivery with one of four interruption levels (including new Passive and Time-Sensitive levels). Notifications from communication apps now have a distinctive appearance, and these apps can — with user permission — sync their status to reflect the user’s current system-level Focus status.

View documentation


SwiftUI brings new features, such as improved list views, better search experiences, and support for control focus areas. Gain more control over lower-level drawing primitives with the new Canvas API, a modern, GPU-accelerated equivalent of drawRect. And with the new Accessibility Representation API, your custom controls easily inherit full accessibility support from existing standard SwiftUI controls.

Learn about SwiftUI


UIKit introduces sheet presentation controllers, which let you present your view controller as a customizable, resizable sheet. UIKit provides new APIs for configuration buttons, displaying pop-up buttons, a new chromeless bar appearance, image decoding, and creating a thumbnail version of an image. And starting in iOS 15, drag and drop on iPhone is enabled by default.

View documentation

Keyboard layout guides

New keyboard layout guides give your app an easy way to adapt your app’s layout based on the keyboard’s size and position. Support for the new tracking layout guide in UIKit automatically enables and disables constraints when the keyboard is docked to the bottom of the screen, undocked, or floating over your app, letting you provide a great text input experience.

View the Adjust Your Layout with Keyboard Layout Guide

Core Location UI

CoreLocationUI is a brand-new framework that introduces the location button, which lets people grant your app temporary authorization to access their location at the moment it’s needed. This button interacts securely with Core Location to request authorization to access location data.

View documentation


The Accessibility framework introduces audio graphs, a new way to represent data in your charts and graphs that allows VoiceOver to construct and play an audible representation of it. This framework also adds an API to query information relevant for MFi hearing devices, such as streaming preferences, streaming capabilities, and paired hearing devices.

View documentation for Audio Graphs

View documentation for Hearing Device Support

Augmented Reality

ARKit 5

ARKit 5 brings Face Tracking support to the Ultra Wide camera in the latest iPad Pro (5th generation), letting it track up to three faces at once using the TrueDepth camera to power front-facing camera experiences like Memoji and Snapchat.

Learn about ARKit

RealityKit 2

Turn photos from your iPhone or iPad into high-quality 3D models that are optimized for AR in minutes using the new Object Capture API on macOS. And brand-new capabilities give you more control over your AR objects and scene with custom render targets and materials, customizable loading for assets, player-controlled characters, and more.

Learn about RealityKit

Machine learning

Create ML

Create ML is now available as a Swift framework on iOS and iPadOS, in addition to macOS. You can programmatically experiment and automate model creation in Swift scripts or playgrounds. Build dynamic app features that leverage Create ML APIs to train models directly from user input or behavior on-device, allowing you to provide personalized and adaptive experiences while preserving user privacy.

Create ML adds the Hand Pose and Hand Action classifier tasks to both the Create ML API and the developer tool included with Xcode. These classifiers recognize hand positions in still images and hand movements in videos, respectively.

View documentation for MLHandPoseClassifier

View documentation for MLActionClassifier

View documentation for Create ML

Core ML

Core ML adds ML Packages, a new, future-looking model format that provides the flexibility to edit metadata and visibility to track changes with source control. Core ML also adds ML Programs, a new model type that compiles more efficiently, decouples a model’s architecture from its weights, and offers more control over the computational precision of its intermediate tensors. The new MLShapedArray API lets you work with multidimensional data using idiomatic Swift that improves the code’s type safety and readability.

View Updating a Model File to a Model Package

View documentation for MLShapedArray

Tabular Data. Tabular Data makes it easy to programmatically import information from JSON and CSV files and prepare datasets ready for Core ML, Create ML, or your own custom solution. Use Tabular Data’s central DataFrame API to sort, join, group, split, encode, decode, explode, filter, slice, combine, and transform the rows and columns of your tabular data to meet your needs.

View documentation for TabularData

View documentation for DataFrame

Sound Analysis. Sound Analysis adds a new sound classifier that your apps can use to identify over 300 unique sounds from live audio or an audio file. A new time window duration API gives you tune prediction accuracy versus time precision.

View documentation



GameKit provides new ways to discover and invite players to participate in a game. Players can now invite contacts, message groups, and anyone with a phone number or email address. Players see the status of other players receiving and accepting invitations and, optionally, start with a minimum number of players while waiting for others to join.

View documentation

Game Controller

The Game Controller framework adds virtual controllers — software emulations of real controllers that users interact with similarly to real controllers. You choose the configuration and the input elements to display specifically for your game.

View documentation

StoreKit 2

StoreKit’s new In-App Purchase API provides a simple, powerful, and secure way to work with your app’s products and transactions. The new API takes advantage of modern Swift features, such as concurrency, to simplify your in-app purchase workflow. Its cryptographically signed transaction and subscription information uses the JSON Web Signature (JWS) format, which is a secure and simple way to parse on the client. A new entitlements API makes it easy to determine which content and services your app should unlock for users. Use the new StoreKit API throughout the in-app purchase process — from displaying in-app purchases, to managing access to content and providing customer service within your app.

View documentation for StoreKit

View documentation for In-App Purchase

Apple Pay

Give users more options by adding coupons, deferred payments, recurring payments, shipping dates, and read-only pickup addresses to your Apple Pay transactions.

View Offering Apple Pay in Your App

View documentation for Apple Pay on the web

Safari Web Extensions

Safari Web Extensions use HTML, CSS, and JavaScript to offer powerful browser customizations and new functionality across the web. With iOS 15, Safari Web Extensions are now available on all Apple devices that support Safari.

View documentation

Screen Time

Apps with parental controls can support a wider range of tools for parents with the Screen Time API. You can use key features, such as core restrictions and device activity monitoring, in a way that puts privacy first.

ManagedSettings and ManagedSettingsUI. Use ManagedSettings to define usage policies and settings constraints on a parent or guardian’s device and apply them on other devices in the Family Sharing group. ManagedSettingsUI provides an opportunity for you to customize Screen Time API’s shielding views to match your app’s branding and style.

FamilyControls. FamilyControls gives control to parents or guardians in Family Sharing groups by requiring them to authorize parental controls on a device signed into a child’s iCloud account. FamilyControls provides a secure environment where only family members in the Family Sharing group can authorize access. It also provides a secure way to select apps, web domains, and categories that protects the user’s privacy.

DeviceActivity. Device Activity provides a privacy-preserving way for an app to monitor a user’s app and website activity.

View documentation for the Screen Time API


Enrich your app experience with audio recognition. Match music to the millions of songs in Shazam’s vast catalog or make any prerecorded audio recognizable by building your own custom catalog using audio from video, podcasts, and more.

Learn about ShazamKit


Easily integrate Apple Music into your iOS and iPadOS apps using Swift. The MusicKit framework provides a new model layer for accessing music items in Swift, as well as playback support so you can add music to your app.

Learn about MusicKit

Nearby Interaction

Build apps that interact with accessories simply by being in close proximity to an Apple device that includes the U1 chip. Taking advantage of Ultra Wideband technology lets you create more precise, directionally aware app experiences.

View documentation


HomeKit APIs in iOS 15 SDK automatically work with Matter-enabled accessories. Start testing your smart home apps with Matter, the unifying open-connectivity standard designed to increase the compatibility of smart home accessories, so they work seamlessly with your devices.

Learn about HomeKit


HealthKit adds the ability to request one-time access to a verifiable clinical record. These records bundle information about the user’s identity with clinical data, like an immunization record or a lab test result. The organization that produced the data cryptographically signs the bundle, which HealthKit apps can access and verify.

View documentation for HKVerifiableClinicalRecord

View documentation for HKVerifiableClinicalRecordQuery


CloudKit builds on top of the new async/await support in Swift 5.5, making the asynchronous API easier to use and more configurable. CloudKit adds Record Zone Sharing, which builds on the existing sharing infrastructure to let users share the entire contents of a record zone with other iCloud users. You can now encrypt a record’s values using new APIs on CKRecord, helping you offer strong privacy guarantees to your users. The new CloudKit Schema Language allows you to retrieve and upload textual representations of your CloudKit schema, which means you can now version it using the same tools as your app’s source code.

View documentation for CloudKit

CloudKit Console. CloudKit improves your workflows with a brand-new CloudKit Console, an intuitive web-based control panel that you can use throughout the development lifecycle of your app, and cktool command line interface.

Learn about the console

Core Data. Core Data provides new APIs that facilitate the sharing of managed objects with other iCloud users, specifically for CloudKit-backed persistent stores. In addition, you can now choose to encrypt an entity’s attributes before they’re saved to iCloud. Spotlight integration has also been enhanced, with additional APIs that allow for fine-grained control over what data is added to the index and when.

View documentation for Core Data

Virtual conference extension

Apps that provide virtual conference services can use this new app extension in EventKit to integrate directly into users’ calendar events. You’ll be able to provide custom locations for events, a link that lets people join the conference with a single tap, and additional information, like dial-in details.

View documentation

iOS 14

With the iOS 14 SDK, users can more easily discover your app’s core functionality through app clips. SwiftUI introduces a new app life cycle and new view layouts. It supports the new WidgetKit framework, which allows your app to display information directly on the iOS Home screen. Machine learning adds style transfers and action classification to the models, and offers a CloudKit-based deployment solution. Vision API additions help your app analyze image and video more thoroughly. ARKit advances promote an even tighter integration with the world around the device, and you can include markups in your emails and websites that help Siri Event Suggestions surface your events.

App Clips

An app clip is a lightweight version of your app that offers users some of its functionality. It’s discoverable at the moment it’s needed, fast, and quick to launch. Users discover and open app clips from a number of places, including Safari, Maps, and Messages, as well as in the real world through QR codes and NFC tags. App clips also provide opportunities for users to download the full app from the App Store. To learn how to create your own app clips, see the app clips documentation.


Widgets give users quick access to timely, at-a-glance information from your app right on the iOS Home screen. iOS 14 offers a redesigned widget experience. Your app can present widgets in multiple sizes, allow user customization, include interactive features, and update content at appropriate times. To learn about designing widgets, see the Human Interface Guidelines. To learn how to support widgets in your app, see the WidgetKit framework.


SwiftUI provides a selection of new built-in views, including a progress indicator and a text editor. It also supports new view layouts, like grids and outlines. Grids and the new lazy version of stacks load items only as needed.

Starting in Xcode 12, you can now use SwiftUI to define the structure and behavior of an entire app. Compose your app from scenes containing the view hierarchies that define an app’s user interface. Add menu commands, handle life-cycle events, invoke system actions, and manage storage across all of your apps. By incorporating WidgetKit into your app, you can also create widgets that provide quick access to important content right on the iOS Home screen or the macOS Notification Center. For more information, see App Structure and Behavior.


ARKit adds Location Anchors, which leverages the refine location feature in the new Apple Map to enable rear-camera AR experiences in specific geographic locations. A new Depth API lets you access even more precise distance and depth information captured by the LiDAR Scanner on iPad Pro. To learn more about these features, see the ARKit framework documentation.

Machine Learning

Your machine learning apps gain new functionality, flexibility, and security with the updates in iOS 14. Core ML adds model deployment with a dashboard for hosting and deploying models using CloudKit, so you can easily make updates to your models without updating your app or hosting the models yourself. Core ML model encryption adds another layer of security for your models, handling the encryption process and key management for you. The Core ML converter supports direct conversion of PyTorch models to Core ML.

The Create ML app’s new Style Transfer template stylizes photos and videos in real time, and the new Action Classification template classifies a single person’s actions in a video clip. For more information, see the Core ML and Create ML developer documentation.


With iOS 14, the Vision framework has added APIs for trajectory detection in video, hand and body pose estimation for images and video, contour detection to trace the edges of objects and features in image and video, and optical flow to define the pattern of motion between consecutive video frames. To learn more about these features, see the Vision framework documentation. In particular, read Building a Feature-Rich App for Sports Analysis to find out how these features come together in a sample app.

Natural Language

The Natural Language framework has new API to provide sentence embedding that creates a vector representation of any string; word tagging to train models that classify natural language, customized for your specific domain; and confidence scores that rank the framework’s predictions. For more information, see the Natural Language framework documentation.

App Store Privacy Information

Privacy is at the core of the entire iOS experience, and new privacy information in the App Store gives users even more transparency and control over their personal information. On iOS 14, apps will be required to ask users for permission to track them across apps and websites owned by other companies. Later this year, the App Store will help users understand apps’ privacy practices, and you’ll need to enter your privacy practice details into App Store Connect for display on your App Store product page.

Siri Event Suggestions Markup

You can use the Siri Event Suggestions Markup to provide event details on a webpage and in email. Siri parses travel arrangements, movies, sporting events, live shows, restaurant reservations, and social events. Once parsed, Siri can suggest driving directions, a ride share to a scheduled event, or activation of Do Not Disturb just before a show starts. To learn how to integrate your own events with Siri, see the Siri Event Suggestions Markup documentation.


PencilKit now enables handwriting recognition inside text fields. Using gestures, users can also select or delete text, and join or break up words. You can add data detection to your app, as well as text and shape recognition and selection. For more information, see the PencilKit framework documentation.


A new Accessibility framework lets your app dynamically deliver a subset of accessible content to a user based on context.


MetricKit adds Diagnostics, a new type of payload that tracks specific app failures, such as crashes or disk-write exceptions. For more information, see the MetricKit framework documentation.

Family Sharing for In-App Purchases

Family Sharing is a simple way for users to share subscriptions, purchases, and more with everyone in their household. And with iOS 14, you can choose to offer Family Sharing for your users’ in-app purchases and subscriptions so their whole family can enjoy the added benefits. See the SKProduct and SKPaymentTransactionObserver for the new APIs.

Screen Time

iOS 14 includes Screen Time APIs for sharing and managing web-usage data and observing changes a parent or guardian makes. For more details, see the Screen Time framework documentation.

Uniform Type Identifiers

Use the new Uniform Type Identifiers framework to describe file formats and in-memory data for transfer, such as the pasteboard; and to identify resources, such as directories, volumes, and packages.

File Compression

Use the new Apple Archive framework to perform fast, multithreaded, lossless compression of directories, files, and data in iOS.