Privacy is a more important issue than ever. Learn about Apple's privacy pillars, our approach to privacy, and how to adopt the latest features on our platforms that can help you earn customer trust, create more personal experiences, and improve engagement. Explore the transparency iOS provides when your app is recording using the microphone or camera, control over location with approximate location, tracking transparency and permissions, and much more.
Hi, I'm Rohith, and I'm joined today by my colleague Brandon. Today, we're going to take you through how we build trust with our users through better privacy. I'll walk through our approach to privacy at Apple and then go through how we can apply our privacy principles to mitigate user tracking in our ecosystem.
So, what is our approach to privacy? At Apple, we have four fundamental privacy pillars that guide the products and features that we make. On-device processing-- processing data locally, without sending it to a server. Data minimization-- requesting and only using data that you actually need. Security protections, which enforce the privacy protections on our platform. And transparency and control-- providing the user understanding and control over their data.
These four pillars help us build strong privacy protections into our features to continue building trust with our users.
Let's go through these one by one as I showcase how each influences changes that we've made this year, starting with on-device processing.
So, what is the benefit of operating on data without sending it off a user's device to a remote server? When you send data to a remote server, the user loses their ability to control who can access it, who the data will in turn be shared with and what the data may be used for. It additionally requires extra work to secure customer data against breaches or other threats.
But sometimes you need to collect data to train a machine-learning model.
So we're creating more ways to leverage Core ML to build and train models on-device. It's as easy as dragging and dropping these models into your Xcode project. And keeping data locally automatically takes advantage of the strong security protections we have on our devices.
We've been leveraging on-device learning for many of our features. Since iOS 13, we have been using private federated learning, or PFL, to build machine-learning models on potentially sensitive data.
PFL works by having devices send differentially private model updates instead of sending the user data. This way, we can build centralized models on our servers without ever having access to user data. Last year, we used this technology to improve models for QuickType and Siri voice recognition from users who opt in to improve our products.
And this year, we're bringing new private federated learning use cases. We leverage the powerful Neural Engine in our chips, allowing us to improve machine-learning models without revealing user data to Apple or any third parties.
To learn more about using machine learning in your apps, check out the following talks.
Additionally, we have brought dictation locally to the device for many languages. So when you dictate on your phone, your voice data will be processed locally right on your device. And if you write apps that use dictation, you should specify to use the on-device dictation model to take advantage of this new technology.
This year, we also introduced a new HomeKit feature, face recognition, so that your cameras and video doorbells can let you know who is there based on the people that you've tagged in your Photos app or from recent visitors you identify in the Home app. If you opt in to this feature, your home hub will leverage on-device intelligence to recognize people who walk by.
On-device processing is essential to many of the new features we are introducing this year. You should look for new opportunities to use on-device processing and reconsider your existing features as our devices continue to become more powerful. This is a great tool to build trust with your users by minimizing the data that you send off the device.
Let me now hand it off to Brandon to talk more about our next privacy pillar, data minimization.
Thanks, Rohith. Privacy is about building trust with your users. One of the best ways to do this is use only the data you need to get the job done.
You can think of this as a careful balance between what data your app has access to and what it will use to deliver the feature.
Asking a user to share lots of personal information for features that will only take advantage of a little asks users to make a bigger choice than they might be ready for just to use that feature. Trust is built over time, and starting out by showing respect for users' data by asking for access to as little as possible is a great first step.
We know it's not always simple in practice, so this year we're introducing new tools that will make it easier for you to balance the amount of information your app has access to with a simple user experience. Today we'll talk about three types of data that are accessible in new ways in iOS 14: Photos, Location and Contacts.
Let's look at these in the context of a simple photo-sharing app.
The app needs access to Photos in order to pick the photo you want to share, Location, to find friends around you to share with, and Contacts, if you want to pick a friend to share with manually. Let's see how this might be implemented on iOS 13.
To select a photo to send, the user will be asked to give the app access to their whole photos library.
The app will ask for location permission to find friends in the area. Or if a user wants to send the photo to a specific friend, the app will request access to all of their contacts, so the user can pick one to share with.
To look at the whole flow, this is a lot of access and a lot of user friction. To share a photo, the user had to answer three prompts, and in the process gave access to all of their photos, their precise location and all of their contacts. This is not great from a privacy perspective. While the user did have control over sharing their information, they couldn't use the core functionality of the app without saying yes to a lot of sharing.
It's also not great from a usability perspective. But we can do better. This year, we're introducing a few new technologies to help you improve this experience in your apps, starting with Photos.
For all apps that access the photos library on iOS 14, we're introducing the "Limited Photos Library." Users can give apps access to only a limited selection of their photos instead of their entire photos library. When the user taps "Select Photos," they can pick just what they want to share with the app.
This is the new prompt for all apps that ask for photos access on iOS 14. Please test your apps for compatibility, and for more information, check out the "Limited Photos Library" session. But this still requires a prompt, and we can do even better. Most apps use only a small number of photos, like in this case, the photo you want to share with a friend.
PHPicker is a new framework that replaces UIImagePickerController that you can adopt to avoid prompting for photos access entirely. It features search and multi-select to help users find what they're looking for and doesn't require the user to grant Photos Library access.
PHPicker runs in a separate process from your app, but is rendered on top of it. Your app can't access the picker directly or take screenshots of the picker content. Only what the user actually selected is passed back to your app.
Unless your app has a strong need for access to all photos, such as to provide a backup service, you should use PHPicker. For more details on PHPicker, check out the "Meet the New Photos Picker" session.
So let's look at what that means in practice for our photo-sharing app. To pick a photo to share, the user is presented with the PHPicker, where they can select just the photos they want. This is a much faster user experience than going through a prompt and results in much better privacy.
Next, let's look at Location. In iOS 14, we're adding the ability for a user to share only their approximate location with an app.
Like Limited Photos Library, this is the new access prompt for all apps that ask for location on iOS 14. When the user taps the pill, the app will receive only approximate location updates.
Many apps today have features that are a natural fit for approximate location, like this photo-sharing app for finding friends in your area.
You can request approximate location by default when asking for location authorization by setting the NSLocationDefaultAccuracyReduced key in your Info.plist.
And if your app has features that really needs precise location, like turn-by-turn navigation, you can ask the user for a temporary upgrade to precise location. In this photo-sharing app, we don't need it, so we'll ask for just approximate location.
You should take a careful look at how your app will respond to approximate location. Respect the user's intent by keeping as many features available as possible, and where appropriate, consider asking for approximate location only.
To learn more about the Core Location API changes, see "What's New in Core Location." And for a thoughtful look at the design choices Apple Maps made to adopt approximate location, see "Design for Location Privacy." Finally, let's look at how access to Contacts can be simplified in iOS 14.
This year, we've made proactive keyboard even more intelligent. QuickType will suggest contact details from your on-device contacts database.
When a user starts typing a name in the "Enter name" field, the keyboard automatically populates the correct information for that name.
You can annotate text fields to give clues to the keyboard about what type of contact information should be suggested.
Most apps, like this photos app, can have a great experience without needing to ask for access to all of the user's contacts just to fill in a phone number or e-mail field. To learn more about how this works, see the "Autofill Everywhere" session.
Now let's recall how we started here. To share a photo, the user had to answer three prompts and grant a lot of access. Not great for privacy, and not great for ease of use.
But now with the new tools available to your app in iOS 14, you can provide a significantly better experience.
It's seamless for the user to share only the photos they want, easy to help them find friends in the area without sharing their precise location, and simple to type in a friend's name without granting access to all contacts.
Great for privacy and great for usability.
Regardless of the individual design choices your app might make with these new API, it's important to remember the principle behind them.
Data minimization is about asking for only what you need, when you need it, and using new tools to build great functionality with the minimum data possible.
It's a great way to start building trust with your users, from their first moments using your app.
Back to Rohith to talk about our next privacy principle, security.
Thanks, Brandon. Security is another fundamental pillar that grounds and enforces privacy in our ecosystem.
This year, we leverage security to address the problem of server-name tracking in Internet protocols.
One way that server names are exposed is with DNS queries.
When a device accesses a website, the system sends a DNS query to turn that name into an IP address.
The DNS server the device uses is usually automatically configured by the Internet provider, wireless carrier, enterprise or other network operator.
And since DNS queries do not support either confidentiality or authenticity, the queries and server addresses can be read or even modified by third parties or your network operator. They can collect, monitor, retain and share information about the types of DNS queries your device sends.
But with an encrypted connection, third parties can no longer observe DNS queries.
Starting this year, Apple platforms natively support two standard encrypted DNS protocols.
The supported standard encrypted DNS protocols protect both the confidentiality and the authenticity of server names. This will coexist with corporate VPNs and MDM configuration profiles.
IOS 14 also supports automatic DoH server discovery, so that devices can automatically use encrypted DNS protocols. If you host web content, you can direct devices to resolve DNS queries using your servers securely.
Apple server names will automatically resolve DNS queries using Apple's DoH server. For more information about enabling these protocols, see the "Enable Encrypted DNS" talk. Another way that server names are revealed when you're browsing is through the TLS session establishment handshake. TLS is a protocol that is used to encrypt traffic on the web. Even if you've enabled encrypted DNS to make name resolution more private, TLS session establishment includes a plain-text Server Name Indication, or SNI. Just like with DNS queries, the SNI can be observed by a third party on the network, telling them the name of the server you are making a connection with.
We're currently working with the IETF to standardize methods for encrypting even more of the TLS handshake so that third parties can't snoop on your traffic.
With these updates, encryption prevents network operators and third parties from tracking your activity on the web.
This is just one of the many ways that security enforces privacy protections in our systems. Users depend on your apps to keep their data secure.
For more information about securing your apps to maintain user trust in iOS 14, please see the following talks. We've now talked through three of our privacy pillars-- on-device processing, data minimization and security protections. In addition to giving developers better ways to request access for only what they need, we're adding transparency for users to understand when apps access what data. This year, we have updates to help users better understand what data is collected by apps and on the web.
Starting with the App Store.
Starting in fall 2020, when you submit your app to the App Store, you will need to fill out a questionnaire to describe how your app uses user data.
The information you provide will be shown to users directly on your store page.
This gives users the ability to see what an app does before they download it. They will be able to see if you collect a little data or a lot of data and if any of that data is used to track them.
This information will be readily available on your App Store product page.
It will also be shown for apps in the App Stores on all platforms.
If you have installed any third-party code, such as an analytics or advertising SDK, you will also need to declare what data they collect and how it is used. Remember that SDKs run in-process of your app, so they have access to the same permissions that your app does. And as a developer, you are responsible for the trust and handling of user data in your app, including any code that you include. And, SDK developers, this would be a great time to update your documentation to make sure that developers understand what your SDK does. The developer documentation and questionnaire will be available later. In the meantime, reach out to your SDK's developers to make sure you understand how they may collect and use data.
On the web, we're making changes to let you see how Intelligent Tracking Prevention, or ITP, has been protecting you.
ITP has been protecting users since iOS 11 and Safari 11.
Now you can see what known trackers ITP is protecting you from right from Safari's toolbar. And you can also dive deeper and view a full report of tracking that Safari has prevented across all browsing in the last 30 days.
We're also adding increased transparency within apps. Users may copy and paste a lot of sensitive information, like passwords, photos or even their credit-card number.
This information should only be accessed when a user wants to share their clipboard data with an app. You may not even realize that code you include is accessing the clipboard.
In iOS 14, we're making it clear to developers and users when apps access pasteboard items from another app. This includes programmatic access as well as when a user manually pastes using the keyboard, menu items or the callout bar.
This serves to provide user confirmation when they paste an item and also warn of apps that may be excessively accessing pasteboard information.
We're also bringing increased transparency to the camera and microphone.
When an app turns on the camera or mic, an indicator will appear in the status bar so users always know when apps are recording.
Control Center will additionally show which app is currently using the camera or mic or which app has recently used it. This transparency also applies to websites' use of the camera and mic permissions.
Make sure you understand how your app uses the pasteboard or recording to ensure that it's only accessed when a feature requires it and that a user expects it. This could include pre-warming the pasteboard, microphone or camera to make your app more responsive. A user may be surprised if an app immediately starts recording on launch.
Make sure to include UI clues that make it clear when and why your app will access the pasteboard or start recording.
And remember that SDKs are part of your app too. You're responsible for the code in your app and the relationship with the end user. If an SDK in your app uses the pasteboard, mic or camera, it looks as if your app did so to the end user. Maintaining transparency reinforces trust the users have with your app and its use of their data.
Control over data that's shared with your app goes hand-in-hand with transparency. So let's go through the following updates we're introducing this year, starting with networking.
Devices on a local network can observe and gather information about network activity, and apps can gather information about users from the networks that they use.
Every network is unique, and it represents a user's surrounding environment. This means that apps and other observers can get information such as if they are at home and who is around them.
They can also build a profile based on connected devices, such as TVs and other accessories.
This year, we're adding control over access to the local network. When your app tries to access the local network, such as with a Bonjour or mDNS scan, it will prompt the user to provide it access to do so. You should declare which Bonjour services your app requires in your Info.plist so that the system knows which services to provide information from.
As before, make sure you provide a usage string to explain why your app needs access, and make sure to have UI clues so that the prompt is not unexpected.
For more information about local networking in iOS 14, see the "Support Local Network Privacy in Your App" talk.
MAC addresses are identifiers used to address specific devices on a network, but they were not created for the purpose of tracking devices. In iOS 8, we introduced MAC-address randomization. This prevents users from being tracked from their MAC address when they are not connected to Wi-Fi.
But when users connect to a network, their physical Wi-Fi MAC address leaves a trail of their connectivity because the address doesn't change. As they move around from one network to another, network operators can combine the data from their Wi-Fi MAC address to create a more complete user profile which includes where they've been and details about their network activity. So this year, we're introducing private Wi-Fi address. iOS 14 will automatically manage Wi-Fi MAC addresses when joining networks. This way, the MAC address will not tie to a user's identity, and it can't be used to track them from one network to the next. A new MAC address will also be generated for networks every 24 hours, and the new private address will be used when the user leaves and rejoins the network. Users are always in control over this feature and can adjust this in their Wi-Fi settings.
So with private Wi-Fi addresses, users get a per-network address that is not linked to their identity, generated daily. And users are always in control.
We're also introducing the Nearby Interaction framework this year that allows you to take advantage of the ultra-wideband ranging we introduced with the U1 chip. Nearby Interaction is a great way to get distance and direction information for interactive games and other peer-to-peer use cases, such as confirming that a nearby phone is the one you are looking for.
When using this framework, there is no need to prompt for Bluetooth or network access. Instead, your app will prompt for a session-based access.
So the data will be available while the app continues to be used in the foreground. Make sure to include a clear explanation when you prompt the user for permission, and prompt when there's context for why your app is requesting it. For more information, see the "Meet Nearby Interaction" talk.
New in iOS 14, app clips are a great way to introduce users to experiences in your application. We designed app clips from the ground up to be private, so users can feel at ease when trying them. If a user doesn't upgrade from an app clip to a full app, iOS cleans up any unused app clips so they leave no trace behind.
We've also designed new privacy-friendly location access specifically for app clips' unique use cases to help you practice data minimization. You can check to make sure that a tag hasn't moved or that a user is getting on the wait list for the right restaurant. Location confirmation reveals just enough information to accomplish this without the need for full location access.
Users can control this in the app clip card. When you ask the system if the app clip was invoked at a specific location, there is no extra prompt. But a failed check may not indicate that the user isn't nearby. The user may simply have location services turned off or they may have declined to allow location confirmation.
You should provide a way for users to complete the action, such as with additional confirmation.
For more information, see the "Streamline Your App Clip" and "Design Great App Clips" talks.
New in Safari 14, Safari web extensions put users in direct control over their extensions and use of their data. Users will be able to select which websites a Safari web extension gets access to and customize it to their needs. You should configure your extension to request the minimum permissions necessary. For example, the active-tab permission allows you to run script on the current web page after the user invokes the extension without additional user confirmation.
To learn more about Safari web extensions and privacy, check out the "Meet Safari Web Extensions" talk.
And this year on the Mac, we're bringing many of the technologies and privacy protections you're familiar with on iOS to macOS. Users can now control Bluetooth access for macOS apps that use Core Bluetooth.
And if you use Catalyst to build your apps, users can also take advantage of the Limited Photos Library, HomeKit, which is available with the same authorization model as iOS, access to Media and Apple Music and access to CNCopyCurrentNetworkInfo. And just as in iOS, you will need to provide a purpose string to help users understand why you need their data.
We've built great new tools this year for you to practice transparency and control in your apps. It's important to explain and show users how their data is used and make sure that users are always in control of their data.
These four privacy pillars put users in charge of their data and their devices. Each one continues to build trust with your users, starting from their first moments in your app. Back to Brandon to talk about how these privacy pillars come together to prevent user tracking.
Since the early days of Safari, we've had tracking prevention built in to protect users as they browse the web. We've worked hard to continually update these protections, from blocking third-party cookies by default to this year's new transparency in Intelligent Tracking Prevention. This year, we're bringing this same tracking prevention mentality to apps, which is that we believe tracking should always be transparent and under our users' control.
Moving forward, App Store policy will require apps to ask before tracking users across apps and websites owned by other companies.
Your app must display this prompt and only track users across apps and websites owned by other companies if they tap "Allow Tracking." This includes tracking for targeted advertising, advertising measurement or sharing with data brokers. Let's take a closer look at what tracking across companies means in practice.
For example, if your app knows that I like bagels and don't like grapes, and then you share this with a data broker who knows that I want to be an astronaut, this counts as tracking. Importantly, this counts even if it isn't tied directly to my name, but also if it's tied to an identifier about a particular user, such as a user ID, identifier for advertising, device ID, fingerprinted ID or profile. There are a few specific circumstances where calling the tracking API isn't required.
For instance, if the linking is done solely on the user's device. That means the data can't be sent off the device in a way that can identify that user or device. Or if data is being shared with a data broker, but only for fraud detection or prevention or for security purposes. And regardless, the use must be exclusively on your behalf, not for the data broker's purposes.
To ask for permission to track a user, call the AppTrackingTransparency framework, which will result in a prompt that looks like this.
This framework requires the NSUserTrackingUsageDescription key to be filled out in your Info.plist.
You should add a clear description of why you're asking to track the user.
The IDFA is one of the identifiers that is controlled by the new tracking permission.
To request permission to track the user, call the AppTrackingTransparency framework. If users select "Ask App Not to Track," the IDFA API will return all zeros.
The AppTrackingTransparency framework is only available in the iOS 14 SDK.
This means that if you haven't built your app against iOS 14, the IDFA will not be available, and the API will return all zeros.
Additionally, users can choose to not be asked by any app for tracking permissions.
Limit Ad Tracking is migrating to this switch, and as a user who is upgrading would expect, Limit Ad Tracking enabled means apps on iOS 14 will continue to read the IDFA as zeros, and the Request to Track switch will be off.
Just like Limit Ad Tracking, the switch will be off and be disabled entirely for child accounts, shared iPads, and can be switched off and disabled with an MDM profile.
Finally, you should keep in mind that users can disallow tracking permissions at any time.
Call the AppTrackingTransparency framework every time your app is launched before you want to use the IDFA. As is the case today, you shouldn't cache or store the IDFA.
And think about what changes you should make to stop tracking a user if they switch tracking off.
We believe privacy is a chance to innovate. With this substantial change to how tracking works, we've put together an innovative solution to help you answer a key question that normally involves tracking-- advertising attribution.
For many developers, tracking is a side effect of trying to answer business questions like "Which advertising campaign is most effective?" Apple's Search Ads attribution system doesn't track users across companies today, but what about other attribution systems? This year we're announcing major improvements to SKAdNetwork, a framework that gives you a privacy-friendly way to answer questions about advertising performance.
SKAdNetwork is engineered with privacy at its core. It uses on-device intelligence and aggregation to provide conversion measurement without users being tracked. And because it's engineered to not track users, there's no need to request permission to track. Here's how attribution works. You pay for an ad for your app to appear in a news app. If a user clicks on that ad, they can see your app in the App Store, decide to download it, launch it and end up new users of your app.
SKAdNetwork can help you answer questions like "Which advertising campaign led to the most app installations?" Here it looks like the orange background is more effective, so it's probably a good idea to run more of Campaign 1 and less of Campaign 2. It can also help you understand which apps that the ad appeared in led to the most new users of your app. Here you should probably focus on running more campaigns in the sports app. So how does this work today? When the user clicks on an ad, the advertising SDK sends a message to their ad network with a campaign ID and an identifier associated with the user, like the IDFA.
Then when the user has installed and launched the app, an advertising SDK in that app sends up the same identifier to tell the ad network. The ad network puts these pieces together and informs the developer who bought the ad that Campaign 89 in the news app resulted in a successful conversion. So what does the ad network learn? They learn a lot. They learn that this user has installed both the news app and your app, that they clicked on this ad, and that the conversion was successful.
Now let's see what this would look like without tracking.
If the user denies permission to track, the user IDs here are gone, so the ad network isn't able to learn anything about the conversion.
This is a problem for conversion measurement the traditional way.
But the ad network is only trying to measure conversions, which don't include user-specific information. We built SKAdNetwork to make it possible to learn just conversion information without user specifics.
Let's take a look at how this works under the hood. Instead of the ad SDK calling the ad network directly, let's have it call the StoreKit framework on-device and pass it the campaign ID. This information is stored by the App Store client, which also records the app download.
When the app is launched, the ad network inside your app needs to tell StoreKit that it has been successfully launched, which in turn informs the App Store client.
To prevent a unique combination of the source and destination app from identifying a user, the conversion data is sent to Apple, which checks if a sufficient number of other users have made this same conversion. If the check passes, the device is notified so that it can send the conversion data to the ad network, reporting a successful conversion from the news app to your app via the campaign you ran. And to ensure the integrity of this information, cryptographic signatures are used throughout.
Now let's look at what the ad network learns here-- only that Campaign 89 led a number of users from the news app to your app, which is what they were trying to measure all along. SKAdNetwork is a great example of applying privacy engineering techniques, building creative technical solutions to bridge the gap between features and privacy.
There are several groups of developers that can take steps to adopt SKAdNetwork. If you have advertising in your app, or your app is advertised in another app, pick an ad network that supports SKAdNetwork so you don't have to prompt for permission to track and will get conversion data regardless of user tracking choices. If you're an ad network, go to developer.apple.com to learn how to adopt SKAdNetwork. Please check out the "What's New with In App Purchases" session to learn more about SKAdNetwork and adoption.
To wrap up, we talked about the four key privacy pillars that guide product decisions at Apple and described how they apply to our new features this year, including new tracking protections. We hope this will inspire you to treat privacy as a chance to innovate and use some of the tools we talked about today to build trust with your users through great privacy. Thank you.
Looking for something specific? Enter a topic above and jump straight to the good stuff.
An error occurred when submitting your query. Please check your Internet connection and try again.