The Nearby Interaction framework streams distance and direction between opted-in Apple devices containing the U1 chip. Discover how this powerful combination of hardware and software allow you to create intuitive spatial interactions based on the relative position of two or more devices. We'll walk you through this session-based API and show you how to deliver entirely new interactive experiences — all with privacy in mind.
Hello and welcome to the session "Meet Nearby Interaction." My name is Yagil, and I'm an engineer from the Location team here at Apple. Today we will talk about how to build a completely new class of user interactions that are based on spatial awareness in iOS. Spatial awareness, in essence, means understanding the physical world around you.
Such understanding in a device you're holding in your hand can translate into fluid interactions with the environment.
This is AirDrop in iOS 13. The top selection that I'm highlighting here uses spatial sensing technologies available for the first time in the iPhone 11. The user can orient their device toward someone else's device to give a strong hint to the share sheet for presenting the most relevant sharing selection.
This sort of high-fidelity interactivity is only possible thanks to the Apple-designed U1 chip.
We wanted to empower you to use such capabilities in your application. So today, we are introducing the Nearby Interaction framework, a powerful yet easy-to-use interface to spatial awareness in iOS. Let's start by talking about user control and transparency. Say your app wants to run a Nearby Interaction session with another user's device. The first thing the system will do is present a permission prompt to both users.
The user can choose between not allowing your app to interact with a nearby device, or granting it a one-time permission that remains in effect until your app is exited.
If users on both sides choose to grant permission, their devices can start to understand how far apart they are and in which relative direction.
This kind of rich spatial information enables you to create user experiences that are more natural and more intuitive in your application. Nearby Interaction will be available to your apps on U1-equipped devices running iOS 14.
Here's what we'll cover today. I will talk about spatial awareness in iOS with Nearby Interaction. Then I will take you through what it takes to start streaming relative position updates. And I'll close with some best practices for you to adopt in your application. So let's jump right in.
Nearby Interaction provides your app two main types of output: a measurement of distance between devices, and a measurement of relative direction from one to the other. So when your app is running a Nearby Interaction session, it is able to get a continuous stream of updates containing distance and direction information. As this slide shows, these updates are bi-directional, and both sides of the session are learning about each other's relative position simultaneously. Your app is not limited to interacting with just one device. Each device can run several sessions at the same time, each session with one other peer. In this graphic, I'm showing four devices interacting with one another by each running three sessions in parallel.
Let's talk about the interface for making this happen.
You start with creating a Nearby Interaction session object. All Nearby Interactions are encapsulated in sessions. Similar to patterns in other Apple frameworks such as ARKit, you provide your session object a configuration you would like it to run with. So let's say that there are two or more users running your app, and they want to interact with one another in some spatial manner. Before this can happen, your app needs to let the system know, on both sides, how to identify the other device when it is nearby.
Peer discovery is a key concept. Devices can discover each other in a privacy-preserving manner by using something we call a discovery token.
A discovery token is a randomly generated identifier for a given device in a particular Nearby Interaction session.
The discovery token has a limited time period for which it can be used, and that time period is exactly as long as the session itself, meaning that if you invalidate the session or if the user exits your app, any session and its associated token get invalidated.
The token is generated by the system, and you receive it in your application through the session object. Each session you create has its own unique associated discovery token. Finally, your app needs to copy the discovery token from the session object and then share it between users that want to interact.
Let's go into a little bit more detail about what that means.
Let's say that your app is running on both of these devices, and that your app has some networking layer over which the devices can talk to one another. Use your networking layer to send a discovery token from one device to the other, and do exactly the same on the other side.
How do you do this? The discovery token type conforms to the NSSecureCoding protocol, which means you can easily encode it and then shuttle it over using whatever transport technology is available to your app. For example, you can use Apple's MultipeerConnectivity framework, or, if applicable to your app, via the cloud.
Coming back to the block diagram. The discovery token your app exchanged goes into the session configuration, and the configuration is provided to the session via the "run" method.
Let's jump into the code.
This short code snippet here shows you all you need to do to get a session going in your app. First, create a session object. Behind the scenes, this call allocates all the needed resources for running a session, including a discovery token.
Next, to be able to receive callbacks from the system, set the session's delegate.
The next step is exchanging the session-specific discovery token. This needs to be done over your app's networking layer. When the token exchange step is complete, proceed to create a session configuration. And finally, run the session with the configuration.
Let's keep building this down to match the code.
The session takes a delegate to provide updates to, and after calling "run," your delegate starts getting updates about nearby objects. That's the basic structure of a Nearby Interaction session.
Time for a demo. Let's see a fun little app that makes use of distance and direction: Monkey Time.
I have an app here that is also running on the phone behind me on the table over there, and I already started the session. Let's see what's on the screen. We see a measurement distance at the top, and you see it's updating. And we also have our little friend, the monkey here, and he's covering his eyes. So I'm going to head over towards the other phone and see what's gonna happen.
All right. I went through a lot of new stuff, so let's recap. To run a session, first create a session object for each peer and exchange tokens with that peer.
Once discovery tokens are exchanged, create a session configuration. And to start streaming updates, run the session with the configuration. You can also pause a session and resume it later by calling the "run" method again. I mentioned that after calling "run," you will start getting updates from the session. With that in mind, let's go into a little more detail about the delegate callbacks you can implement in your app.
This is the session delegate protocol. Your app receives updates about nearby devices by implementing the didUpdate callback. There are a few more interesting methods here.
The system will notify your app whenever it is no longer interacting with a nearby object. This notification comes with a reason associated with it.
The two reasons you can expect are "timeout" or "peerEnded." They are different in some important ways. Timeout means that there wasn't any activity in the session for over some time period. This may happen, for example, if the devices are too far apart.
PeerEnded, on the other hand, means that the session was explicitly invalidated on the other end. A word of caution here. This notification is delivered on a best-effort basis and may not always be received.
Those were the nearby object removal reasons. The final three delegate callbacks have to do with the session life cycle. Whenever there are conditions preventing your session from running, your delegate will get a suspension notification.
The session will be suspended, for example, when your app is no longer in the foreground. You need to wait for a notification that the suspension ended before being able to use it again.
When you finally get a callback that the suspension ended, your session will not resume automatically. This is to allow you to decide what to do with the session at this point. If you want to resume the session, you can call "run" on both sides, just like you did when you first started the session, but this time, there's no need to exchange discovery tokens.
Last but not least is the sessionDidInvalidate callback, which notifies your app about session invalidation.
The session will be invalidated with an associated error code upon certain error conditions or resource constraints. Sessions that are invalidated cannot be run again, and their associated tokens cannot be used anymore. To restart interaction, you'll need to create a new session and re-exchange discovery tokens.
These were the different delegate callbacks your app might get from the system.
Back to the full diagram. This is illustrating a single session. I mentioned that your app can run several simultaneous sessions. This is what it would look like with multiple sessions.
You create a session for each peer, and you can re-use other parts of your app whenever it makes sense.
For example, here, I'm using the same delegate for all sessions.
Maintaining the principle of "one session for one peer" makes managing multiple sessions straightforward. You can interact with multiple nearby devices by creating an interaction session for each one. You can think about it like a conversation with that device. We recommend storing sessions in a data structure like a dictionary in order to map your app's notion of peer users to their respective Nearby Interaction sessions.
All right. You've constructed and run a session or two, and you start getting updates about nearby objects. Now I'll dive into what nearby objects contain and how you can use them in your application.
Each nearby object comes with three properties. A discovery token that you provided to the session in the configuration. This property is here to allow you to tie back these updates to the user from which you received the token in the first place.
The next property is distance. It contains a distance measurement in meters, indicating how far apart the two devices are. And the final property is direction. It contains a three-dimensional unit vector pointing at the other device, relative to the local device itself.
This is what it looks like in code. A discovery token identifying this nearby object, a distance measurement in meters between you and the object, and a vector expressing the relative direction to the object. I'd like to call your attention to the fact that the distance and direction properties are nullable. This is an excellent segue to some best practices we'd like you to keep in mind. First, always verify hardware support.
Choose a strategic place in your app to check whether Nearby Interaction is supported. Make sure to fall back to a different user experience when necessary. Next, become familiar with the concept of the directional field of view.
Similar to other hardware sensors, like the camera, for example, the hardware used in Nearby Interaction is also subject to a field of view.
The field of view, notionally, is a cone that looks just like this coming out of the back of the phone. It roughly corresponds with the Ultra Wide camera's field of view on the iPhone 11.
When devices you're interacting with are inside this field of view, both distance and direction updates are expected to be produced with high confidence. But if a device is outside that directional field of view, your app may get distance updates about it but not direction updates. Remember this when developing your apps and designing your user experiences. Next, understand the impact of physical device orientation. For optimal performance, devices should be held in the portrait orientation.
One device in portrait and the other in landscape would lead to limited measurement availability. This is something you absolutely want to avoid, so make sure that your app does not lead users to this situation.
Moving on, be mindful of occlusions. Devices that are in clear line of sight of each other will achieve optimal performance. You can think about it in terms of the devices looking at one another. Brick walls, people, or any body or object in between devices may reduce measurement availability.
Like I mentioned, the distance and direction properties of a nearby object are nullable. So whenever either one cannot be produced due to the scenarios I just talked about, these fields are expected to be nil. Be prepared for this and handle nullability accordingly in your application. As for best practices around your development process, take advantage of the natively integrated Simulator support in Xcode.
The same application code that runs on actual devices will get triggered with distance and direction updates between two or more Simulator windows. We're happy to be able to ship this to you, and we hope that this functionality will simplify and accelerate your development process.
This makes our final best-practice recommendation. Leverage the native Xcode support for simulating Nearby Interactions.
Today we introduced Nearby Interaction. It enables you to add spatial awareness to your application and create new kinds of user interactions that are based on knowledge of relative device position. Thank you for joining the session today, and we truly can't wait to see
what you'll build with this.
Looking for something specific? Enter a topic above and jump straight to the good stuff.
An error occurred when submitting your query. Please check your Internet connection and try again.