Streaming is available in most browsers,
and in the WWDC app.
What's new in AVKit
Learn about enhancements to Picture in Picture and full screen improvements on macOS. Explore the new content source API, and learn how AVPictureInPictureController supports AVSampleBufferDisplayLayer, as well as recommended steps for an app to provide a seamless full screen experience on macOS or in a Mac Catalyst app.
- Have a question? Ask with tag wwdc21-10290
- Search the forums for tag wwdc21-10290
♪ Bass music playing ♪ ♪ Marty Pye: Welcome to "What’s new in AVKit".
My name is Marty Pye and I'm an engineer on the AVKit team.
Today, I'd like to talk about some of the enhancements we've made to Picture in Picture -- or short, PiP -- as well as to the full-screen experience on macOS.
Let's start with Picture in Picture.
With Picture in Picture, users can continue to enjoy their video content while multitasking with their device.
For example, if you're watching a video full screen and you receive a message, you can briefly reply to that message while continuing to watch your content.
The video will automatically enter PiP, and once you've finished replying, you can quickly resume full-screen playback.
This makes for a really seamless viewing experience, and we think users will expect this behavior whenever they're watching videos.
For more information on how to integrate PiP into your own apps, I encourage you to watch this 2019 session on delivering intuitive media playback with AVKit.
New this year, if your video is playing inline, you can optionally allow it to automatically enter PiP when a user swipes back to the Home screen.
Enabling this behavior is achieved via the canStartPictureInPicture AutomaticallyFromInline property.
This property is available both on AVPlayerViewController for apps using our native controls and on AVPictureInPictureController for apps implementing their own custom UI.
Make sure to only set this flag to true when the playing content is intended to be the user's primary focus.
If you're using AVPlayerViewController to present video content, PiP is handled for you.
There's nothing you need to do.
If you're not using AVPlayerViewController, you can still use AVPictureInPictureController to bring the native PiP experience to your app.
First, you need to configure your app's audio session category for playback and enable the PiP background mode.
Then, all you need to do is create a pictureInPictureController, passing in a reference to the playerLayer.
Then, when a user attempts to toggle Picture in Picture using the button you provide, you just need to call start or stop PiP on the controller object.
Up until now, our Picture in Picture experience was built around AVPlayer-based content.
Today, I'm excited to announce the same level of support for AVSampleBufferDisplayLayer.
Instead of creating the Picture in Picture controller with a player layer, you first create a ContentSource, which you set up with either an AVPlayerLayer or -- as shown here -- with an AVSampleBufferDisplayLayer.
For a user, the Picture in Picture experience will be identical.
For you as a developer, there are some new responsibilities associated with supporting PiP for AVSampleBufferDisplayLayer.
Let's take a look at this playback delegate.
We have to rely on playback state information provided via the new AVPictureInPictureSample BufferPlaybackDelegate in order to render the PiP UI, since media playback is not managed by an AVPlayer.
When the user attempts to control media from the PiP UI, we forward those commands to the delegate to handle.
Let's go through the five individual callbacks one by one.
The setPlaying function is called when the user presses the Play/Pause button in the PiP window.
The skipByInterval function is called when the user presses one of the skip buttons.
Use these callbacks to control your media accordingly.
The timeRangeForPlayback function allows you to specify the currently playable time range.
This allows us to render the timeline and show where the playhead is currently.
Time ranges with a finite duration should always contain the current time of the sample buffer display layer's timebase.
Use a time range with an infinite duration to indicate live content.
The didTransitionToRenderSize function is called when the Picture in Picture window changes size, such as during pinch-to-zoom.
Take this render size into account when choosing media variants in order to avoid unnecessary decoding overhead.
The isPlaybackPaused function is called periodically and informs the Picture in Picture UI whether to reflect a paused or playing state.
This is conceptually the equivalent of timeControlStatus on AVPlayer.
Next, let's take a look at some of the improvements we've made to the full-screen experience on macOS.
In Big Sur, when you take a video full screen in a Mac Catalyst app, the video would fill the entire window but not the entire screen.
Now in macOS Monterey, the video will take up the entire screen.
You end up with a true full-screen experience for both native macOS and Mac Catalyst apps.
The playback controls look the same for both.
All Mac Catalyst apps will get this new behavior automatically.
Just like in any native macOS full-screen experience, the user can swipe back to the app window.
A placeholder will be shown instead of the original video, indicating that the content is playing full screen.
This is very similar to the placeholder shown when the video is playing in Picture in Picture.
In a scenario where you present a player view controller full screen after a user selects some content, the view controller will still present in full window.
However, new in macOS Monterey, users can detach to a true full-screen playback experience by pressing the green full screen button in the top left of the window.
The full screen life cycle can be explicitly managed to provide a better user experience based on your application's needs.
Let's take a look at an example.
As we've already shown, a user should be able to take a video full screen and then swipe back to your app while playback continues.
They should be able to navigate your app freely, even if that results in the player view controller being removed from your view hierarchy.
At any point in time, they should be able to either swipe or use Mission Control to navigate back to the full-screen video.
So let's take a look at how to make that work.
You are responsible for the playerViewController's life cycle.
In order to achieve an optimal experience, you need to make sure to keep the playerViewController alive even if it's not in your app's view hierarchy.
Otherwise, when the user navigates away from the page with the video, full-screen playback will end as the playerViewController is released.
All you need to do is keep a strong reference to the playerViewController when you receive the willBeginFullScreenPresentation callback.
Then, once the user exits full screen, you'll receive the willEndFullScreenPresentation callback.
This is your opportunity to let go of the playerViewController you were keeping alive, assuming the user has navigated away from the original view it was presented from.
The same applies for native macOS.
You can use the new playerViewDelegate to keep the playerView alive until you receive the playerViewWillExitFullScreen callback.
When a user exits full screen, you will also receive this restoreUserInterface callback.
This is an opportunity for your app to navigate back to the original page containing the video, assuming that's appropriate for your use case.
This is very similar to the existing callback you receive when a user stops Picture in Picture.
Make sure to return from this completionHandler as quickly as possible so as not to block the transition from full screen to inline.
Returning false indicates that restoration failed or isn't possible, in which case the content exits full screen without an animation.
With that, I would like to wrap up today's session.
We saw how to use the new content source API to add Picture in Picture support to your app when using AVSampleBufferDisplayLayer instead of AVPlayerLayer.
For macOS and Mac Catalyst, we went over the enhanced full screen experience, and outlined the necessary steps for your code to integrate seamlessly.
I hope you enjoyed today's session and I look forward to seeing some of these features integrated into your apps.
Enjoy the rest of the conference.
Looking for something specific? Enter a topic above and jump straight to the good stuff.
An error occurred when submitting your query. Please check your Internet connection and try again.