Create view-level services for media playback, complete with user controls, chapter navigation, and support for subtitles and closed captioning using AVKit.

AVKit Documentation

Posts under AVKit tag

89 Posts
Sort by:
Post not yet marked as solved
3 Replies
2.5k Views
AVPlayerViewController has the functionality to keep playing the audio when the app is backgrounded or the device is locked. Obviously picture in picture improves this when the app is backgrounded but we lose the functionality when the device is locked as it stops the audio. The user can hit play on the lock-screen to continue playing but doesn't seem to be ideal. Is there any way around this or is the expected behaviour?
Posted
by
Post marked as solved
5 Replies
1.9k Views
I am currently working on a macOS app which will be creating very large video files with up to an hour of content. However, generating the images and adding them to AVAssetWriter leads to VTDecoderXPCService using 16+ GB of memory and the kernel-task using 40+ GB (the max I saw was 105GB). It seems like the generated video is not streamed onto the disk but rather written to memory for it to be written to disk all at once. How can I force it to stream the data to disk while the encoding is happening? Btw. my app itself consistently needs around 300MB of memory, so I don't think I have a memory leak here. Here is the relevant code: func analyse() { 				self.videoWritter = try! AVAssetWriter(outputURL: outputVideo, fileType: AVFileType.mp4) 				let writeSettings: [String: Any] = [ 						AVVideoCodecKey: AVVideoCodecType.h264, 						AVVideoWidthKey: videoSize.width, 						AVVideoHeightKey: videoSize.height, 						AVVideoCompressionPropertiesKey: [ 								AVVideoAverageBitRateKey: 10_000_000, 						] 				] 				self.videoWritter!.movieFragmentInterval = CMTimeMake(value: 60, timescale: 1) 				self.frameInput = AVAssetWriterInput(mediaType: AVMediaType.video, outputSettings: writeSettings) 				self.frameInput?.expectsMediaDataInRealTime = true 				self.videoWritter!.add(self.frameInput!) 				if self.videoWritter!.startWriting() == false { 						print("Could not write file: \(self.videoWritter!.error!)") 						return 				} } func writeFrame(frame: Frame) { 				/* some more code to determine the correct frame presentation time stamp */ 				let newSampleBuffer = self.setTimeStamp(frame.sampleBuffer, newTime: self.nextFrameStartTime!) 				self.frameInput!.append(newSampleBuffer) 				/* some more code */ } func setTimeStamp(_ sample: CMSampleBuffer, newTime: CMTime) -> CMSampleBuffer { 				var timing: CMSampleTimingInfo = CMSampleTimingInfo( 						duration: CMTime.zero, 						presentationTimeStamp: newTime, 						decodeTimeStamp: CMTime.zero 				) 				var newSampleBuffer: CMSampleBuffer? 				CMSampleBufferCreateCopyWithNewTiming( 						allocator: kCFAllocatorDefault, 					 sampleBuffer: sample, 					 sampleTimingEntryCount: 1, 					 sampleTimingArray: &timing, 					 sampleBufferOut: &newSampleBuffer 			 ) 				return	newSampleBuffer! 		} My specs: MacBook Pro 2018 32GB Memory macOS Big Sur 11.1
Posted
by
Post not yet marked as solved
1 Replies
1.3k Views
I implemented a custom player and used AVPictureInPictureController. pip button is coming and also calling startPictureInPicture() in action. controller.isPictureInPicturePossible is returning true. AVPictureInPictureController delegates are not receiving the call. Any help? func pictureInPictureControllerWillStartPictureInPicture(_ pictureInPictureController: AVPictureInPictureController) {     print("PIP will start") } Have enabled Picture in picture in background mode capability set AVAudioSession category to playback in appdelegate set AVAudioSession.sharedInstance().setActive(true) as true
Posted
by
Post marked as solved
4 Replies
6k Views
Hello there, in our team we were requested to add the possibility to manually select the video quality. I know that HLS is an adaptive stream and that depending on the network condition it choose the best quality that fits to the current situation. I also tried some setting with preferredMaximumResolution and preferredPeakBitRate but none of them worked once the user was watching the steam. I also tried something like replacing the currentPlayerItem with the new configuration but anyway this only allowed me to downgrade the quality of the video. When I wanted to set it for example to 4k it did not change to that track event if I set a very high values to both params mentioned above. My question is if there is any method which would allow me to force certain quality from the manifest file. I already have some kind of extraction which can parse the manifest file and provide me all the available information but I couldn't still figure out how to make the player reproduce specific stream with my desired quality from the available playlist.
Posted
by
Post not yet marked as solved
1 Replies
1.1k Views
Due to legal restrictions I need to prevent my app's users from skipping and fast-forwarding the content that is played by AVPlayerViewController. I use playerViewController(:willResumePlaybackAfterUserNavigatedFrom:to:) and playerViewController(:timeToSeekAfterUserNavigatedFrom:to:) delegate methods to control the skipping behaviour. However, those delegate methods are only triggered for skip +/- 10, but not for fast-forwarding/rewinding.  Is there a way to prevent fast-forwarding in addition to skipping in AVPlayerViewController? Here is an example of the code I use: class ViewController: UIViewController {   override func viewDidAppear(_ animated: Bool) {     super.viewDidAppear(animated)     setUpPlayerViewController()   }   private func setUpPlayerViewController() {     let playerViewController = AVPlayerViewController()     playerViewController.delegate = self guard let url = URL(string: "https://devstreaming-cdn.apple.com/videos/streaming/examples/img_bipbop_adv_example_ts/master.m3u8") else {       debugPrint("URL is not found")       return     }     let playerItem = AVPlayerItem(url: url)     let player = AVPlayer(playerItem: playerItem)     playerViewController.player = player     present(playerViewController, animated: true) {       playerViewController.player?.play()     }   } } extension ViewController: AVPlayerViewControllerDelegate {   public func playerViewController(_ playerViewController: AVPlayerViewController, willResumePlaybackAfterUserNavigatedFrom oldTime: CMTime, to targetTime: CMTime) { // Triggered on skip +/- 10, but not on fast-forwarding/rewinding     print("playerViewController(_:willResumePlaybackAfterUserNavigatedFrom:to:)")   }   public func playerViewController(_ playerViewController: AVPlayerViewController, timeToSeekAfterUserNavigatedFrom oldTime: CMTime, to targetTime: CMTime) -> CMTime {     // Triggered on skip +/- 10, but not on fast-forwarding/rewinding     print("playerViewController(_:timeToSeekAfterUserNavigatedFrom:to:)")     return targetTime   } }
Posted
by
Post marked as solved
3 Replies
2.0k Views
I have a Catalyst application that uses (as expected) MPNowPlayingInfoCenter to set the now playing info and MPRemoteCommandCenter to get the media events for play/pause/stop/favorite/etc. The code is shared on iOS, tvOS and watchOS and it works correctly there. It seems not to work on macOS (app is compiled as a Catalyst application) on Big Sur (and Monterey, fwiw). Media keys on the keyboard starts the Music app, the music part of the control center do not show now playing info (nor the media controls there send messages to the app). I seem to remember that it used to work in Catalina (at least the media key part) and for sure it used to work in a precedent version of the same app that used to be an UIKit one. Is this a bug (worth a feedback to Apple) or something wrong on my side? I forgot some magic capability for macOS? App is sandboxed and uses hardened runtime, in case this is significant. Thank you for any hint!
Posted
by
Post not yet marked as solved
3 Replies
1.7k Views
I have a music app that can play in the background, using AVQueuePlayer. I'm in the process of adding support for CloudKit sync of the CoreData store, switching from NSPersistentContainer to NSPersistentCloudKitContainer. The initial sync can be fairly large (10,000+ records), depending on how much the user has used the app. The issue I'm seeing is this: ✅ When the app is in the foreground, CloudKit sync uses a lot of CPU, nearly 100% for a long time (this is expected during the initial sync). ✅ If I AM NOT playing music, when I put the app in the background, CloudKit sync eventually stops syncing until I bring the app to the foreground again (this is also expected). ❌ If I AM playing music, when I put the app in the background, CloudKit never stops syncing, which leads the system to terminate the app after a certain amount of time due to high CPU usage. Is there any way to pause the CloudKit sync when the app is in the background or is there any way to mitigate this?
Posted
by
Post not yet marked as solved
1 Replies
993 Views
I know that if you want background audio from AVPlayer you need to detatch your AVPlayer from either your AVPlayerViewController or your AVPlayerLayer in addition to having your AVAudioSession configured correctly. I have that all squared away and background audio is fine until we introduce AVPictureInPictureController or use the PiP behavior baked into AVPlayerViewController. If you want PiP to behave as expected when you put your app into the background by switching to another app or going to the homescreen you can't perform the detachment operation otherwise the PiP display fails. On an iPad if PiP is active and you lock your device you continue to get background audio playback. However on an iPhone if PiP is active and you lock the device the audio pauses. However if PiP is inactive and you lock the device the audio will pause and you have to manually tap play on the lockscreen controls. This is the same between iPad and iPhone devices. My questions are: Is there a way to keep background-audio playback going when PiP is inactive and the device is locked (iPhone and iPad) Is there a way to keep background-audio playback going when PiP is active and the device is locked? (iPhone)
Posted
by
Post not yet marked as solved
1 Replies
1.4k Views
I am trying to support video playback for remote video files in iOS. It works for a regular video, but has a problem with slow motion videos: the slow motion effect is lost. I.e. A slow motion plays back like a regular speed rate video. Here is what I am doing:         AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];         [asset.resourceLoader setDelegate:self.urlDelegate                                     queue:[self.urlDelegate getDispatchQueue]];         AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];         AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];         AVPlayerViewController *controller = [[AVPlayerViewController alloc] init];         controller.player = player;         [player play];         [self presentViewController:controller animated:false completion:^{ <snipped>         }]; Note, the url points to a QuickTime (.MOV) video file on a HTTP(s) server. Again, the above code plays a slow motion video just like a regular video without any slow motions. Is it because AVURLAsset does not support slow motion video? What are I missing to let AVPlayer plays slow motion? I am targeting iOS 12 and above. Thanks!
Posted
by
Post not yet marked as solved
2 Replies
1.2k Views
We have QR-scanner feature implemented on web view (WKWebView). If it's dark we want to light our QR-code using flashlight in iPhone. In general, this feature works, but without flashlight. But have that problems: If we turn on torch then camera preview disappears. If turn off torch then camera preview appears. Do you have any idea why it's so? And how can we sort it out? Thanks
Posted
by
Post not yet marked as solved
3 Replies
1.5k Views
AVPlayer was working on a previous version of XCode, but when XCode updated yesterday, the AVPlayer stopped working in the simulator. Here is the error message: [connection] nw_connection_add_timestamp_locked_on_nw_queue [C2] Hit maximum timestamp count, will start dropping events
Posted
by
Post not yet marked as solved
1 Replies
963 Views
Is it still possible to tutor a QuickTime movie with hyperlinks? I'm building a website and I know at one point you could author a QuickTime movie that supported links inside the video - either to other timestamps in the video or to other web pages. I don't want to use a custom player, I'd prefer to use the system level. I've seen a really amazing example of this on the mobile version of the memory alpha (Star Trek nerds!) website. There is a movie that plays at the top of pages that is fully interactive. Is that still supported? Is it possible to author that way? I'm not making anything insanely complicate, I just thought it would be a nice way to build a website with tools I'm more comfortable working in.
Posted
by
Post marked as solved
3 Replies
1.9k Views
When capturing RAW (not ProRAW) photos using AVCapturePhotoOutput, the resulting images are subject to a strange overexposed effect when viewed in Apple software. I have been able to recreate this in multiple iOS apps which allow RAW capture. Some users report previously normal images transforming over the span of a few minutes. I have actually watched this happen in real-time: if you observe the camera roll after taking a few RAW photos, the highlights in some will randomly **** (edit: this just says b l o w, nice job profanity filter) out of proportion after whatever is causing this issue kicks in. The issue can also be triggered by zooming in to one of these images from the stock Photos app. Once the overexposure happens on a given image, there doesn't appear to be a way to get it to display normally again. However, if you AirDrop an image to a different device and then back, you can see it display normally at first and then break once more. Interestingly, the photo displays completely fine when viewed in Affinity photo or random photo viewers on Ubuntu. Sometimes the issue is not that bad, but it is often egregious, resulting in completely white areas of a previously balanced photo (see https://discussions.apple.com/thread/254424489). This definitely seems like a bug, but is there any way to prevent it? Could there be an issue with color profiles? This is not the same issue in which users think RAW photos are broken because they are viewing the associated JPG – this happens even with photos that have no embedded JPG or HEIC preview. Very similar (supposedly fixed) issue on MacOS: https://www.macworld.com/article/1444389/overexposed-raw-image-export-macos-monterey-photos-fixed.html Numerous similar complaints: https://discussions.apple.com/thread/254424489 https://discussions.apple.com/thread/253179308 https://discussions.apple.com/thread/253773180 https://discussions.apple.com/thread/253954972 https://discussions.apple.com/thread/252354516
Posted
by
Post not yet marked as solved
2 Replies
1k Views
Hi, In my SwiftUI app, I'm using AVPlayer and a UIViewRepresentable view to play back a video from a network server. There are many videos on the server, so I have a LazyHStack for the video thumbnails. Only one video is shown on screen at a time. When the user scrolls between videos, I will create a new AVPlayerItem and replace the current player item in the AVPlayer. The problem is: the first video plays back without any problems. But when the user scrolls over to the next video, the new video would play for 1 second and then freezes while the audio track continues. The video track would resume after a few seconds. One note is that if I drag the video view a bit without fully scrolling to another video, the video track will resume after I released the drag. So it seems maybe somehow the AVPlayerLayer was not rendering? Related code pieces: struct PlayerView: UIViewRepresentable { let player: Player let width: CGFloat let height: CGFloat @Binding var updateCount: Int func makeUIView(context: Context) -> UIView { print("\(#function)") let playerLayer = AVPlayerLayer(player: player) let view = UIView() view.frame = CGRectMake(0, 0, width, height) playerLayer.frame = view.bounds view.layer.addSublayer(playerLayer) return view } func updateUIView(_ uiView: UIView, context: Context) { print("\(#function)") guard let playerLayer = uiView.layer.sublayers?.first as? AVPlayerLayer else { return } uiView.frame = CGRectMake(0, 0, width, height) if updateCount > 0 { playerLayer.player = player playerLayer.frame = uiView.bounds } } } updateCount was added to force updateUIView called when the parent View increases the updateCount. But it seems not helping much. My question: Is there any known issue about replacing the current item in AVPlayer in SwiftUI app? Here is my code of replacing the current item: let asset = AVURLAsset(url: url) asset.resourceLoader.setDelegate(urlDelegate, queue: urlDelegate.resourceLoaderQueue) let assetKeys = ["playable", "hasProtectedContent"] let playerItem = AVPlayerItem(asset: asset, automaticallyLoadedAssetKeys: assetKeys) player.replaceCurrentItem(with: playerItem) Note that I have a custom delegate to handle the resource loader for the URLAsset. This delegate is the same for all videos. They always plays well when selected first time (i.e. without replacing another video). One more data point: I have an ObjC version of the same design, and it worked without problems.
Posted
by
Post marked as solved
2 Replies
681 Views
I need to get video duration in seconds. If I use code below guard let playerItem = self.player?.currentItem else { return 0 } print("playerItem.duration", playerItem.duration, playerItem.duration.seconds, playerItem.duration.toString) it returns me a double value 37.568 CMTime(value: 37568, timescale: 1000, flags: __C.CMTimeFlags(rawValue: 1), epoch: 0) 37.568 00:00:37 I wonder why it's a 37.568 instead of 37.000? if video is 00:00:37? Does it means, it is a 37,5s long?
Posted
by
Post not yet marked as solved
1 Replies
898 Views
I am learning SwiftUI, I want to observe an AVPlayer status so I know when the videos is paused or not. My current approach is more less like this: I have VideosView that holds a list of a videos (in ZStack cards). VideoViews has a VideosViewModel. in VideosView i am calling in onAppear VideosViewModel.getItems... struct ItemModel: Identifiable, Codable, Hashable, Equatable { var id: String var author: String // video owner var url: URL? // url to the video var player: AVPlayer? // AVPlayer created based on self.url... mutating func setPlayer(_ avPlayer: AVPlayer) { self.player = avPlayer } } // vm class FeedViewModel: ObservableObject { @Published private(set) var items: [ItemModel] = [] func getItems() async { do { // fetch data from the API let data = try await dataService.fetchFeeds() // download and attach videos downloadFeedVideos(data) } catch { // .... } } private func downloadFeedVideos(_ feeds: [ItemModel]) { for index in feeds.indices { var item = feeds[index] if let videoURL = item.url { self.downloadQueue.queueDownloadIfFileNotExists( videoURL, DownloadOperation( session: URLSession.shared, downloadTaskURL: videoURL, completionHandler: { [weak self] (localURL, response, error) in guard let tempUrl = localURL else { return } let saveResult = self?.fileManagerService.saveInTemp(tempUrl, fileName: videoURL.lastPathComponent) switch saveResult { case .success(let savedURL): DispatchQueue.main.async { // maybe this is a wrong place to have it? item.setPlayer(AVPlayer(url: savedURL)) self?.items.append(item) if self?.items.count ?? 0 > 1 { // once first video is downloaded, use all device cores to fetch next videos // all newest iOS devices has 6 cores self?.downloadQueue.setMaxConcurrentOperationCount(.max) } } case .none: break case .failure(_): EventTracking().track("Video download fail", [ "id": item.id, "ulr": videoURL.absoluteString.decodeURL() ]) } }), { fileCacheURL in // file already downloaded DispatchQueue.main.async { item.setPlayer(AVPlayer(url: fileCacheURL)) self.items.append(item) } }) } } } } I found this article with some pseudo-code of how to track video playback state but I'm not sure how to implement it in my code.... https://developer.apple.com/documentation/avfoundation/media_playback/observing_playback_state
Posted
by
Post not yet marked as solved
0 Replies
689 Views
I want to use AVPlayerViewController to display the video but it should be in auto-play mode. Previously I was using AVPlayer for that and listening to the .AVPlayerItemDidPlayToEndTime notification but I wonder if there is a better way? eg. using AVPlayerLooper for instance so I don't have to use that .AVPlayerItemDidPlayToEndTime anymore I wrote something like this but it is not working - I have a black screen with video controls - probably because AVPlayerViewController does not have any playable content... struct VideoPlayerQueuedView: UIViewControllerRepresentable { let videoUrl: URL func makeUIViewController(context: Context) -> AVPlayerViewController { let queuePlayer = AVQueuePlayer() let playerViewController = AVPlayerViewController() // Create an AVPlayerItem from the videoUrl let playerItem = AVPlayerItem(url: videoUrl) // Create an AVPlayerLooper with the queuePlayer and the playerItem as the template item let playerLooper = AVPlayerLooper(player: queuePlayer, templateItem: playerItem) // Set the player property of AVPlayerViewController playerViewController.player = queuePlayer return playerViewController } func updateUIViewController(_ uiViewController: AVPlayerViewController, context: Context) { // Update the video player if needed } }
Posted
by