Create view-level services for media playback, complete with user controls, chapter navigation, and support for subtitles and closed captioning using AVKit.

AVKit Documentation

Posts under AVKit tag

89 Posts
Sort by:
Post not yet marked as solved
1 Replies
1k Views
We have created a hls playback framework and lot of our client's are complaining about this error {"code": -19152, "domain": "CoreMediaErrorDomain", "localizedDescription": "The operation couldn’t be completed. (CoreMediaErrorDomain error -19152 - The operation couldn’t be completed. (CoreMediaErrorDomain error -19152.))", "localizedFailureReason": "", "localizedRecoverySuggestion": ""} We are unable to reproduce this issue on our end but we have data reflecting the same error happening at good rate. Any help/hint is welcome. Thanks
Posted
by
Post not yet marked as solved
2 Replies
1.4k Views
I cannot seem to create an AVAudioFile from a URL to be played in an AVAudioEngine. Here is my complete code, following the documentation. import UIKit import AVKit import AVFoundation class ViewController: UIViewController { let audioEngine = AVAudioEngine() let audioPlayerNode = AVAudioPlayerNode() override func viewDidLoad() { super.viewDidLoad() streamAudioFromURL(urlString: "https://samplelib.com/lib/preview/mp3/sample-9s.mp3") } func streamAudioFromURL(urlString: String) { guard let url = URL(string: urlString) else { print("Invalid URL") return } let audioFile = try! AVAudioFile(forReading: url) let audioEngine = AVAudioEngine() let playerNode = AVAudioPlayerNode() audioEngine.attach(playerNode) audioEngine.connect(playerNode, to: audioEngine.outputNode, format: audioFile.processingFormat) playerNode.scheduleFile(audioFile, at: nil, completionCallbackType: .dataPlayedBack) { _ in /* Handle any work that's necessary after playback. */ } do { try audioEngine.start() playerNode.play() } catch { /* Handle the error. */ } } } I am getting the following error on let audioFile = try! AVAudioFile(forReading: url) Thread 1: Fatal error: 'try!' expression unexpectedly raised an error: Error Domain=com.apple.coreaudio.avfaudio Code=2003334207 "(null)" UserInfo={failed call=ExtAudioFileOpenURL((CFURLRef)fileURL, &_extAudioFile)} I have tried many other .mp3 file URLs as well as .wav and .m4a and none seem to work. The documentation makes this look so easy but I have been trying for hours to no avail. If you have any suggestions, they would be greatly appreciated!
Posted
by
Post not yet marked as solved
3 Replies
1.3k Views
Hello, We are using HLS for our streaming iOS and tvOS applications. We have DRM protection on our applications but we want add another secure layer which is CDN token. We want to add that CDN token data on header or query parameters. Any of two are applicable at our CDN side. There is a problem at client side. We want to send that token knowledge and refresh at given a time. We add token data using at initial state let asset = AVURLAsset(url: url, options: ["AVURLAssetHTTPHeaderFieldsKey": headers]) and add interceptor with asset.resourceLoader.setDelegate. It works seamlessly. We use AVAssetResourceLoaderDelegate and we can intercept just master playlist and playlists via func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForLoadingOfRequestedResource loadingRequest: AVAssetResourceLoadingRequest) -> Bool and we can refresh CDN token data at only playlists. That token data can be at query params or header. It does not matter. For example, #EXTM3U #EXT-X-VERSION:3 #EXTINF:10.0 https://chunk1?cdntoken=A.ts #EXTINF:10.0 https://chunk2?cdntoken=A.ts #EXTINF:10.0 https://chunk3?cdntoken=A.ts #EXTINF:10.0 assume that it is our .m3u8 file for given live video playlist. It has three chunks with cdn token data in query params. If we give that chunks to AVPlayer. It is going to play chunks in order. When we change new cdn token at query params, it effects our chunk Urls and our player stalls. It is because our cdn adds new cdn token knowledge to chunk's Url. It means that our new .m3u8 is going to be like that for next playlist; #EXT-X-VERSION:3 #EXTINF:10.0 https://chunk4?cdntoken=B.ts #EXTINF:10.0 https://chunk5?cdntoken=B.ts #EXTINF:10.0 https://chunk6?cdntoken=B.ts #EXTINF:10.0 Cdn token data is converted to B from A at cdn side and it sends new playlist like that. That's why, our player is going to stall. Is there any way not to player stall with edited chunk url? When we change new cdn token at header. It does not change chunks url like at the first question but AVPlayer does not allow to intercept chunk Urls which means that before callin https://chunk1?cdntoken=A.ts url, i want to intercept and add new cdn token data to header. Is there any way to intercept chunk Urls like intercepting playlist? Thanks for answers in advance
Posted
by
Post not yet marked as solved
1 Replies
694 Views
hi I am having a issue with sound on a network video stream, the stream is loaded by a m3u,. during playback there is no audio from the device, however when using headphones / airplay audio works correctly. the other peculiar thing is the device simulator works fine. this maybe related to airplay working, but I don't know. this is the view handling the playback. Im not sure where the issue is. I can also play the videos fine when embedding the avplayer in its own view. but that looks messy when you have to dismiss a second window when closing the video. #if os(iOS) import SwiftUI import AVKit import MediaPlayer struct iOSVideoLibraryView: View { @ObservedObject var videoLibrary: VideoLibrary @State private var isPlayerDismissed = false let LiveStreams = [GridItem(.flexible()), GridItem(.flexible()), GridItem(.flexible())] let VODStreams = [GridItem(.flexible()), GridItem(.flexible()), GridItem(.flexible()), GridItem(.flexible())] var body: some View { NavigationView { ScrollView { LazyVGrid(columns: LiveStreams, spacing: 20) { ForEach(videoLibrary.videos, id: \.title) { video in if video.type == "LIVE" { Button(action: { isPlayerDismissed = false // Reset the dismissal flag presentVideoPlayer(videoURL: video.referenceURL) }) { VStack { Image(systemName: "play.circle.fill") .font(.system(size: 30)) // icon .foregroundColor(.blue) Text(video.title) .frame(width: 100, height: 50) // title bounds .font(Font.caption) .background(Color.blue) .foregroundColor(.white) .cornerRadius(3) } .frame(width: 70) // main button container .padding() .background(Color.blue.opacity(0.2)) .cornerRadius(10) } } else { // Handle non-LIVE videos } } } .padding() } .navigationBarTitle("Live Streams") } } private func presentVideoPlayer(videoURL: URL) { let playerViewController = CustomAVPlayerViewController() let player = AVPlayer(url: videoURL) playerViewController.player = player player.isMuted = false player.play() DispatchQueue.main.async { playerViewController.modalPresentationStyle = .fullScreen UIApplication.shared.windows.first?.rootViewController?.present(playerViewController, animated: true, completion: nil) } } } class PlayerManager: NSObject, AVPictureInPictureControllerDelegate { static let shared = PlayerManager() func pictureInPictureControllerWillStartPictureInPicture(_ pictureInPictureController: AVPictureInPictureController) { // Perform any necessary actions when picture-in-picture starts } func pictureInPictureControllerDidStopPictureInPicture(_ pictureInPictureController: AVPictureInPictureController) { // Perform any necessary actions when picture-in-picture stops } func pictureInPictureController(_ pictureInPictureController: AVPictureInPictureController, failedToStartPictureInPictureWithError error: Error) { // Perform any necessary actions when picture-in-picture fails to start } } class CustomAVPlayerViewController: AVPlayerViewController { let playerManager = PlayerManager.shared let customPlayer = AVPlayer() override func viewDidAppear(_ animated: Bool) { super.viewDidAppear(animated) if AVPictureInPictureController.isPictureInPictureSupported() { if let playerItem = customPlayer.currentItem { let playerLayer = AVPlayerLayer(player: customPlayer) playerLayer.videoGravity = .resizeAspectFill let pictureInPictureController = AVPictureInPictureController(playerLayer: playerLayer) pictureInPictureController?.delegate = playerManager if let pictureInPictureController = pictureInPictureController, pictureInPictureController.isPictureInPicturePossible { pictureInPictureController.startPictureInPicture() } } } } override func viewDidLoad() { super.viewDidLoad() customPlayer.addObserver(self, forKeyPath: "currentItem", options: .new, context: nil) } override func viewDidDisappear(_ animated: Bool) { super.viewDidDisappear(animated) customPlayer.removeObserver(self, forKeyPath: "currentItem") } override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) { if keyPath == "currentItem" { if let playerItem = customPlayer.currentItem { // Handle player item change } } } } #endif
Posted
by
Post not yet marked as solved
3 Replies
1.2k Views
When trying to present a AVPlayerViewController I am getting this error: -AVSystemController- +[AVSystemController sharedInstance]: Failed to allocate AVSystemController, numberOfAttempts=1 and when setting the AVPlayer to it, I get <<<< AVError >>>> AVLocalizedErrorWithUnderlyingOSStatus: Returning error (AVFoundationErrorDomain / -11,800) status (-12,746) Nothing of this happens with iOS 16 or lower
Posted
by
Post not yet marked as solved
0 Replies
653 Views
I need to manually set the frame of a NSTextField, but it seems that when added to AVPlayerView.contentOverlayView the frame gets resized so that it hugs the text. This doesn't happen when the text field is added to a simple NSView instead. Is this a bug? Is there a workaround? class ViewController: NSViewController { override func loadView() { view = NSView() let text = NSTextField(frame: CGRect(x: 0, y: 0, width: 200, height: 30)) text.translatesAutoresizingMaskIntoConstraints = false text.isEditable = false text.backgroundColor = .red let paragraph = NSMutableParagraphStyle() paragraph.alignment = .center text.attributedStringValue = NSAttributedString(string: "asdf", attributes: [.paragraphStyle: paragraph]) view.addSubview(text) // commenting out the following 3 lines solves the issue let playerView = AVPlayerView(frame: CGRect(x: 0, y: 0, width: 200, height: 200)) view.addSubview(playerView) playerView.contentOverlayView!.addSubview(text) // uncommenting the following 5 lines also solves the issue, but the wrong text field frame is briefly visible before it resizes to the correct width // DispatchQueue.main.async { // print(text.frame) // text.frame.size.width = 200 // text.removeConstraints(text.constraints) // } } }
Posted
by
Post not yet marked as solved
0 Replies
470 Views
Dear AVKit Engineers, I see a strange bug in AVKit. Even after the viewcontroller hosting AVPlayerViewController is dismissed, I see CPU spiking to over 100% which is caused by some animation code still running after AVPlayerViewController no more exists ([AVMobileChromelessControlsViewController__animateSliderToTintState:duration:completionHandler:]). How does this code continue to run even after AVPlayerViewController is no more? And what can I do to fix it?
Posted
by
Post not yet marked as solved
0 Replies
569 Views
I am trying to set the .commonIdentifierTitle, .iTunesMetadataTrackSubTitle, .commonIdentifierDescription metadata to an AVPlayerItem's externalMetadata property, but unfortunately only the title and subtitle shown up in the AVPlayerViewController UI. According to the WWDC22 "Create a great video playback experience" video, we are expecting to see description with a chevron should appears. Example code I used is exactly same as outlined in the video: https://developer.apple.com/videos/play/wwdc2022/10147/?time=248 // Setting content external metadata let titleItem = AVMutableMetadataItem() titleItem.identifier = .commonIdentifierTitle titleItem.value = // Title string let subtitleItem = AVMutableMetadataItem() subtitleItem.identifier = .iTunesMetadataTrackSubTitle subtitleItem.value = // Subtitle string let infoItem = AVMutableMetadataItem() infoItem.identifier = .commonIdentifierDescription infoItem.value = // Descriptive info paragraph playerItem.externalMetadata = [titleItem, subtitleItem, infoItem] Could anyone have solutions regarding to this issue or this is a bug in AVPlayerViewController? Thanks
Posted
by
Post not yet marked as solved
0 Replies
411 Views
i am currently showing video by LL-HLS using AVPlayer and AVPictureInPictureController. Video seems to play just fine when using only AVPlayer But whenever the app turns in to PictureInPicture mode it player crashes or stops with emitting below errors. servers m3u8 file contains rendition-report tag and discontinuity tags. Logs about ECN doesnt seem to be critical because other example m3u8 playlist for samples works just fine. the difference with my server and example m3u8 is that, mine supports multi resolution for various bandwith(ABR). ====================================================== errorDate Optional(2023-07-08 14:07:42 +0000) errorStatusCode -15410 errorDomain CoreMediaErrorDomain errorComment Optional("Server should support ECN for Low Latency") ====================================================== ====================================================== errorDate Optional(2023-07-08 14:07:47 +0000) errorStatusCode -15418 errorDomain CoreMediaErrorDomain errorComment Optional("Dropping out of Low Latency: missing Rendition Report") ====================================================== ====================================================== errorDate Optional(2023-07-08 14:07:52 +0000) errorStatusCode -12642 errorDomain CoreMediaErrorDomain errorComment Optional("#EXT-X-DISCONTINUITY-SEQUENCE is no longer in Media Playlist") ====================================================== ====================================================== errorDate Optional(2023-07-08 14:07:52 +0000) errorStatusCode -12642 errorDomain CoreMediaErrorDomain errorComment Optional("Playlist parse error")
Posted
by
Post not yet marked as solved
0 Replies
739 Views
If I disable playback controls for an AVPlayer (showsPlaybackControls), some feature of MPNowPlayingInfoCenter no longer working. (play/pause, skip forward and backward). I need custom video and audio controls on my AVPlayer in my app, that's why I disabled the iOS playback controls. But I also need the features of the MPNowPlayingInfoCenter. Is there another solution to achieve this?
Posted
by
Post not yet marked as solved
0 Replies
378 Views
I can use ios16 to display floating windows, but ios15 cannot. Why @try { NSError *error = nil; [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback mode:AVAudioSessionModeMoviePlayback options:AVAudioSessionCategoryOptionInterruptSpokenAudioAndMixWithOthers error:&amp;error]; [[AVAudioSession sharedInstance] setCategory:AVAudioSessionOrientationBack error:&amp;error]; [[AVAudioSession sharedInstance] setActive:YES error:&amp;error]; } @catch ( NSException *exception ) { NSLog( @"AVAudioSession error" ); } self.pipVC = [[AVPictureInPictureController alloc] initWithPlayerLayer:self.playerLayer]; self.pipVC.delegate = self; [self.pipVC setValue:@1 forKey:@"controlsStyle"]; if ( ![self.pipVC isPictureInPictureActive] ) { [self.pipVC startPictureInPicture]; }
Posted
by
Post not yet marked as solved
0 Replies
552 Views
I'm trying to use the resourceLoader of an AVAsset to progressively supply media data. Unable to because the delegate asks for the full content requestsAllDataToEndOfResource = true. class ResourceLoader: NSObject, AVAssetResourceLoaderDelegate { func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForLoadingOfRequestedResource loadingRequest: AVAssetResourceLoadingRequest) -> Bool { if let ci = loadingRequest.contentInformationRequest { ci.contentType = // public.mpeg-4 ci.contentLength = // GBs ci.isEntireLengthAvailableOnDemand = false ci.isByteRangeAccessSupported = true } if let dr = loadingRequest.dataRequest { if dr.requestedLength > 200_000_000 { // memory pressure // dr.requestsAllDataToEndOfResource is true } } return true } } Also tried using a fragmented MP4 created using AVAssetWriter. But didn't work. Please let me know if it's possible for the AVAssetResourceLoader to not ask for the full content?
Posted
by
Post not yet marked as solved
1 Replies
682 Views
AVSpeechSynthesisVoice.speechVoices() returns voices that are no longer available after upgrading from iOS 16 to iOS 17 (although this has been an issue for a long time, I think). To reproduce: On iOS 16 download 1 or more enhanced voices under “Accessibility > Spoken Content > Voices”. Upgrade to iOS 17 Call AVSpeechSynthesisVoice.speechVoices() and note that the voices installed in step (1) are still present, yet they are no longer downloaded, therefore they don’t work. And there is no property on AVSpeechSynthesisVoice to indicate if the voice is still available or not. This is a problem for apps that allow users to choose among the available system voices. I receive many support emails surrounding iOS upgrades about this issue. I have to tell them to re-download the voices which is not obvious to them. I've created a feedback item for this as well (FB12994908).
Posted
by
Post marked as solved
1 Replies
533 Views
Hello I am trying to assign some metadata to my AVExportSession so that the output files have the same metadata attached as the AVExportSession. I have tried import AVFoundation import AVKit and a combo of the two. The AVMutableMetaDataItem.value is unable to be set because the compiler will reference NSObject.value property or its NSObject.setValue(forKey:) method I have tried creating a playground and just creating the Mutable Metadata item without the same errors When trying to subclass AVMutableMetaDataItem and trying to override its open var value property, the compiler will then complain the value we are overriding is read-only in reference to its non-mutable counterpart AVMetaDataItem.value Does a radar need to be filled or is there something I am not doing correctly with creating an AVMutableMetaDataItem?
Posted
by
Post not yet marked as solved
0 Replies
635 Views
In our application, we play video-on-demand (VOD) content and display subtitles in different languages. The format we prefer for subtitles is WebVTT. We are planning to enhance caption styling (text color, background color, font weight, etc.) in WebVTT files. In our current flow, subtitles and images are loaded in 6-second chunks. Below is an example of one of the subtitle parts we use: WEBVTT X-TIMESTAMP-MAP=MPEGTS:0,LOCAL:00:00:00.000
Posted
by
Post not yet marked as solved
1 Replies
588 Views
Hi guys, Setting AVPlayerViewController.transportBarCustomMenuItems is not working on tvOS. I still see 2 icons for Audio and Subtitles. let menuItemAudioAndSubtitles = UIMenu( image: UIImage(systemName: "heart") ) playerViewController.transportBarCustomMenuItems = [menuItemAudioAndSubtitles] WWDC 2021 video is insufficient to make this work. https://developer.apple.com/videos/play/wwdc2021/10191/ The video doesn't say what exactly I need to do. Do I need to disable subtitle options? viewController.allowedSubtitleOptionLanguages = [] This didn't work and I still see the default icon loaded by the player. Do I need to create subclass of AVPlayerViewController? I just want to replace those 2 default icons by 1 icon as a test, but I was unsuccessful after many hours of work. Is it mandatory to define child menu items to the main item? Or do I perhaps need to define UIAction? The documentation and video are insufficient in providing guidance how to do that. I did something like this before, but that was more than 3 years ago and audi and subtitles was showing at the top of the player screen as tabs, if I rememebr correctly. Is transportBarCustomMenuItems perhaps deprecated? Is it possible that when loading AVPlayerItem and it detects audi and subtitles in the stream, it automatically resets AVPlayerViewController menu? How do I suppress this behavior? I'm currently loading AVPlayerViewController into SwiftUI interface. Is that perhaps the problem? Should I write SwiftUI player overlay from scratch? Thanks, Robert
Posted
by
Post not yet marked as solved
0 Replies
590 Views
Hey guys! I have a question about PiP(Picture in Picture) mode. Do we have some possible solution in case to hide controls like play/pause, step forward buttons using AVPictureInPictureController in UIKit? I know that we have option to set requiresLinearPlayback = true. Using it, we just disable our controls. I found possible solution just setting: pipController.setValue(1, forKey: "requiresLinearPlayback"). It seems to be part of private API, and I'm not sure if it'll pass AppStore review. I'm looking forward to some advice in that case, and how can I handle it.
Posted
by
Post not yet marked as solved
0 Replies
319 Views
Hi, We have a tvOS App with a custom player and we're getting some crashes trying to remove a periodicTimeObserver on a player instance: Incident Identifier: 3FE68C1C-126D-4A16-BBF2-9F8D1E395548 Hardware Model: AppleTV6,2 Process: MyApp [2516] Path: /private/var/containers/Bundle/Application/B99FEAB0-0753-48FE-A7FC-7AEB8E2361C1/MyApp.app/MyApp Identifier: pt.appletv.bundleid Version: 4.9.5 (2559) AppStoreTools: 15A240a AppVariant: 1:AppleTV6,2:16 Beta: YES Code Type: ARM-64 (Native) Role: Foreground Parent Process: launchd [1] Coalition: pt.appletv.bundleid [317] Date/Time: 2023-09-21 18:49:39.0241 +0100 Launch Time: 2023-09-21 18:38:34.6957 +0100 OS Version: Apple TVOS 16.6 (20M73) Release Type: User Report Version: 104 Exception Type: EXC_CRASH (SIGABRT) Exception Codes: 0x0000000000000000, 0x0000000000000000 Termination Reason: SIGNAL 6 Abort trap: 6 Terminating Process: MyApp [2516] Triggered by Thread: 0 Last Exception Backtrace: 0 CoreFoundation 0x1914c12c8 __exceptionPreprocess + 160 (NSException.m:202) 1 libobjc.A.dylib 0x190cfc114 objc_exception_throw + 56 (objc-exception.mm:356) 2 AVFCore 0x1c432b89c -[AVPlayer removeTimeObserver:] + 176 (AVPlayer.m:0) 3 CustomPlayer 0x10549f670 MyPlayerViewController.removePlayerObservers(_:) + 248 (MyPlayerViewController.swift:252) 4 CustomPlayer 0x10549c978 closure #1 in MyPlayerViewController.player.didset + 68 (MyPlayerViewController.swift:98) 5 CustomPlayer 0x10549be60 thunk for @escaping @callee_guaranteed () -> () + 28 (<compiler-generated>:0) 6 libdispatch.dylib 0x190e5eef4 _dispatch_call_block_and_release + 24 (init.c:1518) 7 libdispatch.dylib 0x190e60784 _dispatch_client_callout + 16 (object.m:560) 8 libdispatch.dylib 0x190e6dd34 _dispatch_main_queue_drain + 892 (queue.c:7794) 9 libdispatch.dylib 0x190e6d9a8 _dispatch_main_queue_callback_4CF + 40 (queue.c:7954) 10 CoreFoundation 0x19142b038 __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 12 (CFRunLoop.c:1780) 11 CoreFoundation 0x19142569c __CFRunLoopRun + 2080 (CFRunLoop.c:3147) 12 CoreFoundation 0x191424a3c CFRunLoopRunSpecific + 584 (CFRunLoop.c:3418) 13 GraphicsServices 0x1980cab0c GSEventRunModal + 160 (GSEvent.c:2196) 14 UIKitCore 0x1da6fe6ec -[UIApplication _run] + 868 (UIApplication.m:3782) 15 UIKitCore 0x1da702bc4 UIApplicationMain + 148 (UIApplication.m:5372) 16 MyApp 0x104418268 main + 176 (main.swift:12) 17 dyld 0x1ddd81744 start + 1832 (dyldMain.cpp:1165) Thread 0 name: Thread 0 Crashed: 0 libsystem_kernel.dylib 0x0000000190fe69a8 __pthread_kill + 8 (:-1) 1 libsystem_pthread.dylib 0x000000019109e440 pthread_kill + 208 (pthread.c:1670) 2 libsystem_c.dylib 0x0000000190f5f8dc __abort + 124 (abort.c:155) 3 libsystem_c.dylib 0x0000000190f5f860 abort + 132 (abort.c:126) 4 libc++abi.dylib 0x0000000190da1fe0 abort_message + 128 (:-1) 5 libc++abi.dylib 0x0000000190d92be8 demangling_terminate_handler() + 300 6 libobjc.A.dylib 0x0000000190cda7d4 _objc_terminate() + 124 (objc-exception.mm:498) 7 FirebaseCrashlytics 0x0000000105118754 FIRCLSTerminateHandler() + 340 (FIRCLSException.mm:452) 8 libc++abi.dylib 0x0000000190da15c0 std::__terminate(void (*)()) + 12 (:-1) 9 libc++abi.dylib 0x0000000190da1570 std::terminate() + 52 10 libdispatch.dylib 0x0000000190e60798 _dispatch_client_callout + 36 (object.m:563) 11 libdispatch.dylib 0x0000000190e6dd34 _dispatch_main_queue_drain + 892 (queue.c:7794) 12 libdispatch.dylib 0x0000000190e6d9a8 _dispatch_main_queue_callback_4CF + 40 (queue.c:7954) 13 CoreFoundation 0x000000019142b038 __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 12 (CFRunLoop.c:1780) 14 CoreFoundation 0x000000019142569c __CFRunLoopRun + 2080 (CFRunLoop.c:3147) 15 CoreFoundation 0x0000000191424a3c CFRunLoopRunSpecific + 584 (CFRunLoop.c:3418) 16 GraphicsServices 0x00000001980cab0c GSEventRunModal + 160 (GSEvent.c:2196) 17 UIKitCore 0x00000001da6fe6ec -[UIApplication _run] + 868 (UIApplication.m:3782) 18 UIKitCore 0x00000001da702bc4 UIApplicationMain + 148 (UIApplication.m:5372) 19 MyApp 0x0000000104418268 main + 176 (main.swift:12) 20 dyld 0x00000001ddd81744 start + 1832 (dyldMain.cpp:1165) The code is: @objc public dynamic var player: AVPlayer? { willSet { removeThumbnails() } didSet { DispatchQueue.main.async { [weak self] in guard let self else { return } self.removePlayerObservers(oldValue) self.addPlayerObservers(self.player) } } } func removePlayerObservers(_ player: AVPlayer?) { if let periodicTimeObserver = periodicTimeObserver { player?.removeTimeObserver(periodicTimeObserver) self.periodicTimeObserver = nil } } What could be the problem? Thank you
Posted
by
Post not yet marked as solved
1 Replies
434 Views
I want to show the user actual start and end dates of the video played on the AVPlayer time slider, instead of the video duration data. I would like to show something like this: 09:00:00 ... 12:00:00 (which indicates that the video started at 09:00:00 CET and ended at 12:00:00 CET), instead of: 00:00:00 ... 02:59:59. I would appreciate any pointers to this direction.
Posted
by