Create view-level services for media playback, complete with user controls, chapter navigation, and support for subtitles and closed captioning using AVKit.

AVKit Documentation

Posts under AVKit tag

89 Posts
Sort by:
Post not yet marked as solved
2 Replies
447 Views
I download a mp4 file into the date folder. I am able to run an AVPlayer and view the video in the app. After some time video won't play. If I enable UIFileSharingEnabled and check if file exists, I can see it's there but then it won't play even from accessing it through Files. If I copy the file from iOS to Mac then I can pay the video on Mac, but not anymore on iOS. If I delete the app, install again and download the video while UIFileSharingEnabled and if it's played in the app it can be played in the Files app, but again after some time it's not playable again... I can see only first frame. I can even see the length of video. It happens to multiple videos. Any clues?
Posted
by 7h4r05.
Last updated
.
Post not yet marked as solved
0 Replies
511 Views
I'm working on a custom spatial video player that uses AVSampleBufferDisplayLayer as render layer. When I feed it with CMSampleBuffer that output from VTCompressionSession using new encoding API it can display normally but I don't know if it can work in VisionPro. Anyone has idea?
Posted
by fsjack.
Last updated
.
Post not yet marked as solved
3 Replies
666 Views
Loading a video that played on tvOS 17, won't now play in tvOS 17.2. It isn't true for all videos or even all videos of a certain type. This code works fine on tvOS 17, but not on 17.2 import SwiftUI import AVKit struct ContentView: View { var body: some View { let player = AVPlayer(url: URL(string: "http://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4")!) VideoPlayer(player: player) .onAppear { player.play() } } } I have tried reloading the metadata. I tried making the player from an AVAsset rather than a URL. I can't seem to see what is making it work with some videos and not all and what is different from tvOS 17 to 17.2.
Posted Last updated
.
Post marked as solved
1 Replies
457 Views
I am creating a camera app where I would like music from another app (Apple Music, Spotify, etc.) to continue playing once the app is opened. Currently I am using .mixWithOthers to do this in my viewDidLoad. let audioSession = AVAudioSession.sharedInstance() do { try audioSession.setCategory(AVAudioSession.Category.playback, options: [.mixWithOthers]) try audioSession.setActive(true) } catch { print("error trying to record and play audio") } However I am running into an issue where the music only plays if you resume music playback once you start recording a video. Otherwise, when you open the app music will stop when you see the preview. The interesting thing is that if you start playing music while recording, then once you stop music continues to play in the preview view. If you close the app (not force close) and reopen then music play back continues as expected. However, once you force close the app then it returns to the original behavior. I've tried to do research on this and I have not been able to find anything. Any help is appreciated. Let me know if more details are needed.
Posted
by MGTech.
Last updated
.
Post not yet marked as solved
0 Replies
418 Views
Hi Team, We see an issue with this version if CoreMedia requesting multiple qualities at all times for a stream. We don't see this issue on 1.0.0.21C62. We are unsure what would be causing this. [2024-01-05 16:53:51] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=0 HTTP/1.0" 200 1145 2529 1090199 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:52] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=0 HTTP/1.0" 200 1146 2396 1013356 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:52] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_0.fmp4 HTTP/1.0" 200 1139 24975 1013385 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:52] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=1 HTTP/1.0" 200 1145 2603 998670 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:52] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_1.fmp4 HTTP/1.0" 200 1138 40534 998739 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:53] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=2 HTTP/1.0" 200 1145 2677 835327 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:53] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_2.fmp4 HTTP/1.0" 200 1138 57656 835207 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:53] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=1 HTTP/1.0" 200 1146 2458 986038 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:53] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_1.fmp4 HTTP/1.0" 200 1139 24700 986032 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:54] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=3 HTTP/1.0" 200 1145 2751 1013257 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:54] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_3.fmp4 HTTP/1.0" 200 1138 55900 1013324 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:54] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=2 HTTP/1.0" 200 1146 2520 1016693 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:54] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_2.fmp4 HTTP/1.0" 200 1139 25014 1016717 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:55] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=4 HTTP/1.0" 200 1145 2825 917753 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:55] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_4.fmp4 HTTP/1.0" 200 1138 103745 917903 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:55] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=3 HTTP/1.0" 200 1146 2582 958102 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:55] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_3.fmp4 HTTP/1.0" 200 1139 24782 958195 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:56] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=5 HTTP/1.0" 200 1145 2899 931101 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:56] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_5.fmp4 HTTP/1.0" 200 1138 112113 931228 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:56] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=4 HTTP/1.0" 200 1146 2644 935550 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:56] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_4.fmp4 HTTP/1.0" 200 1139 24824 937720 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:57] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=5 HTTP/1.0" 200 1146 2706 895680 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:57] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_5.fmp4 HTTP/1.0" 200 1139 24843 895734 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:57] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=631&_HLS_part=0 HTTP/1.0" 200 1145 2529 907045 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
Posted
by jnolla.
Last updated
.
Post not yet marked as solved
1 Replies
433 Views
The following type of problem has appeared. I need to flip through the video. Imagine you have, say, 3 videos, and you can scroll through them and choose which video you want to watch. For this, I decided to use a TabView with the .page style. But it turned out that it didn't work. And I found myself in a stupor. The TabView itself starts to lag, the scrolling starts to lag, the videos do not start the first time, and sometimes the control panel does not even appear on some videos, which is why it is impossible to expand the video to full screen. The code will be below, maybe someone has encountered this problem, how did he solve it, maybe there are some other options to make a similar logic? let videos: [String] = ["burpee", "squat", "step-up", "sun-salute"] var body: some View { TabView { ForEach(videos, id: \.self) { videoName in VideoPlayerView(videoName: videoName) .clipShape(RoundedRectangle(cornerRadius: 25)) } } .frame(width: 375, height: 230) } struct VideoPlayerView: View { let videoName: String var body: some View { if let videoURL = Bundle.main.url(forResource: videoName, withExtension: "mp4") { VideoPlayerWrapper(player: AVPlayer(url: videoURL)) } else { Text("No Video \(videoName)") } } } #Preview { VideoPlayerView(videoName: "squat") } struct VideoPlayerWrapper: UIViewControllerRepresentable { let player: AVPlayer func makeUIViewController(context: Context) -> AVPlayerViewController { let controller = AVPlayerViewController() controller.player = player controller.showsPlaybackControls = true return controller } func updateUIViewController(_ uiViewController: AVPlayerViewController, context: Context) {} }
Posted Last updated
.
Post not yet marked as solved
2 Replies
485 Views
Crash seems to be in a private Apple framework. There's some other reports of this floating around but no solutions so far. Any ideas? *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[WebAVPlayerLayer startRedirectingVideoToLayer:forMode:]: unrecognized selector sent to instance 0x6000037033c0' *** First throw call stack: ( 0 CoreFoundation 0x0000000187d56800 __exceptionPreprocess + 176 1 libobjc.A.dylib 0x000000018784deb4 objc_exception_throw + 60 2 CoreFoundation 0x0000000187e083bc -[NSObject(NSObject) __retain_OA] + 0 3 CoreFoundation 0x0000000187cc0a84 forwarding + 1572 4 CoreFoundation 0x0000000187cc03a0 _CF_forwarding_prep_0 + 96 5 AVKit 0x00000001bdc81f30 -[__AVPlayerLayerView startRoutingVideoToPictureInPicturePlayerLayerView] + 156 6 AVKit 0x00000001bdcf1d48 -[AVPictureInPicturePlatformAdapter(Common) _setRoutingVideoToHostedWindow:pictureInPictureViewController:source:] + 84 7 AVKit 0x00000001bdcd952c -[AVPictureInPicturePlatformAdapter startPictureInPicture] + 380 8 AVKit 0x000000022883000c -[AVPictureInPicturePlatformAdapterAccessibility startPictureInPicture] + 44 9 AVKit 0x00000001bdcddea0 -[AVPictureInPictureController startPictureInPicture] + 216 10 WebCore 0x00000001c75277c8 -[WebAVPlayerViewController startPictureInPicture] + 128 11 libdispatch.dylib 0x0000000102c64f14 _dispatch_call_block_and_release + 32
Posted
by doggkruse.
Last updated
.
Post marked as solved
1 Replies
443 Views
On macOS Sonoma I have a SwiftUI app that correctly plays remote video files and local video files from the app bundle. Where I'm having trouble is setting up the AVPlayer URL for a UVC camera device directly connected on the Mac. let url = URL(string: "https://some-remote-video.mp4")! player = AVPlayer(url: url) player.play() Is there some magic to using a UVC device with AVPlayer, or do I need to access the UVC device differently? Thanks, Grahm
Posted Last updated
.
Post not yet marked as solved
0 Replies
378 Views
I'm creating an app with a video player streaming video from an URL. The AVPlayer only takes the URL address of the video but I'd like to create a custom URLRequest and then use that to stream the video from online. So current situation is this: let videourl = URL(string: "videofoobar.com") let player = AVPlayer(url: videourl) player.play() And I would like to go to something like this: let videourl = URL(string: "videofoobar.com") var request = URLRequest(url: videourl) let player = AVPlayer(request: request) //This obviously fails, any workaround? player.play() I know it is not possible to do it like this as AVPlayer only takes URL or AVPlayerItem as parameter. Is there any workaround to make URLRequest and then give it to AVPlayer for online streaming?
Posted
by EliasH.
Last updated
.
Post not yet marked as solved
0 Replies
385 Views
I’ve got some code that creates an AVPlayerItem from a URL, the creates an AVQueuePlayer from it. If I check the player item's status after that, it's still unknown. According to the docs, it'll remain unknown until it is associated with an AVPlayer, and then it "immediately begins enqueuing the item’s media and preparing it for playback." But checking the status right after that, I still get unknown, which tells me it’s not quite immediate. Is there any way to test if the player item will work immediately after creation? In this case, the problem is that my app doesn't have permission, due to it being a bookmark saved in a sandboxed app.
Posted
by JetForMe.
Last updated
.
Post not yet marked as solved
2 Replies
882 Views
In my SwiftUI/SwiftData application, I want to store videos in SwiftData objects (using external storage). To display them, I need to instantiate an AVPlayer (for use in a VideoPlayer view). But AVPlayer expects a URL, not a Data object. Obviously, I can solve this problem via a file-caching scheme (i.e., by creating a local file when needed, using an LRU cache to control it's lifetime), but this results in an extra copy of the data (besides the hidden local file managed by SwiftData/CoreData). However, since videos can be quite large, I would prefer not to do that. Has anyone any thoughts about how I can avoid the extra data copy?
Posted Last updated
.
Post not yet marked as solved
0 Replies
353 Views
Is there a way to play a specific rectangular region of interest of a video in an arbitrarily-sized view? Let's say I have a 1080p video but I'm only interested in a sub-region of the full frame. Is there a way to specify a source rect to be displayed in an arbitrary view (SwiftUI view, ideally), and have it play that in real time, without having to pre-render the cropped region? Update: I may have found a solution here: img DOT ly/blog/trim-and-crop-video-in-swift/ (Apple won't allow that URL for some dumb reason)
Posted
by JetForMe.
Last updated
.
Post not yet marked as solved
0 Replies
577 Views
Hi, I'm trying to play multiple video/audio file with AVPlayer using AVMutableComposition. Each video/audio file can process simultaneously so I set each video/audio in individual tracks. I use only local file. let second = CMTime(seconds: 1, preferredTimescale: 1000) let duration = CMTimeRange(start: .zero, duration: second) var currentTime = CMTime.zero for _ in 0...4 { let mutableTrack = composition.addMutableTrack( withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid ) try mutableTrack?.insertTimeRange( duration, of: audioAssetTrack, at: currentTime ) currentTime = currentTime + second } When I set many audio tracks (maybe more than 5), the first part sounds a little different from original when it starts. It seems like audio's front part is skipped. But when I set only two tracks, AVPlayer plays as same as original file. avPlayer.play() How can I fix it? Why do audio tracks affect that don't have any playing parts when start? Please let me know.
Posted
by Mandeuk.
Last updated
.
Post not yet marked as solved
0 Replies
430 Views
Hi, I've started learning swiftUI a few months ago, and now I'm trying to build my first app :) I am trying to display VTT subtitles from an external URL into a streaming video using AVPlayer and AVMutableComposition. I have been trying for a few days, checking online and on Apple's documentation, but I can't manage to make it work. So far, I managed to display the subtitles, but there is no video or audio playing... Could someone help? Thanks in advance, I hope the code is not too confusing. // EpisodeDetailView.swift // OroroPlayer_v1 // // Created by Juan Valenzuela on 2023-11-25. // import AVKit import SwiftUI struct EpisodeDetailView4: View { @State private var episodeDetailVM = EpisodeDetailViewModel() let episodeID: Int @State private var player = AVPlayer() @State private var subs = AVPlayer() var body: some View { VideoPlayer(player: player) .ignoresSafeArea() .task { do { try await episodeDetailVM.fetchEpisode(id: episodeID) let episode = episodeDetailVM.episodeDetail guard let videoURLString = episode.url else { print("Invalid videoURL or missing data") return } guard let subtitleURLString = episode.subtitles?[0].url else { print("Invalid subtitleURLs or missing data") return } let videoURL = URL(string: videoURLString)! let subtitleURL = URL(string: subtitleURLString)! let videoAsset = AVURLAsset(url: videoURL) let subtitleAsset = AVURLAsset(url: subtitleURL) let movieWithSubs = AVMutableComposition() let videoTrack = movieWithSubs.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid) let audioTrack = movieWithSubs.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid) let subtitleTrack = movieWithSubs.addMutableTrack(withMediaType: .text, preferredTrackID: kCMPersistentTrackID_Invalid) // if let videoTrackItem = try await videoAsset.loadTracks(withMediaType: .video).first { try await videoTrack?.insertTimeRange(CMTimeRangeMake(start: .zero, duration: videoAsset.load(.duration)), of: videoTrackItem, at: .zero) } if let audioTrackItem = try await videoAsset.loadTracks(withMediaType: .audio).first { try await audioTrack?.insertTimeRange(CMTimeRangeMake(start: .zero, duration: videoAsset.load(.duration)), of: audioTrackItem, at: .zero) } if let subtitleTrackItem = try await subtitleAsset.loadTracks(withMediaType: .text).first { try await subtitleTrack?.insertTimeRange(CMTimeRangeMake(start: .zero, duration: videoAsset.load(.duration)), of: subtitleTrackItem, at: .zero) } let playerItem = AVPlayerItem(asset: movieWithSubs) player = AVPlayer(playerItem: playerItem) let playerController = AVPlayerViewController() playerController.player = player playerController.player?.play() // player.play() } catch { print("Error: \(error.localizedDescription)") } } } } #Preview { EpisodeDetailView4(episodeID: 39288) }
Posted
by JuanV.
Last updated
.
Post marked as solved
5 Replies
1.9k Views
I am currently working on a macOS app which will be creating very large video files with up to an hour of content. However, generating the images and adding them to AVAssetWriter leads to VTDecoderXPCService using 16+ GB of memory and the kernel-task using 40+ GB (the max I saw was 105GB). It seems like the generated video is not streamed onto the disk but rather written to memory for it to be written to disk all at once. How can I force it to stream the data to disk while the encoding is happening? Btw. my app itself consistently needs around 300MB of memory, so I don't think I have a memory leak here. Here is the relevant code: func analyse() { 				self.videoWritter = try! AVAssetWriter(outputURL: outputVideo, fileType: AVFileType.mp4) 				let writeSettings: [String: Any] = [ 						AVVideoCodecKey: AVVideoCodecType.h264, 						AVVideoWidthKey: videoSize.width, 						AVVideoHeightKey: videoSize.height, 						AVVideoCompressionPropertiesKey: [ 								AVVideoAverageBitRateKey: 10_000_000, 						] 				] 				self.videoWritter!.movieFragmentInterval = CMTimeMake(value: 60, timescale: 1) 				self.frameInput = AVAssetWriterInput(mediaType: AVMediaType.video, outputSettings: writeSettings) 				self.frameInput?.expectsMediaDataInRealTime = true 				self.videoWritter!.add(self.frameInput!) 				if self.videoWritter!.startWriting() == false { 						print("Could not write file: \(self.videoWritter!.error!)") 						return 				} } func writeFrame(frame: Frame) { 				/* some more code to determine the correct frame presentation time stamp */ 				let newSampleBuffer = self.setTimeStamp(frame.sampleBuffer, newTime: self.nextFrameStartTime!) 				self.frameInput!.append(newSampleBuffer) 				/* some more code */ } func setTimeStamp(_ sample: CMSampleBuffer, newTime: CMTime) -> CMSampleBuffer { 				var timing: CMSampleTimingInfo = CMSampleTimingInfo( 						duration: CMTime.zero, 						presentationTimeStamp: newTime, 						decodeTimeStamp: CMTime.zero 				) 				var newSampleBuffer: CMSampleBuffer? 				CMSampleBufferCreateCopyWithNewTiming( 						allocator: kCFAllocatorDefault, 					 sampleBuffer: sample, 					 sampleTimingEntryCount: 1, 					 sampleTimingArray: &timing, 					 sampleBufferOut: &newSampleBuffer 			 ) 				return	newSampleBuffer! 		} My specs: MacBook Pro 2018 32GB Memory macOS Big Sur 11.1
Posted
by nbde.
Last updated
.
Post not yet marked as solved
0 Replies
339 Views
Use AVPlayer to play and AVAssetResourceLoaderDelegate to read data. The following errors occasionally occur during playback. -11819:Cannot Complete Action -11800:The operation could not be completed -11829:Cannot Open -11849:Operation Stopped -11870:这项操作无法完成 -1002:unsupported URL -11850:操作已停止 -1:未知错误 -17377
Posted Last updated
.
Post not yet marked as solved
0 Replies
292 Views
When attempting to present an AVPlayerViewController without animations, the video orientation does not function as expected. However, when the animation parameter is set to true, the video orientation works correctly. The following code does not produce the desired video orientation behavior when animation is disabled: parentViewController.present(playerViewController, animated: false) In contrast, the desired video orientation is achieved with animation enabled: parentViewController.present(playerViewController, animated: true)
Posted Last updated
.
Post not yet marked as solved
0 Replies
345 Views
How can be changed playback (forward and backward) buttons values in AVPictureInPictureController? My backward and forward buttons have 15 seconds value by default (screenshot from my app is attached), but I've found other apps has 10 seconds (for instance, Apple TV iOS app). This Apple forum discussion I've read that AVPlayerViewController adapts its capabilities and controls to the asset being played. But it seems backward/forward values in PiP stay the same for all videos independent of duration in both my app and apps I've found. But I can't find the way to change them.
Posted
by GipsySh.
Last updated
.
Post not yet marked as solved
0 Replies
401 Views
Hi, We were using Capture Systems of AVKit to take photo's in our app and we need to zoom camera to certain limit. If we configure zoomFactor to AVCaptureDevice we receiving awkward VideoFrames(blurred images) through Camera. Our app works fine in all devices of iPhone/iPad except devices that support Center Stage. We looked into Apple's default Camera app we understood that it was implemented using UIImagePickerController. We tried with multiple combinations of AVCaptureDevice.Format/AVCaptureSession.Preset but nothing helped us. We want's to achieve zoom(front camera) through AVKit, we'll add code snippet we used below please help on this. session.sessionPreset = AVCaptureSession.Preset.photo var bestFormat: AVCaptureDevice.Format? var bestFrameRateRange: AVFrameRateRange? for format in device.formats { for range in format.videoSupportedFrameRateRanges { if range.maxFrameRate > bestFrameRateRange?.maxFrameRate ?? 0 { bestFormat = format bestFrameRateRange = range } } } if let bestFormat = bestFormat, let bestFrameRateRange = bestFrameRateRange { do { try device.lockForConfiguration() // Set the device's active format. device.activeFormat = bestFormat // Set the device's min/max frame duration. let duration = bestFrameRateRange.minFrameDuration device.activeVideoMinFrameDuration = duration device.activeVideoMaxFrameDuration = duration device.videoZoomFactor = 2.0 device.unlockForConfiguration() } catch { // Handle error. } }
Posted Last updated
.
Post not yet marked as solved
0 Replies
429 Views
Hey Apple! I'm just wondering if there are any recommendations on best practices for supporting AV experiences in SwiftUI? As far as I know, VideoPlayer is the only API available directly supported in SwiftUI in AVKit (https://developer.apple.com/documentation/avkit) without the need for UIViewRepresentable / UIViewControllerRepresentable bridging of AVPlayer into SwiftUI. However there are many core video and audio experiences that a modern audience expect that are not supported in VideoPlayer. e.g. PiP Is there a roadmap for support in SwiftUI directly? Thanks!
Posted
by em_walks.
Last updated
.