Integrate photo, audio, and video content into your apps.

Media Documentation

Posts under Media tag

61 Posts
Sort by:
Post marked as solved
3 Replies
2k Views
I have a Catalyst application that uses (as expected) MPNowPlayingInfoCenter to set the now playing info and MPRemoteCommandCenter to get the media events for play/pause/stop/favorite/etc. The code is shared on iOS, tvOS and watchOS and it works correctly there. It seems not to work on macOS (app is compiled as a Catalyst application) on Big Sur (and Monterey, fwiw). Media keys on the keyboard starts the Music app, the music part of the control center do not show now playing info (nor the media controls there send messages to the app). I seem to remember that it used to work in Catalina (at least the media key part) and for sure it used to work in a precedent version of the same app that used to be an UIKit one. Is this a bug (worth a feedback to Apple) or something wrong on my side? I forgot some magic capability for macOS? App is sandboxed and uses hardened runtime, in case this is significant. Thank you for any hint!
Posted
by
Post not yet marked as solved
1 Replies
1k Views
I know that if you want background audio from AVPlayer you need to detatch your AVPlayer from either your AVPlayerViewController or your AVPlayerLayer in addition to having your AVAudioSession configured correctly. I have that all squared away and background audio is fine until we introduce AVPictureInPictureController or use the PiP behavior baked into AVPlayerViewController. If you want PiP to behave as expected when you put your app into the background by switching to another app or going to the homescreen you can't perform the detachment operation otherwise the PiP display fails. On an iPad if PiP is active and you lock your device you continue to get background audio playback. However on an iPhone if PiP is active and you lock the device the audio pauses. However if PiP is inactive and you lock the device the audio will pause and you have to manually tap play on the lockscreen controls. This is the same between iPad and iPhone devices. My questions are: Is there a way to keep background-audio playback going when PiP is inactive and the device is locked (iPhone and iPad) Is there a way to keep background-audio playback going when PiP is active and the device is locked? (iPhone)
Posted
by
Post not yet marked as solved
1 Replies
956 Views
CVPixelBuffer.h defines kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange = '420v', /* Bi-Planar Component Y'CbCr 8-bit 4:2:0, video-range (luma=[16,235] chroma=[16,240]). baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrBiPlanar struct */ kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange = 'x420', /* 2 plane YCbCr10 4:2:0, each 10 bits in the MSBs of 16bits, video-range (luma=[64,940] chroma=[64,960]) */ But when I set above format camera output, and I find the output pixelbuffer's value is exceed the range.I can see [0 -255] for 420YpCbCr8BiPlanarVideoRange and [0,1023] for 420YpCbCr10BiPlanarVideoRange Is it a bug or something wrong of the output?If it is not how can I choose the correct matrix transfer the yuv data to rgb?
Posted
by
Post not yet marked as solved
3 Replies
1.8k Views
I  am trying to save an image to the user's photo library using PHPhotoLibrary and set the image file name at the time of saving suing the code below. This is working the first time, but if I then try to save the same image again with a different file name, it saves with the same file name as before. Is there something I need to add to let the system know to save a new version of the image with a new file name? Thank you PHPhotoLibrary.shared().performChanges ({ let assetType:PHAssetResourceType = .photo let request:PHAssetCreationRequest = .forAsset() let createOptions:PHAssetResourceCreationOptions = PHAssetResourceCreationOptions() createOptions.originalFilename = "\(fileName)" request.addResource(with: assetType, data: image.jpegData(compressionQuality: 1)!, options: createOptions) }, completionHandler: { success, error in if success == true && error == nil { print("Success saving image") } else { print("Error saving image: \(error!.localizedDescription)") } })
Posted
by
Post not yet marked as solved
4 Replies
1.9k Views
Hi, I'm trying to add a video to my first iOS app. From the tutorials I've read online, this seemed to be a simple process of creating a AVPlayer, providing a URL to the video file and using onAppear to start the video playing when the view is shown. Below is a simplified version of the code I'm using in my app: struct ContentView: View {   let avPlayer = AVPlayer(url: Bundle.main.url(forResource: "Intro", withExtension: "mp4")!)   var body: some View {     VStack{       VideoPlayer(player: avPlayer)         .onAppear{           avPlayer.play()         }     }   } } When I run this code, the video plays but when it finishes playing I receive the following errors in the Xcode output window: 2023-01-27 11:56:39.530526+1100 TestVideo[29859:2475750] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2023-01-27 11:56:39.676462+1100 TestVideo[29859:2475835] [TextIdentificationService] Text LID dominant failure: lidInconclusive 2023-01-27 11:56:39.676822+1100 TestVideo[29859:2475835] [VisualTranslationService] Visual isTranslatable: NO; not offering translation: lidInconclusive 2023-01-27 11:56:40.569337+1100 TestVideo[29859:2476091] Metal API Validation Enabled I have googled each of these error messages but have not been able to find any information explaining exactly what they mean or how to eliminate them. I am using Xcode 14.2 and testing on iOS 16.2. If anyone could please point me in the right direction of how to understand and eliminate these errors I'd really appreciate it. Thanks!
Posted
by
Post not yet marked as solved
1 Replies
967 Views
Is it still possible to tutor a QuickTime movie with hyperlinks? I'm building a website and I know at one point you could author a QuickTime movie that supported links inside the video - either to other timestamps in the video or to other web pages. I don't want to use a custom player, I'd prefer to use the system level. I've seen a really amazing example of this on the mobile version of the memory alpha (Star Trek nerds!) website. There is a movie that plays at the top of pages that is fully interactive. Is that still supported? Is it possible to author that way? I'm not making anything insanely complicate, I just thought it would be a nice way to build a website with tools I'm more comfortable working in.
Posted
by
Post not yet marked as solved
0 Replies
725 Views
Bracketed photo capture with iPhone 14 Pro produces photos with oddly clipped highlights. Rather than pixel values clipping normally to white when too bright, there is a sudden and harsh jump from gray to white. See image below that shows a comparison of a normal photo capture (using qualityPriorization .speed) and a bracketed capture. Note the massive difference in clipping behavior. Here's how I am configuring the bracketed capture: let bracketedStillImageSettings = AVCaptureAutoExposureBracketedStillImageSettings.autoExposureSettings(exposureTargetBias: 0.0) let bracketSettings = AVCapturePhotoBracketSettings(rawPixelFormatType: 0 as OSType, processedFormat: [AVVideoCodecKey: AVVideoCodecType.jpeg], bracketedSettings: [bracketedStillImageSettings]) photoOutput.capturePhoto(with: bracketSettings, delegate: photoCaptureProcessor) Things I've tried that don't make a difference: Changing AVCapturePhotoBracketSettings.photoQualityPrioritization to values of .speed, .balanced, or .quality Changing device type .builtInTripleCamera, .builtInUltraWideCamera, .builtInWideCamera, .builtInDualCamera, etc Changing capture format from jpeg to hevc Any ideas?
Posted
by
Post not yet marked as solved
0 Replies
499 Views
item.loadTransferable(type: Data.self) { result in switch result { case .success(let data): guard let data = data, let image = UIImage(data: data) else { break } imageView.image = image case .failure(let error): ... } } I load the raw format photo through the above code, but the displayed image is very blurry and messed up. But not all raw format photos will be like this, what is the reason?
Posted
by
Post not yet marked as solved
1 Replies
953 Views
Background I am building a web app where users can talk to a microphone and watch dynamic video that changes depending on what they say. Here is a sample flow: User accesses the page Mic permission popup appears, user grants it User presses start buttton, standby video starts playing and mic turns on User speaks something, the speech gets turned to text, analyzed, and based on that, the src changes The new video plays, mic turns on, the loop continues. Problem The problem in iOS is that volume goes down dramatically to a level where it is hard to hear on max volume if mic access is granted. iPhone's volume up/down buttons don't help much either. That results in a terrible UX. I think the OS is forcefully keeping the volume down, but I could not find any documentation about it. On the contrary, if mic permission not granted, the volume does not change, video plays in a normal volume. Question Why does it happen and what can I do prevent volume going down automatically? Any help would be appreciated. This will not happen in PC(MacOS, Windows) or Android OS. Has anyone had any similar experience before? Context For context: I have two tags(positioned:absolute, with and height 100%) that are switched(by toggling z-indexs) to appear one on top of the other. This is to hide load, buffer and black screen from the user for better UX. If the next video is loaded and can play, then they are switched places. both tags have playsinline to enable inline play as required by webKit. both tags start out muted, muted is removed after play starts. video.play() is initiated after user grants mic permission Tech stack NextJS with Typescript, latest versions Testing on latest Chrome and Safari on iOS 16 fully updated
Posted
by
Post not yet marked as solved
0 Replies
643 Views
I am currently working on a SwiftUI video app. When I load a slow motion video being in 240 IPS (239.68), I use "asset.loadTracks" and then ".load(.nominalFrameRate)" which returns 30 IPS (29.xx), asset being AVAsset(url: ). And the duration in asset.load(.duration) is also 8 times bigger than original duration. Do you know how to get this 239.68 displayed in the Apple Photo app ? Is it stored somewhere in the video metadata or is it computed ?
Posted
by
Post not yet marked as solved
3 Replies
761 Views
https://developer.apple.com/documentation/shazamkit/shazamkit_dance_finder_with_managed_session The song detection is successful however with new APIs, I can't find this demo working with SHLibrary, it expect to display the RecentDanceRowView. I wonder if I missed any steps or the SHLibrary is not ready yet.
Posted
by
Post not yet marked as solved
1 Replies
728 Views
I am trying to create an album for an app in photo_kit and store images in it, is there any way to do this under the NSPhotoLibraryAddUsageDescription permission? At first glance, using NSPhotoLibraryAddUsageDescription seems to be the best choice for this app, since I will not be loading any images. However, there are two album operations that can only be done under NSPhotoLibraryUsageDescription. Creating an album Even though the creation of an album does not involve loading media, it is necessary to use NSPhotoLibraryUsageDescription to allow users to load media. This is a bit unconvincing. Saving images in the created album Before saving, you must check to see if the app has already created the album. You need to fetch the name as a key. This is where NSPhotLibraryUsageDescription is needed. I understand that the NSPhotLibraryUsageDescription is needed for fetching, but if iOS forbids the creation of albums with the same name and ignores attempts to create an album with an already existing name, this fetching will not be necessary. In summary, I just want to create an album for my app and store media in it, but in order to do so I need to get permission from the user to read into the photos, which goes against the idea that the permissions I request should be minimal and only what I need. If there is a way to do this under the NSPhotoLibraryAddUsageDescription permission I would like to know, I am new to Swift so sorry if I am wrong.
Posted
by
Post not yet marked as solved
0 Replies
442 Views
Are there plans to expose the cinematic frames (e.g. disparity) to a AVAsynchronousCIImageFilteringRequest? I want to use my own lens blur shader on the cinematic frames. Right now it looks like the cinematic frames are only available in a AVAsynchronousVideoCompositionRequest like this: guard let sourceFrame = SourceFrame(request: request, cinematicCompositionInfo: cinematicCompositionInfo) else { return } let disparity = sourceFrame.disparityBuffer
Posted
by
Post not yet marked as solved
0 Replies
722 Views
If I disable playback controls for an AVPlayer (showsPlaybackControls), some feature of MPNowPlayingInfoCenter no longer working. (play/pause, skip forward and backward). I need custom video and audio controls on my AVPlayer in my app, that's why I disabled the iOS playback controls. But I also need the features of the MPNowPlayingInfoCenter. Is there another solution to achieve this?
Posted
by
Post not yet marked as solved
3 Replies
1.3k Views
Hi community I'm developing an application for MacOS and i need to capture the mic audio stream. Currently using CoreAudio in Swift i'm able to capture the audio stream using IO Procs and have applied the AUVoiceProcessing for prevent echo from speaker device. I was able to connect the audio unit and perform the echo cancellation. The problem that i'm getting is that when i'm using AUVoiceProcessing the gain of the two devices get reduced and that affects the volume of the two devices (microphone and speaker). I have tried to disable the AGC using the property kAUVoiceIOProperty_VoiceProcessingEnableAGCbut the results are the same. There is any option to disable the gain reduction or there is a better approach to get the echo cancellation working?
Posted
by
Post not yet marked as solved
0 Replies
562 Views
Hi, Replay doesn't work on HLS videos (.m3u8) on ios 16 safari when you get to the end of a video. It works on .mp4s, .movs, etc. I have written a github issue on the videojs repo here: https://github.com/videojs/video.js/issues/8345 But i'm starting to think its the new native ios 16 player that is causing issues and not the library itself.
Posted
by
Post not yet marked as solved
1 Replies
937 Views
I was play a pcm(24kHz 16bit) file with AudioQueue, and it mix with other sound( 192kHz 24bit) named sound2. Setting for AVAudioSession as: category (AVAudioSessionCategoryPlayback), options (AVAudioSessionCategoryOptionMixWithOthers|AVAudioSessionCategoryOptionDuckOthers) when playing pcm the sound2 should pushed volume lower as setting. BUT, there has a absolutly mute keep 0.5 second when the sound2 become low, and after a 0.5s mute the pushed lower sound came out. It only become in sound2 situation(192kHz, 24bit). if the sound2's quality lower everything is OK.
Posted
by