Integrate photo, audio, and video content into your apps.

Media Documentation

Posts under Media tag

61 Posts
Sort by:
Post not yet marked as solved
0 Replies
93 Views
Hello, we have noticed a change in the last few weeks in how Mail Privacy Protection (MPP) is operating. Specifically, MPP pre-caches images within email newsletters that are protected via Private Relay. The end result of the pre-cacheing is that every image in the newsletter is retrieved from our servers even if the user does not open the newsletter. This has been in place since '21. What we've noticed in the last month or so, is that the amount of pre-cacheing has dropped significantly, on the order of 20-25%. We can compare this with newsletters opened in non-MPP environments to know that email sends are consistent, it is only that pre-cached events seem to have changed. Does anyone know of any changes to the logic of Private Relay / MPP that would impact how it is pre-caching data from email newsletters? Thank you.
Posted
by mpisula.
Last updated
.
Post not yet marked as solved
1 Replies
416 Views
I'm working on a game which uses HDR display output for a much brighter range. One of a feature of the game is the ability to export in-game photos. The only appropriate format I found for this is Open EXR. The embedded Photos app is capable of showing HDR photos on an HDR display. However, if drop an EXR file to the photos with a large range, it won't be properly displayed with HDR mode with the full range. At the same time, pressing Edit on the file makes it HDR displayable and it remains displayable if save the edit with any, even a tiny, change. Moreover, if the EXR file is placed next to 'true' HDR one (or an EXR 'fixed' as on above), then durring scroll between the files, the broken EXR magically fixes at the exact moment the other HDR drives up to the screen. I tested on different files with various internal format. Seems to be a coomon problem for all. Tested on the latest iOS 17.0.3. Thank you in advance.
Posted
by Goryh.
Last updated
.
Post not yet marked as solved
0 Replies
93 Views
Hello everyone! Im new to this forum and i have a question about playing mp3 files on our Homepage on apple devices: We have a service section with the latest News, Traffic and Weather on our Site. The problem is, that the mp3 Files can be played, but the player shows 0:00 and is not able to go to the middle or end of the file. This is ONLY on Iphone in our app and on Safari. I am recording the mp3 with ffmpeg. The strange thing is, that there is no Bitrate shown in FIle details, and also no length. The File is auto Recorded with a script.
Posted
by TheBroon.
Last updated
.
Post not yet marked as solved
4 Replies
232 Views
iOS 17.4.1, iPhone 15 Pro. I pick photos from the user's photo library using: ... .photosPicker(isPresented: $addPhotos, selection: $pickedPhotos, matching: .images) .onChange(of: pickedPhotos) { import(photoItems: pickedPhotos) } The picker UI works ok, but then when I import the photos: private func import(photoItems: [PhotosPickerItem]) { for photoItem in photoItems { Log.debug("picked: \(photoItem)") Task { do { let imageData = try await photoItem.loadTransferable(type: Data.self) guard let imageData else { Log.error("failed to load image data") return } guard let image = UIImage(data: imageData) else { Log.error("failed to create image from data") return } // use image .... } catch { Log.error("failed to load image data: \(error)") } } } } Logging the picked photo gives: PhotosPickerItem(_itemIdentifier: "C7E2F753-43F6-413D-BA42-509C60BE9D77/L0/001", _shouldExposeItemIdentifier: false, _supportedContentTypes: [<_UTCoreType 0x1ebcd1c10> public.jpeg (not dynamic, declared), <_UTCoreType 0x1ebcd1d70> public.heic (not dynamic, declared), <UTType 0x300fe0430> com.apple.private.photos.thumbnail.standard (not dynamic, declared), <UTType 0x300fe03f0> com.apple.private.photos.thumbnail.low (not dynamic, declared)], _itemProvider: <PUPhotosFileProviderItemProvider: 0x303fdff00> {types = ( "public.jpeg", "public.heic", "com.apple.private.photos.thumbnail.standard", "com.apple.private.photos.thumbnail.low" )}) Looks like there's a valid photo? But then the loadTransferable() call fails with: 5C9D59CB-3606-48C1-9B37-1F18D642B3AD grantAccessClaim reply is an error: Error Domain=NSCocoaErrorDomain Code=4101 "Couldn’t communicate with a helper application." UserInfo={NSUnderlyingError=0x308244f30 {Error Domain=PFPAssetRequestErrorDomain Code=0 "The operation couldn’t be completed. (PFPAssetRequestErrorDomain error 0.)" UserInfo={NSURL=file:///private/var/mobile/Containers/Shared/AppGroup/36CF50FB-38FC-440E-9662-35C23B5E636C/File%20Provider%20Storage/photospicker/uuid=C7E2F753-43F6-413D-BA42-509C60BE9D77&library=1&type=1&mode=2&loc=true&cap=true.jpeg, NSLocalizedDescription=The operation couldn’t be completed. (PFPAssetRequestErrorDomain error 0.)}}} Error loading public.data: Error Domain=NSItemProviderErrorDomain Code=-1000 "Cannot load representation of type public.jpeg" UserInfo={NSLocalizedDescription=Cannot load representation of type public.jpeg, NSUnderlyingError=0x3081a2550 {Error Domain=NSCocoaErrorDomain Code=4101 "Couldn’t communicate with a helper application." UserInfo={NSUnderlyingError=0x308244f30 {Error Domain=PFPAssetRequestErrorDomain Code=0 "The operation couldn’t be completed. (PFPAssetRequestErrorDomain error 0.)" UserInfo={NSURL=file:///private/var/mobile/Containers/Shared/AppGroup/36CF50FB-38FC-440E-9662-35C23B5E636C/File%20Provider%20Storage/photospicker/uuid=C7E2F753-43F6-413D-BA42-509C60BE9D77&library=1&type=1&mode=2&loc=true&cap=true.jpeg, NSLocalizedDescription=The operation couldn’t be completed. (PFPAssetRequestErrorDomain error 0.)}}}}} 2024-04-03 12:16:07.8010 error PhotosView.import: failed to load image data: importNotSupported [ As usual I rebooted my phone as these things tend to be pretty buggy in iOS, but same error. Note this is not in a simulator which seems to have long standing bugs related to photo picking, this is on a freshly upgraded 17.4.1 device. I can't find any documentation related to these errors and all googling comes up with a few other cases but no solutions. Should this API actually work or is it better to go back to the old UIKit stuff? I use loadTransferable(type: Data.self) as UIImage.self is not Transferable and this hack has seemed to work ok for some months.
Posted Last updated
.
Post not yet marked as solved
1 Replies
181 Views
I'm trying to accomplish creating a new playlist on device that appears in AppleMusic, and adding into the playlist a selection of MP3s within a small IOS app. Now the MP3's are either a stream of bytes, or a flat file already stored on the device (the app itself generates these - they aren't downloaded, they are created in app, and then stored on the local device) in it's local storage space. The idea is that created tracks can show up in a specific play list on the device. Now, there appears to be some conflict as to what framework I need to use. I've found MPMediaPlayer, which appears to allow me to create a playlist using the GetPlaylist call, although the documentation on this seems pretty sparse and there's not a lot of examples I can find on how to use this? It looks like a UUID is passed in, but there is no documentation on what this UUID is or where it comes from? If I want to create a new Playlist, I presume I need to generate a UUID, and then store that locally in order to be able to access that playlist again later, yes? There's an AddItem call which looks like it's how you add a track to a playlist, but there's no documentation on how you generate an entry. The documentation for this function talks about a Product ID, without describing what the product ID is, or where it needs to come from. Is this a GUID? Is it a name/description? Does it have to be unique? I'm assuming this Product ID refers to that which is being added to the playlist, but the documentation is sadly lacking in terms of explaining what the product ID refers to. Is it a media Item, or is that what is created when whatever entity the Product ID is referring to is added to the playlist? I'm assuming I can create a NSURL of the file that is stored that is actually the MP3 sample, but what I do with that in order to actually add it as a playlist entry is unknown. I'm sure there is a mechanism to do this, it's just not clear what that is. There's a lack of understanding or explanation of what the process is here, and some illumination would be helpful.
Posted Last updated
.
Post not yet marked as solved
0 Replies
302 Views
Hi guys, I'm implementing FairPlay support for a video streaming application. I've managed to get as far as generating the SPC and acquiring a license from the license server. However when it comes to parsing the license (CKC) returned from the server, the FPS module returns error code -42671. Has anyone else faced this before and / or knows what the fix is? I thought passing it the license should be enough unless additional data is required?
Posted
by ThetaSeg.
Last updated
.
Post not yet marked as solved
0 Replies
267 Views
Is there any way to play panoramic or 360 videos in an immersive space, without using VideoMaterial on a sphere? I've tried using local videos with 4k and 8k quality and all of them look pixelated using this approach. I tried both simulator as well as the real device, and I can't ever get a high-quality playback. If the video is played on a regular 2D player, on the other hand, it shows the expected quality.
Posted Last updated
.
Post marked as solved
1 Replies
472 Views
This is my h5 code: <video id="myVideo" src="xxxapp://***.***.xx/***/***.mp4" style="object-fit:cover;opacity:1;width:100%;height:100%;display:block;possition:absolute;" type="video/mp4"></video> I want to load local large video, so, I use WKURLSchemeHandler. - (void)webView:(WKWebView *)webView startURLSchemeTask:(id<WKURLSchemeTask>)urlSchemeTask { NSURLRequest *request = [urlSchemeTask request]; NSURL *url = request.URL; NSString *urlString = url.absoluteString; NSString *videoPath = [[NSBundle mainBundle] pathForResource:@"***" ofType:@"mp4"]; NSData *videoData = [NSData dataWithContentsOfFile:videoPath options:nil error:nil]; NSURLResponse *response = [[NSURLResponse alloc] initWithURL:url MIMEType:@"video/mp4" expectedContentLength:videoData.length textEncodingName:nil]; [urlSchemeTask didReceiveResponse:response]; [urlSchemeTask didReceiveData:videoData]; [urlSchemeTask didFinish]; } but its not work, data is not nil, but video do not play. I would greatly appreciate it if someone could help me find a solution!! ps: can make it, but we cannot use it due to some reasons.
Posted Last updated
.
Post not yet marked as solved
1 Replies
926 Views
CVPixelBuffer.h defines kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange = '420v', /* Bi-Planar Component Y'CbCr 8-bit 4:2:0, video-range (luma=[16,235] chroma=[16,240]). baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrBiPlanar struct */ kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange = 'x420', /* 2 plane YCbCr10 4:2:0, each 10 bits in the MSBs of 16bits, video-range (luma=[64,940] chroma=[64,960]) */ But when I set above format camera output, and I find the output pixelbuffer's value is exceed the range.I can see [0 -255] for 420YpCbCr8BiPlanarVideoRange and [0,1023] for 420YpCbCr10BiPlanarVideoRange Is it a bug or something wrong of the output?If it is not how can I choose the correct matrix transfer the yuv data to rgb?
Posted
by vrsure.
Last updated
.
Post not yet marked as solved
0 Replies
245 Views
Hello, I have a music on apple music. When I search this music on Shazam, I want it to appear with a clip like the link I provided below. Is there any way you can help with this? Example: https://www.youtube.com/watch?v=St8smx2q1Ho My Music: https://music.apple.com/us/album/tam-ba%C4%9F%C4%B1ms%C4%B1z-t%C3%BCrkiye/1689395789?i=1689395790 Thanks.
Posted
by yasirb_.
Last updated
.
Post not yet marked as solved
1 Replies
278 Views
Hello, we are embedding a PHPickerViewController with UIKit (adding the vc as a child vc, embedding the view, calling didMoveToParent) in our app using the compact mode. We are disabling the following capabilities .collectionNavigation, .selectionActions, .search. One of our users using iOS 17.2.1 and iPhone 12 encountered a crash with the following stacktrace: Crashed: com.apple.main-thread 0 libsystem_kernel.dylib 0x9fbc __pthread_kill + 8 1 libsystem_pthread.dylib 0x5680 pthread_kill + 268 2 libsystem_c.dylib 0x75b90 abort + 180 3 PhotoFoundation 0x33b0 -[PFAssertionPolicyCrashReport notifyAssertion:] + 66 4 PhotoFoundation 0x3198 -[PFAssertionPolicyComposite notifyAssertion:] + 160 5 PhotoFoundation 0x374c -[PFAssertionPolicyUnique notifyAssertion:] + 176 6 PhotoFoundation 0x2924 -[PFAssertionHandler handleFailureInFunction:file:lineNumber:description:arguments:] + 140 7 PhotoFoundation 0x3da4 _PFAssertFailHandler + 148 8 PhotosUI 0x22050 -[PHPickerViewController _handleRemoteViewControllerConnection:extension:extensionRequestIdentifier:error:completionHandler:] + 1356 9 PhotosUI 0x22b74 __66-[PHPickerViewController _setupExtension:error:completionHandler:]_block_invoke_3 + 52 10 libdispatch.dylib 0x26a8 _dispatch_call_block_and_release + 32 11 libdispatch.dylib 0x4300 _dispatch_client_callout + 20 12 libdispatch.dylib 0x12998 _dispatch_main_queue_drain + 984 13 libdispatch.dylib 0x125b0 _dispatch_main_queue_callback_4CF + 44 14 CoreFoundation 0x3701c __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 16 15 CoreFoundation 0x33d28 __CFRunLoopRun + 1996 16 CoreFoundation 0x33478 CFRunLoopRunSpecific + 608 17 GraphicsServices 0x34f8 GSEventRunModal + 164 18 UIKitCore 0x22c62c -[UIApplication _run] + 888 19 UIKitCore 0x22bc68 UIApplicationMain + 340 20 WorkAngel 0x8060 main + 20 (main.m:20) 21 ??? 0x1bd62adcc (Missing) Please share if you have any ideas as to what might have caused that, or what to look at in such a case. I haven't been able to reproduce this myself unfortunately.
Posted Last updated
.
Post not yet marked as solved
1 Replies
249 Views
I use a PHPickerViewController for a user to select profile images. I want the order in which the photos were selected, respected. However, whenever I return the selected results, the order in which I selected is not reflected. I've double checked my config, but no solution has worked and would appreciate any guidance that does not involve using a 3rd party! Apple Documentation states simply setting the .selection property to .ordered should respect the user's selected order, but it does not!! documentation //Setup code var config = PHPickerConfiguration() config.selectionLimit = 3 config.filter = .images config.selection = .ordered let picker = PHPickerViewController(configuration: config) picker.delegate = self //Delegate handler func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) { guard !results.isEmpty else { picker.dismiss(animated: true) return } self.photos = [] var tempImages: [Int: UIImage] = [:] let dispatchGroup = DispatchGroup() for (index, result) in results.enumerated() { dispatchGroup.enter() // Enter the group result.itemProvider.loadObject(ofClass: UIImage.self) { [weak self] object, error in defer { dispatchGroup.leave() } guard let self = self else { return } if let image = object as? UIImage { tempImages[index] = image } } } dispatchGroup.notify(queue: .main) { [weak self] in guard let self = self else { return } for index in 0..<tempImages.keys.count { if let image = tempImages[index] { self.photos?.append(image) } } } picker.dismiss(animated: true) }
Posted
by ccynn22.
Last updated
.
Post not yet marked as solved
1 Replies
970 Views
I know that if you want background audio from AVPlayer you need to detatch your AVPlayer from either your AVPlayerViewController or your AVPlayerLayer in addition to having your AVAudioSession configured correctly. I have that all squared away and background audio is fine until we introduce AVPictureInPictureController or use the PiP behavior baked into AVPlayerViewController. If you want PiP to behave as expected when you put your app into the background by switching to another app or going to the homescreen you can't perform the detachment operation otherwise the PiP display fails. On an iPad if PiP is active and you lock your device you continue to get background audio playback. However on an iPhone if PiP is active and you lock the device the audio pauses. However if PiP is inactive and you lock the device the audio will pause and you have to manually tap play on the lockscreen controls. This is the same between iPad and iPhone devices. My questions are: Is there a way to keep background-audio playback going when PiP is inactive and the device is locked (iPhone and iPad) Is there a way to keep background-audio playback going when PiP is active and the device is locked? (iPhone)
Posted
by jblaker.
Last updated
.
Post not yet marked as solved
0 Replies
645 Views
Hello. Does anyone have any ideas on how to work with the new iOS 17 Live Photo? I can save the live photo, but I can't set it as wallpaper. Error: "Motion is not available in iOS 17" There are already applications that allow you to do this - VideoToLive and the like. What should I use to implement this with swift language? Most likely the metadata needs to be changed, but I'm not sure.
Posted
by MRKIOS.
Last updated
.
Post not yet marked as solved
2 Replies
393 Views
Hello! I'm trying to save videos asynchronously. I've already used performChanges without the completionHandler, but it didn't work. Can you give me an example? Consider that the variable with the file URL is named fileURL. What would this look like asynchronously?
Posted Last updated
.
Post not yet marked as solved
1 Replies
550 Views
if UIImagePickerController.isSourceTypeAvailable(.camera) { let imagePicker = UIImagePickerController() imagePicker.delegate = self imagePicker.allowsEditing = false imagePicker.sourceType = .camera self.present(imagePicker, animated: true, completion: nil) } this code crashes on M2 Mac (Designed for iPad) with the following exception <<<< FigCaptureCameraParameters >>>> Fig assert: "success" at bail (FigCaptureCameraParameters.m:249) - (err=0) An uncaught exception was raised *** -[__NSPlaceholderDictionary initWithObjects:forKeys:count:]: attempt to insert nil object from objects[0] ( 0 CoreFoundation 0x0000000180b02800 __exceptionPreprocess + 176 1 libobjc.A.dylib 0x00000001805f9eb4 objc_exception_throw + 60 2 CoreFoundation 0x0000000180a1a724 -[__NSPlaceholderDictionary initWithObjects:forKeys:count:] + 728 3 CoreFoundation 0x0000000180a1a420 +[NSDictionary dictionaryWithObjects:forKeys:count:] + 52 4 AVFCapture 0x000000019de90374 -[AVCaptureFigVideoDevice _cameraInfo] + 200 5 AVFCapture 0x000000019de90278 -[AVCaptureFigVideoDevice updateStreamingDeviceHistory] + 36 6 AVFCapture 0x000000019deec8c0 -[AVCaptureSession _startFigCaptureSession] + 464 7 AVFCapture 0x000000019def0980 -[AVCaptureSession _buildAndRunGraph:] + 1936 8 AVFCapture 0x000000019deecc00 -[AVCaptureSession _setRunning:] + 120 9 AVFCapture 0x000000019deec46c -[AVCaptureSession startRunning] + 452 10 libRPAC.dylib 0x00000001051c9024 _replacement_AVCaptureSession_startRunning + 104 11 libdispatch.dylib 0x000000010509cf14 _dispatch_call_block_and_release + 32 12 libdispatch.dylib 0x000000010509eb4c _dispatch_client_callout + 20 13 libdispatch.dylib 0x00000001050a7cd8 _dispatch_lane_serial_drain + 864 14 libdispatch.dylib 0x00000001050a8dcc _dispatch_lane_invoke + 416 15 libdispatch.dylib 0x00000001050b877c _dispatch_root_queue_drain_deferred_wlh + 652 16 libdispatch.dylib 0x00000001050b7a54 _dispatch_workloop_worker_thread + 444 17 libsystem_pthread.dylib 0x0000000105147d9c _pthread_wqthread + 288 18 libsystem_pthread.dylib 0x000000010514fab4 start_wqthread + 8 )
Posted Last updated
.
Post not yet marked as solved
0 Replies
473 Views
Animated AVIF is rendered slowly on Safari Tested with MacBook pro (16" 2019) and Safari (Version 17.0 - 19616.1.27.211.1) and also on several iPhone models (14, 15 Pro) (over BrowserStack) When using macBook pro (16" 2019) with Chrome (Version 120.0.6099.129) it is rendered OK example for 720p@25FPS: https://res.cloudinary.com/yaronshmueli/image/upload/cases/animated_AVIF_Apple/world_flight_fast_decode_tile_clmn_btiolg.avif
Posted
by yaronsh.
Last updated
.
Post not yet marked as solved
1 Replies
1.1k Views
Starting with Sonoma 14.2 it is not possible any longer to connect canon cameras to an app via USB using Canon's EDSDK framework. This has been working fine up to Sonoma 14.1. The app using the EDSDK is not crashing, but the SDK is not reporting back any connected cameras any longer. The camera is connected and can be seen in the system report as well as in e.g. gphoto2 and even in the EOS Utility Software. It seems that 14.2 introduced some breaking change to the access to cameras from within apps. I've tried upgrading to the newest EDSDK version and checked with and without App Sandbox. There is no way to find the camera any longer on 14.2.
Posted
by dom1001.
Last updated
.