Integrate photo, audio, and video content into your apps.

Media Documentation

Posts under Media tag

62 Posts
Sort by:
Post not yet marked as solved
0 Replies
553 Views
I'm trying to use the resourceLoader of an AVAsset to progressively supply media data. Unable to because the delegate asks for the full content requestsAllDataToEndOfResource = true. class ResourceLoader: NSObject, AVAssetResourceLoaderDelegate { func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForLoadingOfRequestedResource loadingRequest: AVAssetResourceLoadingRequest) -> Bool { if let ci = loadingRequest.contentInformationRequest { ci.contentType = // public.mpeg-4 ci.contentLength = // GBs ci.isEntireLengthAvailableOnDemand = false ci.isByteRangeAccessSupported = true } if let dr = loadingRequest.dataRequest { if dr.requestedLength > 200_000_000 { // memory pressure // dr.requestsAllDataToEndOfResource is true } } return true } } Also tried using a fragmented MP4 created using AVAssetWriter. But didn't work. Please let me know if it's possible for the AVAssetResourceLoader to not ask for the full content?
Posted
by
Post not yet marked as solved
2 Replies
648 Views
I know that I can uniquely identify a PHAsset on a given device using localIdentifier but if that asset is synched (through iCloud, say) to another device, how to I uniquely identify that asset across multiple devices? My app allows users to store their images in the standard photo gallery, but I have no way of referring to them when they sync their app profile to another iOS device with my app installed.
Posted
by
Post not yet marked as solved
3 Replies
558 Views
The documentation for this API mentions: The system uses the current representation and avoids transcoding, if possible. What are the scenarios in which transcoding takes place? The reason for asking is that we've had a user reaching out saying they selected a video file from their Photos app, which resulted in a decrease in size from ~110MB to 35MB. We find it unlikely it's transcoding-related, but we want to gain more insights into the possible scenarios.
Posted
by
Post not yet marked as solved
0 Replies
475 Views
Hi there, I'm wondering how to get GroupSessionJournal API to work. I have gone through the "Share files with SharePlay" session WWDC23 and have been unsuccessful at getting the DrawTogether example app to work with syncing the images using the GroupSessionJournal as described and shown in the session. When I run the DrawTogether example app with the GroupSessionJournal code in it, I can get the two devices to see one another and the strokes will update across both devices in realtime (they are using GroupSessionMessenger) but the image code doesn't cause images loaded on either side to sync to the other device. Is the GroupSessionJournal still in beta - and/or I'm missing something? Cheers! j*
Posted
by
Post not yet marked as solved
0 Replies
251 Views
Is there any royalty fee needed if I am going to develop a camera which save video in mov format?
Posted
by
Post not yet marked as solved
0 Replies
368 Views
I am trying to locate information or documentation on how to pull in photos from the iCloud Shared Albums, but have not been able to find anything yet. Dakboard is currently doing it so I know it is possible, but I cannot find an API or any documentation covering how to access the photos in a Shared Album for incorporation into web applications. Can anyone help?
Posted
by
Post not yet marked as solved
2 Replies
347 Views
Hi, I am trying to store persistent changes to the photo library. I know I can use a PHPersistentChangeToken to get the changes since that token, but I am not sure how to serialize this object and save it into a file. I am also wondering if you can serialize a PHPersistentChangeFetchResult and store that in a file. Cheers.
Posted
by
Post not yet marked as solved
1 Replies
496 Views
I'm working on a game which uses HDR display output for a much brighter range. One of a feature of the game is the ability to export in-game photos. The only appropriate format I found for this is Open EXR. The embedded Photos app is capable of showing HDR photos on an HDR display. However, if drop an EXR file to the photos with a large range, it won't be properly displayed with HDR mode with the full range. At the same time, pressing Edit on the file makes it HDR displayable and it remains displayable if save the edit with any, even a tiny, change. Moreover, if the EXR file is placed next to 'true' HDR one (or an EXR 'fixed' as on above), then durring scroll between the files, the broken EXR magically fixes at the exact moment the other HDR drives up to the screen. I tested on different files with various internal format. Seems to be a coomon problem for all. Tested on the latest iOS 17.0.3. Thank you in advance.
Posted
by
Post not yet marked as solved
0 Replies
704 Views
Hello, I'm developing a PWA, this app needs to use some features of the Audio API. one feature is MediaSource I get to work on all devices except iPhone. according to https://caniuse.com/?search=MediaSource it isn't supported on iPhone. are the plans to support this in the future? The application that I'm building creates a audioContext and audio elament. after that creates a buffer for type 'audio/mp4; codecs="mp4a.40.2'. and trys to fetch chunks of data into it. after the first chunk is loaded it starts playing. It works on my Mac and iPad I also tested on android. Hello, I'm developing a PWA (Progressive Web App). This app needs to use some features of the Audio API. One feature is MediaSource, which I've managed to get to work on all devices except the iPhone. According to Can I use Can I use, it isn't supported on iPhone. Are there plans to support this in the future? The application that I'm building creates an audioContext and audio element. After that, it creates a buffer for the type 'audio/mp4; codecs="mp4a.40.2".' and tries to fetch chunks of data into it. After the first chunk is loaded, it starts playing. It works on my Mac and iPad; I also tested it on Android.
Posted
by
Post not yet marked as solved
0 Replies
380 Views
I have designed an app for media player, in this app i need to implement live tv, movies and series. so url can be of any type such as .ts formate for live tv, and .mp4, .mov, etc. I am also going to work with m3u. but AVPlayer does not supports all these urls.So can i get some suggestions and solutions for that. what could be the best practice and how to work with all these kind if urls etc.
Posted
by
Post not yet marked as solved
0 Replies
404 Views
We have a logic in the SDK which stops playback when the outputObscuredDueToInsufficientExternalProtection event is fired by the player. Our initial understanding was that this event is fired only when the DRM blocks the video playback. However, in the present case we see that it is called even when playback is successful(playback with external screen connected). To determine whether playback still functions when the 'outputObscuredDueToInsufficientExternalProtection' event is triggered, we temporarily disabled the playback stop implementation that occurs after the event is triggered. code snippet - Observations - After this event was triggered during mirroring playback using a Lightning to HDMI connector, our expectation was that the playback would result in a black screen. However, to our surprise, the playback worked perfectly, indicating that this event is being triggered even when there are no DRM restrictions for that asset's playback. Another scenario we tested involved using a VGA connector. In this case, we observed that the 'outputObscuredDueToInsufficientExternalProtection' event was triggered. Initially, playback started as expected when we commented out the playback stop implementation. However, after a few seconds of playback, the screen went black. In the first scenario, it was unexpected for the 'outputObscuredDueToInsufficientExternalProtection' event to trigger, as the playback worked without issues even after the event was triggered. However, in the second scenario, the event was triggered as expected. The issue we identified is that this event is being triggered irrespective of the presence of DRM restrictions for the asset. In another scenario, we attempted to differentiate between the VGA and HDMI connectors to determine if such distinction was possible. However, we found that the VGA cable was also recognized as an HDMI port in the case of iOS. We also tested the issue on an older iOS version (iOS 14.6.1) to see if the problem persisted. Surprisingly, we found that the 'outputObscuredDueToInsufficientExternalProtection' event was triggered even in the older OS version. Conclusion: In our analysis, we have identified that the 'outputObscuredDueToInsufficientExternalProtection' flag always remains true even though output is not obsecured. working case log: default 13:23:19.096682+0530 AMC ||| observeValueForKeyPath = "outputObscuredDueToInsufficientExternalProtection" object = <AVPlayer: 0x281284930> change kind = { kind = 1; new = 1; old = 0; } non working case log: default 13:45:21.356857+0530 AMC ||| observeValueForKeyPath = "outputObscuredDueToInsufficientExternalProtection" object = <AVPlayer: 0x281c071e0> change kind = {kind = 1; new = 1; old = 0; } We searched through related documents and conducted a Google search, but we couldn't find any information or references related to this behavior of the 'outputObscuredDueToInsufficientExternalProtection' event. It would be really appreciated if any one can help us with this!
Posted
by
Post not yet marked as solved
0 Replies
462 Views
Our initial understanding was that this event is fired only when the DRM blocks the video playback. However, in the present case we see that it is called even when playback is successful(playback with external screen connected). To assess whether playback remains functional when the 'outputObscuredDueToInsufficientExternalProtection' event is triggered, we conducted two specific scenario tests: 1) playing an asset without any DRM restrictions, and 2) playing an asset with DRM restrictions. Result: In our analysis, we have identified that the 'outputObscuredDueToInsufficientExternalProtection' flag always remains set to one, even when playback is successful. However, it is expected to be set to zero when the playback is successful. working case log when playback is successful: default 13:23:19.096682+0530 AMC ||| observeValueForKeyPath = "outputObscuredDueToInsufficientExternalProtection" object = <AVPlayer: 0x281284930> change kind = { kind = 1; new = 1; old = 0; } non working case log when playback came as black screen: default 13:45:21.356857+0530 AMC ||| observeValueForKeyPath = "outputObscuredDueToInsufficientExternalProtection" object = <AVPlayer: 0x281c071e0> change kind = {kind = 1; new = 1; old = 0; } We searched through related documents and conducted a Google search, but we couldn't find any information or references related to this behavior of the 'outputObscuredDueToInsufficientExternalProtection' event. It would be really appreciated if any one can help us with this!
Posted
by
Post not yet marked as solved
2 Replies
791 Views
We're experiencing an issue on an iPhone 15 (iOS 17.1) where some video files can't be loaded from the results of a PHPickerViewController. results[index].itemProvider.loadFileRepresentation(forTypeIdentifier: UTType.movie.identifier) Gives error: Cannot load representation of type public.movie Video info (taken from Mac Finder): H.264 MPEG-4 AAC HD (1-1-1) 480x848px Filetype .MP4 Origin: Recorded on iPhone 14, sent over WhatsApp, & auto saved from WhatsApp to an iPhone 15 The iPhone 15 has iCloud enabled and the videos failing are frequently viewed and used in testing, so are likely to be downloaded/cached locally. I've tried changing the PHPickerConfiguration preferredAssetRepresentationMode to .current with no difference in the error. I've also tried using the openInPlace alternative but it complains that it's not supported in the debug output.
Posted
by
Post not yet marked as solved
0 Replies
410 Views
I'm trying to show likeCommand and dislikeCommand on the Lock Screen of a music player without success. Are they still supported ? Is there any special configuration on the player or the track for them to show? My current code which works for playCommand looks like this MPRemoteCommandCenter.shared().likeCommand.addTarget { [unowned self] _ in if isPlaying { return .success } return .commandFailed }
Posted
by
Post not yet marked as solved
0 Replies
453 Views
I wish to parse the bitstream of HEVC video with alpha (specific video format reference WWDC2019: https://developer.apple.com/videos/play/wwdc2019/506). Taking the 'puppets_with_alpha_hevc.mov' file from 'Using HEVC Video with Alpha' as an example, I would first extract the HEVC bitstream, then parse its fields. When it comes to the VPS field, as I reach the vps_extension, I find that the bitstream in 'puppets_with_alpha_hevc.mov' does not conform to the HEVC standard document, preventing further parsing. Besides the 'HEVC Video with Alpha Interoperability Profile.pdf', are there any more detailed documents describing the HEVC video with alpha format? Also, is there anyone who can encode or decode HEVC with alpha videos on systems other than macOS?
Posted
by
Post not yet marked as solved
0 Replies
468 Views
Hello, I'm currently investigating the possibility of accessing my photos stored on my iCloud via a dedicated API, in order to create a photo portfolio. However, after extensive research, I haven't found any documentation or public API allowing such access. I wonder if there are any future plans to make such an API available to third-party developers. I would be grateful if you could provide me with information regarding the possibility of accessing an API for Apple Photos or any other solution you might suggest. Thank you for your attention and assistance. Yours sincerely Owen
Posted
by
Post not yet marked as solved
0 Replies
530 Views
I'm involved in development on an iOS app for home security and alarm systems. There is recently a lot of negative feedback from customers about how low the notification sounds are since iOS17. Much of the feedback centers around the inability to control the volume of the notification sounds. My question is: if our app uses custom notification sounds, are these impacted by the volume changes made in iOS 17? I know previous versions of iOS allow you to control "Ringtone and Alert" volume in settings (with a volume slider). Is this same control still available for custom notification sounds within our app?
Posted
by