Media Player

RSS for tag

Find and play songs, audio podcasts, audio books, and more from within your app using Media Player.

Media Player Documentation

Posts under Media Player tag

85 Posts
Sort by:
Post not yet marked as solved
0 Replies
404 Views
We have a logic in the SDK which stops playback when the outputObscuredDueToInsufficientExternalProtection event is fired by the player. Our initial understanding was that this event is fired only when the DRM blocks the video playback. However, in the present case we see that it is called even when playback is successful(playback with external screen connected). To determine whether playback still functions when the 'outputObscuredDueToInsufficientExternalProtection' event is triggered, we temporarily disabled the playback stop implementation that occurs after the event is triggered. code snippet - Observations - After this event was triggered during mirroring playback using a Lightning to HDMI connector, our expectation was that the playback would result in a black screen. However, to our surprise, the playback worked perfectly, indicating that this event is being triggered even when there are no DRM restrictions for that asset's playback. Another scenario we tested involved using a VGA connector. In this case, we observed that the 'outputObscuredDueToInsufficientExternalProtection' event was triggered. Initially, playback started as expected when we commented out the playback stop implementation. However, after a few seconds of playback, the screen went black. In the first scenario, it was unexpected for the 'outputObscuredDueToInsufficientExternalProtection' event to trigger, as the playback worked without issues even after the event was triggered. However, in the second scenario, the event was triggered as expected. The issue we identified is that this event is being triggered irrespective of the presence of DRM restrictions for the asset. In another scenario, we attempted to differentiate between the VGA and HDMI connectors to determine if such distinction was possible. However, we found that the VGA cable was also recognized as an HDMI port in the case of iOS. We also tested the issue on an older iOS version (iOS 14.6.1) to see if the problem persisted. Surprisingly, we found that the 'outputObscuredDueToInsufficientExternalProtection' event was triggered even in the older OS version. Conclusion: In our analysis, we have identified that the 'outputObscuredDueToInsufficientExternalProtection' flag always remains true even though output is not obsecured. working case log: default 13:23:19.096682+0530 AMC ||| observeValueForKeyPath = "outputObscuredDueToInsufficientExternalProtection" object = <AVPlayer: 0x281284930> change kind = { kind = 1; new = 1; old = 0; } non working case log: default 13:45:21.356857+0530 AMC ||| observeValueForKeyPath = "outputObscuredDueToInsufficientExternalProtection" object = <AVPlayer: 0x281c071e0> change kind = {kind = 1; new = 1; old = 0; } We searched through related documents and conducted a Google search, but we couldn't find any information or references related to this behavior of the 'outputObscuredDueToInsufficientExternalProtection' event. It would be really appreciated if any one can help us with this!
Posted
by vinay1234.
Last updated
.
Post not yet marked as solved
0 Replies
354 Views
Application is getting stuck when video is just start playing in AVPlayer and receive a phone call on iPhone. I have try to handle the the use case in applicationWillResignActive: delegate method but I am not receiving call back on this in case of phone call receive. Please help.
Posted Last updated
.
Post not yet marked as solved
2 Replies
725 Views
We have a web application which is having stalled html video element issue on iOS Safari. iOS Version 17.0.3. The html page contains a inline script tag for example <script> ... </script> and runs immediately when page loads. The script does following videoElement.src = url; videoElement.load(); The url is a HLS manifest url. After the DOM elements are created, we attatch the videoElement to a <div> and expecting the video reaches canplaythrough eventually and starts to playing. Actual Behavior: The video never plays videoElement readyState and networkState both stuck at value of 1 we found that "suspened" event was triggered on the video element and not sure who is triggering it. Temporary Mitigation: When video is stalled, if we call videoElement.load() manually in Safari js console, the readyState and networkState will increase and seeing HLS video segment are being fetched and video eventually reaches canplaythrough. This happens only on iOS Safari, not MacOS Safari. We suspect its because at begining when videoElement.src is set and .load() was called, the videoElement was not attached to any div so iOS decides to stop it to save battery. But its completely uneducated guess and any help would be appreciated. Thanks !
Posted
by tyrelltle.
Last updated
.
Post not yet marked as solved
0 Replies
701 Views
Hello, I'm developing a PWA, this app needs to use some features of the Audio API. one feature is MediaSource I get to work on all devices except iPhone. according to https://caniuse.com/?search=MediaSource it isn't supported on iPhone. are the plans to support this in the future? The application that I'm building creates a audioContext and audio elament. after that creates a buffer for type 'audio/mp4; codecs="mp4a.40.2'. and trys to fetch chunks of data into it. after the first chunk is loaded it starts playing. It works on my Mac and iPad I also tested on android. Hello, I'm developing a PWA (Progressive Web App). This app needs to use some features of the Audio API. One feature is MediaSource, which I've managed to get to work on all devices except the iPhone. According to Can I use Can I use, it isn't supported on iPhone. Are there plans to support this in the future? The application that I'm building creates an audioContext and audio element. After that, it creates a buffer for the type 'audio/mp4; codecs="mp4a.40.2".' and tries to fetch chunks of data into it. After the first chunk is loaded, it starts playing. It works on my Mac and iPad; I also tested it on Android.
Posted Last updated
.
Post not yet marked as solved
2 Replies
1.2k Views
We noticed iOS 16 doesn't seem to support these commands anymore: MPRemoteCommandCenter.shared().likeCommand MPRemoteCommandCenter.shared().dislikeCommand MPRemoteCommandCenter.shared().bookmarkCommand Or is there another way to show a menu in lieu of the previous button on the lock screen?
Posted
by chival.
Last updated
.
Post not yet marked as solved
0 Replies
564 Views
I have an AVPlayerViewController in my app to play custom audio+image or video streamed from an online service. If I set the below, I am able to add info to the nowPlayingInfo dictionary avplayerController.updatesNowPlayingInfoCenter = false This works for iOS control centre, it correctly displays my custom album name, artwork etc. But when using airplay for audio, it only displays the track name while playing audio. It doesn't display the artwork or the album etc. However if I set avplayerController.player?.allowsExternalPlayback = false It does correctly display artwork, title, album etc. This however disables the airplay button thats inbuilt on the player. I would like this button to remain, but need the artwork to be displayed while airplay-ing. How to I achieve this?
Posted
by simonmcl.
Last updated
.
Post not yet marked as solved
2 Replies
2.1k Views
I'm developing a media player for Mac (AppKit, not Catalyst) that plays local and remote content. I have AirPlay working with AVPlayer (with an AVRoutePickerView assigning the route), but while I get the metadata that I've set for the MPNowPlayingInfoCenter on the AirPlay device (a TV in this case), I don't get any album art (but I do in the macOS now playing menu bar/control centre applet). It looks like this: (imgur link because I can't get it to upload in the forum): https://i.imgur.com/2JBIYCw.jpg My code for setting the metadata:         NSImage *artwork = [currentTrack coverImage];         CGSize artworkSize = [artwork size];         MPMediaItemArtwork *mpArtwork = [[MPMediaItemArtwork alloc] initWithBoundsSize:artworkSize requestHandler:^NSImage * _Nonnull(CGSize size) {             return artwork;         }];         [songInfo setObject: mpArtwork forKey:MPMediaItemPropertyArtwork]; I noticed that it doesn't resize, but it seems at least macOS doesn't care. I tried modifying the code to resize the artwork in the callback, but that also doesn't change anything. I noticed in the logs that I get a message about a missing entitlement: 2023-01-29 14:00:37.889346-0400 Submariner[42682:9794531] [Entitlements] MSVEntitlementUtilities - Process Submariner PID[42682] - Group: (null) - Entitlement: com.apple.mediaremote.external-artwork-validation - Entitled: NO - Error: (null) ...however, this seems to be a private entitlement and the only reference I can find to it is WebKit. Using it makes LaunchServices very angry at me, and I presume it's a red herring.
Posted Last updated
.
Post not yet marked as solved
1 Replies
849 Views
Hi! I'm currently developing an app that can play music stored locally. It was working fine previously, but after updating my device to iOS 17, I started getting error -54 when I try to play the file. I also noticed that when getting the list of files in MPMediaQuery.songs(), I would encounter the following errors: I suspect it might be some issue with file permissions, but I can't figure out what i am missing. I have already checked that MPMediaLibrary.authorizationStatus() is authorized. Does anyone know what the issue might be? Thank you
Posted
by LogicUI.
Last updated
.
Post not yet marked as solved
1 Replies
705 Views
Hi, I’m looking at using MusicKit in my watchOS app however I don’t seem to have any method of being able to play the audio though the recommended use of MPMusicPlayerController since it isn’t available on watchOS. This method works fine for iOS and iPadOS but not watchOS which seems bizarre considering we have full access to MusicKit but no way to actually play any audio. I’m trying to build an app that includes Apple Music through MusicKit but don’t have any way to actually play the audio. Is there a technical reason for this and if so is there any other way to play audio from MusicKit on watchOS. The docs for MPMusicPlayerController can be found here: https://developer.apple.com/documentation/mediaplayer/mpmusicplayercontroller
Posted
by Superbro.
Last updated
.
Post not yet marked as solved
1 Replies
583 Views
Hello, I have a music player application that uses MPMusicPlayerController to play Apple Music songs. Now I want to add a Widget to it that can trigger playback pause on iOS 17. How do I need to achieve this? I can get the playbackStoreID of the song on the Widget Extension and use MPMusicPlayerController setQueue(with:) to play it. How do I pass the playbackStoreID in the Widget to the main application for playback, or should I create a new MPMusicPlayerController on the Widget Extension for playback?
Posted
by tbfungeek.
Last updated
.
Post not yet marked as solved
1 Replies
347 Views
We have received a lot of user feedback, saying that our app caused the video in the user's system album to not play, we did reproduce this phenomenon after operating some modules of our app many times, after monitoring the device log, click on the system album z probably received the following abnormal error VideoContentProvider received result:<AVPlayerItem: 0x281004850, asset = <AVURLAsset: 0x28128fce0, URL = file:///var/mobile/Media/DCIM/100APPLE/IMG_0085.MP4>>, info:{ PHImageResultRequestIDKey = 316; }, priority:oneup automatic, strategy:<PXDisplayAssetVideoContentDeliveryStrategy: 0x2836c3000>quality: medium+(med-high), segment:{ nan - nans }, streaming:YES, network:YES, audio:YES, targetSize:{1280, 1280}, displayAsset:8E30C461-B089-4142-82D9-3A8CFF3B5DE9 <PUBrowsingVideoPlayer: 0xc46a59770> Asset : <PHAsset: 0xc48f5fc50> 8E30C461-B089-4142-82D9-3A8CFF3B5DE9/L0/001 mediaType=2/524288, sourceType=1, (828x1792), creationDate=2023-07-19 上午7:36:41 +0000, location=0, hidden=0, favorite=0, adjusted=0 VideoSession : <PXVideoSession 0xc48a1ec50> { Content Provider: <PXPhotoKitVideoContentProvider: 0x282d441e0>, Asset <PHAsset: 0xc48f5fc50> 8E30C461-B089-4142-82D9-3A8CFF3B5DE9/L0/001 mediaType=2/524288, sourceType=1, (828x1792), creationDate=2023-07-19 上午7:36:41 +0000, location=0, hidden=0, favorite=0, adjusted=0 , Media Provider: <PUPhotoKitMediaProvider: 0x28104da70> Desired Play State: Paused Play State: Paused Stalled: 0 At Beginning: 1 End: 0 Playback: ‖ Paus √ b0 a0 s0 l1 f0 e0 r0.0 0.000/60.128 VideoOutput: (null) Got First Pixel Buffer: NO Pixel Buffer Frame Drops: 0 Buffering: 0 }: Starting disabling of video loading for reason: OutOfFocus <PUBrowsingVideoPlayer: 0xc46de66e0> Asset : <PHAsset: 0xc48f5f1d0> 11ECA95E-0B79-4C7C-97C6-5958EE139BAB/L0/001 mediaType=2/0, sourceType=1, (1080x1920), creationDate=2023-09-21 上午7:54:46 +0000, location=1, hidden=0, favorite=0, adjusted=0 VideoSession : (null): Starting disabling of video loading for reason: OutOfFocus I think this message is imporant VideoSession : (null): Starting disabling of video loading for reason: OutOfFocus restart the iPhone can resolve this anomalous ,can you know reason or how to avoid this bug the bug like :https://discussionschinese.apple.com/thread/254766045 https://discussionschinese.apple.com/thread/254787836
Posted
by YangMike.
Last updated
.
Post not yet marked as solved
6 Replies
2k Views
Turn on address sanitizer on Xcode and use a real device and put a Test.mp3 file in the Xcode project. Then it will crash when you initialise a AVAudioPlayer with a mp3 file (with a wav file it works fine). I have made an entry in feedback assistent -> FB12425453. var player : AVAudioPlayer? func playSound() { if let url = Bundle.main.url(forResource: "Test", withExtension: "mp3") { self.player = try? AVAudioPlayer(contentsOf: url) // --> deallocation of non allocated memory problem --> with a "wav" file it works .... } }
Posted Last updated
.
Post not yet marked as solved
6 Replies
1.6k Views
Am I the only one having this problem? The Music app on iOS 17 beta does not cleanly seque between songs that are intended to have no gap between them. When gapless songs are played (e.g., Pink Floyd's "Dark Side of the Moon", The Beatles' "Abbey Road"), a noticeable gap is heard between songs. These are songs in my music library that I sync from my computer, not Apple Music streams. It should be noted that this bug also exists in iOS 16.6, and I've verified that the problem does not occur on a older phone running iOS 15.7.8. More importantly, when I put the device in Airplane mode, the songs seque correctly, without any gaps. I suspect that the Music app is phoning home to Apple (a bad practice in and of itself) and something is interrupting playback queuing. Even when I have Cellular access turned off for the Music app, and am not connected to Wi-Fi, the problem persists. The only way to make gapless playback work is to turn off all of the device radios via Airplane mode. Understand that this is NOT a cross-fade issue. It is a gapless issue. And it's not an Apple Music streaming issue. The problem seems to be more prevalent for songs encoded as 128 kbps AAC. In comparison, the Music app on macOS (Ventura, 13.5) operates correctly. It's only iOS that no longer performs gapless playback. I've filed bug reports (FB12992049 and FB13019931), but have not heard anything from Apple. Like I asked at the beginning, am I the only one having this problem? It's extremely maddening that Apple can't get this right. I can't play my Pink Floyd, Alan Parsons, and my other AOR playlists, including my late 60's Beatles. Steve Jobs would be rolling in his grave.
Posted
by rshuston.
Last updated
.
Post marked as solved
11 Replies
3.7k Views
I'm playing library items (MPMediaItem) and apple music tracks (Track) in MPMusicPlayerApplicationController.applicationQueuePlayer, but I can't use the actual Queue functionality because I can't figure out how to get both media types into the same queue. If there's a way to get both types in a single queue, that would solve my problem, but I've given up on that one. Because I can't use a queue, I have to be able to detect when a song ends so that I can put the next song in the queue and play it. The only way I can figure out to detect when a song ends is by watching the playBackState, and I've actually got that pretty much working, but it's really ugly, because you get playBackState of paused when a song ends, and when a bluetooth speaker disconnects, etc. The only answer I've been able to find on the internet is to watch the MPMusicPlayerControllerNowPlayingItemDidChange, and when that fires, and the nowPlayingItem is NIL, a song ends.. but that's not the case. When a song ends, the nowPlayingItem remains the same. There's got to be an answer to this problem, right?
Posted
by samhall.
Last updated
.
Post not yet marked as solved
0 Replies
299 Views
Typically when uploading ePubs into Books all that was required was to drag them either into the app directly or onto the Books icon. In MacOS Sonoma Books accepts ePubs but rejects them about a minute or two later. Curious about this behavior I went to the folder where ePubs are stored, and moved the books directly to it instead of via the Books app. Once you move them into the folder, the same behavior that you see in the app happens with the folder (books appear in the folder, then vanish from the folder a minute or two later).
Posted
by bmcneal3.
Last updated
.
Post not yet marked as solved
0 Replies
434 Views
Hello! In our website, we allow members to follow private RSS feeds on their Podcasts app easily with the click of a button. Underneath the button is a link in the format of: podcast://example.rssurl.com # Example With this URL scheme, the Podcasts app launches correctly and shows the "Follow a Show by URL" input box, with the private RSS feed URL prefilled. However, the prefilled URL uses HTTP instead of HTTPS. So, the example link above (podcast://example.rssurl.com) when clicked is converted to: http://example.rssurl.com in the "Follow a Show by URL" input box in the Podcasts app. We noticed the HTTP form of the URL does not work well, as it sometimes fail to follow the show. Clicking on "Follow" would simply close the box and cause nothing to happen, not even error messages. We then tested with HTTPS by manually inserting the URL like so: https://example.rssurl.com and this would always work. However, this is not ideal as our users would have to manually paste in the URL all the time. It'd be great if the Podcasts URL scheme above would automatically prefill the "Follow a Show by URL" input box with the HTTPS form instead. Is there a way to force this behavior or is this intended from Apple's side? Thank you!
Posted Last updated
.
Post not yet marked as solved
1 Replies
593 Views
I've integrated MPVolumeView into my view, and it correctly responds to hardware volume changes as expected. However, once I initiate audio streaming using AVAudioEngine to capture microphone audio and AudioUnit for decoding, the MPVolumeView ceases to reflect changes made using the hardware volume buttons. Additionally, even when I adjust the volume using the slider on MPVolumeView, it doesn't change the system volume. Has anyone else encountered this issue? What might be causing MPVolumeView to stop responding to hardware volume changes once streaming starts? For the AVAudioSession.Mode, I use the default setting because using .voiceChat prevents MPVolumeView update from device volume changes permanently. let session = AVAudioSession.sharedInstance() do { try session.setCategory(.playAndRecord, options: [.allowBluetooth]) try session.setActive(true) } catch { print(error.localizedDescription) }
Posted Last updated
.
Post not yet marked as solved
1 Replies
469 Views
Hello Everyone, I'm working on creating an audio playback widget for my app, aiming for functionality similar to the Apple Music widget. Specifically, I've implemented a play button in my widget that triggers an AudioPlaybackIntent. This intent then interacts with a singleton class that manages my AVAudioPlayer. The issue I'm facing is that the AVAudioPlayer instance in my main app and the instance in my widget extension don't seem to share the same state. I've noticed that the Apple Music widget is capable of showing real-time changes (eg. stopping music from command center, stops it on widget UI), which implies that it has some way of sharing the playback state between the main app and the widget. How can I achieve shared state between the main app and widget for my AVAudioPlayer instance? Is there a specific approach or API that Apple Music uses to make this happen?
Posted
by DavJwk.
Last updated
.
Post not yet marked as solved
0 Replies
635 Views
In our application, we play video-on-demand (VOD) content and display subtitles in different languages. The format we prefer for subtitles is WebVTT. We are planning to enhance caption styling (text color, background color, font weight, etc.) in WebVTT files. In our current flow, subtitles and images are loaded in 6-second chunks. Below is an example of one of the subtitle parts we use: WEBVTT X-TIMESTAMP-MAP=MPEGTS:0,LOCAL:00:00:00.000
Posted
by rushly.
Last updated
.