Integrate music and other audio content into your apps.

Posts under Audio tag

81 Posts
Sort by:
Post not yet marked as solved
0 Replies
349 Views
Does anyone know if there will be a problem with publishing an app that contains music under a CC (royalty-free) license? Will adding the credentials of the author be sufficient for the review team?
Posted
by
Post not yet marked as solved
6 Replies
2k Views
Turn on address sanitizer on Xcode and use a real device and put a Test.mp3 file in the Xcode project. Then it will crash when you initialise a AVAudioPlayer with a mp3 file (with a wav file it works fine). I have made an entry in feedback assistent -> FB12425453. var player : AVAudioPlayer? func playSound() { if let url = Bundle.main.url(forResource: "Test", withExtension: "mp3") { self.player = try? AVAudioPlayer(contentsOf: url) // --> deallocation of non allocated memory problem --> with a "wav" file it works .... } }
Posted
by
Post not yet marked as solved
0 Replies
475 Views
Using the OC framework AVFAudio, obtain PCM data for the 32-bit microphone of the device, and the results obtained are all between -1 and 1 (the actual value of the fixed noise source has exceeded 100 decibels). According to the conversion results, it cannot exceed 100 decibels (20 * log10 (PCM/0.00002)). Do you know what the reason or problem is? Output: -0.82569194 -0.82774025 -0.83398014 -0.87197787 -0.90468484 -0.9037836 0.9085202
Posted
by
Post not yet marked as solved
1 Replies
400 Views
I'm experiencing a lot of crackling sounds when watching a video or listening to music on my Macbook. I see a lot of issues but without a solution for it. So my question to Apple is: HOW TO FIX THIS?
Posted
by
Post not yet marked as solved
0 Replies
414 Views
Planning to create a Music Game App similar to Guitar Hero kind of, and I am planning to use songs from beginner producers (They don't have any Publishers or Distributors). If I get their legal concent through a contract to use their songs, does Apple allow to have Music that is not from Itunes if I get the legal license from the producer to use their songs in my app? Will apple allow songs that don't have Publishers or Distributors, or in the App review stage they will flag it and cause an issue for me?
Posted
by
Post not yet marked as solved
1 Replies
715 Views
For iPhone 14 pro, mic icon remains active on dynamic island. Every time restart is required to make it disappear. After restarting after few number of phone call, it again remains active on dynamic island.
Posted
by
Post not yet marked as solved
0 Replies
644 Views
I have the Flutter mobile app and I'm using the record flutter package for recording audio. So I'm facing an issue while recording the audio while the phone is locked. App Behavior: First we start the app and connect it to a Bluetooth device Then the app starts looking for the trigger of 1 from the device connected with it. On receiving the trigger from device it start recording. while mobile locked and app is running in background. AVAudioSession_iOS.mm:2367 Failed to set category, error: '!int' Failed to set up audio session: Error Domain=NSOSStatusErrorDomain Code=560557684 "(null)" I'm getting this error when AVAudioSession setting the category. My is for Users security purpose so it need to record background let me know how can I achive this functionality
Posted
by
Post not yet marked as solved
1 Replies
729 Views
I'm developing webview app with using javascript with IONIC. When I try to add worklet module, It looks fine but after add module, and then when I try to connect audioWorkletprocessor with audioContext. IOS give me error description like this. `2023-07-24 11:35:57.436444+0900 CHeKT[27066:10627891] [assertion] Error acquiring assertion: <Error Domain=RBSServiceErrorDomain Code=1 "(originator doesn't have entitlement com.apple.runningboard.assertions.webkit AND originator doesn't have entitlement com.apple.multitasking.systemappassertions)" UserInfo={NSLocalizedFailureReason=(originator doesn't have entitlement com.apple.runningboard.assertions.webkit AND originator doesn't have entitlement com.apple.multitasking.systemappassertions)}> 2023-07-24 11:35:57.436491+0900 CHeKT[27066:10627891] [ProcessSuspension] 0x1060089f0 - ProcessAssertion::acquireSync Failed to acquire RBS assertion 'WebKit Media Playback' for process with PID=27071, error: Error Domain=RBSServiceErrorDomain Code=1 "(originator doesn't have entitlement com.apple.runningboard.assertions.webkit AND originator doesn't have entitlement com.apple.multitasking.systemappassertions)" UserInfo={NSLocalizedFailureReason=(originator doesn't have entitlement com.apple.runningboard.assertions.webkit AND originator doesn't have entitlement com.apple.multitasking.systemappassertions)} 2023-07-24 11:35:57.436947+0900 CHeKT[27066:10627891] [assertion] Error acquiring assertion: <Error Domain=RBSServiceErrorDomain Code=1 "(originator doesn't have entitlement com.apple.runningboard.assertions.webkit AND originator doesn't have entitlement com.apple.multitasking.systemappassertions)" UserInfo={NSLocalizedFailureReason=(originator doesn't have entitlement com.apple.runningboard.assertions.webkit AND originator doesn't have entitlement com.apple.multitasking.systemappassertions)}> 2023-07-24 11:35:57.436980+0900 CHeKT[27066:10627891] [ProcessSuspension] 0x106008ae0 - ProcessAssertion::acquireSync Failed to acquire RBS assertion 'WebKit Media Playback' for process with PID=27066, error: Error Domain=RBSServiceErrorDomain Code=1 "(originator doesn't have entitlement com.apple.runningboard.assertions.webkit AND originator doesn't have entitlement com.apple.multitasking.systemappassertions)" UserInfo={NSLocalizedFailureReason=(originator doesn't have entitlement com.apple.runningboard.assertions.webkit AND originator doesn't have entitlement com.apple.multitasking.systemappassertions)} 2023-07-24 11:35:57.437323+0900 CHeKT[27066:10627891] [assertion] Error acquiring assertion: <Error Domain=RBSServiceErrorDomain Code=1 "(originator doesn't have entitlement com.apple.runningboard.assertions.webkit AND originator doesn't have entitlement com.apple.multitasking.systemappassertions)" UserInfo={NSLocalizedFailureReason=(originator doesn't have entitlement com.apple.runningboard.assertions.webkit AND originator doesn't have entitlement com.apple.multitasking.systemappassertions)}> 2023-07-24 11:35:57.437354+0900 CHeKT[27066:10627891] [ProcessSuspension] 0x106008bd0 - ProcessAssertion::acquireSync Failed to acquire RBS assertion 'WebKit Media Playback' for process with PID=27072, error: Error Domain=RBSServiceErrorDomain Code=1 "(originator doesn't have entitlement com.apple.runningboard.assertions.webkit AND originator doesn't have entitlement com.apple.multitasking.systemappassertions)" UserInfo={NSLocalizedFailureReason=(originator doesn't have entitlement com.apple.runningboard.assertions.webkit AND originator doesn't have entitlement com.apple.multitasking.systemappassertions)}` even gives me that error. audioWorkletProcessor is working, but when I try to access microphone with getUserMedia() method, audioWorkletProcessor sound is broke like robot sound. audioWorklet is not working with IOS? I need to develop 2way audio using with audioWorklet. but It is not possible to make 2way audio in IOS (Android is working well) Please let me know if you have any feedback or solutions. Thanks. Bobby.
Posted
by
Post not yet marked as solved
0 Replies
831 Views
Hello developers, we have an issue with opening an Apple MPEG-4 audio file that apparently has a correct header but then no actual audio data. This file is 594 bytes and freezes completely the app's main thread and never returns from either of these calls: NSURL *fileURL = [NSURL fileURLWithPath:filePath]; NSError *error; AVAudioPlayer *audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:fileURL error:&error]; // freez (call stack below) AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading:fileURL error:&error]; // freez AudioFileID audioFileID; OSStatus result = AudioFileOpenURL((__bridge CFURLRef)fileURL, kAudioFileReadPermission, 0, &audioFileID); // freez Putting the debugger in pause reveals where it is stuck: #0 0x00007ff81b7683f9 in MP4BoxParser_Track::GetSampleTableBox() () #1 0x00007ff81b76785a in MP4BoxParser_Track::GetInfoFromTrackSubBoxes() () #2 0x00007ff81b93fde5 in MP4AudioFile::UseAudioTrack(void*, unsigned int, unsigned int) () #3 0x00007ff81b93ab2c in MP4AudioFile::OpenFromDataSource() () #4 0x00007ff81b72ee85 in AudioFileObject::Open(__CFURL const*, signed char, int) () #5 0x00007ff81b72ed9d in AudioFileObject::DoOpen(__CFURL const*, signed char, int) () #6 0x00007ff81b72e1f0 in AudioFileOpenURL () #7 0x00007ffa382e8183 in -[AVAudioPlayer initWithContentsOfURL:fileTypeHint:error:] () With either of 3 calls the call stack is a little bit different but all in the end get stuck forever in MP4BoxParser_Track::GetSampleTableBox() I'm attaching the incriminated audio file to the post (just rename it back to .m4a): Audio_21072023_10462282.crash How can we avoid this and verify that an audio file is openable and playable. Before, we were checking if a file that we belive be an audio contains data inside, if true then we create AVAudioPlayer with it and see if it return no errors and if the duration is >0. This bug breaks this fondamental logic and now we added a hotfix hack to check if the data is at least 600 bytes long. How do we correctly solve this if none of the methods above return any error but instead ALL hang?
Posted
by
Post not yet marked as solved
1 Replies
542 Views
Whenever I try to make a playlist with a lot of songs all at once, I'll get to a point it almost freezes. Like how after I click add to playlist for each song a notice comes up saying "song added", but it will stop doing that and the song won't show up on the playlist. Then maybe 2 or 3 minutes later it will show the notice the song has been added. Anyone else dealing with this? Frustrating when I'm trying to do big playlists and I have to come back to it the next day to add the rest.
Posted
by
Post not yet marked as solved
0 Replies
448 Views
Could you provide guidance on how to add chapter marks to an M4A. I've been attempting bookmark. From what I've read, it requires the use of AVMetadataKey.quickTimeUserDataKeyChapter track.addTrackAssociation(to: ... type: .chapterList) or both. I've looked into AVTimedMetadataGroup but I havent found a way to get it added based on the documentation. I also havent found anyone who has used native Swift to add chapter marks. They've always given in and used ffmpeg or some other external solution. inputURL is for the file that is being read in outputURL is for the the final file chapters is an array of dictionaries, where time is the start of each chapter and its name in the list The target is macOS import AVFoundation class AudioChapterCreator { // Function to create an audio file with chapters and a chapter list func createAudioFileWithChapters(inputURL: URL, outputURL: URL, chapters: [(time: CMTime, title: String)]) { let options = [AVURLAssetPreferPreciseDurationAndTimingKey: true] let asset = AVURLAsset(url: inputURL, options: options) let durationInSeconds = CMTimeGetSeconds(asset.duration) print("asset durationInSeconds: \(durationInSeconds)") guard let audioTrack = asset.tracks(withMediaType: .audio).first else { print("Error: Unable to find audio track in the asset.") return } // Create metadata items for chapters let chapterMetadataItems = chapters.map { chapter -> AVMetadataItem in let item = AVMutableMetadataItem() // this duration is just for testing let tempDur = CMTime(seconds: 100, preferredTimescale: 1) item.keySpace = AVMetadataKeySpace.quickTimeUserData item.key = AVMetadataKey.quickTimeUserDataKeyChapter as NSString item.value = chapter.title as NSString item.time = chapter.time item.duration = tempDur return item } // Create an AVAssetExportSession for writing the output file guard let exportSession = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetAppleM4A) else { print("Error: Unable to create AVAssetExportSession.") return } // Configure the AVAssetExportSession exportSession.outputFileType = .m4a exportSession.outputURL = outputURL exportSession.metadata = asset.metadata + chapterMetadataItems exportSession.timeRange = CMTimeRangeMake(start: CMTime.zero, duration: asset.duration); // Export the audio file exportSession.exportAsynchronously { switch exportSession.status { case .completed: print("Audio file with chapters and chapter list created successfully.") case .failed: print("Error: Failed to create the audio file.") case .cancelled: print("Export cancelled.") default: print("Export failed with unknown status.") } } } }
Posted
by
Post not yet marked as solved
1 Replies
708 Views
Problem Description This HLS video https://lf3-vod-cdn-tos.douyinstatic.com/obj/vodsass/hls/main.m3u8 starts with noise at 22 seconds play directly on MacOS 12.6.6 Safari,and it also appears on iOS (16.5.1) safari. But there is no noise when playing with MSE on Mac by the third-party open source web playe such as hls.js on Safari. Test tool hls.js test demo: https://hlsjs.video-dev.org/demo/
Posted
by
Post not yet marked as solved
1 Replies
547 Views
Hello, I have struggled to resolve issue above question. I could speak utterance when I turn on my iPhone, but when my iPhone goes to background mode(turn off iPhone), It doesn't speak any more. I think it is possible to play audio or speak utterance because I can play music on background status in youtube. Any help please??
Posted
by
Post not yet marked as solved
0 Replies
501 Views
I am writing a watchOS app where I have some audio files that I want to play at various points. I am using AVAudioPlayer. It all works in the simulator and it also works if I have Air Pods connected to my watch via Bluetooth. However I get no sound if there isn't a paired set of earphones. In the case of no earphones I would like the sounds to play from the physical watch speaker. I can't seem to find any documentation on how to cause that to happen. Any hints or tips are appreciated.
Posted
by
Post not yet marked as solved
0 Replies
539 Views
How can i record an audio when the app is in background? I tried on android and its working but on IOS its only recording on the foreground Its an Expo app and am using Expo Av library I have already allowed the permissions and set the UIBackgroundModes with audio "infoPlist": { ... "UIBackgroundModes": [ "audio" ] } And in the code also await Audio.setAudioModeAsync({ allowsRecordingIOS: true, playsInSilentModeIOS: true, staysActiveInBackground:true }); but once the app is in background mode it is failing to start recording. Can anyone help me how I can fix this ?
Posted
by
Post not yet marked as solved
0 Replies
535 Views
I'm developing a macOS app and I'm trying to access the microphone without directly triggering the default permission dialog. Instead, I've managed to programmatically open the System Settings, specifically the Privacy &amp;amp; Security -&amp;gt; Microphone section, allowing users to manually grant permission. However, there's an issue. Even after the user manually toggles on the microphone permission for my app in System Settings, the AVCaptureDevice.authorizationStatus(for: .audio) still returns .notDetermined. To clarify, I'm avoiding the use of AVCaptureDevice.requestAccess(for: .audio) because it prompts the default permission dialog. But when I do use it, the app correctly recognizes changes in permission status. The problem arises only when trying to detect permission changes made directly from the System Settings. Here is my code struct SystemSettingsHandler { static func openSystemSetting(for type: String) { guard type == "microphone" || type == "screen" else { return } let microphoneURL = "x-apple.systempreferences:com.apple.preference.security?Privacy_Microphone" let screenURL = "x-apple.systempreferences:com.apple.preference.security?Privacy_ScreenCapture" let urlString = type == "microphone" ? microphoneURL : screenURL if let url = URL(string: urlString) { NSWorkspace.shared.open(url) } } } private func requestMicrophonePermission(completion: @escaping (Bool) -&amp;gt; Void) { switch AVCaptureDevice.authorizationStatus(for: .audio) { case .authorized: print("authorized") completion(true) case .notDetermined: print("notDetermined") AVCaptureDevice.requestAccess(for: .audio) { granted in if granted { completion(granted) } else { completion(granted) } } case .denied, .restricted: print("denied") SystemSettingsHandler.openSystemSetting(for: "microphone") completion(false) @unknown default: print("unknown") completion(false) } } Thank you for reading this post!
Posted
by
Post not yet marked as solved
1 Replies
508 Views
I've been using AVAssetExportSession to trim audio files for the past 2 years, and suddenly it stopped working properly. It still works fine when I run my app on a phone running iOS 16, but on my iOS 17 phone it exports an incorrect duration (ex. I'll provide a file with 2 seconds duration, ask it to trim it to 0 - 1.7s, it'll return the file overtrimmed at 1.58s or something like that). The AVURLAsset is returning the correct duration, I've already tried using the AVURLAssetPreferPreciseDurationAndTimingKey, it's useless to me, as the error happens somewhere during the export. guard let exportSession = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetAppleM4A) else { completion(false, nil) return } let startTime = CMTimeMakeWithSeconds(floor(startPoint * 100) / 100.0, preferredTimescale: 44100) let stopTime = CMTimeMakeWithSeconds(ceil(endPoint * 100) / 100.0, preferredTimescale: 44100) let exportTimeRange = CMTimeRange(start: startTime, end: stopTime) exportSession.timeRange = exportTimeRange exportSession.outputFileType = .m4a exportSession.outputURL = targetURL AudioHelper.deleteFile(at: exportSession.outputURL) exportSession.exportAsynchronously { ... } I've managed to somewhat mitigate the damage by adding silence to the file and continuously trimming it until I get it close to my required duration, but it's an extremely ugly hack and it's breaking down the whole functionality of my app.
Posted
by
Post not yet marked as solved
1 Replies
432 Views
Translated Report (Full Report Below) Process: Logic Pro X [1524] Path: /Applications/Logic Pro X.app/Contents/MacOS/Logic Pro X Identifier: com.apple.logic10 Version: 10.7.7 (5762) Build Info: MALogic-5762000000000000~2 (1A85) App Item ID: 634148309 App External ID: 854029738 Code Type: ARM-64 (Native) Parent Process: launchd [1] User ID: 502 Date/Time: 2023-10-10 12:52:02.8675 +0100 OS Version: macOS 13.0 (22A380) Report Version: 12 Anonymous UUID: D3A4AE8C-2CA2-CC80-A569-39459CA10192 Time Awake Since Boot: 4400 seconds System Integrity Protection: enabled Crashed Thread: 0 Dispatch queue: com.apple.main-thread Exception Type: EXC_BAD_ACCESS (SIGSEGV) Exception Codes: KERN_INVALID_ADDRESS at 0x0000000000000010 Exception Codes: 0x0000000000000001, 0x0000000000000010 Termination Reason: Namespace SIGNAL, Code 11 Segmentation fault: 11 Terminating Process: exc handler [1524] VM Region Info: 0x10 is not in any region. Bytes before following region: 105553518919664 REGION TYPE START - END [ VSIZE] PRT/MAX SHRMOD REGION DETAIL UNUSED SPACE AT START ---> MALLOC_NANO (reserved) 600018000000-600020000000 [128.0M] rw-/rwx SM=NUL ...(unallocated) Thread 0 Crashed:: Dispatch queue: com.apple.main-thread 0 Logic Pro X 0x1045c2794 0x10412c000 + 4810644 1 Logic Pro X 0x1045c273c 0x10412c000 + 4810556 2 Logic Pro X 0x1045c37c8 0x10412c000 + 4814792 3 Logic Pro X 0x10492bc40 0x10412c000 + 8387648 4 Logic Pro X 0x10492bd30 0x10412c000 + 8387888 5 Logic Pro X 0x1045c23d0 0x10412c000 + 4809680 6 Logic Pro X 0x1049838ac 0x10412c000 + 8747180 7 Logic Pro X 0x10498340c 0x10412c000 + 8745996 8 Logic Pro X 0x1044feb18 0x10412c000 + 4008728 9 CoreFoundation 0x191222fd0 NSDICTIONARY_IS_CALLING_OUT_TO_A_BLOCK + 24 10 CoreFoundation 0x19125f4b4 -[__NSDictionaryM enumerateKeysAndObjectsWithOptions:usingBlock:] + 212 11 Logic Pro X 0x104ae37b4 0x10412c000 + 10188724 12 Logic Pro X 0x1044fe7ac 0x10412c000 + 4007852 13 Logic Pro X 0x104ae9eb8 0x10412c000 + 10215096 14 Logic Pro X 0x10498987c 0x10412c000 + 8771708 15 Logic Pro X 0x104989434 0x10412c000 + 8770612 16 Logic Pro X 0x10521497c 0x10412c000 + 17729916 17 Foundation 0x19219c67c __NSFireTimer + 104 18 CoreFoundation 0x191277578 CFRUNLOOP_IS_CALLING_OUT_TO_A_TIMER_CALLBACK_FUNCTION + 32 19 CoreFoundation 0x191277220 __CFRunLoopDoTimer + 940 20 CoreFoundation 0x191276d78 __CFRunLoopDoTimers + 356 21 CoreFoundation 0x19125c760 __CFRunLoopRun + 1896 22 CoreFoundation 0x19125b8a4 CFRunLoopRunSpecific + 612 23 HIToolbox 0x19a8cf3bc RunCurrentEventLoopInMode + 292 24 HIToolbox 0x19a8cf200 ReceiveNextEventCommon + 672 25 HIToolbox 0x19a8cef48 _BlockUntilNextEventMatchingListInModeWithFilter + 72 26 AppKit 0x1944b4630 _DPSNextEvent + 632 27 AppKit 0x1944b37c0 -[NSApplication(NSEvent) _nextEventMatchingEventMask:untilDate:inMode:dequeue:] + 728 28 Logic Pro X 0x1055198b8 0x10412c000 + 20895928 29 AppKit 0x1944a7bf0 -[NSApplication run] + 464 30 AppKit 0x19447f058 NSApplicationMain + 880 31 Logic Pro X 0x104a6a7a8 0x10412c000 + 9693096 32 dyld 0x190e53e50 start + 2544 Thread 1:: caulk.messenger.shared:17 0 libsystem_kernel.dylib 0x19113ed6c semaphore_wait_trap + 8 1 caulk 0x19a5f6cfc caulk::mach::semaphore::wait_or_error() + 28 2 caulk 0x19a5d9634 caulk::concurrent::details::worker_thread::run() + 56 3 caulk 0x19a5d9278 void* caulk::thread_proxy<std::__1::tuple<caulk::thread::attributes, void (caulk::concurrent::details::worker_thread::)(), std::__1::tuplecaulk::concurrent::details::worker_thread* > >(void) + 96 4 libsystem_pthread.dylib 0x19117e06c _pthread_start + 148 5 libsystem_pthread.dylib 0x191178e2c thread_start + 8 Thread 2:: com.apple.NSEventThread 0 libsystem_kernel.dylib 0x19113edf0 mach_msg2_trap + 8 1 libsystem_kernel.dylib 0x1911508d8 mach_msg2_internal + 80 2 libsystem_kernel.dylib 0x191147638 mach_msg_overwrite + 540 3 libsystem_kernel.dylib 0x19113f16c mach_msg + 24 4 CoreFoundation 0x19125dbdc __CFRunLoopServiceMachPort + 160 5 CoreFoundation 0x19125c4c8 __CFRunLoopRun + 1232 6 CoreFoundation 0x19125b8a4 CFRunLoopRunSpecific + 612 7 AppKit 0x1945de248 _NSEventThread + 172 8 libsystem_pthread.dylib 0x19117e06c _pthread_start + 148 9 libsystem_pthread.dylib 0x191178e2c thread_start + 8 Thread 3:: MIDIClientNotificationThread 0 libsystem_kernel.dylib 0x19113edf0 mach_msg2_trap + 8 1 libsystem_kernel.dylib 0x1911508d8 mach_msg2_internal + 80 2 libsystem_kernel.dylib 0x191147638 mach_msg_overwrite + 540 3 libsystem_kernel.dylib 0x19113f16c mach_msg + 24 4 CoreFoundation 0x19125dbdc __CFRunLoopServiceMachPort + 160 5 CoreFoundation 0x19125c4c8 __CFRunLoopRun + 1232 6 CoreFoundation 0x19125b8a4 CFRunLoopRunSpecific + 612 7 Foundation 0x192163e58 -[NSRunLoop(NSRunLoop) runMode:beforeDate:] + 212 8 Foundation 0x1921d83b4 -[NSRunLoop(NSRunLoop) runUntilDate:] + 100 9 Logic Pro X 0x1045a8a74 0x10412c000 + 4704884 10 libsystem_pthread.dylib 0x19117e06c _pthread_start + 148 11 libsystem_pthread.dylib 0x191178e2c thread_start + 8 Thread 4:: SeqTimer
Posted
by
Post marked as solved
1 Replies
785 Views
Hello everyone, I'm using Flutter and the just_audio package. When a user receives a push notification, the app plays audio in the background. I've tested this functionality on iPhone 6s and iPhone 13. It works correctly on iPhone 6s and the app plays the sound on push notification received. However on iPhone 13 the app receives the notification, starts the background process but fails to play the sound with these errors: mediaserverd(MediaExperience)[17680] &lt;Notice&gt;: -CMSUtilities- CMSUtility_IsAllowedToStartPlaying: Client sid:0x45107e5, Runner(28933), 'prim' with category MediaPlayback and mode Default and mixable does not have assertions to start mixable playback mediaserverd(MediaExperience)[17680] &lt;Notice&gt;: -CMSessionMgr- MXCoreSessionBeginInterruption_WithSecTaskAndFlags: CMSessionBeginInterruption failed as client 'sid:0x45107e5, Runner(28933), 'prim'' has insufficient privileges to take control mediaserverd(AudioSessionServer)[17680] &lt;Error&gt;: AudioSessionServerImp.mm:405 { "action":"cm_session_begin_interruption", "error":"translating CM session error", "session":{"ID":"0x45107e5","name":"Runner(28933)"}, "details":{"calling_line":879,"error_code":-16980,"error_string":"Operation denied. Cannot start playing"} } From what I understand of these errors is that on the newer iPhones, there must be additional permissions. Does anyone have any idea on how I can fix this?
Posted
by