Integrate music and other audio content into your apps.

Posts under Audio tag

86 Posts
Sort by:
Post not yet marked as solved
0 Replies
340 Views
Is there a way for an FXPlug to access the Source audio? Or do we need to make an AU plugin, apply it to a audio source [both video or audio track], and feed the info via shared memory to an FXPlug? Is there an AU plugin for external processes to "listen" to the audio?
Posted
by belisoful.
Last updated
.
Post not yet marked as solved
0 Replies
328 Views
Hi all, I have created a QuickLook Preview for my custom datatype in my app. I use SwiftUI wrapped in UIKit for the preview. My issue is that when I try and play audio using AVAudioPlayer, I receive a status code 50 error. Does anyone know if there are seperate permissions I need to request before being able to do this? Here are the errors I get while trying to set my audio session as active and play on the avaudioplayer Thanks for your help and advice! The operation couldn’t be completed. (OSStatus error -50.) nwi_state: registration failed (9) connection <connection: 0x100e0b270> { name = com.apple.audio.AudioQueueServer, listener = false, pid = 0, euid = 4294967295, egid = 4294967295, asid = 4294967295 } : error <dictionary: 0x251524530> { count = 1, transaction: 0, voucher = 0x0, contents = "XPCErrorDescription" => <string: 0x2515246c8> { length = 18, contents = "Connection invalid" } } auto-cancelling <connection: 0x100e0b270> { name = com.apple.audio.AudioQueueServer, listener = false, pid = 0, euid = 4294967295, egid = 4294967295, asid = 4294967295 } 0x2816bf680 reply: XPC_ERROR_CONNECTION_INVALID throwing swix::exception: !(is_valid()) AQ_API_V2Impl.cpp:134 AudioQueueNew: <-AudioQueueNew failed -302 rebuilding null connection 0x2816bf680 reply: XPC_ERROR_CONNECTION_INVALID connection <connection: 0x100822a90> { name = com.apple.audio.AudioQueueServer, listener = false, pid = 0, euid = 4294967295, egid = 4294967295, asid = 4294967295 } : error <dictionary: 0x251524530> { count = 1, transaction: 0, voucher = 0x0, contents = "XPCErrorDescription" => <string: 0x2515246c8> { length = 18, contents = "Connection invalid" } } throwing swix::exception: !(is_valid()) auto-cancelling <connection: 0x100822a90> { name = com.apple.audio.AudioQueueServer, listener = false, pid = 0, euid = 4294967295, egid = 4294967295, asid = 4294967295 } AQ_API_V2Impl.cpp:134 AudioQueueNew: <-AudioQueueNew failed -302
Posted
by emilea.
Last updated
.
Post not yet marked as solved
0 Replies
302 Views
Each time your listening music you are streaming from a server powered by frequently coal or gaz are rarely green energy. As a developper on IOS, i request to Apple to provide download of audio file into our audio app . The goal is not to resell the audio and violate authors right. You Tube already does that. it is time to find tips and tricks to reduce the consumption of the energy specially into data brodcasting and useless streaming of the same song again and again and again. is it possible to change the API in accordance to this reality.
Posted
by tinolayco.
Last updated
.
Post not yet marked as solved
0 Replies
373 Views
I’m exploring enabling speech-to-commands processing for a game, but would like to try and do a baseline of voice recognition within that to allow two people in close proximity to interact , but not interfere with each others voice commands to this system. (it’s for an accessible game idea)
Posted
by heckj.
Last updated
.
Post not yet marked as solved
1 Replies
613 Views
Hi I was trying to design the above UI, But using the below code of CPListImageRowItem func templateApplicationScene(_ templateApplicationScene: CPTemplateApplicationScene, didConnect interfaceController: CPInterfaceController) { self.interfaceController = interfaceController // Create a list row item with images let item = CPListImageRowItem(text: "Category", images: [UIImage(named: "cover.jpeg")!, UIImage(named: "cover2.jpeg")!, UIImage(named: "discover.jpeg")!, UIImage(named: "thumbnail.jpeg")!]) // Create a list section let section = CPListSection(items: [item]) // Create a list template with the section let listTemplate = CPListTemplate(title: "Your Template Title", sections: [section]) // Set the template on the interface controller interfaceController.setRootTemplate(listTemplate, animated: true) } I was getting only header and below image items but detailed text under images are no way to set. can any one help me out of this
Posted
by TarakJaan.
Last updated
.
Post not yet marked as solved
0 Replies
459 Views
I have an app that is getting rejected from TestFlight because of this error: ITMS-90683: Missing purpose string in Info.plist - Your app’s code references one or more APIs that access sensitive user data, or the app has one or more entitlements that permit such access. The Info.plist file for the “TurtleTuner.app” bundle should contain a NSCameraUsageDescription key with a user-facing purpose string explaining clearly and completely why your app needs the data. If you’re using external libraries or SDKs, they may reference APIs that require a purpose string. While your app might not use these APIs, a purpose string is still required. For details, visit: https://developer.apple.com/documentation/uikit/protecting_the_user_s_privacy/requesting_access_to_protected_resources. The app does not use the camera, only the microphone. I cannot find references to the camera in any of the third party libraries I'm using. What are some ways to troubleshoot this beyond looking for "camera" in the few dependencies? For context, this commit allows the app to get through successfully to TestFlight: https://github.com/tsargent/turtle-tuner/commit/67d4a52e62839ad6c2a49848bea9c408d983f17a While this following commit, which reverts the commit, fails on TestFlight with the mentioned camera permission error: https://github.com/tsargent/turtle-tuner/commit/c95b0b16c4e85d77e625d36b816ed53faa826cf5
Posted
by twsargent.
Last updated
.
Post not yet marked as solved
0 Replies
287 Views
I have a question about the Apple Music preview app for Windows 11. It has a setting called Sound Check. Is that feature available on the Apple Music web player and the Apple Music Android app? If not, is that a planned feature for those?
Posted
by Jmcinvale.
Last updated
.
Post not yet marked as solved
1 Replies
2.4k Views
Does anyone have a working example on how to play OGG files with swift? I've been trying for over a year now. I was able to wrap the C Vorbis library in swift. I then used it to parse an OGG file successfully. Then I was required to use Obj-C&#92;&#43;+ to fill the PCM because this method seems to only be available in C&#92;&#43;+ and that part hangs my app for a good 40 seconds to several minutes depending on the audio file, it then plays for about 2 seconds and then crashes. I can't get the examples on the Vorbis site to work in objective-c and i tried every example on github I could find (most of which are for iOS - I want to play the files on mac) I also tried using Cricket Audio framework below. https://github.com/sjmerel/ck It has a swift example and it can play their proprietary soundbank format but it is also supposed to play OGG and it just doesn't do anything when trying to play OGG as you can see in the posted issue https://github.com/sjmerel/ck/issues/3 Right now I believe every player that can play OGGs on mac is written in Objective-C or C++. Anyway, any help/advice is appreciated. OGG format is very prevalent in the gaming community. I could use unity, which I believe plays oggs through the mono framework but I really really want to stay in swift.
Posted Last updated
.
Post not yet marked as solved
0 Replies
376 Views
I am working on a design that requires connecting an ios device to two audio output devices specifically headphones and a speaker. I want the audio driver to switch output device without user action. Is this manageable via ios SDK?
Posted
by AnLiu.
Last updated
.
Post not yet marked as solved
0 Replies
348 Views
How does visionOS play an MP4 audio to Spatial Audio through SwiftUI or RealityKit? Note: Since I can only test the App through Simulator, in order to ensure that my Spatial Audio is played correctly in the space, please tell me how to display the location of Spatial Audio in the space. Ew and how to delete this View after the test, thank you!
Posted
by lijiaxu.
Last updated
.
Post not yet marked as solved
1 Replies
415 Views
i have create one recording application, but user switch off or kill the application, so that time how to save ongoing record.
Posted
by Rajesh_S.
Last updated
.
Post not yet marked as solved
0 Replies
375 Views
Developing for iphone/ipad/mac I have an idea for a music training app, but need to know of supporting libraries for recognizing a musical note's fundamental frequency in close to real time (100 ms delay) Accuracy should be within a few cents (hundredths of a semi tone) A search for "music" resolved the core-midi library -- fine if I want to take input from midi, but I want to be open to audio input too. And I found MusicKit, which seems to be a programmer's API for digging into Meta questions: Should I be using different search terms: Where are libraries listed? Who are the names in 3rd party libraries.
Posted Last updated
.
Post not yet marked as solved
0 Replies
566 Views
Hi, I'm trying to play multiple video/audio file with AVPlayer using AVMutableComposition. Each video/audio file can process simultaneously so I set each video/audio in individual tracks. I use only local file. let second = CMTime(seconds: 1, preferredTimescale: 1000) let duration = CMTimeRange(start: .zero, duration: second) var currentTime = CMTime.zero for _ in 0...4 { let mutableTrack = composition.addMutableTrack( withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid ) try mutableTrack?.insertTimeRange( duration, of: audioAssetTrack, at: currentTime ) currentTime = currentTime + second } When I set many audio tracks (maybe more than 5), the first part sounds a little different from original when it starts. It seems like audio's front part is skipped. But when I set only two tracks, AVPlayer plays as same as original file. avPlayer.play() How can I fix it? Why do audio tracks affect that don't have any playing parts when start? Please let me know.
Posted
by Mandeuk.
Last updated
.
Post not yet marked as solved
1 Replies
477 Views
I was watching a video when I noticed a periodical popping sound. Thought it was a problem with the video, but the issue persists on all other videos or sound tracks. Gets very annoying! Quite disappointed to find this flaw in a brand new Mac Book Pro.
Posted
by IvanPolik.
Last updated
.
Post not yet marked as solved
0 Replies
495 Views
I'm involved in development on an iOS app for home security and alarm systems. There is recently a lot of negative feedback from customers about how low the notification sounds are since iOS17. Much of the feedback centers around the inability to control the volume of the notification sounds. My question is: if our app uses custom notification sounds, are these impacted by the volume changes made in iOS 17? I know previous versions of iOS allow you to control "Ringtone and Alert" volume in settings (with a volume slider). Is this same control still available for custom notification sounds within our app?
Posted Last updated
.
Post not yet marked as solved
0 Replies
368 Views
I just ripped a CD onto my Itunes, and when it plays on the CD, it plays gapless as intended, but when ripped and uploaded to my Itunes and Iphone, there is a gap. I hate it. I know previously you could have gapless playback. Can you PLEASE bring it back? It is a simple fix, update your software, and bring back gapless playback please. People have complained about it before, DO something about it.
Posted
by ToniBZE.
Last updated
.
Post not yet marked as solved
0 Replies
412 Views
Hello, I want to fetch all local music files from an iPhone device. I tried MPMediaQuery but I can get only that file which is in the document folder.If we use UIDocumentpicker we can fetch all files from the iPhone (downloads, File Application)after selection by the user. I want to fetch all music files like UIDocumentpicker but without user interaction. Thanks in Advance for your guidance.
Posted Last updated
.
Post not yet marked as solved
2 Replies
748 Views
Hello, I started to set audio stereo recording (both audio and video are recorded) and the audio quality seems to be lower than quality obtained with native camera application (configured for stereo). Using console to check the log, I found a difference between camera app and mine regarding MXSessionMode (of mediaserverd) in fact, camera application gives MXSessionMode = SpatialRecording and mine MXSessionMode = VideoRecording How can I configure capture session to finally have MXSessionMode = SpatialRecording? Any suggestion? Best regards
Posted
by ftristani.
Last updated
.