Siri and Voice

RSS for tag

Help users quickly accomplish tasks related to your app using just their voice.

Posts under Siri and Voice tag

42 Posts
Sort by:
Post not yet marked as solved
5 Replies
3.2k Views
We are currently experiencing a usability issue in our App. We also discovered this issue for sites in Safari as well. While using Voiceover in iOS 13.3+, we've discovered that VO skips all tables that are using a caption. This occurs when a user swipes to read the contents of the page. We also observed that using the "rotor" and choosing tables, it will not recognize that there is a table on the page. This has been repeated by multiple users on different devices. Our testing also encompassed VO on macOS Catalina and VO worked as expected for all tables we tested. Has anyone else come across this issue?
Posted
by
Post not yet marked as solved
1 Replies
906 Views
I was trying out SiriKit Media Intents and found that with iOS 14, Media Intents can be handled in-app instead of using an extension. Given my current project structure, in-app intent handling suits my purpose better than handling it in an extension. But my question is about how this intent will be handled in a watchOS app? Is in-app Intent Handling supported on watchOS as well (if yes, are there any examples that I can refer to)? If not, can I create an extension for Media Intents and trigger it for watchOS while triggering the in-app handling for iOS alone? Please share if I have missed to read through some documentation / reference that solves this problem.
Posted
by
Post not yet marked as solved
63 Replies
24k Views
i tried to call my girlfriend using siri and well theres a bug with that she keeps trying to do it but she always fail with every command i gave her, also while using airpods siri doesnt appear at all. im currently on iphone 13 pro
Posted
by
Post not yet marked as solved
4 Replies
2.3k Views
Using the write method from AVSpeechSynthesizer produces the following error: [AXTTSCommon] TTSPlaybackEnqueueFullAudioQueueBuffer: error -66686 enqueueing buffer This issue has first been seen on iOS 16. More information and code snippet: https://stackoverflow.com/questions/73716508/play-audio-buffers-generated-by-avspeechsynthesizer-directly
Posted
by
Post not yet marked as solved
1 Replies
966 Views
Hi, The "Siri & Search" option is listed in my app's settings and I'd like to know how to remove it. I'm pretty sure it wasn't listed when I first started developing my app over a year ago and I never added any Siri related entitlements to my app, not sure when it started being there. It seems half of the apps on my device (from other developers) have this listed, other half do not, so it's not there by default. I've checked my entitlements file, nothing related to Siri. Checked capabilities, nothing there either related to Siri. I know my users could go and manually block the app's data from being accessible but I'd rather remove the option entirely and definitely don't want it on by default. Shows up on at least iOS 15 and iOS 16. Thanks! Colin
Posted
by
Post not yet marked as solved
1 Replies
1.2k Views
We are developing an app for iOS 16 using App Intents and Siri. We have been testing the behavior of opening another App Intent by passing the opensIntent argument to the result function in the App Intent with iOS 16.4 devices. As a result, we found that the dialog text specified in the result argument of the Intent to which it transitions is not displayed and therefore does not work. And also found that when invoked from Siri, the Intent passed to the opensIntent argument is invoked "twice". Are these behaviors bugs? Any ideas? The following is a sample code and logs. import AppIntents struct PrimaryIntent: AppIntent { static var title: LocalizedStringResource = "Primary" func perform() async throws -> some OpensIntent { print("\(String(describing: Self.self)).\(#function): invoked") return .result(opensIntent: SecondaryIntent()) } } struct SecondaryIntent: AppIntent { static var title: LocalizedStringResource = "Secondary" func perform() async throws -> some ProvidesDialog { print("\(String(describing: Self.self)).\(#function): invoked") return .result(dialog: .init(stringLiteral: "test")) } } struct ShortcutsProvider: AppShortcutsProvider { static var appShortcuts: [AppShortcut] { AppShortcut(intent: PrimaryIntent(), phrases: ["\(.applicationName)"]) } } logs from Shortcut App PrimaryIntent.perform(): invoked SecondaryIntent.perform(): invoked # is not displayed dialog... logs from Siri PrimaryIntent.perform(): invoked SecondaryIntent.perform(): invoked SecondaryIntent.perform(): invoked # is not displayed dialog... # SecondaryIntent invoked twice...
Posted
by
Post not yet marked as solved
3 Replies
1.3k Views
We have a simple AppIntent to let users ask a question in our app. The intent has a single parameter: Prompt which is retrieved by a requestValueDialog. Users have reported that when using Siri, the dialog for "What would you like to ask?" appears, but if they respond with phrases such as "What is the last album by Sting" it just presents the dialog for the prompt again. Testing it, I find that I can reproduce the behavior if I include words like "recent" or "last". Just providing those words in isolation causes the dialog to be presented over and over again. Using the same intent from Shortcuts does not seem to have that limitation, it's only when providing the spoken words for speech recognition. All of that interaction happens outside our code, so I cant see any way to debug or identify why the prompts are being rejected. Is there a different way to specify the @Parameter to indicate the prompt: String could include any arbitrary text - including "recent" or "last" ? My hunch is those words are triggering a system response that is stepping on the requstValueDialog? Here's the basics of how the intent and parameter are setup: struct AskAI: AppIntent { static var title: LocalizedStringResource = "Ask" static var description: IntentDescription = IntentDescription("This will ask the A.I. app") static var openAppWhenRun = false @Parameter(title: "Prompt", description: "The prompt to send", requestValueDialog: IntentDialog("What would you like to ask?")) var prompt: String @MainActor func perform() async throws -> some IntentResult & ProvidesDialog & ShowsSnippetView { var response = "" ... response = "You asked: \"\(prompt)\" \n" ... return .result(dialog: "\(response)") } static var parameterSummary: some ParameterSummary { Summary("Ask \(\.$prompt)") } }
Posted
by
Post not yet marked as solved
0 Replies
640 Views
Hi there, After a crash the log contained the following events that look to my newbie eyes to be a call to SIRI for my configuration files; looking for other mentions of this crash log message brought me here to the development forums and included not only what appears to be a possible call to SIRI but also a possible call to a specific message to be "dictated" to SIRI. I don't know what this means but my *** radar is going off. Original crash log entries (Logic Pro X crashing) "legacyInfo" : { "threadTriggered" : { "queue" : "com.apple.main-thread" } }, "logWritingSignature" : "76f7997036f9874763e329de45d64ba64b5b5964", "trialInfo" : { "rollouts" : [ { "rolloutId" : "62699e1ec1ff297(***)", "factorPackIds" : { "SIRI_FIND_MY_CONFIGURATION_FILES" : "631f72d1de5591(***)" }, "deploymentId" : 240000023 }, { "rolloutId" : "6391cacc75b072(***), "factorPackIds" : { "COREOS_ICD" : "63957ec73127f(***)" }, "deploymentId" : 240000007 } ], "experiments" : [ ] } ^^ In the above log, please note "SIRI_FIND_MY_CONFIGURATION_FILES" In another posted error log in this Apple forum (https://developer.apple.com/forums/thread/706908), SIRI_FIND_MY_CONFIGURATION_FILES was posted as well as a following call to "SIRI_DICTATION_ASSETS". Excuse me wut?
Posted
by
Post not yet marked as solved
3 Replies
1.2k Views
Hi all, I'm trying to update my app to use the AppIntent framework to play an audio file (loaded from the main bundle). I tried implementing an a PlayMusicIntent using the AudioStartingIntent protocol, but have had no luck. The intent does run, and provides the dialog response, but the music doesn't start. struct PlayMusicIntent: AudioStartingIntent { static let title:LocalizedStringResource = "Play Music" @MainActor func perform() async throws -> some IntentResult & ProvidesDialog { guard let musicPath = Bundle.main.url(forResource: "moonglow", withExtension: "mp3") else { fatalError("Could not load music file") } do { let musicPlayer = try AVAudioPlayer(contentsOf: musicPath) musicPlayer.numberOfLoops = -1 musicPlayer.play() return .result(dialog: "Now Playing Moonglow") }catch { print(error) return .result(dialog: "There was an error playing the music: \(error.localizedDescription)") } return .result(dialog: "Cannot play music") } } And then add this in my App Shortcuts Provider: AppShortcut(intent: PlayMusicIntent(), phrases: [ "Play music with \(.applicationName)" ]) I do have Audio background mode enabled. I'm thinking I need to do something with the intent's return type, but there is not a lot of documentation or online examples to implement AudioStartingIntent Any suggestions would be appreciated. Thanks, Scott
Posted
by
Post not yet marked as solved
1 Replies
596 Views
Hei, I have an issue that all our apartment homes run on custom insecure cert (which we cannot change). And customers want to make Siri commands to automate the AC with Shortcuts app. Seems like if customer tries to run the shortcut which makes a POST with headers and body to insecure URL then they get an error message "there was a problem running the shortcut". Seems like Apple Shortcuts app does not allow to make post request to insecure URLs? Is there a way to get around it? Regards, Robert
Posted
by
Post not yet marked as solved
2 Replies
828 Views
I want to add shortcut and Siri support using the new AppIntents framework. Running my intent using shortcuts or from spotlight works fine, as the touch based UI for the disambiguation is shown. However, when I ask Siri to perform this action, she gets into a loop of asking me the question to set the parameter. My AppIntent is implemented as following: struct StartSessionIntent: AppIntent { static var title: LocalizedStringResource = "start_recording" @Parameter(title: "activity", requestValueDialog: IntentDialog("which_activity")) var activity: ActivityEntity @MainActor func perform() async throws -> some IntentResult & ProvidesDialog { let activityToSelect: ActivityEntity = self.activity guard let selectedActivity = Activity[activityToSelect.name] else { return .result(dialog: "activity_not_found") } ... return .result(dialog: "recording_started \(selectedActivity.name.localized())") } } The ActivityEntity is implemented like this: struct ActivityEntity: AppEntity { static var typeDisplayRepresentation = TypeDisplayRepresentation(name: "activity") typealias DefaultQuery = MobilityActivityQuery static var defaultQuery: MobilityActivityQuery = MobilityActivityQuery() var id: String var name: String var icon: String var displayRepresentation: DisplayRepresentation { DisplayRepresentation(title: "\(self.name.localized())", image: .init(systemName: self.icon)) } } struct MobilityActivityQuery: EntityQuery { func entities(for identifiers: [String]) async throws -> [ActivityEntity] { Activity.all()?.compactMap({ activity in identifiers.contains(where: { $0 == activity.name }) ? ActivityEntity(id: activity.name, name: activity.name, icon: activity.icon) : nil }) ?? [] } func suggestedEntities() async throws -> [ActivityEntity] { Activity.all()?.compactMap({ activity in ActivityEntity(id: activity.name, name: activity.name, icon: activity.icon) }) ?? [] } } Has anyone an idea what might be causing this and how I can fix this behavior? Thanks in advance
Posted
by
Post not yet marked as solved
0 Replies
584 Views
I have an app intent like the one below. If the intent is run via siri when the phone is locked then the user is asked to unlock their phone but the shortcut is interrupted. Is it possible to delay opening the app until the result dialog is returned? import AppIntents struct MyIntent: AppIntent { static var title: LocalizedStringResource = "MyApp" static var description = IntentDescription("Captures data") static var openAppWhenRun: Bool = true @Parameter(title: "The Data") var theData: String @MainActor func perform() async throws -> some IntentResult { _ = try await RESTClient.postData(theData: theData) return .result(dialog:"Thank you!") //TODO: Now try to open app } }
Posted
by
Post not yet marked as solved
1 Replies
633 Views
I have a Driving Task CarPlay entitlement. While connected to CarPlay, I can ask Siri to open any of the apps on the CarPlay Console. Granted, I suspect none of the apps I have fall under Driving Task entitlements. Audible, Audio Books and Music would likely be Audio apps. WhatsApp and Telegram will be Communication, Waze and Maps will be Navigation. I have SiriKit intents to do various tasks in the app. When I use Siri and issue those commands, they all work, regardless of whether I'm connected to CarPlay or Not. What baffles me is that I can't ask Siri to simply open my app on the CarPlay console. She keeps responding with “Sorry, I can’t do that while you’re driving.” I've read through the CarPlay App Programming Guide. The guide mentions the following: All voice interaction must be handled using SiriKit (with the exception of CarPlay navigation apps) Under additional guidelines for communication apps, it mentions that your app must support VOIP calling features via specific SiriKit intents. Under additional guidelines for navigation apps, they mention that voice control must be limited to navigation features. I don't see any special mention of voice control related to Driving Task apps. When I ask Siri to open my app on the iPhone, it just happens. I didn't have to do anything specifically to have this functionality. Why can't I open my app with Siri on the CarPlay console?
Posted
by
Post not yet marked as solved
1 Replies
524 Views
I have a few Custom Intents defined in an intents definitions file. One of these intents allow the user to capture an expense. Naturally, one of the parameters for the expense is the amount charged. When I initially created these intents in November 2022, this intent worked brilliantly. The user specified the amount, in the way you would usually communicate a currency amount by including words like "dollar" or "cents", and all worked well. About a month ago, a user informed me about a problem with the intent. All parameter values resolved except for the expense amount parameter. No matter what you provided for the expense amount, Siri would reply with "expense amount cannot be negative". Investigating the issue confirmed the problem. I'm unable to determine exactly when this started happening as most of our users don't use the Siri integration and not all users inform us about issues they find. I implemented a temporary workaround and finally have the time to spend to resolve this problem. I added a breakpoint to the intent handler's "resolveParameterExpenseAmount" method to see where the breakdown in communication was. First, the breakpoint was never hit. Siri would still reply with "expense amount cannot be negative". I realised the Validation Errors defined in the parameter's settings were catching the value before I was asked to resolve it in the intent handler. So I removed all of the validation errors (you can't remove them with the minus sign below, I just hit back space on the keyboard on each one). This time, the breakpoint was hit, but the expense amount was nil. To make sure I wasn't messing something up with this parameter being part of a whole bunch of other parameters, I decided to create just a simple Test intent. This is the properties on the custom intent: There is one parameter, configured like this: The section under Shortcuts app: The section under Suggestions: As for the response section, I kept it very basic: Finally, the intent handler for this test intent: This is a very basic test. Once again, after invoking the intent, Siri asks me for the test amount. Upon giving it to her, the reply: "Test Amount can't be negative". To change things up a little regarding "the negative number validation", I change the minimum value to -1, allowing the value to be negative. This time, no matter the amount I specify (including amounts like "one" and "zero"), the standard reply is: "Test Amount can't be higher than one hundred thousand". The one thing I'm uncertain about, is the "Currency Codes" section in the parameter settings: I don't know if I have to add the currencies the user can possibly talk about there? I can't find any proper documentation on the parameter settings specifically related to Currency Amount. For now, the work around remains in place. I changed my expense amount to type "Decimal". The drawback is that Siri cannot infer $542.62 from "Five hundred and forty two dollars and sixty two cents". I had to change the prompt to: "What is the expense amount? Specify only the numeric value and do not include currency metrics like dollar and cents." Really? That's just ridiculous! Especially since the Currency Amount parameter worked absolutely fine a few months ago. Are my settings wrong in the definition file? Is there an issue with SiriKit's validation on Currency Amounts? Must I specify currency codes somewhere? I'm not sure what else to try. Any advice will be greatly appreciated.
Posted
by
Post not yet marked as solved
2 Replies
1.2k Views
Hello there, I recently updated both my iPhone and HomePod to the latest iOS 17 Public Beta and have been encountering an issue with Siri activation. In particular, I am unable to activate Siri just by saying "Siri". Specifically, I have adjusted the settings on my HomePod to allow activation via either "Siri" or "Hey Siri" (in my case, "Oye Siri", since I'm using Siri in Spanish). After saving these settings and rebooting all of my devices, I have found that Siri will not activate unless I use the full "Hey Siri" prompt ("Oye Siri"). I would expect that, following the update and the adjustment to these settings, I should be able to activate Siri just by saying just "Siri" but this does not appear to be the case. I would appreciate any assistance with rectifying this issue. To summarize: I've updated to iOS 17 Public Beta on all of my Apple devices. The "Activate via "Siri" or "Hey Siri"" option is set to "Yes". Even after saving and restarting, Siri only activates with "Hey Siri" ("Oye Siri"). If anyone has encountered similar issues or has any advice or potential solutions, I would greatly appreciate your input. Also, if you feel I should be reporting this as a possible bug in the beta software, please let me know. Thank you all for your time and help.
Posted
by
Post not yet marked as solved
0 Replies
459 Views
I have "upgraded" from INStartAudioCallIntent to INStartCallIntent because of deprecation warnings about INStartAudioCallIntent However with INStartCallIntent when it proceeds to make the VoIP call I need to UNLOCK the device. This defeats the purpose of using Siri where you may be driving and do not want to touch the iPhone in order to make a call Is there anyway this can be avoided? Is there a Property List Key or Phone Setting to allow the call to proceed without unlocking? With INStartAudioCallIntent I did not need to unlock the device so this is a step backwards for users.
Posted
by
Post not yet marked as solved
1 Replies
589 Views
Hey there, I implemented Siri and CarPlay. The INStartCallIntent works on iOS but not when initiating a voice command via CarPlay. Error from INIntentDeliverer: Unable to find implementation of resolution method for facade slot name (null) From what I see, I implemented all methods declared on INStartCallIntentHandling but none is called. Does someone know whats missing? 2023-08-29 11:34:52.551834+0200 MyApp[64559:4844776] [Intents] -[INIntentDeliverer _resolveIntentParameter:forIntent:intentHandler:updateIntent:withCompletion:]_block_invoke Unable to find implementation of resolution method for facade slot name (null) on intent <INStartCallIntent: 0x282a71830> {
Posted
by
Post not yet marked as solved
1 Replies
459 Views
Good morning, my company needs to develop a screen reader (like VoiceOver more-or-less) with added custom features for blind people. I wanted to know if there's any possibility to implement a third-party screen reader on iOS, or if someone could suggest any workaround for it. Thank you
Posted
by