Handle requests for your app’s services from users using Siri or Maps.

SiriKit Documentation

Posts under SiriKit tag

42 Posts
Sort by:
Post not yet marked as solved
0 Replies
94 Views
Description: Problem Statement: State the problem clearly: The Siri Intent for the "Next","Previous","Repeat" command is not working as expected within the Speech Framework. Steps to Reproduce: Provide a detailed description of the steps to reproduce the issue. For example: Open the Speech Framework application. Tap on the Siri button to activate voice input. Say "Next" to trigger the intended action. Observe that the action is not executed correctly. IN Our Demo App: Steps of my demo application as below: Open SIRI Speak: Check In Response: Open dialog as below: What user wants? One 2) Next 3) Yes 4) Goodbye Speak: Next In Response: SIRI repeat same dialog (Step: 2) 3) Speak: Yes, or One or Goodbye In Response: SIRI goes to next dialog. Expected Behavior: Should be get "Next" Value in siri kit intent or app intent. Actual Behavior: But it give previous user input key word give in siri kit intent and recuresively repeat dialog in app intent. Device versions and Region and Language: Device model: IPhone 11 and OS version: 17.4.1 Region: Us and Language: English(US) Impact: User Cant use Iterative dialog in one context. Additional: How Different command work on app intent and siri kit intent on diffrent diffrent device. you can follow No vise in order. || No || Diffrent Device test on Diffrent sinario || SiriKit intent || app Intent || | 1 | ISG iPhone 11 - Next | Not | Not | | 2 | ISG iPhone 11 - Yes | Not | Yes (But Using Enum) | | 3 | ISG iPhone 11 - GoodBye | Not | Yes (But Using Enum) | | 4 | ISG iPhone 11 - One | Yes | Yes | | 5 | iPad - Next | Not | Not | | 6 | iPad - One | Yes | Yes | | 7 | iPad - GoodBye | Not | Yes | | 8 | iPad - Yes | Not | Yes | | 9 | Simulator - iPhone 15 - Next, Yes, One, GoodBye | Yes | Yes | Please help me in it...
Posted
by Infosense.
Last updated
.
Post not yet marked as solved
0 Replies
160 Views
I added a custom intents extension to my project. All seems to be correctly implemented: in Targets' Signing & Capabilities Siri appears in info.plist, NSUserActivityTypes specify the custom intent in project entitlements, Siri is specified in intent info.plist's, NSExtensionPrincipalClass specify the intent class The app asks permission to use Siri - I confirm permission. The app implements the button "Add to Siri" - I add the shortcut. If I start the shortcut manually, I'm able to perform all the provided actions. If I start the the intent selecting the app executable on the simulator, I'm able to perform all the actions using Siri with my voice But ... If I start the the intent selecting the app executable on the device, when I call Siri it responds " hasn't added support for that with Siri" If I try to call directly the shortcut with Siri on the device, Siri responds " hasn't added support for that with Siri" I googled on the Internet, but I'm still not able to understand what is going wrong. There are some settings that I'm forgetting? Any help would be appreciated. Thank you!
Posted Last updated
.
Post not yet marked as solved
0 Replies
192 Views
Hello, We're interested in using the PTT Framework with our PTT capable hardware, as the framework has intended. The problem is activating Siri with any of our specified Intent's doesn't work when the phone is locked. The iPhone always says "You'll have to unlock your iPhone first". Reading up on the problem, it seems pretty common in the fact that Apple doesn't allow Siri Intents to be executed while the phone is locked. It's a sensible precaution by default, but there are countless threads of real use cases that users want to use Siri without unlocking (with PTT, or well, without). There appears to be no options for PTT to enable this, any flags on the Siri Intent to allow benign App actions or queries, user UI configuration through Settings -> Siri & Search to manually allow it even when the phone is locked. Neither are there any entitlements (that I'm aware of) that would allow trivial and non-secure Siri App Intents. The only advice we have for our users (and albeit against the intention of the limitation in the first place) is to: Disable Auto Lock, Disable Face ID and to Disable Passcode. It is in fact 2024, and users do expect a better experience than this with Siri, or am I missing something?
Posted Last updated
.
Post not yet marked as solved
0 Replies
230 Views
I want to make apple watch app with Siri Kit. So I integrated siri intents and code on Xcode. But I caught the following build error during debugging of intents. Please advice me. Failed to install the app on the device. Domain: com.apple.dt.CoreDeviceError Code: 3002 User Info: { DVTErrorCreationDateKey = "2024-03-11 02:38:02 +0000"; IDERunOperationFailingWorker = IDEInstallCoreDeviceWorker; NSURL = "file:///Users/cion/Library/Developer/Xcode/DerivedData/Hydrator-dqlkxgxzgyjnzwhgguipzumyovfi/Build/Products/Debug-iphoneos/Hydrator.app/"; } “Hydrator”をインストールできません Domain: IXUserPresentableErrorDomain Code: 6 Failure Reason: このアプリはこのデバイス用に作成されていません。 Recovery Suggestion: This app was not built to support this device family; app is compatible with ( 1, 2 ) but this device supports ( 4 ) This app was not built to support this device family; app is compatible with ( 1, 2 ) but this device supports ( 4 ) Domain: MIInstallerErrorDomain Code: 10 User Info: { FunctionName = MIIsApplicableToCurrentDeviceFamilyWithError; LegacyErrorString = DeviceFamilyNotSupported; SourceFileLine = 86; } Event Metadata: com.apple.dt.IDERunOperationWorkerFinished : { "device_isCoreDevice" = 1; "device_isWireless" = 1; "device_model" = "Watch5,2"; "device_osBuild" = "10.3.1 (21S651)"; "device_platform" = "com.apple.platform.watchos"; "dvt_coredevice_version" = "355.7.7"; "dvt_mobiledevice_version" = "1643.60.2"; "launchSession_schemeCommand" = Run; "launchSession_state" = 1; "launchSession_targetArch" = "arm64_32"; "operation_duration_ms" = 4026; "operation_errorCode" = 6; "operation_errorDomain" = "com.apple.dt.CoreDeviceError.3002.IXUserPresentableErrorDomain"; "operation_errorWorker" = IDEInstallCoreDeviceWorker; "operation_name" = IDERunOperationWorkerGroup; "param_debugger_attachToExtensions" = 1; "param_debugger_attachToXPC" = 1; "param_debugger_type" = 1; "param_destination_isProxy" = 0; "param_destination_platform" = "com.apple.platform.watchos"; "param_diag_MainThreadChecker_stopOnIssue" = 0; "param_diag_MallocStackLogging_enableDuringAttach" = 0; "param_diag_MallocStackLogging_enableForXPC" = 1; "param_diag_allowLocationSimulation" = 1; "param_diag_checker_tpc_enable" = 1; "param_diag_gpu_frameCapture_enable" = 0; "param_diag_gpu_shaderValidation_enable" = 0; "param_diag_gpu_validation_enable" = 0; "param_diag_memoryGraphOnResourceException" = 0; "param_diag_queueDebugging_enable" = 1; "param_diag_runtimeProfile_generate" = 0; "param_diag_sanitizer_asan_enable" = 0; "param_diag_sanitizer_tsan_enable" = 0; "param_diag_sanitizer_tsan_stopOnIssue" = 0; "param_diag_sanitizer_ubsan_stopOnIssue" = 0; "param_diag_showNonLocalizedStrings" = 0; "param_diag_viewDebugging_enabled" = 1; "param_diag_viewDebugging_insertDylibOnLaunch" = 1; "param_install_style" = 0; "param_launcher_UID" = 2; "param_launcher_allowDeviceSensorReplayData" = 0; "param_launcher_kind" = 0; "param_launcher_style" = 0; "param_launcher_substyle" = 2; "param_runnable_appExtensionHostRunMode" = 0; "param_runnable_productType" = "com.apple.product-type.app-extension"; "param_structuredConsoleMode" = 1; "param_testing_launchedForTesting" = 0; "param_testing_suppressSimulatorApp" = 0; "param_testing_usingCLI" = 0; "sdk_canonicalName" = "watchos10.2"; "sdk_osVersion" = "10.2"; "sdk_variant" = watchos; } System Information macOS Version 14.3.1 (Build 23D60) Xcode 15.2 (22503) (Build 15C500b) Timestamp: 2024-03-11T11:38:02+09:00
Posted Last updated
.
Post not yet marked as solved
0 Replies
214 Views
So I'm working on a logging app that uses Siri to log diaper changes for babies. There are 3 types of diaper changes, wet, dirty, both. I created a enum for these values in the intent definition file and made it configurable and resolvable. in the resolve function, I added this line of code public func resolveDiaperType(for intent: DiaperIntentIntent, with completion: @escaping (DiaperTypeResolutionResult) -> Void) { let needsValue = intent.diaperType == .unknown if needsValue { completion(.needsValue()) } else { completion(.success(with: intent.diaperType)) } } But as soon as .needsValue() is called, the UI will ask user to select one value, and then crash the app. I tried removing a lot of different params and code blocks, needsValue is the only thing that's crashing for me. If I make the default diaperType parameter as .dirty instead of .unknown, it works. Basically it won't let me work with an empty enum parameter. I get the SIGABRT error and the app crashes. I have like 4 intents. 3 of them uses enums. All 3 crash on the enum input UI. all 3 work correctly when the enum is given a value instead of .unknown. The problem is, I NEED to ask user the type. If I give it a default value and resolve it with .needsValue(), it still crashes. I cannot ask the user for a value. I haver made siri intents with enums input before. And those intents STILL WORK. They were just made for older Xcode versions Is this an Xcode bug? Testing on iOS 17.2 simulator Xcode 15.2
Posted Last updated
.
Post not yet marked as solved
0 Replies
268 Views
I have an Intents definition file for a Custom Intent that I want to convert to an AppIntent. The Custom Intent has the checkbox "Configurable in Shortcuts" not checked and therefore the "Convert to App Intent" Button is greyed out. I can however still do a conversion using the Menu-Item "Editor"->"Convert to App Intent". The intent has a number of parameters that are not configurable, but were set in code. This way it was possible to donate shortcuts with the parameters (and even the title) set in code. The automatic conversion using the menu item however produces a result that does not match the legacy Custom Intent (Parameters appear in the Shortcuts App etc). I also did not find any way to create AppIntents that have parameters that can be set in code, before the intent is donated. I would leave the old legacy Custom Intents as they are, but as soon as make use of any of the new iOS 16 Shortcut features (App Shortcuts) the existing donated Custom Intents disappear in the Shortcuts App. Given the apparent inability to convert them into AppIntents due to the missing code-set parameters, I would be happy for any advice on potential solutions.
Posted
by hhtouch.
Last updated
.
Post not yet marked as solved
1 Replies
198 Views
guard let fileURL = intent.attachments?.first?.audioMessageFile?.fileURL else { print("Couldn't get fileNameWithExtension from intent.attachments?.first?.audioMessageFile?.fileURL?.lastPathComponent") return failureResponse } defer { fileURL.stopAccessingSecurityScopedResource() } let fileURLAccess = fileURL.startAccessingSecurityScopedResource() print("FileURL: \(fileURLAccess)") let tempDirectory = FileManager.default.temporaryDirectory let tempFileURL = tempDirectory.appendingPathComponent(UUID().uuidString + "_" + fileURL.lastPathComponent) do { // Check if the file exists at the provided URL guard FileManager.default.fileExists(atPath: fileURL.path) else { print("Audio file does not exist at \(fileURL)") return failureResponse } fileURL.stopAccessingSecurityScopedResource() // Check if the temporary file already exists and remove it if necessary if FileManager.default.fileExists(atPath: tempFileURL.path) { try FileManager.default.removeItem(at: tempFileURL) print("Removed existing temporary file at \(tempFileURL)") } // Copy the audio file to the temporary directory try FileManager.default.copyItem(at: fileURL, to: tempFileURL) print("Successfully copied audio file from \(fileURL) to \(tempFileURL)") // Update your response based on the successful upload // ... } catch { // Handle any errors that occur during file operations print("Error handling audio file: \(error.localizedDescription)") return failureResponse } guard let audioData = try? Data(contentsOf: tempFileURL), !audioData.isEmpty else { print("Couldn't get audioData from intent.attachments?.first?.audioMessageFile?.data") return failureResponse } Error: FileURL: false Audio file does not exist at file:///var/mobile/tmp/SiriMessages/BD57CB69-1E75-4429-8991-095CB90959A9.caf is something I'm missing?
Posted Last updated
.
Post not yet marked as solved
0 Replies
282 Views
When I use the wake up word to call up siri shortcut extension, when the request exceeds 10s, intenthandler is automatically released by the system. Is it true that siri shortcut extension cannot run longer than 10s? Is there any way to extend the timeout period?
Posted
by tlyf0824.
Last updated
.
Post not yet marked as solved
5 Replies
1.9k Views
Hi, according this WWDC session https://developer.apple.com/wwdc22/10170 App Shortcuts are defined in Swift code, by implementing the AppShortcutsProvider protocol. To implement the protocol, I'll simply create a single getter that returns all the app shortcuts I want to set up for the user. Note that in total, your app can have a maximum of 10 app shortcuts. However, most apps only need a few. there is a limit for up to 10 AppShortcuts. Could you please clarify how that limit handled? 🤔 (e.g. project failed to build / app will crash or malfunction / only 10 shortcuts will be handled on random/ordered choice by iOS) I suppose there is some way to manage shortcuts amount but see no details at documentation yet.
Posted
by vk_arc.
Last updated
.
Post not yet marked as solved
0 Replies
323 Views
My iphone application has been using the INSendPaymentIntent since 2017. It works perfectly so far but it can not trigger the application after the user gives the amount and the person who to send money with iOS 17. I could not see any update with the intent. Has anyone facing the same problem?
Posted
by Onder.
Last updated
.
Post not yet marked as solved
0 Replies
406 Views
Project Info:A music player iOS App with watchOS app embedded. Project Structure: app target: music intent extenstion: intent (for iOS platform) watchOS app target: watchKit watchOS extension: watchKit Extension iOS app use intent extension to support SiriKit with play media intent, it works very well. now i want to support Siri on watchOS app, but i don't know how. i have tried to add new watch extension target, but it doesn't work. i t keeps saying "my app doesn't support *** instruction". Please share if I have missed to read through some documentation / reference that solves this problem.
Posted
by Highmore.
Last updated
.
Post not yet marked as solved
0 Replies
359 Views
Hello, let me introduce myself to you, my name is Maxime and I would like to create artificial intelligence for my research and development needs. I want to use Siri applications with data from a scientific portal established by my university. The APIs are public, however, access to the portal and the network is reserved for university researchers. How can you help me?
Posted Last updated
.
Post not yet marked as solved
1 Replies
371 Views
INSendMessageIntent has no recipient when replying to a message provided by INSearchForMessagesIntentHandling provider. A user would expect that if Siri just read them a message from an app implementing INSearchForMessagesIntentHandling that they would be able to reply directly without having to look up the recipient. When handling INSearchForMessagesIntentHandling I find the messages in my local DB and create INMessage objects that have INPerson objects embedded in them. We have our own internal contacts, so I fill out the INPerson object as follows: INPerson( personHandle: INPersonHandle(value: "Name", type: .unknown), nameComponents: nil, displayName: "Name", image: nil, contactIdentifier: nil, customIdentifier: "localContactIdentifier" ) After reading every conversation Siri asks "Would you like to reply?", and if the user answers in the affirmative, Siri always answers "To who?" because my INSendMessageIntentHandling.resolveRecipients never gets any recipients. I have attempted to donate all of my contacts using INVocabulary.shared().setVocabulary but that didn't help.
Posted Last updated
.
Post not yet marked as solved
0 Replies
394 Views
I'm developing a CarPlay interface to a messaging application but couldn't find how the root CPTemplate, a grid template with button in my case, could activate SiriKit to let the user choose between several actions like we could see in WhatsApp running on CarPlay: There is CPVoiceControlTemplate that seems to do the job but it is only allowed for navigation app category and not messaging and VoIP. Actually my app could activate Siri to compose a message to a selected contact represented by a CPMessageListItem in a CPListTemplate but I couldn't find how to code a CPGridTemplate that activate Siri...
Posted
by VladX06.
Last updated
.
Post not yet marked as solved
0 Replies
471 Views
I've gotten the following error message a few times, does anyone know anything about it? I currently have a WidgetExtension and am suspecting that this is the circle. Your delivery was successful, but you may wish to correct the following issues in your next delivery: ITMS-90626: Invalid Siri Support - No example phrase was provided for INSearchForMessagesIntent in the 'en' language. Please refer to 'https://developer.apple.com/documentation/sirikit/registering_custom_vocabulary_with_sirikit/global_vocabulary_reference/intent_phrases' ITMS-90626: Invalid Siri Support - No example phrase was provided for INSetMessageAttributeIntent in the 'ko' language. Please refer to 'https://developer.apple.com/documentation/sirikit/registering_custom_vocabulary_with_sirikit/global_vocabulary_reference/intent_phrases' ITMS-90626: Invalid Siri Support - No example phrase was provided for INSetMessageAttributeIntent in the 'en' language. Please refer to 'https://developer.apple.com/documentation/sirikit/registering_custom_vocabulary_with_sirikit/global_vocabulary_reference/intent_phrases' ITMS-90626: Invalid Siri Support - No example phrase was provided for INSendMessageIntent in the 'ko' language. Please refer to 'https://developer.apple.com/documentation/sirikit/registering_custom_vocabulary_with_sirikit/global_vocabulary_reference/intent_phrases' ITMS-90626: Invalid Siri Support - No example phrase was provided for INSearchForMessagesIntent in the 'ko' language. Please refer to 'https://developer.apple.com/documentation/sirikit/registering_custom_vocabulary_with_sirikit/global_vocabulary_reference/intent_phrases' ITMS-90626: Invalid Siri Support - No example phrase was provided for INSendMessageIntent in the 'en' language. Please refer to 'https://developer.apple.com/documentation/sirikit/registering_custom_vocabulary_with_sirikit/global_vocabulary_reference/intent_phrases' After you’ve corrected the issues, you can upload a new binary to App Store Connect. Best regards,
Posted Last updated
.
Post not yet marked as solved
6 Replies
5.7k Views
Issue Summary Hi all, I'm working on an Intents Extension for my app, however when I try to run an intent, Xcode pops up the following error: Could not attach to pid: "965" attach failed (Not allowed to attach to process. Look in the console messages (Console.app), near the debugserver entries, when the attach failed. The subsystem that denied the attach permission will likely have logged an informative message about why it was denied.) An image of the error: This only happens when I try debugging the Intent Extension. Running the main app target or another extension target (e.g. notifications) doesn't produce this error. Build Setup Here are the details of my build setup: Mac Mini M1 Xcode 13 Building to iPhone 11 Pro Max, iOS 15.0.2. I've also tried building to my iPad Pro 12.9 w/ iOS 15.1 and hit the same issue. Things I've tried: Make sure "Debug executable" is unchecked in the scheme I've tried changing the Launch setting to "Automatic" and "Wait for the executable to be launched" I've made sure to run sudo DevToolsSecurity -enable on my mac Rebooted iPhone devices + mac mini Uninstalled / reinstalled the app Deleted derived data Removing / reinstalling the development certs in my keychain --> this actually seemed to work initially, but then the problem came back and now it doesn't work anymore. Console Logs I've looked at the console logs while this error occurs to see if it can shed light on the issue. Here are the ones that seemed notable to me. These logs seem to show that Siri is trying to save / write to a file that it does not have access too. Seems very suspicious error 11:42:38.341470-0800 kernel System Policy: assistantd(31) deny(1) file-read-metadata /private/var/mobile/Library/com.apple.siri.inference error 11:42:38.342204-0800 assistantd failed to save contact runtime data. error=Error Domain=NSCocoaErrorDomain Code=512 "The file “com.apple.siri.inference” couldn’t be saved in the folder “Library”." UserInfo={NSFilePath=/var/mobile/Library/com.apple.siri.inference, NSUnderlyingError=0x100fb03a0 {Error Domain=NSPOSIXErrorDomain Code=5 "Input/output error"}} error 11:42:38.342403-0800 assistantd InferenceError<errorId=crSaveToRunTimeDBFailed file=/Library/Caches/com.apple.xbs/Sources/SiriInference/SiriInference-3100.49.3.1.2/SiriInference/SiriInference/ContactResolver/ContactResolver.swift function=logRunTimeData(runTimeData:config:) line=378 msg=> error 11:42:38.465702-0800 kernel 1 duplicate report for System Policy: assistantd(31) deny(1) file-read-metadata /private/var/mobile/Library/com.apple.siri.inference Looking for "debugserver" entries, like the error suggests, shows these logs: default 11:42:44.814362-0800 debugserver error: [LaunchAttach] MachTask::TaskPortForProcessID task_for_pid(965) failed: ::task_for_pid ( target_tport = 0x0203, pid = 965, &task ) => err = 0x00000005 ((os/kern) failure) default 11:42:44.814476-0800 debugserver 10 +0.011525 sec [03c6/0103]: error: ::task_for_pid ( target_tport = 0x0203, pid = 965, &task ) => err = 0x00000005 ((os/kern) failure) err = ::task_for_pid ( target_tport = 0x0203, pid = 965, &task ) => err = 0x00000005 ((os/kern) failure) (0x00000005) default 11:42:44.825704-0800 debugserver error: MachTask::StartExceptionThread (): task invalid, exception thread start failed. default 11:42:44.825918-0800 debugserver error: [LaunchAttach] END (966) MachProcess::AttachForDebug failed to start exception thread attaching to pid 965: unable to start the exception thread default 11:42:44.826025-0800 debugserver error: Attach failed default 11:42:44.828923-0800 debugserver error: Attach failed: "Not allowed to attach to process. Look in the console messages (Console.app), near the debugserver entries, when the attach failed. The subsystem that denied the attach permission will likely have logged an informative message about why it was denied.". I've also attached the full details of the error below via a text file if it helps. Any help with this issue would be great, and I'm happy to provide more information if needed. Thanks in advance! Xcode Attach Full Error Details
Posted Last updated
.
Post not yet marked as solved
1 Replies
855 Views
I was trying out SiriKit Media Intents and found that with iOS 14, Media Intents can be handled in-app instead of using an extension. Given my current project structure, in-app intent handling suits my purpose better than handling it in an extension. But my question is about how this intent will be handled in a watchOS app? Is in-app Intent Handling supported on watchOS as well (if yes, are there any examples that I can refer to)? If not, can I create an extension for Media Intents and trigger it for watchOS while triggering the in-app handling for iOS alone? Please share if I have missed to read through some documentation / reference that solves this problem.
Posted Last updated
.
Post not yet marked as solved
1 Replies
1.1k Views
I am receiving this error in some cases when calling request on an INVoiceShortcutCenter. [[INVoiceShortcutCenter sharedCenter] getAllVoiceShortcutsWithCompletion:^(NSArray<INVoiceShortcut *> * _Nullable voiceShortcuts, NSError * _Nullable error) {     if (error) {} }];
Posted
by yleson.
Last updated
.
Post not yet marked as solved
0 Replies
493 Views
Apple's new Journal app was introduced with the iOS 17.2 beta. In the release notes, the following is mentioned: If your app donates activities or interactions to SiriKit or CallKit or if someone authorizes your app to save data to HealthKit, some data might show up as part of Journaling Suggestions. Is there any documentation on how this works exactly? What kind of activities can be featured in Journal? How does the system decide what to feature? For instance, if I have an app that allows the user to create art images, can I somehow make those images appear in the Journaling Suggestions?
Posted Last updated
.