Natural Language

RSS for tag

Analyze natural language text and deduce its language-specific metadata using Natural Language.

Natural Language Documentation

Posts under Natural Language tag

13 Posts
Sort by:
Post not yet marked as solved
0 Replies
51 Views
I have a macOS application with a minimum version of macOS 12.0. I need to be able to get the current keyboard region designator. Example: The user selects a input source of English Canadian. What I want as a result of this fact is en-CA locale identifier. I get the current keyboard language with the following code func keyboardLanguage() -> String?{ let keyboard = TISCopyCurrentKeyboardInputSource().takeRetainedValue() let languagesPtr = TISGetInputSourceProperty(keyboard, kTISPropertyInputSourceLanguages)! let languages = Unmanaged<AnyObject>.fromOpaque(languagesPtr).takeUnretainedValue() as? [String] return languages?.first } This returns the language as en, but I don't see how I can get the region from Text Input Sources. I can get the input source id let keyboard = TISCopyCurrentKeyboardInputSource().takeRetainedValue() let idPtr = TISGetInputSourceProperty(keyboard, kTISPropertyInputSourceID)! let id = Unmanaged<AnyObject>.fromOpaque(idPtr).takeUnretainedValue() as? String print(String(describing: id)) This prints com.apple.keylayout.Canadian which points to the Canadian region but is not a region designator. I can possible parse this id and map it to a region designator but first I'm not sure if I will capture all of the regions and secondly what happens if the format of the id changes? If someone can point to the correct API to use it will be much appreciated.
Posted Last updated
.
Post not yet marked as solved
0 Replies
115 Views
i'm trying to create an NLModel within a MessageFilterExtension handler. The code works fine in the main app, but when I try to use it in the extension it fails to initialize. Just this doesn't even work and gets the error below. Single line that fails. SMS_Classifier is the class xcode generated for my model. This line works fine in the main app. let mlModel = try SMS_Classifier(configuration: MLModelConfiguration()).model Error Unable to locate Asset for contextual word embedding model for local en. MLModelAsset: load failed with error Error Domain=com.apple.CoreML Code=0 "initialization of text classifier model with model data failed" UserInfo={NSLocalizedDescription=initialization of text classifier model with model data failed} Any ideas?
Posted
by poiesb.
Last updated
.
Post not yet marked as solved
0 Replies
319 Views
Is there a way to extract the list of words recognized by the Speech framework? I'm trying to filter out words that won't appear in the transcription output, but to do that I'll need a list of words that can appear. SFSpeechLanguageModel.Configuration can be initialized with a vocabulary, but there doesn't seem to be a way to read it, and while there are ways to create custom vocabularies, I have yet to find a way to retrieve it. I added the Natural Language tag in case the framework might contribute to a solution
Posted
by wmk.
Last updated
.
Post not yet marked as solved
4 Replies
2.4k Views
I am working on an app that pulls data from weatherKit, including the conditionCode property, the content of which is displayed to the user. I wish to localize the data pulled from weatherKit but when pulling data from: weatherkit.apple.com/api/v1/weather/de/{latitude}/{longitude} The conditionCode and other strings is in english. Same is true if the language parameter is set to es, ja or something else. Am I doing something wrong or is localization yet to be supported in weatherKit? I can't find any documentation on this.
Posted Last updated
.
Post marked as solved
5 Replies
1.5k Views
I am using NLTagger to tag lexical classes of words, but it suddenly just stopped working. I boiled my code down to the most basic version, but it's never executing the closure of the enumerateTags() function. What do I have to change or what should I try? for e in sentenceArray { let cupcake = "I like you, have a cupcake" tagger.string = cupcake tagger.enumerateTags(in: cupcake.startIndex..<cupcake.endIndex, unit: .word, scheme: .nameTypeOrLexicalClass) { tag, range in print("TAG") return true }
Posted
by yoKurt.
Last updated
.
Post not yet marked as solved
0 Replies
384 Views
Does iOS provide an API for getting text predictions based on previous text? I tried with UITextChecker.completions as such let str = "Hello" let range = NSMakeRange(str.utf16.count, 0) let tc = UITextChecker() let completions = tc.completions(forPartialWordRange: range, in: str, language: "en-US") print(completions) However, this only works for completing words, not sentences. Does iOS have a way of doing this? I read somewhere that macOS does. If not, what workarounds/alternatives would you recommend?
Posted
by edvilme.
Last updated
.
Post not yet marked as solved
1 Replies
844 Views
In the video of Explore Natural Language multilingual models https://developer.apple.com/videos/play/wwdc2023/10042/, it's said at 6:24 that there are three models. I wonder if it is possible to find semantic similairity between models? For example English and Japanese belong to different models(Latin and CJK), can we compare the vector produced from the different models to find out if two sentences have similar meanings?
Posted
by oldhuhu.
Last updated
.
Post not yet marked as solved
4 Replies
2.0k Views
When I try to get the sentence embedding with NLEmbedding.sentenceEmbedding(for: .english), I get nil returned and the message in the console is: "Unable to locate Asset for contextual word embedding model for local en." I've also tried different languages and also NLEmbedding.wordEmbedding(for: .english) doesn't work. This only occurs on my iPhone X after updating to iOS 14.4. Before it worked fine. It also works on the Simulator on iOS 14.4. Did somebody face a similar problem? Or does somebody know how to force iOS to download these embeddings again?
Posted
by aleeeex.
Last updated
.
Post not yet marked as solved
0 Replies
537 Views
The code https://developer.apple.com/documentation/naturallanguage/identifying_people_places_and_organizations works on Swift Playground 4.3.1 under iPad OS but under XCode 14.3.1 the named entities are not recognized and for every word I get: ["Other": 1.0]. Strangely, I had the code work in an iOS App under XCode 14.3.1 but there seems to be some level of unpredictability as to when it works or not. Any hints?
Posted Last updated
.
Post marked as solved
1 Replies
772 Views
I am examining the ability of NaturalLanguage to identify placeNames. I ran across the example code in the developer documentation here: https://developer.apple.com/documentation/naturallanguage/identifying_people_places_and_organizations When I execute this example exactly as stated it does not function as expected. Running Xcode 14.3.1 on macOS Ventura 13.4. The code does not find a single tag of any kind in the sample text. Can someone offer an explanation?
Posted
by SwampDog.
Last updated
.
Post not yet marked as solved
0 Replies
1.1k Views
Hi community. I was looking for a way to make word segmentation of a string without delimiters and spaces. Apple's Natural language framework does not provide this (I think is strange). I want to achieve something like this: Input: "makesenseofthis" Output: ["make", "sense", "of", "this"] Any third party to do it? Maybe is there any Apple API? Thanks in advance
Posted
by JesusMG.
Last updated
.
Post not yet marked as solved
2 Replies
2.3k Views
What tools are folks using to create the json file needed to train a custom word tagger model? I've tried Doccano, but it exports JSONL which is very different than what CreateML is expecting. (example of the required format here: https://developer.apple.com/documentation/naturallanguage/creating_a_word_tagger_model). Are there standard tools or utilities that export/convert to the CreateML format? Thanks.
Posted
by Means.
Last updated
.