iOS is the operating system for iPhone.

iOS Documentation

Posts under iOS tag

2,828 Posts
Sort by:
Post not yet marked as solved
0 Replies
24 Views
Howdy, I have a nasty feeling that the answer to my question is "Y'all cain't do that!", but I figure I'll ask, anyway. THE SAD STORY (GET YOUR HANKY): We have an app that implements Sign [up|in] with Apple. It does it pretty well, with no password visible to the user, and a pretty smooth UX. The issue is what happens when users bork their install. We don't think it will happen often, but want to be able to give the user the best way out, if possible. With the regular (non-SiiA) method, they bonk on a "Forgot Password" button, and the app sends them a new password. We can't do that, with SiiA. The password is stored in the app (in the keychain, so it's very persistent, and shared across devices), and it would a Very Bad Security Hole, to allow users to simply send a new password to the server (the other method generates a rando in the server), which is what would happen, with our method of handling the password. It would also be equally bad, if the server could simply send a new password to the user, directly to their device (the other method sends an email, based on the sign-in information on the server). So the user needs to delete their keychain data completely, which we can easily do, but that does not deal with their SiiA stuff, stored on Apple's server. This is what Apple tells us to do, to delete that. WHICH BEGS THE QUESTION: My question is: Is there a URL scheme that I can use to directly open that panel? If so, it would allow us to create a screen that helps the user to do all the deletions (on the device, our server, and the Apple server).
Posted Last updated
.
Post not yet marked as solved
0 Replies
34 Views
We have been using a Supervision Identity successfully over the last few years to allow us to Xcode debug on managed devices. Something has changed in the last few months which we are finding it hard to find a consistent solution to. We can't determine whether it's Xcode 15/iOS version related. Behaviour we see documented below All Macs with the Supervision Identity installed can open the device in Apple Configurator without need for trust acknowledge on the device. Configurator can open and stream the device console. All problematic devices are also registered as a development device with Apple. We have hit and miss connectivity in Finder, On failure it indicates that the trust prompt on the device needs to be accepted. No trust prompt is displayed Developer Mode on the device can't be enabled until we attempt to connect to Xcode Devices that pair successfully in Xcode do so almost immediately. If a device pairs in Xcode it is also visible in the native macOS Console application If a device fails to pair in Xcode we get a spinner and the message "Xcode has already started pairing with 'iPhone-X'. Select Trust on iPhone-X to complete pairing". The macOS console app reports "The user has not responded to the pairing request on 'iPhone-X' In neither case is a trust prompt displayed on a device. We are not able to Xcode pair any managed device running iOS 17.x. This includes devices that were successfully paired on iOS16.x then upgraded iOS16.x devices that pair with one Mac successfully will not pair with another Mac with the same Supervision Identity installed. The same behaviour is seen on Ventura and Sonoma Macs Clearing Trusted Computers in IOS Developer Mode has no effect None supervised iOS devices pair successfully on iOS 16 and 17 Has anyone else witnessed similar issues and found a work around ?
Posted
by slider42.
Last updated
.
Post not yet marked as solved
2 Replies
65 Views
hello everyone, I want to create VPN app with swiftUI but i do not know where to start. I have already created some beautiful UI but now i want to configurate network to connect another country server. any suggestions?
Posted
by lukakkyk.
Last updated
.
Post not yet marked as solved
3 Replies
154 Views
Hello, Our app has an internal job processing queue. All jobs are built as a NSOperation and involve a network request, and they are added to NSOperationQueue. When the app is closed while a request is being sent, the app sometimes crashes, but it also keeps crashing whenever we build the operation again and retry it. This happens rarely, but we can systematically reproduce it after a few tries with many jobs. This issue blocks the queue in our app. I understand if this is an issue deep within the framework, but it would be very useful to at least find a way to work around this issue so the queue can continue processing other jobs. The full crash report is attached. I also submitted a bug report: FB13734737 There seems to be an internal assertion fired in CFNetwork: Assertion failed: (CFReadStreamGetStatus(_stream.get()) == kCFStreamStatusNotOpen) function _onqueue_setupStream_block_invoke file HTTPRequestBody.cpp line 878. Crashed: com.apple.NSURLConnectionLoader 0 libsystem_kernel.dylib 0xa974 __pthread_kill + 8 1 libsystem_pthread.dylib 0x60ec pthread_kill + 268 2 libsystem_c.dylib 0x75b80 abort + 180 3 libsystem_c.dylib 0x74e70 err + 282 4 CFNetwork 0x1f73b8 CFHTTPCookieStorageUnscheduleFromRunLoop + 278252 5 libdispatch.dylib 0x3dd4 _dispatch_client_callout + 20 6 libdispatch.dylib 0x786c _dispatch_block_invoke_direct + 288 7 CFNetwork 0x259ab0 estimatedPropertyListSize + 33724 8 CoreFoundation 0x24b34 CFArrayApplyFunction + 72 9 CFNetwork 0x2599a0 estimatedPropertyListSize + 33452 10 CFNetwork 0x25c084 estimatedPropertyListSize + 43408 11 CoreFoundation 0x3762c __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 28 12 CoreFoundation 0x368a8 __CFRunLoopDoSource0 + 176 13 CoreFoundation 0x35058 __CFRunLoopDoSources0 + 244 14 CoreFoundation 0x33d88 __CFRunLoopRun + 828 15 CoreFoundation 0x33968 CFRunLoopRunSpecific + 608 16 CFNetwork 0x25ac48 estimatedPropertyListSize + 38228 17 Foundation 0x9ca9c __NSThread__start__ + 732 18 libsystem_pthread.dylib 0x2a90 _pthread_start + 136 19 libsystem_pthread.dylib 0x1fcc thread_start + 8 This is how we build the operation: -(NSOperation*)operationForRequest:(Job*)job { NSURL *url = [NSURL URLWithString:job.url]; NSMutableURLRequest *request = [NSMutableURLRequest requestWithURL:url]; [request setValue:@"application/json, application/xml, text/plain" forHTTPHeaderField:@"Accept"]; [request setValue:@"application/json" forHTTPHeaderField:@"Content-Type"]; [request setValue:@"no-cache" forHTTPHeaderField:@"Cache-Control"]; [request setValue:[NSString stringWithFormat:@"Bearer %@", [self getToken]] forHTTPHeaderField:@"Authorization"]; [request setHTTPMethod:job.method]; NSData *bodyData = [job.payload dataUsingEncoding:NSUTF8StringEncoding]; [request setHTTPBody:bodyData]; return [[NetworkOperation alloc] initWithRequest:request uuid:job.jobId completionHandler:^(NSString* jobId, NSData *data, NSURLResponse *response, NSError *error) { dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0), ^{ @autoreleasepool { RLMRealm *realm = [RLMRealm defaultRealm]; Job *opJob = [Job objectInRealm:realm forPrimaryKey:jobId]; [self processJobResponse:opJob response:response data:data error:error realm:realm]; } }); }]; } This is how the NetworkOperation executes the request: - (void)main { NSURLSession *session = [NSURLSession sharedSession]; NSURLSessionTask *task = [session dataTaskWithRequest:self.request completionHandler:^(NSData *data, NSURLResponse *response, NSError *error) { if (self.networkOperationCompletionBlock) { self.networkOperationCompletionBlock(self.uuid, data, response, error); self.networkOperationCompletionBlock = nil; } [self completeOperation]; }]; [task resume]; self.task = task; } crashlog3.crash
Posted
by nikilic.
Last updated
.
Post not yet marked as solved
0 Replies
68 Views
We've encountered a problem that only occurs on the iPhone 15 Pro Max running the 17.4.1 system. Specifically, when we are operating an app, the phone suddenly goes to a black screen, then the system's loading animation appears, and after the animation ends, the lock screen page appears. The whole phenomenon looks like the phone has restarted, but we haven't found any panic full files in the system logs. Therefore, we are at a loss as to how to solve this problem. Can you offer some assistance? This is a video of the issue occurring:https://watch.wave.video/yg2Oph4n4bfdMQeF
Posted
by luminary.
Last updated
.
Post not yet marked as solved
26 Replies
6.1k Views
I find odd that the App Store Connect still requires 5.5" iPhone screenshots of the iPhone 8 Plus, given that this specific phone is no longer supported by the latest release: iOS 17. I am well aware that the iPhone SE still has a similar screen ratio, and that it is still being supported by iOS 17, but it doesn't have the same pixel requirements (1242 x 2208), which means that in order for my app to be even reviewed (which is an iOS 17+ exclusive), I'm gonna have to create images that will then be upscaled to the right dimensions. Am I missing something here, or is it Apple who missed this detail?
Posted
by 1amChris.
Last updated
.
Post not yet marked as solved
1 Replies
88 Views
I know apple updated their policy related to sign in (see https://developer.apple.com/news/?id=f1v8pyay, "More flexibility for sign in options in apps" section), but the wording of the guidelines (https://developer.apple.com/app-store/review/guidelines/#login-services) is a bit difficult to understand: Apps that use a third-party or social login service (such as Facebook Login, Google Sign-In, Sign in with Twitter, Sign In with LinkedIn, Login with Amazon, or WeChat Login) to set up or authenticate the user’s primary account with the app must also offer as an equivalent option another login service with the following features: the login service limits data collection to the user’s name and email address; the login service allows users to keep their email address private as part of setting up their account; and the login service does not collect interactions with your app for advertising purposes without consent. As far as I can tell, FB, Google, Amazon, etc. do not offer these protections. Would Apple Sign In still be required in this case?
Posted
by ZOlbrys.
Last updated
.
Post not yet marked as solved
3 Replies
100 Views
When sending HTTP requests to API's, the header is now capitalized Old (iOS 17.4 and before): "headers": { "origin": "https://example.com" } New (iOS 17.5): "headers": { "Origin": "https://example.com" } Is this a bug or intentional? I couldn't find this change anywhere.
Posted Last updated
.
Post not yet marked as solved
1 Replies
127 Views
I'm developing an iOS app that displays store locations on a map using Apple Maps (MapKit). I've limit the number of icons that can be displayed on the map to 100, but there's still huge performance issues and the app is very laggy even on modern iPhone models. What's the best practice when displaying a large number of icons on a map, should the icons be in PNG format with a small resolution (~10kb) or should the icons be vector (SVG) for best performance? Should I use the MapKit framework for iOS 17 or the UIKit approach?
Posted
by Filip27.
Last updated
.
Post not yet marked as solved
25 Replies
20k Views
We are working on a new iOS application utilizing the new iOS 17 APIs, and I have updated Xcode to Xcode 15 Beta, and my iPhone 12 Pro to iOS 17 Beta 2, though this issue was also present on iOS 17 Beta 1. In Xcode, for "Signing and Capabilities" I have my Team set to my personal team, utilizing the "Automatically manage signing" tick. While the app will build and install on my phone, I immediately receive this error, with no popup to trust the developer. Going to Settings > General > VPN and Device Management, I can see my Development Team, and I am able to Trust my team. When trying to then Verify App(s), it tells me it will use my internet connection to verify the application. However, it will then do nothing, with no error, regardless of how many times I attempt to verify. Trying to open the app from my home screen will result in the repeated "Unable to Verify Error". Trying to reset network settings does not result in any change in this behavior, nor does a reset of the phone. I have tried 4 different high quality WiFi networks, as well as a fully connection AT&T cellular LTE connection, and still receive this error. I am running out of diagnostic scenarios, and I'm curious if anyone has found a resolution to this?
Posted
by covt.
Last updated
.
Post not yet marked as solved
0 Replies
48 Views
Hi We are developing iOS app using react native. Vscode is used as IDE. when we build the project in Xcode build is successful. But when we try to run it on Xcode simulator the recent changes are not visible. The build is having only the older changes. Please guide us in resolving this.
Posted
by Lakshmish.
Last updated
.
Post not yet marked as solved
1 Replies
56 Views
Hi all apple devs! I am a young developer who is completely new to everything programming. I am currently trying to develop an app where I want to use visionkit, but I can't for the life of me figure out how to implement its features. I've been stuck on this for several days, so I am now resorting to asking all of you experts for help! Your assistance would be immensely appreciated! I started to develop the app trying to exclusively use swiftUI to futureproof my app. Upon figuring out what visionkit is, to my understanding it is more compatible with UIkit? So I rewrote the part of my code that will use visionkit into a UIkit based view, to simplify the integration of visionkits features. It might just have overcomplicated my code? Can visionkit be easily implemented using only swiftUI? I noticed in the demo on the video tutorial the code is in a viewcontroller not a contentview, is this what makes my image unresponsive? My image is not interactable like her demo in the video, where in my code do I go wrong? Help a noob out! The desired user flow is like this: User selects an image through the "Open camera" or "Open Camera Roll" buttons. Upon selection the UIkit based view opens and the selected image is displayed on it. (This is where I want to implement visionkit features) User interacts with the image by touching on it, if touching on a subject, the subject should be lifted out of the rest of the image and be assigned to the editedImage, which in turn displays only the subject without the background on the contentview. (For now the image is assigned to editedimage by longpressing without any subjectlifting since I cant get visionkit to work as I want) Anyways, here's a code snippet of my peculiar effort to implement subject lifting and visionkit into my app:
Posted
by emol.
Last updated
.
Post not yet marked as solved
0 Replies
53 Views
In statistical objects In MXMetricPayload histogrammedTimeToFirstDrawKey start times not equal applicationExitMetrics exit: case1: "applicationExitMetrics": { "backgroundExitData": { "cumulativeNormalAppExitCount": 3, "cumulativeMemoryPressureExitCount": 4 }, "foregroundExitData": { "cumulativeMemoryResourceLimitExitCount": 3 } }, "applicationLaunchMetrics": { "histogrammedTimeToFirstDrawKey": { "histogramNumBuckets": 14, "histogramValue": { "10": { "bucketCount": 1, "bucketStart": "1740 ms", "bucketEnd": "1749 ms" }, "2": { "bucketCount": 1, "bucketStart": "1440 ms", "bucketEnd": "1449 ms" }, "3": { "bucketCount": 2, "bucketStart": "1490 ms", "bucketEnd": "1499 ms" }, "11": { "bucketCount": 1, "bucketStart": "1780 ms", "bucketEnd": "1789 ms" }, "4": { "bucketCount": 1, "bucketStart": "1500 ms", "bucketEnd": "1509 ms" }, "5": { "bucketCount": 1, "bucketStart": "1580 ms", "bucketEnd": "1589 ms" }, "12": { "bucketCount": 1, "bucketStart": "1860 ms", "bucketEnd": "1869 ms" }, "6": { "bucketCount": 1, "bucketStart": "1620 ms", "bucketEnd": "1629 ms" }, "13": { "bucketCount": 1, "bucketStart": "1990 ms", "bucketEnd": "1999 ms" }, "7": { "bucketCount": 1, "bucketStart": "1650 ms", "bucketEnd": "1659 ms" }, "0": { "bucketCount": 1, "bucketStart": "1400 ms", "bucketEnd": "1409 ms" }, "8": { "bucketCount": 1, "bucketStart": "1660 ms", "bucketEnd": "1669 ms" }, "1": { "bucketCount": 1, "bucketStart": "1430 ms", "bucketEnd": "1439 ms" }, "9": { "bucketCount": 1, "bucketStart": "1730 ms", "bucketEnd": "1739 ms" } } }, In this Case the app cold started 15 times, but quit only 10 times, why?? case2: "applicationExitMetrics": { "backgroundExitData": { "cumulativeMemoryPressureExitCount": 1 }, "foregroundExitData": { "cumulativeMemoryResourceLimitExitCount": 3, "cumulativeNormalAppExitCount": 1 } }, "applicationLaunchMetrics": { "histogrammedTimeToFirstDrawKey": { "histogramNumBuckets": 3, "histogramValue": { "0": { "bucketCount": 1, "bucketStart": "1490 ms", "bucketEnd": "1499 ms" }, "1": { "bucketCount": 1, "bucketStart": "1680 ms", "bucketEnd": "1689 ms" }, "2": { "bucketCount": 1, "bucketStart": "1880 ms", "bucketEnd": "1889 ms" } } }, The app cold started 3 times, but quit unexpectedly reached 5 times, why???
Posted Last updated
.
Post not yet marked as solved
0 Replies
50 Views
Hello, When I build an XCode project with the Apple Silicon chip, I have some issues. The project contains Pods and Swift Packages. I could not run the application at all and always got the following error: Could not find module '***' for target 'x86_64-apple-ios-simulator'; found: arm64, arm64-apple-ios-simulator, at: *** I try to resolve this issue. Always embed swift standard libraries = YES Build Active Architure Only = YES UIRequiredDeviceCapabilities = armv7 Excluded Architectures > Debug > Any iOS Simulator SDK arm64 add Open Using Rosetta Excluded Architectures > Debug > Any iOS Simulator SDK arm64 remove However, issue always come to me. :( Do you have any solution for this problem ? Thank you by advance !
Posted
by Gooood.
Last updated
.
Post not yet marked as solved
1 Replies
602 Views
As of iOS 17 SFSpeechRecognizer.isAvailable returns true, even when recognition tasks cannot be fulfilled and immediately fail with error “Siri and Dictation are disabled”. The same speech recognition code works as expected in iOS 16. In iOS 16, neither Siri or Dictation needed to be enabled to have SpeechRecognition to be available and it works as expected. In the past, once permissions given, only an active network connection is required to have functional SpeechRecognition. There seems to be 2 issues in play: In iOS 17, SFSpeechRecognizer.isAvailable incorrectly returns true, when it can’t fulfil requests. In iOS 17 dictation or Siri being enabled is required to handle SpeechRecognition tasks, while in iOS 17 this isn’t the case. If issue 2. Is expected behaviour (I surely hope not), there is no way to actually query if Siri or dictation is enabled to properly handle those cases in code and inform the user why speech recognition doesn’t work. Expected behaviour: Speech recognition is available when Siri and dictation is disabled SFSpeechRecognizer.isAvailable returns correctly false when no SpeechRecognition requests can be handled. iOS Version 17.0 (21A329) Xcode Version 15.0 (15A240d) Anyone else experiencing the same issues or have a solution? Reported this to Apple as well -> FB13235751
Posted
by GeertB.
Last updated
.
Post not yet marked as solved
2 Replies
82 Views
Running through the tutorial on how to sign data using security.framework, I was trying to understand the format Apple is using & wanting for signatures (as this isn't documented anywhere): https://developer.apple.com/documentation/security/certificate_key_and_trust_services/keys/signing_and_verifying?language=objc I've learned the format of the signatures are just ASN.1 objects, with EC signatures being a sequence of the R and S coordinates as ASN.1 integers. However, I am noticing when using SecKeyCreateSignature that either the R or S value will always be prepended with an extra byte. For example: 30 45 02 20 66 B7 4C FB FC A0 26 E9 42 50 E8 B4 E3 A2 99 F1 8B A6 93 31 33 E8 7B 6F 95 D7 28 77 52 41 CC 28 02 21 00 E2 01 CB A1 4C AD 42 20 A2 ^^ why is this here? 66 A5 94 F7 B2 2F 96 13 A8 C5 8B 35 C8 D5 72 A0 3D 41 81 90 3D 5A 91 This is a ASN.1 sequence, first is a 32-byte integer and second is a 33-byte integer. Why is that 00 byte being prepended to the integer? Why is it sometimes the R and sometimes the S? Removing it causes SecKeyVerifySignature to fail, so obviously it's required, but I need to know the logic here as I'm having to hand-craft these ASN.1 objects as all I have are the raw R and S values.
Posted
by ecnepsnai.
Last updated
.