Posts

Sort by:
Post not yet marked as solved
0 Replies
31 Views
I am fairly new to 3D model rendering and do not know where to start. I am trying to, ideally with ARKit & RealityKit or SceneKit, do a scan of an environment. This includes: Applying realistic textures to the model. Being able to save it as a .usdz file (to be able to open it within the App itself) Once it is save do post-processing measurements within the model. I would prefer to accomplish this feature by using a mesh, instead of the pointCloud that is used in the sample project of apple. Would this be doable using Apple's APIs and on a mobile device or would it be necessary to use a third party program? I have managed to create a USDZ file using SceneKit's .scene.write(to:,delegate:) method. However the saved file is a "single object" and it is not possible to use raycasting to do post-processing measurements in the model.
Posted
by
Post not yet marked as solved
0 Replies
42 Views
It seems like this may have been an issue for a while based on what I've seen, but I have added a toolbar item to a textfield keyboard and it doesn't show. The only way I can get it to show is by opening the keyboard, typing something, closing the keyboard, and then reopening it. Anyone have a workaround for this? It's like Apple purposely wants to make it difficult to close the keyboard. TextField("Something here...", text: $text, axis: .vertical) .multilineTextAlignment(.leading) .toolbar { ToolbarItemGroup(placement: .keyboard, content: { Button("Close") { } }) }
Posted
by
Post not yet marked as solved
0 Replies
40 Views
If my app utilized the RoomPlan api to create a parametric representation of the room, would it open on iPhones that don’t have lidars? I‘m aware the iPhone models that are equipped with lidar are iPhone 12 Pro & Pro Max, iPhone 13 Pro & Pro Max, iPhone 14 Pro & Pro Max, and iPhone 15 Pro & Pro Max.
Posted
by
Post not yet marked as solved
0 Replies
48 Views
I was following the Swiftui tutorial at section 6 (https://developer.apple.com/tutorials/swiftui/creating-and-combining-views) to use a circle image to create an overlapping effect on the map. Turns out that when using a GeometryReader, the bottom padding was not working at all: VStack{ MapView().frame(height:300) CircleImage() .offset(y: -130) .padding(.bottom, -130) VStack(alignment:.leading){ Text( "Turtle Rock" ) .font( .title ) HStack { Text( "Joshua Tree National Park" ) .font( .subheadline ) Spacer() Text( "California" ) .font( .subheadline ) } }.padding(/*@START_MENU_TOKEN@*/10/*@END_MENU_TOKEN@*/) } This is the code for the CicleImage view: GeometryReader(content: { geometry in let _ = print( geometry.size.width ) AsyncImage( url: URL( string: "https://cms.rationalcdn.com/v3/assets/blteecf9626d9a38b03/bltf5486c52361f2012/6144fafd39dff133fc23de9f/img-ios.png" ) ) .frame(width: geometry.size.width) .clipShape( Circle() ).overlay{ Circle().stroke( .white, lineWidth: 4 ) }.shadow( radius: 7 ) })
Posted
by
Post not yet marked as solved
0 Replies
34 Views
My Macbook Air mid-2013 model running MacOS Big Sur 11.7.10 has its system preference broken. The padlock to make changes doesn't do anything when clicked and so does every other tab or setting. Now, I can get into panels like Sound or Accessibilities, but everything that I SHOULD be able to do within them just... doesn't work. Nothing happens when I click on them. It's like they're just images.
Posted
by
Post not yet marked as solved
1 Replies
69 Views
Hi all, I need some help debugging some code I wrote. Just as a preface, I'm an extremely new VR/AR developer and also very new to using ARKit + RealityKit. So please bear with me :) I'm just trying to make a simple program that will track an image and place an entity on it. The image is tracked correctly, but the moment the program recognizes the image and tries to place an entity on it, the program crashes. Here’s my code: VIEWMODEL CODE: Observable class ImageTrackingModel { var session = ARKitSession() // ARSession used to manage AR content var imageAnchors = [UUID: Bool]() // Tracks whether specific anchors have been processed var entityMap = [UUID: ModelEntity]() // Maps anchors to their corresponding ModelEntity var rootEntity = Entity() // Root entity to which all other entities are added let imageInfo = ImageTrackingProvider( referenceImages: ReferenceImage.loadReferenceImages(inGroupNamed: "referancePaper") ) init() { setupImageTracking() } func setupImageTracking() { if ImageTrackingProvider.isSupported { Task { try await session.run([imageInfo]) for await update in imageInfo.anchorUpdates { updateImage(update.anchor) } } } } func updateImage(_ anchor: ImageAnchor) { let entity = ModelEntity(mesh: .generateSphere(radius: 0.05)) // THIS IS WHERE THE CODE CRASHES if imageAnchors[anchor.id] == nil { rootEntity.addChild(entity) imageAnchors[anchor.id] = true print("Added new entity for anchor \(anchor.id)") } if anchor.isTracked { entity.transform = Transform(matrix: anchor.originFromAnchorTransform) print("Updated transform for anchor \(anchor.id)") } } } APP: @main struct MyApp: App { @State var session = ARKitSession() @State var immersionState: ImmersionStyle = .mixed private var viewModel = ImageTrackingModel() var body: some Scene { WindowGroup { ModeSelectView() } ImmersiveSpace(id: "appSpace") { ModeSelectView() } .immersionStyle(selection: $immersionState, in: .mixed) } } Content View: RealityView { content in Task { viewModel.setupImageTracking() } } //Im serioulsy so clueless on how to use this view
Posted
by
Post not yet marked as solved
0 Replies
53 Views
Hello, I'll be objective: when I compile any APP in my XCode and transfer the APP to my iPhone, including the test APP "Hello world!", whether via network or USB cable, when I open the APP it simply doesn't work and the iPhone crashes. Only that. My XCode is 15.3, iPhone 14 Pro Max, IOS 17.5, macOS latest version.
Posted
by
Post not yet marked as solved
1 Replies
42 Views
I am using @AppStorage in a model object (see code below that works as expected). class ModelObject { static let shared = ModelObject() @AppStorage("enhanced") var scriptPickers: Bool = true var defaultDependentValue: String { scriptPickers ? "Enhanced" : "NOT enhanced" } } struct ContentView: View { @AppStorage("enhanced") var scriptPickers: Bool = true var body: some View { VStack { Toggle(isOn: $scriptPickers, label: { Text("userDefault val") }) Text("value: \(ModelObject.shared.defaultDependentValue)") } } } Now I want to test my model object in a way that will allow me to use a mock instance of UserDefaults, but am having trouble with the syntax. I tried adding a userDefaults var, and referring to the var in the @AppStorage class ModelObject { static let shared = ModelObject() let userDefaults: UserDefaults init(userDefaults: UserDefaults = .standard) { self.userDefaults = userDefaults } @AppStorage("enhanced", store: userDefaults) var scriptPickers: Bool = true var defaultDependentValue: String { scriptPickers ? "Enhanced" : "NOT enhanced" } } However I can't find a way to avoid the syntax error this generates: Cannot use instance member 'userDefaults' within property initializer; property initializers run before 'self' is available Any guidance on how I might be able to: continue using @AppStorage be able to test my class in a way that doesn't force me to use UserDefaults.standard thanks, in advance, Mike
Posted
by
Post not yet marked as solved
0 Replies
44 Views
Can I use Apple's sound recognition in my augmented reality app to trigger content ? Or is there another source I can use?
Posted
by
Post not yet marked as solved
0 Replies
47 Views
Seeing the following, whether initializing Maps() in SwiftUI or using Apple's example Overlay Project since updating to Xcode 15.3: Thread Performance Checker: Thread running at User-interactive quality-of-service class waiting on a thread without a QoS class specified (base priority 0). Investigate ways to avoid priority inversions PID: 2148, TID: 42369 Backtrace ================================================================= 3 VectorKit 0x00007ff81658b145 ___ZN3geo9TaskQueue5applyEmNSt3__18functionIFvmEEE_block_invoke + 38 4 libdispatch.dylib 0x00000001036465c2 _dispatch_client_callout2 + 8 5 libdispatch.dylib 0x000000010365d79b _dispatch_apply_invoke3 + 527 6 libdispatch.dylib 0x000000010364658f _dispatch_client_callout + 8 7 libdispatch.dylib 0x0000000103647c6d _dispatch_once_callout + 66 8 libdispatch.dylib 0x000000010365c89b _dispatch_apply_redirect_invoke + 214 9 libdispatch.dylib 0x000000010364658f _dispatch_client_callout + 8 10 libdispatch.dylib 0x000000010365a67f _dispatch_root_queue_drain + 1047 11 libdispatch.dylib 0x000000010365af9d _dispatch_worker_thread2 + 277 12 libsystem_pthread.dylib 0x00000001036e2b43 _pthread_wqthread + 262 13 libsystem_pthread.dylib 0x00000001036e1acf start_wqthread + 15```
Posted
by
Post not yet marked as solved
2 Replies
71 Views
Posting this on behalf of my colleague, who has a project in mind that requires a huge amount of RAM. Is it true that modern Mac Pro's can only have up to 192GB of RAM which is about 8 times less than 5 years old intel based Mac Pros?
Posted
by
Post not yet marked as solved
0 Replies
67 Views
Calling SKAction.follow(..) causes my SKSpriteNode to rotate 90 degrees CW and not stay horizontal as it follows my UIBezierPath? I have this code (within my GameViewController Class) which implements the following of a SKSpriteNode along a UIBezierPath. ===== Please note that a brilliant contributor solved the above challenge by creating a new Class, e.g., class NewClass: NSObject. Nevertheless, I need the solution to appear in an extension of my GameViewController ===== func createTrainPath() { trackRect = CGRect(x: tracksPosX - tracksWidth/2, y: tracksPosY, width: tracksWidth, height: tracksHeight) trainPath = UIBezierPath(ovalIn: trackRect) } // createTrainPath func startFollowTrainPath() { var trainAction = SKAction.follow( trainPath.cgPath, asOffset: false, orientToPath: true, speed: theSpeed) trainAction = SKAction.repeatForever(trainAction) myTrain.run(trainAction, withKey: runTrainKey) } // startFollowTrainPath func stopFollowTrainPath() { guard myTrain == nil else { myTrain.removeAction(forKey: runTrainKey) savedTrainPosition = getPositionFor(myTrain, orPath: trainPath) return } } // stopFollowTrainPath
Posted
by
Post not yet marked as solved
0 Replies
54 Views
Apple Transporter.app report this issue: Asset validation failed (90237) The product archive package's signature is invalid. Ensure that it is signed with your "3rd Party Mac Developer Installer" certificate. In the post "https://developer.apple.com/forums/thread/680438" Quinn “The Eskimo!” reply: For the Mac App Store you need: Apple Development: TTT (or the older Mac Developer: TTT) for day-to-day development 3rd Party Mac Developer Installer: TTT for signing the installer package you submit to App Store Connect 3rd Party Mac Developer Application: TTT for signing the code inside that installer package In https://stackoverflow.com/questions/29039462/which-certificate-should-i-use-to-sign-my-mac-os-x-application" Apple Codesigning Certificate Types Mac App Distribution 3rd Party Mac Developer Application: Team Name Used to sign a Mac app before submitting it to the Mac App Store. Mac Installer Distribution 3rd Party Mac Developer Installer: Team Name Used to sign and submit a Mac Installer Package, containing your signed app, to the Mac App Store. Both Mac App Distribution and Mac App Distribution was added to Keychain Access: Picture 1: Xcode -> Preferences -> Account -> Manage Certificates show that "Mac Installer Distribution" is there: Picture 2: And "Mac Installer Distribution" is also shown in my "Account" -> Certificates, IDs & Profiles -> Certificates in developer.apple.com Picture 3: Is "3rd Party Mac Developer Installer" = "Mac Installer Distribution" missing somewhere ? What can I do to fix "Asset validation failed (90237)" in Apple Transporter.app ?
Posted
by
Post not yet marked as solved
0 Replies
83 Views
Good day folks, We have a workflow setup where a new Sign in with Apple user registers (first SIWA login where user can pick name and show/hide email), and the server-side code obtains a refresh token from SIWA REST API. That refresh token is stored internally against the user's profile in the DB for future use. Whenever user account is deleted from server-side, we use that refresh token to revoke Sign in with Apple (so that the user would need to go through registration flow rather than sign in- where they have an option to specify name and show/hide email). That has been working beautifully until we have added an AppClip to the app. The code which obtains the refresh token "respects" the correct bundle ID for the main app / app clip, and everything seems to work. Both of Apple's APIs return OK codes. In fact, we even get the email from Apple when token is revoked which reads "APP_NAME has revoked your Sign in with Apple account. Next time you use Sign in with Apple to sign in to your onUgo Access account, you will have to share your name and email again". Problem is- it doesn't. SIWA still offers to "sign in" as if account is still linked, and the app still shows up as "App using Sign in with Apple" in iPhone settings. What's even more mysterious is that you can't delete/revoke/"Stop using Apple ID" on that SIWA link with the app from iPhone settings too! It seems to work, but the app never goes away from the list, as if it fails silently. Could anyone please help shed some light on this?
Posted
by
Post not yet marked as solved
0 Replies
54 Views
I was trying to update my workflow but Xcode shows me this: However when I push up to my testFlight branch it goes through and does everything as it's supposed to. I even got an email saying that everything was successful. If I try to go through the process of setting up XCode Cloud it will just have the spinning indicator forever and I have to force quit Xcode. Any suggestions on how to fix this? I am running Xcode 15.2
Posted
by
Post not yet marked as solved
0 Replies
67 Views
I created a PWA that requires access to users' geolocation to perform a certain action in the system. The correct operation would be the user opens the application, and then the operating system prompts them to allow sharing their exact location with the PWA. However, this is not happening with a few users who have iPhone 11 or XR. I tested it on iPhones 14, 13, 11 Pro, and even iPhone 6, and it works as expected. I directly spoke with a user who was experiencing the problem and conducted some tests. I checked if location access was allowed in the settings. I verified if Safari was accepting with the option to always ask selected. In the settings of my system's website, I checked if location access was allowed with the option to always ask chosen. We changed all prompting options to allow. We opened the following site https://whatpwacando.today/ and found that geolocation was also not possible. Everything indicates that the issue lies with these users' phones; however, other geolocation methods work fine, as other geolocation apps function properly. This leads me to think that it might be a problem with Safari not working properly with the HTML Geolocation API. I'm not sure if there are any more advanced settings that could help or if anyone else has encountered this issue.
Posted
by
Post marked as solved
1 Replies
66 Views
I have a macOS app (AppKit-based, not Mac Catalyst) and an iOS app serving a very similar purpose and user group. Both are currently using non-consumable IAPs to unlock functionalities. I‘m considering a subscription model and wonder if both apps could share the same IAP products. I‘m well aware that both apps need to use the same Bundle ID / App Store Connect entry, so I‘m willing to discontinue the existing Bundle IDs (while keeping the apps installable for existing customers), but Apple‘s info about „Universal Purchase“ is only mentioning Mac Catalyst apps for macOS: Mac Catalyst. With Xcode 11.4 or later, Mac apps built with Mac Catalyst can share a bundle ID with the iOS version of the app, so universal purchase is supported. Can an AppKit-based macOS app and an iOS app share the save Bundle ID and offer a Universal Purchase?
Posted
by
Post not yet marked as solved
1 Replies
82 Views
I have a iPad mini4 with iPadOS 15.8.2. I try to Install iPadOS 17.5 beta 4. But I can't succeed in installing the program. I read the documents related to iPadOS 17.5 beta 4 in Apple Developer Program. According to the instructions, I repeated to install the profile to my mini4 by various ways, but in vain. I made inquiries to Apple Care and Apple Developer Support, but they said they gave me some advices that they could have in their resources. They told me my operations seemed to be correct in order to install the profile related to iPadOS 17.5 beta 4 for the machines with iPadOS 16.3 and earlier. They also basically told me they didn't know why and how on a beta program and to go to Developer Forums and post my problem in Developer Forum. I have also have M2Macbook Pro, iPad9, SE3. "Beta Updates"s appear in Three System Settings but only "Automatic Updates" appears in mini4. It is sure to enter a Developer Mode on my Apple ID. How should I do in order to install the profile for iPadOS 17.5 beta 4 ? Please tell me how.
Posted
by

TestFlight Public Links

Get Started

Pinned Posts

Categories

See all