Discuss augmented reality and virtual reality app capabilities.

Posts under AR / VR tag

118 Posts
Sort by:
Post not yet marked as solved
1 Replies
378 Views
struct ARViewContainer: UIViewRepresentable { func makeUIView(context: Context) -> ARView { let arView = ARView(frame: .zero) arView.debugOptions = .showStatistics // Error: return arView } func updateUIView(_ uiView: ARView, context: Context) {} } -[MTLDebugRenderCommandEncoder validateCommonDrawErrors:]:5775: failed assertion `Draw Errors Validation Vertex Function(vsSdfFont): the offset into the buffer viewConstants that is bound at buffer index 4 must be a multiple of 256 but was set to 61840. '
Posted
by shaping.
Last updated
.
Post not yet marked as solved
1 Replies
700 Views
Hey everyone, I'm running into this issue of my USDZ model not showing up in Reality Composer Pro, exported from Blender as a USD and converted in Reality Converter. See Attached image: It's strange, because the USDz model appears fine in Previews. But once it is brought inside RCP, I receive this pop up, and does not appear. Not sure how to resolve this multiple root level issue. If anyone can point me in the right direction or any feedback, all is much appreciated! Thank you!
Posted Last updated
.
Post not yet marked as solved
0 Replies
499 Views
I have a RealityView and I want to add an Entity with an Attachment. Assuming I have a viewModel manage my entities, and the addEntityGesture() will add a new Entity under the rootEntity. RealityView { content, attachments in // Load initial content content.add(viewModel.rootEntity) } update: { updateContent, updateAttachments in // } attachments: { // } .gesture(addEntityGesture()) I know that we can create attachment in the attachments closure, and add those attachments as entities in our make closure, however, what if I want to add entity with an attachment on the fly?
Posted
by GerryGGG.
Last updated
.
Post not yet marked as solved
1 Replies
527 Views
Hey guys How I can fit RealityView content inside a volumetric window? I have below simple example: WindowGroup(id: "preview") { RealityView { content in if let entity = try? await Entity(named: "name") { content.add(entity) entity.setPosition(.zero, relativeTo: entity.parent) } } } .defaultSize(width: 0.6, height: 0.6, depth: 0.6, in: .meters) .windowStyle(.volumetric) I understand that we can resize a Model3D view automatically using .resizable() and .scaledToFit() after the model loaded. Can we achieve the same result using a RealityView? Cheers
Posted
by GerryGGG.
Last updated
.
Post not yet marked as solved
1 Replies
425 Views
I successfully changed a picture to the background in ImmersiveSpace in a full state with the following code. import RealityKit struct MainBackground: View { var body: some View { RealityView { content in guard let resource = try? await TextureResource(named: "Image_Name") else { fatalError("Error.") } var material = UnlitMaterial() material.color = .init(texture: .init(resource)) let entity = Entity() entity.components.set(ModelComponent( mesh: .generateSphere(radius: 1000), materials: [material] )) entity.scale *= .init(x: -1, y: 1, z: 1) content.add(entity) } } } However, when running, I found that when the user moves, the background is not fixed, but follows the user's movement, which I feel unrealistic. How to fix the background in the place where it first appears, and give the user a kind of movement that is really like walking in the real world, instead of letting the background follow the user.
Posted
by lijiaxu.
Last updated
.
Post not yet marked as solved
1 Replies
497 Views
Hello I am very new here in the forum (in iOS dev as well). I am trying to build an app that uses 3d face filters and I want to use Reality Composer. I knew Xcode 15 did not have it so I downloaded the beta 8 version (as suggested in another post). This one actually has Reality Composure Pro (XCode -> Developer tools -> Reality Composure Pro) but the Experience.rcproject still does not appear. Is there a way to create one? When I use Reality Composure it seems only able to create standalone projects and it does not seem to be bundled in a any way to xCode. Thanks for your time people!
Posted Last updated
.
Post not yet marked as solved
0 Replies
438 Views
RealityKit doesn't appear to support particles. After exporting particles from Blender 4.0.1, in standard .usdz format, the particle system renders correctly in Finder and Reality Converter, but when loaded into and anchored in RealityKit...nothing happens. This appears to be a bug in RealityKit. I tried one or more particle instances and nothing renders.
Posted Last updated
.
Post not yet marked as solved
0 Replies
339 Views
Hello. I am a student who just studied Swift. It has succeeded in obtaining body joint information through 'skelton.jointLandmarks' from a typical screen, but every time I zoom in, there is a problem that this joint information is not on the human body and moves sideways and downward. From my guess, there may be a problem that the center of the AR screen is not located in the center of the cell phone screen. I've been searching for information for 3 days due to this problem, but I couldn't find a similar case, and I haven't been able to solve it. If there is a case of solving a similar problem, I would appreciate it if you could let me know. Below link is how I zoomed in on the ARView screen. https://stackoverflow.com/questions/64896064/can-i-use-zoom-with-arview Thank you. Below is how I'm currently having trouble.
Posted
by zlgns.
Last updated
.
Post not yet marked as solved
1 Replies
344 Views
In visionOS, have many code with 3D attributes (SwiftUI) is adapted from the code in iOS, like: .padding(_:) to .padding3D(_:). In iOS have .offset(x:_, y:_), it only have X and Y, but in visionOS, view is in a 3D scene, so I want offset have Z, but offset can't use Z, so I try offset3D: import SwiftUI //Some View .offset3D(x: Number, y: Number, z: Number) Xcode report an error: Error: No member 'offset3D' So do you now how to use like offset's Modifiers, and can use Z in visionOS.
Posted
by lijiaxu.
Last updated
.
Post not yet marked as solved
0 Replies
323 Views
Hi! Im having an issue creating a PortalComponent on visionOS Im trying to anchor a Portal to a wall or floor anchor and always the portal appears opposite to the anchor. If I use a vertical anchor (wall) the portal appears horizontal on the scene If I use a horizontal anchor (floor) the portal appears vertical on the scene Im tested on xcode 15.1.0 beta 3 15.1.0 beta 2 15.0 beta 8 Any ideas ?? Thank you so much!
Posted Last updated
.
Post not yet marked as solved
1 Replies
531 Views
https://developer.apple.com/documentation/realitykit/photogrammetrysession/request/modelfile(url:detail:geometry:) if the path given is a file path that contains a .usdz extension, then it will be saved as .usdz, or else if we provide a folder, it will save as OBJ; I tried it, but no use. Right before saving, it shows the folder that will be saved, but after I click on done and check the folder, it's always empty.
Posted Last updated
.
Post not yet marked as solved
0 Replies
304 Views
Hi there. I have a performance issue when updating ModelEntity's position. There are two models with the same parent: arView.scene.add(anchorEntity) anchorEntity.addChild(firstModel) anchorEntity.addChild(secondModel) The firstModel is very large model. I am taking position of second model and applying it to the first: func session(_ session: ARSession, didUpdate frame: ARFrame) { // ... // Here the FPS drops firstModel.position = secondModel.position // ... } In other 3D Engines changing the transform matrix do not affects the performance. You can change it like hundred times in a single frame. It's only renders the last value on next frame. It means that the changing position itself should not cause FPS drop. If it's low, it will be always low, because there is always a value in transform matrix, and the renderer always renders what stored there. If you change the value, the next frame will basically be rendered with the new value, nothing heavy will not be happen. But in my case the FPS drops only if the model's position got changed. If it's not, the FPS is 60. So the changing transform matrix caused FPS drop. Can anyone describe why the RealityKit's renderer works in that way?
Posted Last updated
.
Post not yet marked as solved
0 Replies
498 Views
Hello, I am new to ARkit. I have tried to program to display a cube using ARkit and Realitykit in various articles using Xcode, but none of them worked. How can I display a 3Dcube in AR ????? What I tried, roughly speaking, was to open the xcode iOS, AR progect file and run the program that is included by default. If you can answer in Japanese, please do so.
Posted
by Kazaki.
Last updated
.
Post marked as solved
2 Replies
653 Views
I'm still trying to understand how to correctly convert 3D coordinates to 2D screen coordinates using convert(position:from:) and project(_:) Below is the example ContentView.swift from the default Augmented Reality App project, with a few important modifications. Two buttons have been added, one that toggles visibility of red circular markers on the screen, and a second button that adds blue spheres to the scene. Additionally a timer has been added to trigger regular screen updates. When run, the markers should line up with the spheres on screen and follow them on screen, as the camera is moved around. However, the red circles are all very far from their corresponding spheres on screen. What am I doing wrong in my conversion that is causing the circles to not line up with the spheres? // ContentView.swift import SwiftUI import RealityKit class Coordinator { var arView: ARView? var anchor: AnchorEntity? var objects: [Entity] = [] } struct ContentView : View { let timer = Timer.publish(every: 1.0/30.0, on: .main, in: .common).autoconnect() var coord = Coordinator() @State var showMarkers = false @State var circleColor: Color = .red var body: some View { ZStack { ARViewContainer(coordinator: coord).edgesIgnoringSafeArea(.all) if showMarkers { // Add circles to the screen ForEach(coord.objects) { obj in Circle() .offset(projectedPosition(of: obj)) .frame(width: 10.0, height: 10.0) .foregroundColor(circleColor) } } VStack { Button(action: { showMarkers = !showMarkers }, label: { Text(showMarkers ? "Hide Markers" : "Show Markers") }) Spacer() Button(action: { addSphere() }, label: { Text("Add Sphere") }) } }.onReceive(timer, perform: { _ in // silly hack to force circles to redraw if circleColor == .red { circleColor = Color(#colorLiteral(red: 1, green: 0, blue: 0, alpha: 1)) } else { circleColor = .red } }) } func addSphere() { guard let anchor = coord.anchor else { return } // pick random point for new sphere let pos = SIMD3<Float>.random(in: 0...0.5) print("Adding sphere at \(pos)") // Create a sphere let mesh = MeshResource.generateSphere(radius: 0.01) let material = SimpleMaterial(color: .blue, roughness: 0.15, isMetallic: true) let model = ModelEntity(mesh: mesh, materials: [material]) model.setPosition(pos, relativeTo: anchor) anchor.addChild(model) // record sphere for later use coord.objects.append(model) } func projectedPosition(of object: Entity) -> CGPoint { // convert position of object into "world space" // (i.e., "the 3D world coordinate system of the scene") // https://developer.apple.com/documentation/realitykit/entity/convert(position:to:) let worldCoordinate = object.convert(position: object.position, to: nil) // project worldCoordinate into "the 2D pixel coordinate system of the view" // https://developer.apple.com/documentation/realitykit/arview/project(_:) guard let arView = coord.arView else { return CGPoint(x: -1, y: -1) } guard let screenPos = arView.project(worldCoordinate) else { return CGPoint(x: -1, y: -1) } // At this point, screenPos should be the screen coordinate of the object's positions on the screen. print("3D position \(object.position) mapped to \(screenPos) on screen.") return screenPos } } struct ARViewContainer: UIViewRepresentable { var coordinator: Coordinator func makeUIView(context: Context) -> ARView { let arView = ARView(frame: .zero) // Create a sphere model let mesh = MeshResource.generateSphere(radius: 0.01) let material = SimpleMaterial(color: .gray, roughness: 0.15, isMetallic: true) let model = ModelEntity(mesh: mesh, materials: [material]) // Create horizontal plane anchor for the content let anchor = AnchorEntity(.plane(.horizontal, classification: .any, minimumBounds: SIMD2<Float>(0.2, 0.2))) anchor.children.append(model) // Record values needed elsewhere coordinator.arView = arView coordinator.anchor = anchor coordinator.objects.append(model) // Add the horizontal plane anchor to the scene arView.scene.anchors.append(anchor) return arView } func updateUIView(_ uiView: ARView, context: Context) {} } #Preview { ContentView() }
Posted
by fraggle.
Last updated
.
Post not yet marked as solved
5 Replies
629 Views
How do we author a Reality File like the ones under Examples with animations at https://developer.apple.com/augmented-reality/quick-look/ ?? For example, "The Hab" : https://developer.apple.com/augmented-reality/quick-look/models/hab/hab_en.reality Tapping on various buttons in this experience triggers various complex animations. I don't see any way to accomplish this in Reality Composer. And I don't see any way to export/compile to a "reality file" from within Xcode. How can I use multiple animations within a single GLTF file? How can I set up multiple "tap target" on a single object, where each one triggers a different action? How do we author something similar? What tools do we use? Thanks
Posted
by d0g.
Last updated
.
Post not yet marked as solved
1 Replies
417 Views
Hi there: From iOS 17 devices, the access to .reality files hosted in my server show the infamous "Object requires a newer version of iOS." message. Same page works flawless accessing the asset form iOS16 and below. Please, check it out a repro accessing to this URL: https://qlar.vortice3d.com/ Any help with this? Thanks for your time.
Posted
by paleRider.
Last updated
.
Post not yet marked as solved
0 Replies
400 Views
I do not know what I am doing wrong, I am trying to add a reality file to an app, and it will never be in scope. How do I get the file that I have created to load? I have followed many different tutorials and none of them work.
Posted
by pkyglee.
Last updated
.
Post not yet marked as solved
1 Replies
509 Views
From the Apple visionOS ARKit scene reconstruction, we can get the geometry of the 3D objects: MeshAnchor.Geometry. I have tried calculating the bounding box of it but had no success. How could we calculate the width, height and depth from the MeshAnchor.Geometry?
Posted Last updated
.
Post not yet marked as solved
1 Replies
626 Views
Hi there Hosting in my server a no-doubt-well-formed AR file, as is the "CosmonautSuit_en.reality" from Apple's examples (https://developer.apple.com/augmented-reality/quick-look/) the infamous and annoying "Object requires a newer version of iOS." message appears, even when I'm running iOS 17.1 in my iPad. That is, the very last available version. All works flawless in uOS16 and below. Of course, my markup is following the required format, namely: <a rel="ar" href="https://artest.myhost.com/CosmonautSuit_en.reality"> <img class="image-model" src="https://artest.myhost.com/cosmonaut.png"> </a> Accessing this same .reality file from the aforementioned Apple's site page works fine. Why is not working in my hosting server? For you rinformation, when I use in my server a USDZ instead, also from the Apple's web page of examples, as is the toy_drummer_idle.usdz file, all works flawless. Again, I'm using the same markup schema: <a rel="ar" href="https://artest.myhost.com/toy_drummer_idle.usdz"> <img class="image-model" src="https://artest.myhost.com/toy_drummerpng"> </a> Also, when I delete the rel="ar" option, AR experience is launched, but by means of an extra step, that implied go thought an ugly poster (generated by QLAR on-the-fly), that ruins all the UX/UI of my webapp. This bahavior is, by the way, the same that you can experience when accessing directly the .realiity file by typing its URL in the Safari browser box. Any tip on this? Thanks for your time.
Posted
by paleRider.
Last updated
.