Reality Composer

RSS for tag

Prototype and produce content for AR experiences using Reality Composer.

Reality Composer Documentation

Pinned Posts

Posts under Reality Composer tag

69 Posts
Sort by:
Post not yet marked as solved
1 Replies
721 Views
Hi! While waiting for scene understanding to work in the visionOS simulator, I am trying to bounce a virtual object off a real world wall or other real world object on iOS ;). I load my virtual objects from a Reality Composer file where I set them to participate in physics with dynamic motion type. With this I am able to have them collide with each other nicely, occlusion also works, but they go right through walls and other real world objects rather than bouncing off... I've tried a couple of variations of the following code: func makeUIView(context: Context) -> ARGameView { arView.environment.sceneUnderstanding.options.insert(.occlusion) arView.environment.sceneUnderstanding.options.insert(.physics) arView.debugOptions.insert(.showSceneUnderstanding) arView.automaticallyConfigureSession = false let config = ARWorldTrackingConfiguration() config.planeDetection = [.horizontal, .vertical] config.sceneReconstruction = .meshWithClassification arView.session.run(config) if let myScene = try? Experience.loadMyScene() { ... arView.scene.anchors.append(myScene) } return arView } I have found several references that his should "just work", e.g. in https://developer.apple.com/videos/play/tech-talks/609 What am I missing? Testing on iPhone 13 Pro Max with iOS 16.5.1 🤔
Posted
by
Post not yet marked as solved
0 Replies
486 Views
Just trying to figure out how to make an occlusion mask in Reality Composer or Reality Converter. I use Blender to make my assets and want to be able to make something similar to a portal which would require an occlusion mask.
Posted
by
Post not yet marked as solved
0 Replies
499 Views
What would be the best way to go about recognizing a 3D physical object, then anchoring digital 3D assets to it? I would also like to use occlusion shaders and masks on the assets too. There's a lot of info out there, but the most current practices keep changing and I'd like to start in the right direction! If there is a tutorial or demo file that someone can point me to that would be great!
Posted
by
Post marked as solved
2 Replies
536 Views
hi I have made a simple textured planet 3D in Reality Composer Pro, and the question is how can I visualize it in the Vision Pro simulator. I already have everything installed, even the simulator works fine, but I don't know how to run what I do in RealityComposerPro, export it to vision Simulator. Any comment will be very appreciated. thank you.
Posted
by
Post not yet marked as solved
0 Replies
474 Views
I'm trying to make a material that has flecks of glitter in it. The main technique I've found to achieve this effect is to use a Voronoi diagram as a normal map, with various amounts of embellishment on top. The shader graph editor has a Worley noise node which is related but produces the "spider web" version of a Voronoi diagram, instead of flat polygons of a consistent color. Is there a trick for converting this Worley texture into a vanilla voronoi diagram, or am I missing something else obvious? Or is what I want not currently possible?
Posted
by
Post not yet marked as solved
0 Replies
321 Views
I’m interested in evaluating the Physics capabilities of RealityKit and VisionOS. i assume that I could create Entities, add the PhysicBody Component and “simulate” and tweak settings interactively - but that’s not my experience. Is something like this possible with Beta 8?
Posted
by
Post not yet marked as solved
2 Replies
865 Views
I've been using the MacOS XCode Reality Composer to export interactive .reality files that can be hosted on the web and linked to, triggering QuickLook to open the interactive AR experience. That works really well. I've just downloaded XCode 15 Beta which ships with the new Reality Composer Pro and I can't see a way to export to .reality files anymore. It seems that this is only for building apps that ship as native iOS etc apps, rather than that can be viewed in QuickLook. Am I missing something, or is it no longer possible to export .reality files? Thanks.
Posted
by
Post not yet marked as solved
8 Replies
2.3k Views
Hi guys, I've just installed the newest Xcode Version 15.0 (15A240d) and I can see that the Reality Composer is missing. It is not appearing in Xcode -> Open Developer Tool menu. Where/how I can find it? My old reality compose project now opens like a text file, and I have no option to open it in Reality Composer like it was in old Xcode. I'm kinda stuck with my project, so any help could be useful. Thanks, J
Posted
by
Post not yet marked as solved
2 Replies
903 Views
With Xcode 11 through 14, Reality Composer for Mac has been included with Xcode. Apples's own documentation states: "You automatically get Reality Composer for macOS when you install Xcode 11 or later. The app is one of the developer tools bundled with Xcode. From the menu, choose Xcode > Open Developer Tool, and select Reality Composer." This simply not the case with Xcode 15. In fact, we cannot even open existing RCPROJECTS such as the classic Experience.rcproject found in Apple's own sample code Creating a Game with Reality Composer. As an AR developer this makes my currently project virtual unworkable. It's my understanding that when Reality Composer Pro is finally released -- it will only be compatible with visionOS apps, as opposed to iOS apps that are built with RealityKit; this was certainly the case with early iterations. Apple's Creation tools for spatial apps still mention Reality Composer, but it is only available for iOS and iPadOS. Taking away such a critical tool without notice is crippling, to say the least. Any official announcements on the future state RealityKit development would be welcome.
Posted
by
Post not yet marked as solved
1 Replies
524 Views
Hello, Currently working on a project that was finished in Reality Composer but then we noticed the pink material after changing scenes on iOS 17 devices. So did some updating to Mac OS 14 and X Code 15 beta to use Reality Composer Pro and currently stuck on how to setup the animations and onClick triggers to be able play animation from the USDZ model in the scene. Once the animation is finished, it will trigger the next scene. This was done through behaviors in Reality Composer and it was simple drag and drop. But now it seem we need to do it by components which i don't mind just don't see much resources on how to set this up properly. Is there are way to do behaviors like in Reality Composer? Extra: If there is a way to use alpha pngs or be able to drag PNG's into the scene like in Reality Composer?
Posted
by
Post not yet marked as solved
0 Replies
356 Views
Hi, I'm working on an AR App. With Reality Composer and Xcode 14 I was triggering custom behaviours from with just: myScene.notifications.myBox.post() called from let myScene = try! Experience.loadBox() Now in Xcode 15 I don't have an Experience instead with the .reality file I have to use Entities so : let objectAR = try! Entity.load(named: "myProject.reality") How can I trigger my previously Reality Composer exported custom behaviour from that ?
Posted
by
Post not yet marked as solved
0 Replies
495 Views
Using the face anchor feature in Reality Composer, I'm exploring the potential for generating content movement based on facial expressions and head movement. In my current project, I've positioned a horizontal wood plane on the user's face, and I've added some dynamic physics-enabled balls on the wood surface. While I've successfully anchored the wood plane to the user's head movements, I'm facing a challenge with the balls. I'm aiming to have these balls respond to the user's head tilts, effectively rolling in the direction of the head movement. For instance, a tilt to the right should trigger the balls to roll right, and likewise for leftward tilts. However, my attempts thus far have not yielded the expected results, as the balls seem to be unresponsive to the user's head movements. The wood plane, on the other hand, follows the head's motion seamlessly. I'd greatly appreciate any insights, guidance, or possible solutions you may have regarding this matter. Are there specific settings or techniques I should be implementing to enable the balls to respond to the user's head movement as desired? Thank you in advance for your assistance.
Posted
by
Post not yet marked as solved
0 Replies
264 Views
When changing the 'Up axis' setting in the Layer Data tab to 'Z', the gizmo does not reflect the change. It continues to display as if the Up axis is 'Y'. This results in the gizmo becoming disconnected from the object itself, making it challenging to perform accurate transformations using the gizmo. Steps to Reproduce: Open Reality Composer Pro in the latest XCode Beta. Click on empty space inside of your scene. Navigate to the Layer Data tab. Change the "Up axis" setting to 'Z'. Observe the gizmo's orientation.
Posted
by
Post not yet marked as solved
3 Replies
535 Views
Reality Composer is no longer available in XCode 15 Release. Is this intended or will be available in later releases? Do I need to revert to Xcode15 Beta8 to get Reality Composer?
Posted
by
Post not yet marked as solved
1 Replies
626 Views
Hi there Hosting in my server a no-doubt-well-formed AR file, as is the "CosmonautSuit_en.reality" from Apple's examples (https://developer.apple.com/augmented-reality/quick-look/) the infamous and annoying "Object requires a newer version of iOS." message appears, even when I'm running iOS 17.1 in my iPad. That is, the very last available version. All works flawless in uOS16 and below. Of course, my markup is following the required format, namely: <a rel="ar" href="https://artest.myhost.com/CosmonautSuit_en.reality"> <img class="image-model" src="https://artest.myhost.com/cosmonaut.png"> </a> Accessing this same .reality file from the aforementioned Apple's site page works fine. Why is not working in my hosting server? For you rinformation, when I use in my server a USDZ instead, also from the Apple's web page of examples, as is the toy_drummer_idle.usdz file, all works flawless. Again, I'm using the same markup schema: <a rel="ar" href="https://artest.myhost.com/toy_drummer_idle.usdz"> <img class="image-model" src="https://artest.myhost.com/toy_drummerpng"> </a> Also, when I delete the rel="ar" option, AR experience is launched, but by means of an extra step, that implied go thought an ugly poster (generated by QLAR on-the-fly), that ruins all the UX/UI of my webapp. This bahavior is, by the way, the same that you can experience when accessing directly the .realiity file by typing its URL in the Safari browser box. Any tip on this? Thanks for your time.
Posted
by
Post not yet marked as solved
3 Replies
1.1k Views
I'm using DrawableQueue to create textures that I apply to my ShaderGraphMaterial texture. My metal render is using a range of alpha values as a test. My objects displayed with the DrawableQueue texture are working as expected, but the alpha component is not working. Is this an issue with my DrawableQueue descriptor? My ShaderGraphMaterial? A missing setting on my scene objects? or some limitation in visionOS? DrawableQueue descriptor let descriptor = await TextureResource.DrawableQueue.Descriptor( pixelFormat: .rgba8Unorm, width: textureResource!.width, height: textureResource!.height, usage: [.renderTarget, .shaderRead, .shaderWrite], // Usage should match the requirements for how the texture will be used //usage: [.renderTarget], // Usage should match the requirements for how the texture will be used mipmapsMode: .none // Assuming no mipmaps are needed for the text texture ) let queue = try await TextureResource.DrawableQueue(descriptor) queue.allowsNextDrawableTimeout = true await textureResource!.replace(withDrawables: queue) Draw frame: guard let drawable = try? drawableQueue!.nextDrawable(), let commandBuffer = commandQueue?.makeCommandBuffer()//, //let renderPipelineState = renderPipelineState else { return } let renderPassDescriptor = MTLRenderPassDescriptor() renderPassDescriptor.colorAttachments[0].texture = drawable.texture renderPassDescriptor.colorAttachments[0].loadAction = .clear renderPassDescriptor.colorAttachments[0].storeAction = .store renderPassDescriptor.colorAttachments[0].clearColor = clearColor /*renderPassDescriptor.colorAttachments[0].clearColor = MTLClearColor( red: clearColor.red, green: clearColor.green, blue: clearColor.blue, alpha: 0.5 )*/ renderPassDescriptor.renderTargetHeight = drawable.texture.height renderPassDescriptor.renderTargetWidth = drawable.texture.width guard let renderEncoder = commandBuffer.makeRenderCommandEncoder(descriptor: renderPassDescriptor) else { return } renderEncoder.pushDebugGroup("DrawNextFrameWithColor") //renderEncoder.setRenderPipelineState(renderPipelineState) // No need to create a render command encoder with shaders, as we are only clearing the drawable. // Since we are just clearing the drawable to a solid color, no need to draw primitives renderEncoder.endEncoding() commandBuffer.commit() commandBuffer.waitUntilCompleted() drawable.present() }