Reality Composer

RSS for tag

Prototype and produce content for AR experiences using Reality Composer.

Reality Composer Documentation

Pinned Posts

Posts under Reality Composer tag

69 Posts
Sort by:
Post not yet marked as solved
1 Replies
650 Views
Hello, I'm setting up an ar view using scene anchors from reality composer. The scenes load perfectly fine the first time entering the AR View. When I go back to the previous screen and re-enter the AR View the app crashes before any of the scenes appear on the screen. I've tried pausing and resuming the session and am still getting the following error. validateFunctionArguments:3536: failed assertion `Fragment Function(fsRenderShadowReceiverPlane): incorrect type of texture (MTLTextureTypeCube) bound at texture binding at index 0 (expect MTLTextureType2D) for projectiveShadowMapTexture[0].' Any help would be very much appreciated. Thanks
Posted
by
Post not yet marked as solved
4 Replies
1.5k Views
Are there any good tutorials or suggestions on creating models in Blender and exporting with the associated materials and nodes? Specifically I'm looking to see if there is an ability to export translucency associated with an object (i.e. glass bottle). I have created a simple cube with a Principled BSDF shader, but the transmission and IOR settings are not porting over. Any tips or suggestions would be helpful.
Posted
by
Post not yet marked as solved
3 Replies
1.5k Views
Hello, I created in Blender a simple cube with 2 animations, one animation move up and down the cube and second one rotating cube on his position. I exported this file in glb format and I tried to converted using Reality Converter, unfortunately I can only see 1 animation. Is there any limitation of Reality Converter? Can I include more than 1 animation? The original file glb has the 2 animation inside, as you can see from the screenshot I checked the file using a online viewer for glb and there are no problem, both animations are in. The converter unfortunately see only the last one created. Any reason or explanation? I believe is a limitation on Reality Converter Regards
Posted
by
Post not yet marked as solved
3 Replies
1.3k Views
hi I am not sure what is going on... I have been working on this model for a while on reality composer, and had no problem testing it that way...it always worked out perfectly. So I imported the file into a brand new Xcode project... I created a new ARApp, and used SwiftUI. I actually did it twice ... And tested the version apple has with the box. In Apple's version, the app appears but the whole part where it tries to detect planes didn't show up. So I am confused. I found a question that mentions the error messages I am getting but I am not sure how to get around it? https://developer.apple.com/forums/thread/691882 // // ContentView.swift // AppToTest-02-14-23 // // Created by M on 2/14/23. // import SwiftUI import RealityKit struct ContentView : View {   var body: some View {     return ARViewContainer().edgesIgnoringSafeArea(.all)   } } struct ARViewContainer: UIViewRepresentable {       func makeUIView(context: Context) -> ARView {           let arView = ARView(frame: .zero)           // Load the "Box" scene from the "Experience" Reality File     //let boxAnchor = try! Experience.loadBox()     let anchor = try! MyAppToTest.loadFirstScene()           // Add the box anchor to the scene     arView.scene.anchors.append(anchor)           return arView         }       func updateUIView(_ uiView: ARView, context: Context) {}     } #if DEBUG struct ContentView_Previews : PreviewProvider {   static var previews: some View {     ContentView()   } } #endif This is what I get at the bottom 2023-02-14 17:14:53.630477-0500 AppToTest-02-14-23[21446:1307215] Metal GPU Frame Capture Enabled 2023-02-14 17:14:53.631192-0500 AppToTest-02-14-23[21446:1307215] Metal API Validation Enabled 2023-02-14 17:14:54.531766-0500 AppToTest-02-14-23[21446:1307215] [AssetTypes] Registering library (/System/Library/PrivateFrameworks/CoreRE.framework/default.metallib) that already exists in shader manager. Library will be overwritten. 2023-02-14 17:14:54.716866-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/suFeatheringCreateMergedOcclusionMask.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.743580-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arKitPassthrough.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.744961-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/drPostAndComposition.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.745988-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arSegmentationComposite.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.747245-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute0.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.748750-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute1.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.749140-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute2.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.761189-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute3.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.761611-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute4.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.761983-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute5.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.762604-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute6.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.763575-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute7.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle 2023-02-14 17:14:54.764859-0500 AppToTest-02-14-23[21446:1307215] [Foundation.Serialization] Json Parse Error line 18: Json Deserialization; unknown member 'EnableARProbes' - skipping. 2023-02-14 17:14:54.764902-0500 AppToTest-02-14-23[21446:1307215] [Foundation.Serialization] Json Parse Error line 20: Json Deserialization; unknown member 'EnableGuidedFilterOcclusion' - skipping. 2023-02-14 17:14:55.531748-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534559-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534633-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534680-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534733-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534777-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534825-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534871-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:55.534955-0500 AppToTest-02-14-23[21446:1307215] throwing -10878 2023-02-14 17:14:56.207438-0500 AppToTest-02-14-23[21446:1307383] [Technique] ARWorldTrackingTechnique <0x1149cd900>: World tracking performance is being affected by resource constraints [2] 2023-02-14 17:17:15.741931-0500 AppToTest-02-14-23[21446:1307414] [Technique] ARWorldTrackingTechnique <0x1149cd900>: World tracking performance is being affected by resource constraints [1] 2023-02-14 17:22:07.075990-0500 AppToTest-02-14-23[21446:1308137] [Technique] ARWorldTrackingTechnique <0x1149cd900>: World tracking performance is being affected by resource constraints [1] code-block
Posted
by
Post not yet marked as solved
2 Replies
1.4k Views
It seems that something must of changed in the reality composer export feature to USDZ. Importing any animated .usdz file into reality composer and then exporting it is reducing the playback frame rate to about 30%. The same file imported and then exported as a .reality file plays back just fine. Anyone else experiencing this issue, as its happening for every usdz file imported and also across 2 different apple laptops running the software?
Posted
by
Post not yet marked as solved
0 Replies
479 Views
Hi All, I am wondering whether there's a way to visually add/edit colliders for ModelEntities in Reality Composer ? It's rather easy to do this kind of thing in engines like Unity and Unreal but with Reality Composer I have not been able to edit the collision shape in any way. Best Regards
Posted
by
Post not yet marked as solved
2 Replies
772 Views
If the anchor picture is out of the camera view, the AR experience disappears. Both with USDZ files, created with Reality composer, directly opened on iPhone or iPad (with AR Quick Look), or with Adobe Aero. So I suppose the bug is due to ARKit.
Posted
by
Post not yet marked as solved
1 Replies
915 Views
I recently spent several days messing around in Reality Composer with the intention of creating a course that we could teach to students to get them started learning how to use Augmented Reality to tell stories and play with digital assets. We would use it in combination with other apps like TinkerCad to teach them modeling, the Voice Memo recorder so they can record dialogue and interaction sounds, iMovie to edit a demo reel of their application, as well as taking advantage of online assets libraries like Sketchfab that does have .usdz files and even some animations available for free. The focus would be on creating an interactive application that works in AR. Here are some notes I took while trying things out. UI Frustrations: The behaviors tab doesn’t go up far enough, I’d like to be able to drag it to take up more space on the screen. Several of the actions have sections that go just below the edge of the screen, and it’s frustrating to have to constantly scroll in order to see all the information. I’ll select an “Ease Type” on the Move, Rotate, Scale To action and buttons will appear on the very edge of my screen in such a way that I can’t read them until I scroll down. This happens for so many different Actions that it feels like I don’t have enough space to see all the necessary information The audio importing from the content library is difficult to navigate. First, I wish there was a way to import only one sound instead of having to import the entire category of sounds. Second, it would be nice to see all the categories of sounds in some kind of sidebar, similar to the “Add Object” menu that already exists. I wish there was a way to copy and paste position and rotation vectors easily so we could make sure objects are in the same place, especially if we need to duplicate objects in order to get a second tap implementation. Currently you have to keep flipping back and forth between objects to get the numbers right Is there a way to see all of the behaviors a selected object is referenced in? Since the “Affected Objects” list is inside all sorts of behaviors, actions, triggers, etc, it can be hard to find exactly where a behavior is coming from, especially if your scene has a lot of behaviors. I do come from a Unity background, so I’m used to object behaviors being put directly onto the object itself, so to not know which behaviors are referencing any given object makes it possible to accidentally have my physics response overwritten by an animation triggered from somewhere else and then causing me to search for it through all of the behaviors in my scene. Is there a way to see the result of my object scanning? Right now it’s all sort of behind the scenes, and it feels like the object scanning doesn’t work unless the object is in the same position in relation to the background as it was before. It’s a black box and hard to understand what I’m doing wrong with the scanning, cuz when I move the object everything stops working. I could use a scene Hierarchy or list of all objects in a scene. Sometimes I don’t know where an object is but I know what it is called, and I’d like to be able to select it to make changes to it. Sometimes objects start right on top of eachother in the scene (like for a reset button for a physics simulation), which makes it frustrating to select one of them over the other, especially since it seems that the only way to select “affected objects” is to tap on them, instead of choosing from a list of those available in the scene. Feature Requests: One thing other apps have that makes it easy to add personality into the scene is characters that have a variety of animations they can play depending on context. It would be nice to have some kind of character creator that came with a bunch of pre-made animations, or at least some kind of library of characters with animations. So for example, if we want to create a non-player character that waves to the player, then moves somewhere else, and talks again, we can switch the animation of the character at the appropriate parts of the movement to make the character feel more real. This is a little more difficult to do with usdz files that only play one animation, although the movement is cool it typically only fits in one setting, so you have to juggle turning a bunch of objects off and on, even if you do find an importable character with a couple of animations (such as you might find on mixamo in .fbx format). Although I believe it may be possible for a .usdz file to have more than one animation in it? I haven't seen any examples of this. Any chance we’ll have a version of the Reality Converter app that works on iPads and iPhones? We don’t want to assume our students have access to a macbook, and being able to convert fbx files or obj files would open up access to a wider variety of online downloadable assets. Something that would really help make more complex scenes is the ability to make a second trigger for an object that relies on a condition being met first. The easiest example is being able to click on a tour guide a second time in order to move onto the next object. This gets a little deeper into code blocks, but possibly there could be an if block or a condition statement that checks if something is in proximity before allowing a tap, or checks how many times the object has been tapped by storing it in an integer variable you could set and check the value of. The way I first imagined it, maybe you’d be able to add a Trigger that enables AFTER the first action sequence has completed, so you can build longer chains. This also comes into play with physics interactions. Let’s say I want to click on a ball to launch it, but when it stops moving at a speed greater than some number I want it to automatically reset. I’d like the ability to make one object follow another one with a block, or some kind of system that’s similar to “parenting” objects together like you can in a 3D engine. This way, you could separate the visuals of an object from its physics, letting you play animations on launched objects, spin them, emphasize them, while still allowing the physics simulation to work. For Physics simulations, is it possible to implement a feature where the direction of force is pointing towards another object in the scene? Or better yet, away from them using negative values? Specifically, I’d like to be able to launch a projectile in the direction the camera is facing, or give the user some control over the direction during playtime. Would be nice to edit the material or color of an object with an action, give the user a little pulse of color when they tap as feedback, or even allow the user to customize their environment with different textures If you want this app to be used in education, there must be a way for teachers to share their built experiences with eachother, some kind of online repository where you can try out what others have made.
Posted
by
Post marked as solved
1 Replies
687 Views
I have been reading through the documentation and can not find a way to alter the users environment lighting. Is this not possible? Basically I would like to add a darkening to a room. Or change the HUE of the environment in the scene they are seeing. I can think of a few "hacks" to do this but figured there would be a fancy reality kit way to do so. If it is possible to "dim" or darken the environment I could then light up my models with lights but still have the real environment all around.
Post not yet marked as solved
0 Replies
355 Views
Hello, I'm trying to build a compound object using 2 cubes. I adjusted the z value for one object to make them appear merged together. However, when I run the simulation the two objects separate. Is there something I need to set to make sure the z value doesn't change when it's running? Thanks, BVSdev composed objects, grouped together running:
Posted
by
Post not yet marked as solved
0 Replies
751 Views
Just getting familiar with XCode. Using Reality Composer a lot now. Ready to try coding along with with Reality Composer. Saw this demo (see link below), but I don't want to use a web server to retrieve banner information, I would prefer to embed this information directly into the USDZ file to be read with AR Quick Look. **Two questions: ** How can you get a banner like this when you open an USDZ file and edit the banner information directly (within the file itself) without using a URL? In place of the call to action button (for Apple Pay) in the demo below, I'd like to use that button to either call a phone number, send a text, or go to a web URL. Link to Apple's example with Apple Pay (see custom examples section, like for the kids' slide example on that page). https://developer.apple.com/augmented-reality/quick-look/ Scraps are welcome, hungry to learn.
Posted
by
Post not yet marked as solved
2 Replies
864 Views
I am trying to do the following code: func withBackground() -> some View { #if os(visionOS) background { Material.thin } #else background { Color.offWhite.ignoresSafeArea() } #endif } But Xcode 15 Beta 2 says the following: Unknown operating system for build configuration os How can I change the background ONLY for visionOS, while keeping it as is on the other platforms I support? Thanks!
Posted
by
Post not yet marked as solved
2 Replies
932 Views
Attempting to load an entity via Entity.loadAsync(contentsOf: url) throws an error when passing a .realitycomposerpro file: "Cannot determine file format for ...Package.realitycomposerpro" AdditionalErrors=( "Error Domain=USDKitErrorDomain Code=3 "Failed to open layer } I have a VisionOs app with the same Reality Composer pro file referenced. The project automatically builds a .reality file out of the project. The project contains the realityComposerPro package as a library in the Link Binary build phase. I duplicated this setup with my iPhone app using RealityKit. Attempting to include a realityconvertpro package into my existing app. causes a build error: RealityAssetsCompile: Error: for --platform, value must be one of [xros, xrsimulator], not 'iphoneos' Usage: realitytool compile --output-reality [--schema-file ] [--derived-data ] --platform --deployment-target [--use-metal ] See 'realitytool compile --help' for more information. Lastly, I extracted the .reality file generated by a separate, working, visionOs app using reality converter pro. Attempting to actually load an entity from this file results in an error: Reality File version 9 is not supported. (Latest supported version is 7.)
Posted
by
Post not yet marked as solved
1 Replies
666 Views
When I create an USDZ file from the original Reality Composer (non pro) and view it in the Vision OS simulator, the transforms and rotations don't look similar. For example a simple Tap and Flip behaviour does not rotate similar in Vision OS. Should we regard RC as discontinued sw and only work with RC-pro? Hopefully Apple will combine the features from the original RC into the new RC pro !
Posted
by
Post not yet marked as solved
1 Replies
661 Views
Hi Everyone, I'm having difficult detecting custom components from RealityComposer in my xrOS app using QueryPredicate(where: ). I've got it working with a PhysicsBodyComponent so I know it's my custom component that's not being recognized somehow. I'd also like to note all the issues that my intel-based iMac is having with RealityComposer, like crashing on particle emitter add and visionOS crashing when I download textures to use. So I'm constantly paranoid that it's a system-level problem. Anyways, thanks in advance for any help. Cheers, Noah
Posted
by