DrawableQueue to RealityKit ShaderGraphMaterial ignoring alpha channel of texture?

I'm using DrawableQueue to create textures that I apply to my ShaderGraphMaterial texture. My metal render is using a range of alpha values as a test.

My objects displayed with the DrawableQueue texture are working as expected, but the alpha component is not working.

Is this an issue with my DrawableQueue descriptor? My ShaderGraphMaterial? A missing setting on my scene objects? or some limitation in visionOS?

DrawableQueue descriptor

            let descriptor = await TextureResource.DrawableQueue.Descriptor(
                pixelFormat: .rgba8Unorm,
                width: textureResource!.width,
                height: textureResource!.height,
                usage: [.renderTarget, .shaderRead, .shaderWrite], // Usage should match the requirements for how the texture will be used
                //usage: [.renderTarget], // Usage should match the requirements for how the texture will be used
                mipmapsMode: .none                   // Assuming no mipmaps are needed for the text texture
            )
            
            let queue = try await TextureResource.DrawableQueue(descriptor)
            queue.allowsNextDrawableTimeout = true
            
            await textureResource!.replace(withDrawables: queue)
            

Draw frame:

        guard
            let drawable = try? drawableQueue!.nextDrawable(),
            let commandBuffer = commandQueue?.makeCommandBuffer()//,
            //let renderPipelineState = renderPipelineState
        else {
            return
        }
        
        let renderPassDescriptor = MTLRenderPassDescriptor()
        renderPassDescriptor.colorAttachments[0].texture = drawable.texture
        renderPassDescriptor.colorAttachments[0].loadAction = .clear
        renderPassDescriptor.colorAttachments[0].storeAction = .store
        renderPassDescriptor.colorAttachments[0].clearColor = clearColor
        /*renderPassDescriptor.colorAttachments[0].clearColor = MTLClearColor(
            red: clearColor.red,
            green: clearColor.green,
            blue: clearColor.blue,
            alpha: 0.5 )*/

        renderPassDescriptor.renderTargetHeight = drawable.texture.height
        renderPassDescriptor.renderTargetWidth = drawable.texture.width
        
        guard let renderEncoder = commandBuffer.makeRenderCommandEncoder(descriptor: renderPassDescriptor) else {
            return
        }
        
        renderEncoder.pushDebugGroup("DrawNextFrameWithColor")
        //renderEncoder.setRenderPipelineState(renderPipelineState)
        // No need to create a render command encoder with shaders, as we are only clearing the drawable.

        // Since we are just clearing the drawable to a solid color, no need to draw primitives
        renderEncoder.endEncoding()
        
        commandBuffer.commit()
        commandBuffer.waitUntilCompleted()
        drawable.present()
    }

Replies

where are you calling Draw frame ?

I would love to see some sample code showing how you're generating a dynamic texture with DrawableQueue!

That said, I can see from the image that your sending UsdUVTexture.RGBA, which has 4 components, to UnlitSurface.Color, which only takes in 3 components. You need to split the RGBA out, then pass the first three into Color, and the last into Opacity.

I had the same issue and thanks to @deeje reply, I figured out how to proceed, so here is the separate and combine part of the shader. Hope this will help !