SceneKit / renderNode override/ Metal / Instancing

Hello

I am a bit stuck with a silly challenge I set myself : I want to have a node with a simple geometry ( let's say a triangle ) and I want to render those triangles N time with passing in some information per triangle. Let s say an offset to apply at shader time.

I understand that may be I should create a sourcegeo and create multiple nodes to reflect that but here the point is to implement some Metal stuff in the renderNode override ( SCNode.rendererDelegate / SCNRenderNodeDelegate ).

so I set up some vertex shader like this :

vertex VertexOut brush_vertex_main(const VertexIn vertexIn [[stage_in]], constant BrushNodeBuffer& scn_node [[buffer(BufferNode)]], constant BrushInstances *instances [[buffer(BufferBrush)]], uint instanceID [[instance_id]]) {

float4 vertexOffset = float4(instances[instanceID].offset.xyz,1.0) + float4(vertexIn.position.xyz,1.0);

// float4 vertexOffset = float4(vertexIn.position.xyz,1.0);

VertexOut out = {
        .position   = scn_node.modelViewProjectionTransform * vertexOffset,
        .color      = vertexIn.color };

return out;

}

Did some binding as well to declare a pipelinerenderState like for eg

        let defLib = wd.device!.makeDefaultLibrary()
        let vertFunc = defLib?.makeFunction(name: vertexFunctionName)
        let fragFunc = defLib?.makeFunction(name: fragmentFunctionName)

        // add geo desc ( geometries should be doing that underhood anyway
        let vertexDescriptor =  MTLVertexDescriptor()
        // pos (SCNVertexSemanticPosition)
        vertexDescriptor.attributes[0].format = .float3
        vertexDescriptor.attributes[0].bufferIndex = 0
        vertexDescriptor.attributes[0].offset = 0
        // color ( SCNVertexSemanticColor)
        vertexDescriptor.attributes[3].format = .float3
        vertexDescriptor.attributes[3].bufferIndex = 0
        vertexDescriptor.attributes[3].offset = MemoryLayout<simd_float3>.stride
        vertexDescriptor.layouts[0].stride = MemoryLayout<simd_float3>.stride * 2
        
        let pipelineDescriptor = MTLRenderPipelineDescriptor()
        pipelineDescriptor.vertexFunction   = vertFunc
        pipelineDescriptor.fragmentFunction = fragFunc
        pipelineDescriptor.vertexDescriptor = vertexDescriptor

did some buffers creation, setting them properly in the rendering loop

    rendererCmdEnc.setRenderPipelineState(brushRenderPipelineState!)
    rendererCmdEnc.setVertexBuffer(vertBuff!, offset: 0, index: 0)
    // node info
    rendererCmdEnc.setVertexBuffer(nodeBuff!    , offset: 0, index: Int(BufferNode.rawValue))
    // per instance info
    rendererCmdEnc.setVertexBuffer(primBuff!    , offset: 0, index: Int(BufferBrush.rawValue))

    rendererCmdEnc.drawIndexedPrimitives(type: .triangle,
                                         indexCount: primitiveIdx.count,
                                         indexType: .uint16,
                                         indexBuffer: indexBuff!,
                                         indexBufferOffset: 0,
                                         instanceCount: 6)

and I keep banging my head when this executes : I have a miss match between my renderpipelineState vs the RenderPassDescriptor. Either it s the colorAttachment or the sample count/rastersamplecount that is invalid.

-[MTLDebugRenderCommandEncoder setRenderPipelineState:]:1604: failed assertion `Set Render Pipeline State Validation For depth attachment, the texture sample count (1) does not match the renderPipelineState rasterSampleCount (4). The color sample count (1) does not match the renderPipelineState's color sample count (4) The raster sample count (1) does not match the renderPipelineState's raster sample count (4)

I used the passdescriptor collorattachment to be a close as possible when describing the renderpipelinestate. changed the rastersamplecount. Tried without any specific for collorattachment etc... Alas, either the API validation will tell me I have the wrong colorAttachment info when I set up the renderpipelinestate in the renderloop and if I fixed the colorattach info at renderStatePipeline creation, some invalid sample count.

In a nutshell : is there any way to do this kind of geo instancing in a single node using SceneKit ?

thanks in advance for any support you would find interesting to provide!

Replies

The short answer is, yes this is very possible. Years ago I ported an open-source particle emitter system into SceneKit and Metal using a SCNNode renderDelegate.

Your code snippets above don't show setting the colorAttachments or depthAttachmentPixelFormat on the MTLRenderPipelineDescriptor?

thank you for your reply!

I did try to conform pixelformat like that :

(0..<8).forEach{ pipelineDescriptor.colorAttachments[$0].pixelFormat = wd.currentRenderPassDescriptor.colorAttachments[$0].texture?.pixelFormat ?? MTLPixelFormat.invalid pipelineDescriptor.colorAttachments[$0].isBlendingEnabled = true }

where wd is the scenerenderer

but to no avail

do you have by chance link towards your particle instancing example ?

  • well well well! currentPassdescriptor is created for the scnrenderer after my renderPipelinedesc. It seems therefore that the renderpipelinestate was therefore set with improper information ( texture type, sample count etc... ). I am a bit in the fog as to why the pass descriptor is actually valide after the first call to renderNode but I guess that s for another thread :-)

  • and now it works :-)

Add a Comment