Loading image into streaming loop always shows dock in the extension

Im trying to load an image in a coremediaIO extension. Ive successfully loaded the image and draw it in 3 different ways. Once from the AppGroup, once from the extension bundle and another time as a base64 string. No problems each time the image loads and renders. However, when I call the image to be drawn in a pixelbuffer it loads the extension into the dock. Id love to be able to draw an image and the extension not appear in the dock but I cannot figure this out. Can render text in this timer loop no problem. Any suggestions other than sending the image across the sink stream?

Essentially i want to render the app logo whenever the host app is not sending signal to the extension.

func startStreaming() {
        guard let _ = _bufferPool else {
            return
        }
        
        _streamingCounter += 1
        _timer = DispatchSource.makeTimerSource(flags: .strict, queue: _timerQueue)
        _timer!.schedule(deadline: .now(), repeating: 1.0/Double(kFrameRate), leeway: .seconds(0))
        
        _timer!.setEventHandler {
            if self.sinkStarted {
                return
            }
            
            var err: OSStatus = 0
            var pixelBuffer: CVPixelBuffer?
            
            err = CVPixelBufferPoolCreatePixelBufferWithAuxAttributes(kCFAllocatorDefault, self._bufferPool, self._bufferAuxAttributes, &pixelBuffer)
            if err != 0 {
                logger.debug("out of pixel buffers \(err)")
            }
            
            if let pixelBuffer = pixelBuffer {
                CVPixelBufferLockBaseAddress(pixelBuffer, [])
                let pixelData = CVPixelBufferGetBaseAddress(pixelBuffer)
                let width = CVPixelBufferGetWidth(pixelBuffer)
                let height = CVPixelBufferGetHeight(pixelBuffer)
                let rgbColorSpace = CGColorSpaceCreateDeviceRGB()
                if let context = CGContext(data: pixelData,
                                           width: width,
                                           height: height,
                                           bitsPerComponent: 8,
                                           bytesPerRow: CVPixelBufferGetBytesPerRow(pixelBuffer),
                                           space: rgbColorSpace,
                                           bitmapInfo: CGBitmapInfo.byteOrder32Little.rawValue | CGImageAlphaInfo.premultipliedFirst.rawValue) {

                    let graphicsContext = NSGraphicsContext(cgContext: context, flipped: false)
                    NSGraphicsContext.saveGraphicsState()
                    NSGraphicsContext.current = graphicsContext
                    let cgContext = graphicsContext.cgContext
                    let dstRect = CGRect(x: 0, y: 0, width: width, height: height)
                    cgContext.clear(dstRect)
                    cgContext.setFillColor(NSColor.black.cgColor)
                    cgContext.fill(dstRect)

                    let imageWidth = 400 // You can adjust the width as needed
                    let imageHeight = 400 // You can adjust the height as needed
                    let imageOrigin = CGPoint(x: (width - imageWidth) / 2, y: (height - imageHeight) / 2) // Center the image
                    
                    if let decodedData = Data(base64Encoded: imageBaseString, options: .ignoreUnknownCharacters),
                       let image = NSImage(data: decodedData), let cgImage = image.cgImage(forProposedRect: nil, context: nil, hints: nil) {
                        cgContext.draw(cgImage, in: CGRect(origin: imageOrigin, size: NSSize(width: imageWidth, height: imageHeight)))
                    }
                    
                    NSGraphicsContext.restoreGraphicsState()
                }
                CVPixelBufferUnlockBaseAddress(pixelBuffer, [])
            }

            
            if let pixelBuffer = pixelBuffer {
                var sbuf: CMSampleBuffer!
                var timingInfo = CMSampleTimingInfo()
                timingInfo.presentationTimeStamp = CMClockGetTime(CMClockGetHostTimeClock())
                err = CMSampleBufferCreateForImageBuffer(allocator: kCFAllocatorDefault, imageBuffer: pixelBuffer, dataReady: true, makeDataReadyCallback: nil, refcon: nil, formatDescription: self._videoDescription, sampleTiming: &timingInfo, sampleBufferOut: &sbuf)
                if err == 0 {
                    self._streamSource.stream.send(sbuf, discontinuity: [], hostTimeInNanoseconds: UInt64(timingInfo.presentationTimeStamp.seconds * Double(NSEC_PER_SEC)))
                }
            }
        }
        
        _timer!.setCancelHandler {}
        _timer!.resume()
    }

Culprit is here:

 if let decodedData = Data(base64Encoded: imageBaseString, options: .ignoreUnknownCharacters),
                       let image = NSImage(data: decodedData), let cgImage = image.cgImage(forProposedRect: nil, context: nil, hints: nil) {
                        cgContext.draw(cgImage, in: CGRect(origin: imageOrigin, size: NSSize(width: imageWidth, height: imageHeight)))
                    }

Doesnt matter how I load the image, the dock extension will trigger as soon as the draw loop is triggered or on extension init if I load the image then. Is this possible?

Replies

Err title should say extension in the dock. 🙈