Core Media

RSS for tag

Efficiently process media samples and manage queues of media data using Core Media.

Core Media Documentation

Posts under Core Media tag

25 Posts
Sort by:
Post not yet marked as solved
2 Replies
948 Views
Our app gets rejected for referencing non-public symbol _CMTimebaseCreateWithMasterClock. We have confirmed that we are using the suggested CMTimebaseCreateWithSourceClock. When looking at our DerivedData, the compiler turns CMTimebaseCreateWithSourceClock into CMTimebaseCreateWithMasterClock presumably because CMTimebaseCreateWithSourceClock is an inline function. We have verified if we remove the one call to CMTimebaseCreateWithSourceClock that the CMTimebaseCreateWithSourceClock symbol will no longer appear in our DerivedData. This is preventing us from submitting an app update. Please advise asap.
Posted
by CaseySF.
Last updated
.
Post not yet marked as solved
0 Replies
629 Views
How does one get the list of controls which a CMIOObject has to offer? How do the objects in the CMIO hierarchy map to CMIOExtension objects? I expected the hierarchy to be something like this: the system has owned objects of type: 'aplg' `(kCMIOPlugInClassID)` has owned objects of type 'adev' `(kCMIODeviceClassID,` which may have owned objects of type 'actl' `(kCMIOControlClassID)` and has at least one owned object of type 'astr' `(kCMIOStreamClassID),` each of which may have owned objects of type 'actl' `(kCMIOControlClassID)` Instead, when I recursively traverse the object hierarchy, I find the devices and the plug-ins at the same level (under the system object). Only some of the device in my system have owned streams, although they all have a kCMIODevicePropertyStreams ('stm#') property. None of the devices or streams appear to have any controls, and none of the streams have any owned objects. I'm not using the qualifier when searching for owned objects, because the documentation implies that it may be nil if I'm not interested in narrowing my search. Should I expect to find any devices or streams with controls? And if so, how do I get a list of them? CMIOHardwareObject.h says that "Wildcards... are especially useful ...for querying an CMIOObject's list of CMIOControls. ", but there's no example of how to do this. My own device (from my camera extension) has no owned objects of type stream. I don't see any API call to convey ownership of the stream I create by the device it belongs to. How does the OS decide that a stream is 'owned' by a device? I've tried various scopes and elements - kCMIOObjectPropertyScopeGlobal, kCMIOObjectPropertyScopeWildcard, kCMIOControlPropertyScope, and kCMIOObjectPropertyElementMain, kCMIOObjectPropertyElementWildcard and kCMIOControlPropertyElement. I can't get a list of controls using any of these. Ultimately, I'm trying to find my provider, my devices and my streams using the CMIO interface, so that I can set and query properties on them. Is it reasonable to assume that the CMIOObject of type 'aplg' is the one corresponding to a CMIOExtensionProviderSource? This is on Ventura 13.4.1 on M1.
Posted
by ssmith_c.
Last updated
.
Post not yet marked as solved
1 Replies
562 Views
Im trying to load an image in a coremediaIO extension. Ive successfully loaded the image and draw it in 3 different ways. Once from the AppGroup, once from the extension bundle and another time as a base64 string. No problems each time the image loads and renders. However, when I call the image to be drawn in a pixelbuffer it loads the extension into the dock. Id love to be able to draw an image and the extension not appear in the dock but I cannot figure this out. Can render text in this timer loop no problem. Any suggestions other than sending the image across the sink stream? Essentially i want to render the app logo whenever the host app is not sending signal to the extension. func startStreaming() { guard let _ = _bufferPool else { return } _streamingCounter += 1 _timer = DispatchSource.makeTimerSource(flags: .strict, queue: _timerQueue) _timer!.schedule(deadline: .now(), repeating: 1.0/Double(kFrameRate), leeway: .seconds(0)) _timer!.setEventHandler { if self.sinkStarted { return } var err: OSStatus = 0 var pixelBuffer: CVPixelBuffer? err = CVPixelBufferPoolCreatePixelBufferWithAuxAttributes(kCFAllocatorDefault, self._bufferPool, self._bufferAuxAttributes, &pixelBuffer) if err != 0 { logger.debug("out of pixel buffers \(err)") } if let pixelBuffer = pixelBuffer { CVPixelBufferLockBaseAddress(pixelBuffer, []) let pixelData = CVPixelBufferGetBaseAddress(pixelBuffer) let width = CVPixelBufferGetWidth(pixelBuffer) let height = CVPixelBufferGetHeight(pixelBuffer) let rgbColorSpace = CGColorSpaceCreateDeviceRGB() if let context = CGContext(data: pixelData, width: width, height: height, bitsPerComponent: 8, bytesPerRow: CVPixelBufferGetBytesPerRow(pixelBuffer), space: rgbColorSpace, bitmapInfo: CGBitmapInfo.byteOrder32Little.rawValue | CGImageAlphaInfo.premultipliedFirst.rawValue) { let graphicsContext = NSGraphicsContext(cgContext: context, flipped: false) NSGraphicsContext.saveGraphicsState() NSGraphicsContext.current = graphicsContext let cgContext = graphicsContext.cgContext let dstRect = CGRect(x: 0, y: 0, width: width, height: height) cgContext.clear(dstRect) cgContext.setFillColor(NSColor.black.cgColor) cgContext.fill(dstRect) let imageWidth = 400 // You can adjust the width as needed let imageHeight = 400 // You can adjust the height as needed let imageOrigin = CGPoint(x: (width - imageWidth) / 2, y: (height - imageHeight) / 2) // Center the image if let decodedData = Data(base64Encoded: imageBaseString, options: .ignoreUnknownCharacters), let image = NSImage(data: decodedData), let cgImage = image.cgImage(forProposedRect: nil, context: nil, hints: nil) { cgContext.draw(cgImage, in: CGRect(origin: imageOrigin, size: NSSize(width: imageWidth, height: imageHeight))) } NSGraphicsContext.restoreGraphicsState() } CVPixelBufferUnlockBaseAddress(pixelBuffer, []) } if let pixelBuffer = pixelBuffer { var sbuf: CMSampleBuffer! var timingInfo = CMSampleTimingInfo() timingInfo.presentationTimeStamp = CMClockGetTime(CMClockGetHostTimeClock()) err = CMSampleBufferCreateForImageBuffer(allocator: kCFAllocatorDefault, imageBuffer: pixelBuffer, dataReady: true, makeDataReadyCallback: nil, refcon: nil, formatDescription: self._videoDescription, sampleTiming: &timingInfo, sampleBufferOut: &sbuf) if err == 0 { self._streamSource.stream.send(sbuf, discontinuity: [], hostTimeInNanoseconds: UInt64(timingInfo.presentationTimeStamp.seconds * Double(NSEC_PER_SEC))) } } } _timer!.setCancelHandler {} _timer!.resume() } Culprit is here: if let decodedData = Data(base64Encoded: imageBaseString, options: .ignoreUnknownCharacters), let image = NSImage(data: decodedData), let cgImage = image.cgImage(forProposedRect: nil, context: nil, hints: nil) { cgContext.draw(cgImage, in: CGRect(origin: imageOrigin, size: NSSize(width: imageWidth, height: imageHeight))) } Doesnt matter how I load the image, the dock extension will trigger as soon as the draw loop is triggered or on extension init if I load the image then. Is this possible?
Posted Last updated
.
Post marked as solved
3 Replies
783 Views
I am currently writing a software product which involves a Camera Extension and a Cocoa application. I would like to share some files between the two components and as of my understanding this should be quite straightforward by putting both applications into the same App Group and then accessing the particular Group Container. However doing so, does result in both components accessing different locations for the Group Container. I am using the following piece of code to create a new folder inside the container: let directory = FileManager.default.containerURL(forSecurityApplicationGroupIdentifier: "group.235ADAK9D5.com.creativetoday.camskool")! let newDirectory = directory.appendingPathComponent("Mydir") try? FileManager.default.createDirectory(at: newDirectory, withIntermediateDirectories: false) If I run this I find that the Cocoa application is going to access the following Location and create the file there: /Users//Library/Group Containers//" Where as the Camera Extension will access the following Location and create the directory there: /private/var/db/cmiodalassistants/Library/Group Containers// If I create a file in one directory it does not appear in the other. I tried for both components to access the opposite directory but it results in an permission denied message. What am I doing wrong?
Posted
by SimonL90.
Last updated
.
Post not yet marked as solved
0 Replies
614 Views
My current app implements a custom video player, based on a AVSampleBufferRenderSynchronizer synchronising two renderers: an AVSampleBufferDisplayLayer receiving decoded CVPixelBuffer-based video CMSampleBuffers, and an AVSampleBufferAudioRenderer receiving decoded lpcm-based audio CMSampleBuffers. The AVSampleBufferRenderSynchronizer is started when the first image (in presentation order) is decoded and enqueued, using avSynchronizer.setRate(_ rate: Float, time: CMTime), with rate = 1 and time the presentation timestamp of the first decoded image. Presentation timestamps of video and audio sample buffers are consistent, and on most streams, the audio and video are correctly synchronized. However on some network streams, on iOS, the audio and video aren't synchronized, with a time difference that seems to increase with time. On the other hand, with the same player code and network streams on macOS, the synchronization always works fine. This reminds me of something I've read, about cases where an AVSampleBufferRenderSynchronizer could not synchronize audio and video, causing them to run with independent and potentially drifting clocks, but I cannot find it again. So, any help / hints on this sync problem will be greatly appreciated! :)
Posted
by jean-luc.
Last updated
.