Post not yet marked as solved
We find that ReplayKit as of iOS 9.3.2 (haven't tested in iOS 10) is unable to record audio output when the audio unit sub type is kAudioUnitSubType_VoiceProcessingIO and ReplayKit is used via startRecordingWithMicrophoneEnabled(false). When we switch to using sub type kAudioUnitSubType_RemoteIO, ReplayKit works as expected. This is for a live video app where viewers receive live video and audio. We do not use AVPlayer (AVPlayer is the only incompatibility listed in the docs).Are there any workarounds that will allow use of kAudioUnitSubType_VoiceProcessingIO with ReplayKit?Thanks,Robert
Post not yet marked as solved
This error occurs when my app exits the background after starting the screen recording operation. If this error occurs, how can it be avoided or solved? Every time, the process is killed to continue to be enabled
Post not yet marked as solved
I want to develop the application where I can able to share screen (display)from iphone(or from iphone to iphone) for remote support(or any other method) for this this I need screen captute application or API or system call. If any app is there in iphone or app store or any screen sharing client which can give me screen capture support.
Post not yet marked as solved
I have in app screen recording functionality in my app and while recording I want to exclude PHPickerViewController/UIDocumentPickerViewController from being recorded if opened in the app during recording. How can I achieve this ? Is there any configuration available in ReplayKit using which I can achieve this behaviour?
Post not yet marked as solved
My existing code is working properly in iOS < 17 devices it records the iPhone screen and records audio as well simultaneously, but in iOS 17 devices the screen recording video is captured for only 2 seconds and then stops automatically, As its an extension, i don't have logs to debug the issue.
I have tested the same code in other iPhones and OS less than 17, its working fine but in iOS 17 devices this issue is coming.
@try {
NSLog(@“initAssesWriter”);
NSError *error = nil;
CGRect screenRect = [[UIScreen mainScreen] bounds];
_videoWriter = [[AVAssetWriter alloc] initWithURL:
_filePath fileType:AVFileTypeMPEG4
error:&error];
NSParameterAssert(_videoWriter);
//Configure video
NSDictionary* videoCompressionProps = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithDouble:2048*1024.0], AVVideoAverageBitRateKey,
nil ];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecTypeH264, AVVideoCodecKey,
[NSNumber numberWithInt:screenRect.size.width * 4], AVVideoWidthKey,
[NSNumber numberWithInt:screenRect.size.height * 4], AVVideoHeightKey,
videoCompressionProps, AVVideoCompressionPropertiesKey,
nil];
_writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings] ;
_writerInput.expectsMediaDataInRealTime = YES;
NSParameterAssert(_writerInput);
NSParameterAssert([_videoWriter canAddInput:_writerInput]);
[_videoWriter addInput:_writerInput];
AudioChannelLayout acl;
bzero( &acl, sizeof(acl));
acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono;
NSDictionary* audioOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[ NSNumber numberWithInt: kAudioFormatMPEG4AAC], AVFormatIDKey,
[ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,
[ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
[ NSData dataWithBytes: &acl length: sizeof( AudioChannelLayout ) ], AVChannelLayoutKey,
[ NSNumber numberWithInt: 64000 ], AVEncoderBitRateKey,
nil];
_audioWriterInput = [AVAssetWriterInput
assetWriterInputWithMediaType: AVMediaTypeAudio
outputSettings: audioOutputSettings ];
_audioWriterInput.expectsMediaDataInRealTime = YES; // seems to work slightly better
NSParameterAssert(_audioWriterInput);
NSParameterAssert([_videoWriter canAddInput:_audioWriterInput]);
[_videoWriter addInput:_audioWriterInput];
[_videoWriter setMovieFragmentInterval:CMTimeMake(1, 600)];
[_videoWriter startWriting];
} @catch (NSException *exception) {
} @finally {
}
-(void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType{
@try {
if(!_isRecordingStarted){
[_videoWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)];
_isRecordingStarted = YES;
[self saveFlurryLogs:@"Assest writer Start Recording" Details:@""];
NSLog(@"CMSampleBufferGetPresentationTimeStamp");
}
} @catch (NSException *exception) {
[self saveFlurryLogs:@"Recording Start Execption" Details:exception.description];
} @finally {
}
@try {
switch (sampleBufferType) {
case RPSampleBufferTypeVideo:
// Handle video sample buffer
if([_writerInput isReadyForMoreMediaData]){
[_writerInput appendSampleBuffer:sampleBuffer];
NSLog(@"writing matadata Video");
}
break;
case RPSampleBufferTypeAudioApp:
// Handle audio sample buffer for app audio
break;
case RPSampleBufferTypeAudioMic:
if([_audioWriterInput isReadyForMoreMediaData]){
[_audioWriterInput appendSampleBuffer:sampleBuffer];
NSLog(@"writing matadata Audio");
}
// Handle audio sample buffer for mic audio
break;
default:
break;
}
} @catch (NSException *exception) {
[self saveFlurryLogs:@"Packet Write Execption" Details:exception.description];
} @finally {
}
}
Post not yet marked as solved
we have just a black screen and don't have audio and video !
Post not yet marked as solved
Hello everyone.
I am trying to make a small utility that, in the context of digital forensics, logs the desktop.
The utility is to be started via shell like this :
"./nemeapp start path_to_file" and be terminated in "./nemeapp stop".
The code I wrote is:
import Foundation
import ReplayKit
let arguments = CommandLine.arguments
guard arguments.count == 4 else {
print("Utilizzo: nome_script start|stop percorso_file include_audio(true|false)")
exit(0)
}
let command = arguments[1]
let filePath = arguments[2]
let includeAudio = arguments[3] == "true"
switch command {
case "start":
startScreenRecording(filePath: filePath, includeAudio: includeAudio)
case "stop":
stopScreenRecording()
default:
print("Comando non riconosciuto. Utilizzo: nome_script start|stop percorso_file include_audio(true|false)")
}
func startScreenRecording(filePath: String, includeAudio: Bool) {
if RPScreenRecorder.shared().isAvailable {
RPScreenRecorder.shared().startRecording(handler: { error in
if let unwrappedError = error {
print("Errore durante l'avvio della registrazione: \(unwrappedError.localizedDescription)")
} else {
print("La registrazione dello schermo è stata avviata correttamente. Il file verrà salvato in: \(filePath)")
}
})
} else {
print("La registrazione dello schermo non è disponibile.")
}
}
func stopScreenRecording() {
RPScreenRecorder.shared().stopRecording { previewViewController, error in
if let unwrappedError = error {
print("Errore durante l'arresto della registrazione: \(unwrappedError.localizedDescription)")
} else {
print("La registrazione dello schermo è stata interrotta correttamente.")
}
}
}
Unfortunately, the code returns no error message. Only when I give the stop command does it tell me that the registration never started.
I can't even figure out if it is a permissions issue.
Post not yet marked as solved
I just used replaykit to achieve simple screen recording,But sometimes there may be frequent occurrences of automatic stopping of recording, i don`t why ?
have anyone konw this bug ?
use Api:
func startCapture(handler captureHandler: ((CMSampleBuffer, RPSampleBufferType, Error?) -> Void)?, completionHandler: ((Error?) -> Void)? = nil)
open func stopCapture(handler: ((Error?) -> Void)? = nil)
``
The prerequisite for monitoring the automatic stop recording method is that you have already started screen recording and have not actively called to stop recording
let publisher = recorder.publisher(for: .isRecording)
let cancelBag = Subscribers.Sink<Bool, Never>(receiveCompletion: {[weak self] complete in
///
}, receiveValue: {[weak self] value in
///
})
publisher.subscribe(cancelBag)
Post not yet marked as solved
I'm currently working on live screen broadcasting app which allows the user's to record their screen to save a mp4 video. I write video file by AVAssetWriter, and it works fine. But, when there is 1GB-2BG of storage space remaining on the device, errors such as "Attempted to start an invalid broadcast session" frequently occur, and video files cannot be played due to not call assetWriter.finishWriting().
Occur on device:
iPhone se3
iPhone 12 pro max
iPhone 13
iPad 19
iPad air 5
I have tried the movieFragmentInterval of AVAssetWriter to write movie fragments , set shouldOptimizeForNetworkUse true/false , not working.The video can not be played.
I want to known how to observe or catch this error? Thanks!
Post not yet marked as solved
I am using ReplayKit's RPScreenRecorder to record my app. When I use it in a mixed immersive space, nothing is actually recorded. The video is entirely blank.
Is this a feature or a bug? I am trying to record everything the user sees, including passthrough. Is there another way to do this?
Post not yet marked as solved
I have the conflict when i use replaykit. There is other app or system use the recording. I want my app can recognize them and let them use the replaykit preferentially.