Unity streaming audio playback to AVAudioSession not playing correctly or captured by screen recording?

Our app is a game written in Unity where we have most of our audio playback handled by Unity. However, one of our game experiences utilized microphone input for speech recognition, and so in order for us to perform echo cancellation (while the game has audio playback), we setup an audio stream from Unity to native Swift code that performs the mixing of the input/output nodes.

We however found that by streaming the audio buffer to our AVAudioSession:

  1. The volume of the audio playback appears to output differently
  2. When capturing a screen recording of the app, the audio playback being played from AVAudioSession does not get captured at all.

Looking to figure out what could be causing the discrepency in playback as well as capture behaviour during screen recordings.

We setup the AVAudioSession with this configuration:

AVAudioSession.sharedInstance().setCategory(AVAudioSession.Category.playAndRecord, options: .mixWithOthers)

with

inputNode.setVoiceProcessingEnabled(true)

after connecting our IO and mixer nodes.

Any suggestions or ideas on what to look out for would be appreciated!