Audio / Video sync issue on iOS using AVSampleBufferRenderSynchronizer

My current app implements a custom video player, based on a AVSampleBufferRenderSynchronizer synchronising two renderers:

  • an AVSampleBufferDisplayLayer receiving decoded CVPixelBuffer-based video CMSampleBuffers,
  • and an AVSampleBufferAudioRenderer receiving decoded lpcm-based audio CMSampleBuffers.

The AVSampleBufferRenderSynchronizer is started when the first image (in presentation order) is decoded and enqueued, using avSynchronizer.setRate(_ rate: Float, time: CMTime), with rate = 1 and time the presentation timestamp of the first decoded image.

Presentation timestamps of video and audio sample buffers are consistent, and on most streams, the audio and video are correctly synchronized.

However on some network streams, on iOS, the audio and video aren't synchronized, with a time difference that seems to increase with time.

On the other hand, with the same player code and network streams on macOS, the synchronization always works fine.

This reminds me of something I've read, about cases where an AVSampleBufferRenderSynchronizer could not synchronize audio and video, causing them to run with independent and potentially drifting clocks, but I cannot find it again.

So, any help / hints on this sync problem will be greatly appreciated! :)