HTTP Live Streaming

RSS for tag

Send audio and video over HTTP from an ordinary web server for playback on Mac, iOS, and tvOS devices using HTTP Live Streaming (HLS).

HTTP Live Streaming Documentation

Posts under HTTP Live Streaming tag

92 Posts
Sort by:
Post not yet marked as solved
3 Replies
337 Views
Hi, This Apple provided program “SideBySideToMVHEVC” removes the audio track from our videos resulting in useless content. Advise for: code changes required permitting keeping the audio track Or advise for Apple tools permitting re-inserting the audio track after “SideBySideToMVHEVC” Sincerely, Olaf
Posted
by
Post not yet marked as solved
1 Replies
414 Views
I've encountered an issue with the seek bar time display in the video player on iOS 17, specifically affecting live stream videos using HLS manifests with the time displayed in am/pm format. As the video progresses, the displayed start time appears to shift backwards in time with a peculiar pattern: Displayed Start Time = Normal Start Time - Viewed Duration For instance, if a program begins at 9:00 AM, at 9:30 AM, the start time shown will erroneously be 8:30 AM. Similarly, at 9:40 AM, the displayed start time will be 8:20 AM. This issue is observed with both VideoPlayer and AVPlayerViewController on iOS 17. The same implementation of the video player on iOS 16 displays the duration of the viewed program and doesn’t have any issues. Please advise on any known workarounds or solutions to address this issue on iOS 17.
Posted
by
Post not yet marked as solved
2 Replies
620 Views
I have a VisionOS app that streams 180VR video to create immersive experiences, similar to the Apple TV+ immersive content. These videos are 8k60fps. When looking at "video encoding requirements" (1.25) in the hls authoring specifications, there are only specifications for up to 4k30fps for MV-HEVC. Using the "power of .75 rule" I think that 160000kbps could be close to the 8k60fps recommendation. For launching my app, I did my best guess in creating the following multivariant playlist, with a generous low end and very generous high end, with 6 second segment durations for all variants. However, in practice, users seem to be to only be picking up the lower quality bandwidth content (80000kbps) even when speed tests are showing 5x that on the device. This leads to a lot artifacts in the content during playback, since its lower quality. If I hard code a higher bit rate variant (like 240000kbps) it does playback, but obviously has a bit more lag to start up. Now that I have my Vision Pro, I've been able to see the Apple TV+ immersive content. I could be wrong, but it doesn't feel like its varying playback - when watch by tethering to my phone vs on high speed wifi, the content looks the same just a little slower to load on the phone. I'm looking for 3 points of guidance: Are there hls recommendations for 8k60fps video? For both bitrates and target duration for the segments? Any guesses as to why I am not able to pick up the high bitrates? (this could simply be that the higher ends are still too high) While 180VR is just stereo video on a larger canvas, the viewing experience is quite different due to the immersion. Are there special recommendations for 180VR video? (Such as having only one variant and specified bitrate since a changing bitrate/video quality could be jarring to the viewer) Example HLS multivariant playlist: #EXTM3U #EXT-X-STREAM-INF:AVERAGE-BANDWIDTH=80632574,BANDWIDTH=82034658,VIDEO-RANGE=SDR,CODECS="mp4a.40.2,hvc1.1.60000000.L183.B0",RESOLUTION=4096x4096,FRAME-RATE=59.940,CLOSED-CAPTIONS=NONE https://myurl.com/stream/t4096p_080000_kbps_t6/prog_index.m3u8 #EXT-X-STREAM-INF:AVERAGE-BANDWIDTH=160782750,BANDWIDTH=162523567,VIDEO-RANGE=SDR,CODECS="mp4a.40.2,hvc1.1.60000000.L186.B0",RESOLUTION=4096x4096,FRAME-RATE=59.940,CLOSED-CAPTIONS=NONE https://myurl.com/stream/t4096p_160000_kbps_t6/prog_index.m3u8 #EXT-X-STREAM-INF:AVERAGE-BANDWIDTH=240941602,BANDWIDTH=243997537,VIDEO-RANGE=SDR,CODECS="mp4a.40.2,hvc1.1.60000000.L186.B0",RESOLUTION=4096x4096,FRAME-RATE=59.940,CLOSED-CAPTIONS=NONE https://myurl.com/stream/t4096p_240000_kbps_t6/prog_index.m3u8 #EXT-X-STREAM-INF:AVERAGE-BANDWIDTH=321061516,BANDWIDTH=325312545,VIDEO-RANGE=SDR,CODECS="mp4a.40.2,hvc1.1.60000000.H255.B0",RESOLUTION=4096x4096,FRAME-RATE=59.940,CLOSED-CAPTIONS=NONE https://myurl.com/stream/t4096p_320000_kbps_t6/prog_index.m3u8
Posted
by
Post not yet marked as solved
0 Replies
605 Views
HELP! How could I play a spatial video in my own vision pro app like the official app Photos? I've used API of AVKit to play a spatial video in XCode vision pro simulator with the guild of the official developer document, this video could be played but it seems different with what is played through app Photos. In Photos the edge of the video seems fuzzy but in my own app it has a clear edge. How could I play the spatial video in my own app with the effect like what is in Photos?
Posted
by
Post not yet marked as solved
0 Replies
383 Views
Hello, I keep getting this kind of errors (specifically 16247) when trying to download DRM-protected HLS content, would anyone have a clue about the reason why ? Error Domain=CoreMediaErrorDomain Code=-16247 "(null)" UserInfo={_NSURLErrorRelatedURLSessionTaskErrorKey=("BackgroundAVAssetDownloadTask Thanks Sylvain
Posted
by
Post not yet marked as solved
3 Replies
565 Views
I am testing HLS Low latency streams but could not to a low playback latency. For comparison purpose I used Apple HLS-LL stream(https://ll-hls-test.cdn-apple.com/llhls4/ll-hls-test-04/multi.m3u8) where I can get latency of 6 seconds on Safari and AVPlayer. The main difference between the non-working and working streams is CAN-BLOCK-RELOAD attribute. This is set as YES for Apple HLS-LL test stream where latency is observed as 6 seconds vs other stream where latency is ~= 3 full segments duration(~18 seconds). As a test I tried to change CAN-BLOCK-RELOAD=YES to CAN-BLOCK-RELOAD=NO in variant playlist response of the Apple stream using a proxy tool after which I get same result of latency ~= 3 full segments duration(~18 seconds). On AVPlayer, I see following log with CAN-BLOCK-RELOAD=YES which seems to point towards AVPlayer entering low latency mode when setting up blocking reload. mediaserverd <SEGPUMP> segPumpSetupBlockingReload: 0x1040e7600: Player entering Low Latency mode Can Apple support or someone forum members who might have experience with this please share if CAN-BLOCK-RELOAD=YES is mandatory for low latency mode to be enabled in Safari browser and AVPlayer on iOS/tvOS?
Posted
by
Post not yet marked as solved
0 Replies
361 Views
Since iOS 17.2. the video player in Safari becomes black if I jump forward in a HLS video stream. I only hear the sound of the video. If I close the full screen and reopen it the video continious normally. I checked if the source meets all the requirements mentioned here and it does. Does anybody have the same issue or maybe a solution for this problem?
Posted
by
Post not yet marked as solved
0 Replies
619 Views
According to docs the webkitEnterFullScreen() only works in IOS if the element is <video>, but own after i have updated to IOS 17.1.2 it's not working, i have tested it in chrome and safari both. Even the test code W3School for Fullscreen does not work in IOS 17.1.2 Test Done Model: 1Phone 15 pro max IOS Version: 17.0.2 Browser: Chrome Test URL https://www.w3schools.com/howto/howto_js_fullscreen.asp
Posted
by
Post not yet marked as solved
0 Replies
456 Views
Hi Team, We see an issue with this version if CoreMedia requesting multiple qualities at all times for a stream. We don't see this issue on 1.0.0.21C62. We are unsure what would be causing this. [2024-01-05 16:53:51] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=0 HTTP/1.0" 200 1145 2529 1090199 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:52] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=0 HTTP/1.0" 200 1146 2396 1013356 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:52] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_0.fmp4 HTTP/1.0" 200 1139 24975 1013385 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:52] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=1 HTTP/1.0" 200 1145 2603 998670 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:52] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_1.fmp4 HTTP/1.0" 200 1138 40534 998739 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:53] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=2 HTTP/1.0" 200 1145 2677 835327 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:53] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_2.fmp4 HTTP/1.0" 200 1138 57656 835207 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:53] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=1 HTTP/1.0" 200 1146 2458 986038 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:53] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_1.fmp4 HTTP/1.0" 200 1139 24700 986032 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:54] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=3 HTTP/1.0" 200 1145 2751 1013257 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:54] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_3.fmp4 HTTP/1.0" 200 1138 55900 1013324 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:54] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=2 HTTP/1.0" 200 1146 2520 1016693 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:54] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_2.fmp4 HTTP/1.0" 200 1139 25014 1016717 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:55] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=4 HTTP/1.0" 200 1145 2825 917753 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:55] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_4.fmp4 HTTP/1.0" 200 1138 103745 917903 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:55] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=3 HTTP/1.0" 200 1146 2582 958102 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:55] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_3.fmp4 HTTP/1.0" 200 1139 24782 958195 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:56] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=5 HTTP/1.0" 200 1145 2899 931101 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:56] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_5.fmp4 HTTP/1.0" 200 1138 112113 931228 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:56] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=4 HTTP/1.0" 200 1146 2644 935550 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:56] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_4.fmp4 HTTP/1.0" 200 1139 24824 937720 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:57] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=5 HTTP/1.0" 200 1146 2706 895680 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:57] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_5.fmp4 HTTP/1.0" 200 1139 24843 895734 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:57] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=631&_HLS_part=0 HTTP/1.0" 200 1145 2529 907045 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
Posted
by
Post not yet marked as solved
0 Replies
372 Views
I'm trying to use AVCaptureSession and AVAssetWriter to convert video and audio from an iPhone's camera and microphone into a fragmented video file in AppleHLS format. Below is part of the code. It seems that the capture is successful, and I have confirmed that the data received with captureOutput() can be appended to videoWriterIput and audioWriterInput using append(). When executing audioWriterInput!.append(sampleBuffer), sampleBuffer has the following value, and it looks like the audio data has been passed to AssetWriter. sampleBuffer.duration : CMTime(value: 941, timescale: 44100, flags: __C.CMTimeFlags(rawValue: 1), epoch: 0) sampleBuffer.totalSampleSize : 1882 However, the final output init.mp4 and *.m4s do not contain Audio. (The video can be played without any problems.) Could you please tell me any problems or hints as to why Audio is not included? /// Capture Session let captureSession = AVCaptureSession() /// Capture Input var videoDevice: AVCaptureDevice? var audioDevice: AVCaptureDevice? /// Configure and Start Capture Session func startCapture() { // Start Configuration captureSession.beginConfiguration() // Setup Input Video videoDevice = self.defaultCamera(cameraSide: cameraSide) videoDevice!.activeVideoMinFrameDuration = CMTimeMake(value: 1, timescale: 30) let videoInput = try AVCaptureDeviceInput(device: videoDevice!) as AVCaptureDeviceInput captureSession.addInput(videoInput) // Setup Input Audio audioDevice = AVCaptureDevice.default(for: AVMediaType.audio) let audioInput = try AVCaptureDeviceInput(device: audioDevice!) as AVCaptureDeviceInput captureSession.addInput(audioInput) // Setup Output Video let videoDataOutput = AVCaptureVideoDataOutput() videoDataOutput.setSampleBufferDelegate(self, queue: recordingQueue) videoDataOutput.alwaysDiscardsLateVideoFrames = true videoDataOutput.videoSettings = [ kCVPixelBufferPixelFormatTypeKey: Int(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)] as [String : Any] captureSession.addOutput(videoDataOutput) // Setup Output Audio let audioDataOutput = AVCaptureAudioDataOutput() audioDataOutput.setSampleBufferDelegate(self, queue: recordingQueue) captureSession.addOutput(audioDataOutput) //End Configuration captureSession.commitConfiguration() // Start Capture captureSession.startRunning() } private let assetWriter: AVAssetWriter? private let startTimeOffset: CMTime private var audioWriterInput: AVAssetWriterInput? private let videoWriterInput: AVAssetWriterInput? func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) { if assetWriter == nil { // AssetWriter assetWriter = AVAssetWriter(contentType: UTType(AVFileType.mp4.rawValue)!) self.startTimeOffset = CMTime(value: 1, timescale: 1) // Setup Input of Audio. let audioCompressionSettings: [String: Any] = [ AVFormatIDKey: kAudioFormatMPEG4AAC, AVSampleRateKey: 44_100, AVNumberOfChannelsKey: 1, AVEncoderBitRateKey: 128_000 ] audioWriterInput = AVAssetWriterInput(mediaType: .audio, outputSettings: audioCompressionSettings) audioWriterInput!.expectsMediaDataInRealTime = true assetWriter.add(audioWriterInput!) // Setup Input of Video. let videoCompressionSettings: [String: Any] = [ AVVideoCodecKey: AVVideoCodecType.h264 ] let videoCompressionSettings: [String: Any] = [ AVVideoCodecKey: AVVideoCodecType.h264, AVVideoWidthKey: 1280, AVVideoHeightKey: 720, AVVideoCompressionPropertiesKey: [ kVTCompressionPropertyKey_AverageBitRate: 1_024_000, kVTCompressionPropertyKey_ProfileLevel: kVTProfileLevel_H264_Baseline_AutoLevel ] ] videoWriterInput = AVAssetWriterInput(mediaType: .video, outputSettings: videoCompressionSettings) videoWriterInput!.expectsMediaDataInRealTime = true pixelBuffer = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput: videoWriterInput!, sourcePixelBufferAttributes: [kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_32BGRA)]) assetWriter.add(videoWriterInput!) // Configure the asset writer for writing data in fragmented MPEG-4 format. assetWriter.outputFileTypeProfile = AVFileTypeProfile.mpeg4AppleHLS assetWriter.preferredOutputSegmentInterval = CMTime(seconds: 1.0, preferredTimescale: 1) assetWriter.initialSegmentStartTime = startTimeOffset assetWriter.delegate = self // start AssetWriiter startTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer) assetWriter.startWriting() assetWriter.startSession(atSourceTime: startTime) } let isVideo = output is AVCaptureVideoDataOutput if isVideo { if videoWriterInput.isReadyForMoreMediaData { videoWriterInput!.append(sampleBuffer) } }else{ if audioWriterInput!.isReadyForMoreMediaData { audioWriterInput!.append(sampleBuffer) } } } func assetWriter(_ writer: AVAssetWriter, didOutputSegmentData segmentData: Data, segmentType: AVAssetSegmentType, segmentReport: AVAssetSegmentReport?) { : : }
Post not yet marked as solved
4 Replies
1k Views
I am trying to set up HLS with MV HEVC. I have an MV HEVC MP4 converted with AVAssetWriter that plays as a "spatial video" in Photos in the simulator. I've used ffmpeg to fragment the video for HLS (sample m3u8 file below). The HLS of the mp4 plays on a VideoMaterial with an AVPlayer in the simulator, but it is hard to determine if the streamed video is stereo. Is there any guidance on confirming that the streamed mp4 video is properly being read as stereo? Additionally, I see that REQ-VIDEO-LAYOUT is required for multivariant HLS. However if there is ONLY stereo video in the playlist is it needed? Are there any other configurations need to make the device read as stereo? Sample m3u8 playlist #EXTM3U #EXT-X-VERSION:3 #EXT-X-TARGETDURATION:13 #EXT-X-MEDIA-SEQUENCE:0 #EXTINF:12.512500, sample_video0.ts #EXTINF:8.341667, sample_video1.ts #EXTINF:12.512500, sample_video2.ts #EXTINF:8.341667, sample_video3.ts #EXTINF:8.341667, sample_video4.ts #EXTINF:12.433222, sample_video5.ts #EXT-X-ENDLIST
Posted
by
Post not yet marked as solved
1 Replies
398 Views
Can we confirm that as of iOS 16.3.1, key frames for MPEGTS via HLS are mandatory now? I've been trying to figure out why https://chaney-field3.click2stream.com/ shows "Playback Error" across Safari, Chrome, Firefox, etc.. I ran the diagnostics against one of the m3u8 files that is generated via Developer Tools (e.g. mediastreamvalidator "https://e1-na7.angelcam.com/cameras/102610/streams/hls/playlist.m3u8?token=" and then hlsreport validation_data.json) and see this particular error: Video segments MUST start with an IDR frame Variant #1, IDR missing on 3 of 3 Does Safari and iOS devices explicitly block playback when it doesn't find one? From what I understand AngelCam simply acts as a passthrough for the video/audio packets and does no transcoding but converts the RTSP packets into HLS for web browsers But IP cameras are constantly streaming their data and a user connecting to the site may be receiving the video between key frames, so it would likely violate this expectation. From my investigation it also seems like this problem also started happening in iOS 16.3? I'm seeing similar reports for other IP cameras here: https://ipcamtalk.com/threads/blue-iris-ui3.23528/page-194#post-754082 https://www.reddit.com/r/BlueIris/comments/1255d78/ios_164_breaks_ui3_video_decode/ For what it's worth, when I re-encoded the MPEG ts files (e.g. ffmpeg-i /tmp/streaming-master-m4-na3.bad/segment-375.ts -c:v h264 /tmp/segment-375.ts) it strips the non key frames in the beginning and then playback works properly if I host the same images on a static site and have the iOS device connect to it. It seems like Chrome, Firefox, VLC, and ffmpeg are much more forgiving on missing key frames. I'm wondering what the reason for enforcing this requirement? And can I confirm it's been a recent change?
Posted
by
Post not yet marked as solved
0 Replies
352 Views
We're experimenting with a stream that has a large (10 minutes) clear portion in front of the protected section w/Fairplay. We're noticing that AVPlayer/Safari trigger calls to fetch the license key even while it's playing the clear part, and once we provide the key, playback fails with: name = AVPlayerItemFailedToPlayToEndTimeNotification, object = Optional(<AVPlayerItem: 0x281ff2800> I/NMU [No ID]), userInfo = Optional([AnyHashable("AVPlayerItemFailedToPlayToEndTimeErrorKey"): Error Domain=CoreMediaErrorDomain Code=-12894 "(null)"]) - name : "AVPlayerItemFailedToPlayToEndTimeNotification" - object : <AVPlayerItem: 0x281ff2800> I/NMU [No ID] ▿ userInfo : 1 element ▿ 0 : 2 elements ▿ key : AnyHashable("AVPlayerItemFailedToPlayToEndTimeErrorKey") - value : "AVPlayerItemFailedToPlayToEndTimeErrorKey" - value : Error Domain=CoreMediaErrorDomain Code=-12894 "(null)" It seems like AVPlayer is trying to decrypt the clear portion of the stream...and I'm wondering if it's because we've set up our manifest incorrectly. Here it is: #EXTM3U #EXT-X-VERSION:8 #EXT-X-TARGETDURATION:20 #EXT-X-MEDIA-SEQUENCE:0 #EXT-X-INDEPENDENT-SEGMENTS #EXT-X-PLAYLIST-TYPE:VOD #EXT-X-MAP:URI="clear-asset.mp4",BYTERANGE="885@0" #EXT-X-DEFINE:NAME="path0",VALUE="clear-asset.mp4" #EXTINF:9.98458, #EXT-X-BYTERANGE:81088@885 {$path0} #EXTINF:19.96916, #EXT-X-BYTERANGE:159892@81973 {$path0} #EXTINF:19.96916, #EXT-X-BYTERANGE:160245@241865 {$path0} #EXT-X-DISCONTINUITY #EXT-X-MAP:URI="secure-asset.mp4",BYTERANGE="788@0" #EXT-X-DEFINE:NAME="path1",VALUE="secure-asset.mp4" #EXT-X-KEY:METHOD=SAMPLE-AES,URI="skd://guid",KEYFORMAT="com.apple.streamingkeydelivery",KEYFORMATVERSIONS="1" #EXTINF:19.96916, #EXT-X-BYTERANGE:159928@5196150 {$path1} #EXT-X-ENDLIST
Posted
by
Post not yet marked as solved
0 Replies
449 Views
I am reaching out to you as I am currently trying to solve an issue involving AVPlayer, and I have encountered a challenge related to handling errors for video segments. In our implementation, we have noticed that AVPlayer tends to make contiuous calls to fetch the video segments when video segments returns errors. AVPlayer tries for approximately 30 seconds before throwing an error. We observed this issue when video segments return 404 or 5xx errors. Please fnd below screenshot for the same. Is there any recommended approach or configuration setting that can be applied to restrict the number of calls AVPlayer makes in such scenarios. We are particularly interested in finding a solution that can help reduce the number of calls that are made in case of such failures. I look forward to hearing from you soon and appreciate your support in resolving this matter.
Posted
by
Post not yet marked as solved
0 Replies
283 Views
Dear Apple Engineers, First of all, thank you for this wonderful and very necessary native solution. Question: Is it possible to use this API when processing HLS? Thank you.
Posted
by
Post not yet marked as solved
0 Replies
354 Views
Hi Team, Offline playback with AES-128 encryption I'm downloading HLS content that is AES-128 encrypted and using the AVAssetResourceLoaderDelegate method shouldWaitForLoadingOfRequestedResource to parse the manifest to fetch the AES key URL. After fetching the key URL, I'll download and save the AES key locally. I will use the locally saved key to start the offline playback. Since AVContentKeySession has been there for quite some time, is it okay to use the resource loader delegate method to parse and download the AES key? Is there any chance that Apple will deprecate the downloading keys through the resource loader delegate? Thanks, Deepak.N
Posted
by
Post not yet marked as solved
2 Replies
513 Views
I'm using mediafilesegmenter with input as a fragmented mp4 hvc1 file and got this error: Nov 23 2023 17:48:25.948: Fragmented MP4 is the only supported container format for the segmentation of HEVC content Nov 23 2023 17:48:25.948: Unsupported media type 'hvc1' in track 0 Nov 23 2023 17:48:25.948: Unable to find any valid tracks to segment. Segmenting failed (-12780).
Posted
by
Post not yet marked as solved
0 Replies
441 Views
I'm encountering an issue with live video streaming on iOS 17 using AVMutablePlayer. I'm utilizing a wss URL to stream videos by capturing data in chunks (e.g., 5 seconds) and playing it. Upon completion of the 5-second segment, I load another 5 seconds using self.player.replaceCurrentItem(with: nextPlayerItem). Despite listening to events via self.player.currentItem?.observe, the functionality appears to be working well on iOS 16 but consistently displays a blank video on iOS 17. private func playNext() { let nextSet = self.dataCollector.getNextItem(length: self.configuration.frameDelay) if nextSet.count == self.configuration.frameDelay { var playerTime:Int = self.player.currentItem != nil ? Int(CMTimeGetSeconds(player.currentTime())) : 0 var allData = Data() allData.appendAll(dataSet: dataCollector.getFileType()) nextSet.forEach { (data) in playerTime += 1 allData.append(data.getFragmentData()) self.currentFragmentTimes.updateValue(data.getFragmentTime(), forKey: playerTime) } if(allData.count > 0) { self.player.replaceCurrentItem(with: AVPlayerItem(asset: AVMutableMovie(data:allData, options: nil))) self.playerInitializedTime = nil self.player.play() } } }
Posted
by
Post not yet marked as solved
0 Replies
337 Views
I have the m3u8 like this #EXTM3U #EXT-X-STREAM-INF:AVERAGE-BANDWIDTH=190000,BANDWIDTH=240000,RESOLUTION=240x160,FRAME-RATE=24.000,CODECS="avc1.42c01e,mp4a.40.2",CLOSED-CAPTIONS=NONE tracks-v1a1/mono.m3u8?thumbnails=10 #EXT-X-IMAGE-STREAM-INF:BANDWIDTH=10000,RESOLUTION=240x160,CODECS="jpeg",URI="images-240x160/tpl-0-60-10.m3u8?thumbnails=10" and I have no thumbnails in the Safari native player. Could you please tell me why?
Posted
by