AVPlayer streaming progress

I'm successfully using AVPlayer to stream audio from a server and what I want to do now is to show a custom UISlider who shows the progress of the buffering.

Something like this:

enter image description here

With AVPlayer there doesn't seem to be a way to get the total download size or the current downloaded amount for the audio file, only the current playing time and total play time.

There's any workarounds for this?


Solution 1:

I am just working on this, and so far have the following:

- (NSTimeInterval) availableDuration;
{
  NSArray *loadedTimeRanges = [[self.player currentItem] loadedTimeRanges];
  CMTimeRange timeRange = [[loadedTimeRanges objectAtIndex:0] CMTimeRangeValue];
  Float64 startSeconds = CMTimeGetSeconds(timeRange.start);
  Float64 durationSeconds = CMTimeGetSeconds(timeRange.duration);
  NSTimeInterval result = startSeconds + durationSeconds;
  return result;
}

Solution 2:

It should work well:

Objective-C:

- (CMTime)availableDuration
{
    NSValue *range = self.player.currentItem.loadedTimeRanges.firstObject;
    if (range != nil){
        return CMTimeRangeGetEnd(range.CMTimeRangeValue);
    }
    return kCMTimeZero;
}

Swift version:

func availableDuration() -> CMTime
{
    if let range = self.player?.currentItem?.loadedTimeRanges.first {
        return CMTimeRangeGetEnd(range.timeRangeValue)
    }
    return .zero
}

To watch current time value you can use: CMTimeShow([self availableDuration]); or CMTimeShow(availableDuration()) (for swift)

Solution 3:

Personally I do not agree that the timeRanges value will always have a count of 1.

According to the documentation

The array contains NSValue objects containing a CMTimeRange value indicating the times ranges for which the player item has media data readily available. The time ranges returned may be discontinuous.

So this may have values similar to:

[(start1, end1), (start2, end2)]

From my experience with the hls.js framework within the desktop web world, the holes between these time ranges could be very small or large depending on a multitude of factors, ex: seeking, discontinuities, etc.

So to correctly get the total buffer length you would need to loop through the array and get the duration of each item and concat.

If you are looking for a buffer value from current play head you would need to filter the time ranges for a start time that's greater than the current time and an end time that's less than current time.

public extension AVPlayerItem {

    public func totalBuffer() -> Double {
        return self.loadedTimeRanges
            .map({ $0.timeRangeValue })
            .reduce(0, { acc, cur in
                return acc + CMTimeGetSeconds(cur.start) + CMTimeGetSeconds(cur.duration)
            })
    }

    public func currentBuffer() -> Double {
        let currentTime = self.currentTime()

        guard let timeRange = self.loadedTimeRanges.map({ $0.timeRangeValue })
            .first(where: { $0.containsTime(currentTime) }) else { return -1 }

        return CMTimeGetSeconds(timeRange.end) - currentTime.seconds
    }

}