Skip to content
This repository has been archived by the owner on Jan 12, 2019. It is now read-only.

Keep track of approximate total bytes in SourceBuffer / Prevent quota exceeded with 4K streams #1242

Closed

Conversation

tchakabam
Copy link
Contributor

Description

// With high-bitrate streams (like 4K etc, up to 16-20 Mbit/s)
// we can reach a QuotaExceeded error easily within just 60 seconds
// of buffer. Therefore we need to actively check the exact number
// of bytes occupied in the buffer.
// If we reach the limit, we are triming pre-emptively, enqueuing this action
// to the source-updater before any further loading can happen.
// Even if the back-buffer is alrady trimmed, we should wait before
// we append more data if the buffer is still too occupied.
//
// We monitor the bytes in buffer (see checkBuffer_())
// and drive the loading / triming actively
// to make sure we do not reach this limit.

Specific Changes proposed

  • add method totalBytesInBuffer to SourceBuffer
  • modifiy slightly SourceBuffer interface to pass duration
  • use above method to limit loading in SegmentLoader when limit is exceeded
  • move significant constants in Config module
  • trimBackBuffer_ improvements

Requirements Checklist

  • Feature implemented / Bug fixed
  • If necessary, more likely in a feature request than a bug fix
  • Reviewed by Two Core Contributors

@tchakabam tchakabam changed the title Feature/prevent quota exceeded Keep track of total bytes in SourceBuffer / Prevent quota exceeded Aug 25, 2017
@tchakabam tchakabam changed the title Keep track of total bytes in SourceBuffer / Prevent quota exceeded Keep track of approximate total bytes in SourceBuffer / Prevent quota exceeded Aug 25, 2017
@tchakabam tchakabam changed the title Keep track of approximate total bytes in SourceBuffer / Prevent quota exceeded Keep track of approximate total bytes in SourceBuffer / Prevent quota exceeded with 4K streams Aug 25, 2017
Copy link
Contributor

@gesinger gesinger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @tchakabam , sorry this has gone a while without any review. We definitely need to handle this case, and thank you for submitting the PR!

The main thing I wanted to hear your thoughts on is making use of the segment metadata track (and adding byte info to that) to calculate buffer occupancy on demand instead of maintaining more state in the source updater.

Thanks again!

// we can reach a QuotaExceeded error easily within just 60 seconds
// of buffer. Therefore we need to actively check the exact number
// of bytes occupied in the buffer.
// If we reach the limit, we are triming pre-emptively, enqueuing this action
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we are triming pre-emptively, enqueueing => we trim preemptively, and enqueue

src/segment-loader.js Show resolved Hide resolved
// and drive the loading / triming actively
// to make sure we do not reach this limit.
if (bufferedBytes >= Config.MAX_SOURCE_BUFFER_OCCUPATION_BYTES) {
// If we reached the limit let's trim the buffer already
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we...jamming situation I think we can remove, since the comment above states the intention well

// of jamming situation.
// We keep some back-buffer but reduce the allowed size significantly
// compared to the initial configuration.
this.logger_('SourceBuffer occupation exceeds limit, triming back-buffer and deferring further loading');
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

trimming

// We keep some back-buffer but reduce the allowed size significantly
// compared to the initial configuration.
this.logger_('SourceBuffer occupation exceeds limit, triming back-buffer and deferring further loading');
this.trimBackBuffer_(Config.BACK_BUFFER_LENGTH / 3);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any specific reason for the / 3? Maybe we can calculate how much buffer we need to trim judging by segment starts/ends and their respective sizes?

// on the source buffer during subsequent appends
//
// We also monitor the bytes in buffer (see checkBuffer_())
// and drive the loading / triming actively
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

trimming

@@ -859,7 +885,18 @@ export default class SegmentLoader extends videojs.EventTarget {
loadSegment_(segmentInfo) {
this.state = 'WAITING';
this.pendingSegment_ = segmentInfo;
this.trimBackBuffer_(segmentInfo);

// Chrome has a hard limit of 150MB of
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not too big of a deal, but I think this comment still makes sense to be in the trimBackBuffer function, since that is where the logic of how much to clear happens. loadSegment just needs to know to call trimBackBuffer

if (typeof segment.start === 'number' && typeof segment.end === 'number') {
this.mediaSecondsLoaded += segment.end - segment.start;
this.mediaSecondsLoaded += segmentDuration = segment.end - segment.start;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes sense, but is a bit hard to read. Maybe the conditionals just set the value of segmentDuration (or a ternary is used), then this.mediaSecondsLoaded += segmentDuration exists after?

if (mediaSource.readyState === 'closed') {
mediaSource.addEventListener('sourceopen', createSourceBuffer);
} else {
createSourceBuffer();
}
}

appendToBufferInfoQueue_(timestampOffset, duration, byteLength) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of putting all of the buffer info in SourceUpdater and keeping track of it, maybe it makes sense for us to add byte info to the segment metadata track https://github.com/videojs/videojs-contrib-hls#segment-metadata and we can make use of that.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For reference, the PR to include the required information in the segment metadata is #1210.

@tchakabam
Copy link
Contributor Author

Hey, thanks for the review!

And yes @squarebracket is right, if we combine both contributions, no need anymore for the rough estimation with the /3.

I hope I can progress on both of these this week! Thanks for the patience ;)

@squarebracket
Copy link
Contributor

Hey @tchakabam!

We've been talking in the slack channel about how to handle QuoteExceededErrors in the most robust fashion. There's a consensus that we should catch the exception and dynamically change BACK_BUFFER_LENGTH, MAX_GOAL_BUFFER_LENGTH, and MAX_SOURCE_BUFFER_OCCUPATION_BYTES based on the amount of time and data in the buffer when the exception occurs. This way we could handle various buffer sizes without knowing them in advance, and adjust them on-the-fly if system memory pressures force them to decrease in size.

That solution would require some additional functionality to what you've got here, and a new PR to contrib-media-sources, since there's a SourceBuffer shim that's used when transmuxing from MPEG-TS. (It calls appendBuffer async, so it can't be caught by contrib-hls.)

Would that be something you're up for? I can walk you through the details if you're interested. No pressure if not; I can code up the PRs and credit you with the basis for the solution :)

Thanks again for taking the time to contribute and moving us forwards on a fix.

@tchakabam
Copy link
Contributor Author

tchakabam commented Jan 30, 2018

@squarebracket Hey, thanks for the update on this.

These days I've got quite some new things on my plate, and unfortunately wont get to work on this one here anytime soon.

So please go on with that proposed solution :) Sounds very good to me 👍

Adapting the buffer limit dynamically is definitely favorable,

and even more favoreable to leverage the contrib-media-sources layer to handle this stuff, as it will be a common concern for any user of the API :)

Feel free to continue or close this here as you see fit!

@squarebracket
Copy link
Contributor

Sounds good :) thanks again for the initial PR.

I'll close this one once I've got a new one open.

@fbeaudet
Copy link

fbeaudet commented Feb 9, 2018

Hi :)
I'm having this issue only in firefox, I think my samples are limit. Looking forward for that robust solution, gonna update as soon as it's out.
Thank you!

@genelamarellis
Copy link

genelamarellis commented May 17, 2018

I know it's been a while, but any progress on this? Running into the exact same issue! I tried implementing the fix but I'm still running into an issue. Running this on some SmartTvs. The error is thrown on some, but not all. Question, if I wanted to reduce the size even more, which values should I change in the code? Thank you!

@squarebracket
Copy link
Contributor

Yes, actually! My apologies, I forgot to comment in this ticket.

These features are now being tracked in #1416. It would certainly be a big help to me if everyone interested in these features could go build the project with those changes and test it out with their sources / devices.

Closing this PR in favour of #1416.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants