-
Notifications
You must be signed in to change notification settings - Fork 29
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Define the interaction with Remote Playback API #123
Comments
To me it feels that each remoted media element should belong to its own content-like media session (so one could play one video locally and one remotely or even two remotely without them depending on each other). I don't know what should happen to the currently active media session if one of the media elements starts playing remotely? Seems like it would behave the same way as if the element was stopped? So if it was the only one, the active media session becomes empty, if it wasn't empty, the rest of the elements keep playing. |
From a audio focus point-of-view it makes sense that the remoted media element participates in the audio focus system at where remote playback takes place and not locally. But perhaps this is a given... :) If so, might it make sense to add a remote media session type? So when initiated for remote playback an element is added to an unique remote media session. As Anton says this unique media session would inherit meta data from the local session. Since it wouldn't request local audio focus it would essentially just be the notification and metadata. But what would .session point to? |
Would it make sense to only allow remote playback if there's at most one playing element in the session, to make it impossible for the members of a session to split into a local and a remote group? Then the original session could be allowed to represent the remote playback, and if necessary this could be web-exposed on the media session itself as some new state. Then, if one tries to play a (previously paused) media element that's in a remote session, it would either replace the existing element, both would play remotely, or it would throw an exception, depending on what seems possible to implement. (Spec would be made to match, of course.) I think in practice this would mean that remote playback doesn't work great with multi-element media session, but that doesn't seem terrible, given that things one would expect to play remotely are unlikely to be part of some composed media experience. If the protocols would allow it, I think this could be expanded to allow all members simultaneously to start remote playback, but not for now. |
In terms of spec changes, this would mean that |
Going back to this old discussion, we might want to use Chrome Android behaviour as an example. What we do today is that a remotely played element is removed from the default media session and added back when it is no longer played remotely. In term of specs, when the state switches to I agree with @avayvod above and I think we should allow multiple remote playbacks. If for some reasons, a user has a device in multiple room it their home, they should be able to control them simultaneously. However, things will be a bit odd with regards to media keys. We don't really mention this in the Media Session API but it is kind of implicit that being the active session gives you media keys access (or is it mentioned?). I would recommend leaving some leeway for implementations to deal with these cases. For example, usually, on Android, if multiple applications have an active MediaSession (as in the Android one), the latest to activate will have the keys so applications will usually ask for it back when focused in case of their are competing). |
The spec doesn't spell this out in https://mediasession.spec.whatwg.org/#activate but I think that's an oversight, given the notes for kinds other than "content" you can tell it was intended. If responding to media keys is no longer synonymous with being the (only) active session with kind "content", then that could require splitting these concepts entirely. That could be a lot of work, but just to entertain it, what is the ideal behavior? If there is one active local session and a number of remote sessions playing (whether or not to include this in "active" I don't know yet), is it just a matter of which session most recently had some kind of user interaction? Or would a script-trigger pause also bump a session to the top of the stack? |
Closing this issue since we are moving audio focus out to a separate API. |
This issue was moved to WICG/audio-focus#11 |
https://w3c.github.io/remote-playback/
This is an API which can cause an individual media element to play remotely. Local and remote playback are not mutually exclusive, as it could make sense to play one video locally and another remotely at the same time, either with different people watching each, or more speculatively, with the two being different parts of the same experience, playing in sync.
Key issues:
The text was updated successfully, but these errors were encountered: