You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This issue tracks discussion of protocol changes to allow multi-device synchronization of media playback while streaming. This was discussed at the Berlin F2F [1].
It's assumed that if there's one sender and one receiver the current protocol is sufficient to play out audio and video on the receiver with lip sync.
However, once there are multiple receivers, we'll need some timing metadata to be exchanged between the sender and receivers. Here are a few possible scenarios (not exhaustive):
Sending audio to one device and video to another.
Sending audio to multiple devices.
Sending video to multiple devices.
Sending audio and video to multiple devices (possibly with multiple audio and video tracks).
Scenarios involving text tracks or metadata cues.
Scenarios involving non-1.0-rate playback.
Not all of these may be in scope, however. Items 1 and 4 were pointed out as important in Berlin.
Next steps are to research what's feasible from an implementation point of view, and study the proposals in the following groups:
Here is some information about the sychronisation protocol for companion screens in HbbTV. I hope this is useful input into the sync capability we design into the Open Screen Protocol. The goal of the HbbTV protocol is to enable very close synchronisation so that a closely related media stream, such as audio description, could be played on the companion device in sync with the program content on the TV.
This issue tracks discussion of protocol changes to allow multi-device synchronization of media playback while streaming. This was discussed at the Berlin F2F [1].
It's assumed that if there's one sender and one receiver the current protocol is sufficient to play out audio and video on the receiver with lip sync.
However, once there are multiple receivers, we'll need some timing metadata to be exchanged between the sender and receivers. Here are a few possible scenarios (not exhaustive):
Not all of these may be in scope, however. Items 1 and 4 were pointed out as important in Berlin.
Next steps are to research what's feasible from an implementation point of view, and study the proposals in the following groups:
Multi-Device Timing CG Timing Object: https://webtiming.github.io/timingobject/
Media Timed Events TF: https://github.com/WICG/datacue/blob/master/explainer.md
[1] https://www.w3.org/2019/05/23-webscreens-minutes.html#x29
The text was updated successfully, but these errors were encountered: