Although the API is different (the Remote Playback API extends the HTMLMediaElement interface), the spec re-uses patterns and design consideration from the Presentation API whenever appropriate. For example, usually, on Android, if multiple applications have an active MediaSession (as in the Android one), the latest to activate will have the keys so applications will usually ask for it back when focused in case of their are competing). The Remote Playback API is to media content what the Presentation API is to web content. I would recommend leaving some leeway for implementations to deal with these cases. We don't really mention this in the Media Session API but it is kind of implicit that being the active session gives you media keys access (or is it mentioned?). ![]() However, things will be a bit odd with regards to media keys. If for some reasons, a user has a device in multiple room it their home, they should be able to control them simultaneously. I agree with above and I think we should allow multiple remote playbacks. Controlling a remote playback route Sample code In order to use the MediaRouter framework within your app, you must get an instance of the MediaRouter object and attach a MediaRouter.Callback object to listen for routing events. It would apply for the default or a specified media session. In term of specs, when the state switches to connected, the remotely played element would be removed from whatever media session it is on and when it's back to disconnected, it would be back. What we do today is that a remotely played element is removed from the default media session and added back when it is no longer played remotely. ![]() Going back to this old discussion, we might want to use Chrome Android behaviour as an example. If the protocols would allow it, I think this could be expanded to allow all members simultaneously to start remote playback, but not for now. I think in practice this would mean that remote playback doesn't work great with multi-element media session, but that doesn't seem terrible, given that things one would expect to play remotely are unlikely to be part of some composed media experience. Supports DRM-free WAV, AIFF, M4A, AAC, MP3, MOV, 3GP, CAF, and FLAC. Ultra-efficient & low latency playback built on core iOS technologies. Passcode lock your shows to prevent unauthorized edits. ![]() (Spec would be made to match, of course.) OSC control API for remote triggering from QLab or other OSC sources. Then, if one tries to play a (previously paused) media element that's in a remote session, it would either replace the existing element, both would play remotely, or it would throw an exception, depending on what seems possible to implement. Would it make sense to only allow remote playback if there's at most one playing element in the session, to make it impossible for the members of a session to split into a local and a remote group? Then the original session could be allowed to represent the remote playback, and if necessary this could be web-exposed on the media session itself as some new state.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |