-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add monoscopic 360° video support #702
Merged
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This reverts commit 7341953.
It is useful to have a very simple example without PiP support. Other basc examples now support PiP anyway.
The leak could be reproduced as follows: 1. Open playlist example. 2. Enter PiP. 3. Open custom or system example WITHOUT PiP support. 4. Restore playlist from PiP. 5. Close the playlist. The playlist VM was never correctly released.
waliid
approved these changes
Jan 2, 2024
This reverts commit 2bb8597. Contrary to what this commit pretended, this works.
waliid
approved these changes
Jan 2, 2024
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Description
This 🧑🎄 PR adds support for monoscopic 360° video content, based on the past work done in SRG Media Player.
360.mov
Unlike SRG Media Player no support for stereoscopic 360° video content is provided. This could be added with little effort on the video view side, but the problem with stereoscopic support is that it usually requires much more than the ability to display content for each eye. You not only need an additional VR viewing device (e.g. a cardboard headset into which you slip your phone, or crossing eyes 😵💫) but usually also a proper onboarding experience. Controls are also a challenge since you don't have foveal tracking with such devices, and cardboard devices lack buttons, inconveniently forcing users to extract their device for interactions.
Monoscopic 360° support has never been a Pillarbox requirement (and never will on other platforms) since 360° content is not a priority anymore at SRG SSR. But it was fun to add, proving that Pillarbox player is flexible enough to handle this kind of use case as well. Moreover can the Best player in the world™️ have no 360° support? 😉
During the implementation I also discovered an issue with our Picture in Picture implementation, described in #696. I intentionally wanted to fix the issue on a separate branch but proper 360° integration required this issue to be fixed as well. This made the PR bigger than initially expected, but that was the price to pay to ensure everything was working as expected. Sorry for that.
Monoscopic 360° video support design consideration
Monoscopic 360° support has been built right into
VideoView
. I initially intended to have a separate view but having everything handled by theVideoView
transparently is far better from an integration point of view.Unlike what was done in SRG Media Player there are no touch or gyro controls built directly into the
VideoView
. A modifier makes it possible to enable monoscopic support and control the orientation at which content is seen. The rest is the responsibility of the app, as can be seen in the demo, since different user experiences might be desired when displaying 360° content. A set of quaternion helpers is provided to make it easier to orient the camera or relate orientation to CoreMotion attitude.Note that we cannot determine whether content is monoscopic or not based on some HLS playlist metadata or the video resolution. Monoscopic content is usually delivered in with 2:1 aspect ratio but this same format is also popular among filmmakers (e.g. the Morning Show examples we have in our demo). The choice has therefore to be made externally by the app, e.g. with the help of some metadata delivered with the content to be played.
Picture in Picture design considerations
I revisited our Picture in Picture implementation to solve #696, making the implementation a lot simpler in the process.
The main issue was that our reference counting strategy was performing cleanups even if PiP was never started. Our reference counting could namely not differentiate between actual parent player view dismissal or simple video view removal / insertion within a non-dismissed parent player view, leading to early deallocation.
To solve this issue while retaining our current formalism (at least with minimal API changes) I realized that basic and advanced PiPs are nearly identical, except that advanced PiP might outlive the player view during PiP itself. This is where associated player view state must be persisted, for which I introduced a dedicated mechanism based on a
PictureInPicturePersistable
protocol. This protocol not only ensures state persistence and restoration automatically, but also provides hooks into the Picture in Picture life cycle so that the state can be adjusted properly if needed.The other advantage of this approach is that no code update to static properties is required from developers integrating PiP into an existing implementation, which stays almost the same. This has several benefits:
@StateObject
s in their implementation, as they usually do. For proper restoration they must only check if a suitablePictureInPicturePersistable.persisted
instance is available first.PlaylistView
example) is now trivial and similar to other examples.I could make the
PictureInPicture
singleton implementation much simpler as well. While this object is now a delegate of other PiP implementations to manage the common persistable state (like before the in-app PiP modifier is applied at a parent view level, without knowing whether aVideoView
or aSystemVideoView
will be used) I could remove theMode
which was feeling clunky.I could also remove other infamous tricks like a dispatch on the next run loop to avoid a crash when PiP controllers were deallocated.
Finally I ensures that layers are properly managed to avoid unexpected pauses in some restoration scenarios. More information has been added to #612.
With all these fixes I could successfully implement advanced Picture in Picture in a multi-instance context, something we omitted in our initial Picture in Picture implementation:
PiP.mov
We also now support advanced restoration scenarios, e.g. playing the same content in Picture in Picture with a custom layout, then restoring into the system player, without any playback interruption:
PiP.mov
Several restoration scenarios that our API should make it possible to implement, involving custom and system views with or without PiP, have been listed as comment on #612, including videos.
Breaking API changes
VideoView
orSystemVideoView
options to the view itself, these options have been moved to modifiers for better ergonomics.View/enabledForInAppPictureInPictureWithCleanup(perform:)
modifier has been replaced withSwiftUI/View/enabledForInAppPictureInPicture(persisting:)
. Implementation of a view model which can be persisted has been made simpler with the introduction of an emptyPictureInPicturePersistable
protocol which automatically provides persistence abilities to existing objects. Instead of implementing cleanup with a closure only a persistable object is now required to ensure proper state persistence during Picture in Picture.Changes made for 360° support
PlayerConfiguration
requires it) since this behavior is incorrectly not applied for playback involvingSKVideoNode
.MultiView
player.VideoView
andSystemVideoView
implementations over several files for better readability.Modal
to avoid incorrect drag gesture detection outside the safe area boundaries (revealed when implementing 360° navigation gestures).No explicit documentation was added for 360° support since most of the work has to be done client-side and depends on the desired user experience. The rest is quaternion stuff for which I added some documentation pointers.
Changes made to fix Picture in Picture
VideoView
is initially present #696 by revisiting our implementation, as described above.MultiView
example. Also add a button to swap players between both views.PlayerViewModel
.ObservableObject
since we know that memory management issues existed prior to 17.2.WillAppearView
.Player
is correctly released but the underlyingAVQueuePlayer
might take a bit more time to be deallocated when URNs are involved (but ultimately it will be deallocated, there is no permanent resource leak). I therefore applied a safer fix by setting the volume to 0 whenPlayer
is deallocated (we cannot useisMuted
since its observation would lead to different analytics at deallocation time, making a few UTs fail). This fix works for both iOS and tvOS and replaces the old one introduced in Sound continues for a while after closing the tvOS player #520.Known issues
MultiView
example displaying 360° content.Checklist