Skip to content
This repository has been archived by the owner on Mar 1, 2024. It is now read-only.

Document frontend library usage and examples #131

Closed
lukehb opened this issue Mar 1, 2023 · 5 comments
Closed

Document frontend library usage and examples #131

lukehb opened this issue Mar 1, 2023 · 5 comments
Labels
documentation Improvements or additions to documentation
Milestone

Comments

@lukehb
Copy link
Contributor

lukehb commented Mar 1, 2023

We would like to the following for 5.2:

  • create some "getting started" documentation for Frontend library
  • create some examples showing common API usage of the library
  • publish our generated API docs using typedoc
@lukehb lukehb added the documentation Improvements or additions to documentation label Mar 1, 2023
@lukehb lukehb added this to the 5.2 milestone Mar 1, 2023
@lukehb
Copy link
Contributor Author

lukehb commented Mar 1, 2023

@AntiF0 Inviting you to collaborate on this one if you are interested.
@hmuurine We should capture this in our collaboration.

@adrianZahra
Copy link
Contributor

Most of it seems to be there with the old docs. Some things to consider however:

  • Ensure all up to date Config options are explained some of these will be important for SPS. We will link to this in our docs
  • Documentation for the Text Overlay And the Action Overlay; These are the bases to which all overlays spawn from SPS users usually want to know about this stuff so we will link to this in our docs as well.
    Some General knowledge that differs from the old docs:
  • AFKs are a type of action overlay however they come with some extra features for their functionality
  • Freeze Frames are not classed as overlays anymore. even though they may seem similar in functionality they are not made to be customised and are meant to be used as is

@lukehb
Copy link
Contributor Author

lukehb commented Mar 3, 2023

When we get round to this.

Here are some preliminary notes from the SPS team that may be relevant with some cleanup.

Overview

Understanding the frontend architecture and event lifecycle

The process of communicating with the Scalable Pixel Streaming backend server components to initiate a streaming session and then establish a WebRTC connection between the frontend and a Pixel Streaming application instance is relatively complex, and follows a series of distinct phases that typically occur one after another in a linear sequence. The frontend library abstracts away the underlying details of this process within the webRtcPlayerController class (hereafter referred to as the player controller) and its internal components, and exposes simple hooks for responding to key events in each phase, which we refer to as lifecycle events.

The delegate implementation is responsible for responding to these lifecycle events and managing the UI state of the webpage to reflect the current state of the streaming session. This includes displaying information or errors to the user, registering event handlers for user input events, and invoking methods of the player controller to trigger parts of the lifecycle that are dependent on user input. Control flow is managed by the player controller during all lifecycle phases except for the initial Setup Phase, during which control flow briefly becomes the responsibility of the delegate.

The diagram below depicts all of the lifecycle events that the delegate can respond to, along with the control flow between the player controller and the delegate:

{% include figure.html
image="customise/frontend/frontend_architecture.svg"
caption="Figure 1: The multi-phase event lifecycle of the frontend and the corresponding control flow between the webRtcPlayerController class and the delegate implementation."
class="xlarge"
%}

A high-level overview of the core logic for each phase, along with the corresponding lifecycle events, are described in the sections below.

Setup Phase

  1. The delegate is created and is supplied with the Config configuration object discussed in the section Configuring library behaviour.

  2. The player controller is created and is supplied with a reference to the delegate, along with a copy of the configuration object.

  3. The player controller then provides the delegate with a reference to itself (defined using the IWebRtcPlayerController interface type to prevent a circular dependency) so the delegate can store the reference and use it invoke functionality in the player controller as needed.

  4. At this point control shifts to the delegate. The delegate is responsible for triggering the Authentication Phase (discussed below) at its discretion. The logic for triggering this will typically depend on the value of the enableSpsAutoplay field in the configuration object:

    • If autoplay is enabled then the delegate should instruct the player controller to immediately initiate a WebSocket connection to the signalling server.

    • If autoplay is disabled then the delegate should register an event handler that is triggered by a user interaction (e.g. clicking on a "Play" or "Start" UI element) and the event handler should instruct the player controller to initiate a WebSocket connection to the signalling server.

  5. The player controller establishes a WebSocket connection with the signalling server and awaits incoming messages.

WebRTC Connection Phase

{% include alerts/info.html content="For a detailed explanation of the process of establishing a WebRTC connection, see the section Establishing a Peer-to-Peer Connection in the WebRTC chapter of the excellent textbook High Performance Browser Networking by Ilya Grigorik, which is freely available online and is licensed under a Creative Commons license." %}

  1. The player controller transmits a WebRTC offer to the signalling server, which is forwarded to the Pixel Streaming application instance.

  2. The Pixel Streaming application instance responds with a WebRTC answer, which the signalling server forwards to the frontend.

  3. The player controller notifies the delegate that a WebRTC answer has been received using the onWebRtcConnecting() event hook so it can inform the user through the page's UI. Any message displayed to the user should indicate that the Pixel Streaming application instance has started successfully and that a WebRTC connection is now being established.

  4. The player controller and the Pixel Streaming application instance exchange ICE candidates by sending them to the signalling server, which forwards them as appropriate.

  5. The player controller attempts to establish a WebRTC connection with the Pixel Streaming application instance using the information from the ICE candidates. There are two possible outcomes:

    • If the connection fails to be established then the player controller notifies the delegate using the onWebRtcFailed() event hook so it can inform the user through the page's UI.

    • If the connection is established successfully then the player controller notifies the delegate using the onWebRtcConnected() event hook so it can inform the user through the page's UI.

  6. The player controller creates the <video> element for displaying the video stream (and optionally an <audio> element if audio synchronisation is disabled, as discussed in the section Configuring library behaviour) so the media streams can start being received from the Pixel Streaming application instance. The player controller notifies the delegate using the onVideoInitialised() event hook so it can respond as needed to the creation of these DOM elements and configure the page's state to ensure everything is ready to start displaying the video output.

  7. The audio and video streams start playing, and the user can begin interacting with the Pixel Streaming application. This represents the start of the Streaming Phase.

Streaming Phase

  1. Now that the streaming session is up and running, the Pixel Streaming application instance transmits a message to the frontend describing the initial stream settings, such as minimum and maximum video bitrate. The player controller notifies the delegate using the onInitialSettings() event hook so it can update any UI elements used to display the settings. If the Pixel Streaming application is configured to accept settings updates from the user during streaming then these UI elements may also be interactive controls that allow the user to specify new values.

  2. Throughout the course of the streaming session, the Pixel Streaming application instance may transmit a number of messages to the frontend:

    • If the delegate requests a stream latency test by invoking the sendLatencyTest() method of the player controller, then the Pixel Streaming application instance will transmit latency test results to the frontend. When latency test results are received, the player controller notifies the delegate using the onLatencyTestResult() event hook so it can display the results through the page's UI.

    • When multiple users are connected to a single Pixel Streaming application instance and a WebRTC Selective Forwarding Unit (SFU) is not in use, the Pixel Streaming application will select one user as the "quality control owner". The network conditions of the quality control owner are used to automatically adjust the video quality for all connected users. Ownership may change as users connect and disconnect, and the Pixel Streaming application instance will transmit an update to the frontend whenever there is an ownership change. When a quality control ownership update is received, the player controller notifies the delegate using the onQualityControlOwnership() event hook so it can update any UI elements that are used to display the current ownership status. Note that the current version of Scalable Pixel Streaming does not support multiple users connecting to a single Pixel Streaming application instance, so the user will always be the quality control owner for the entire duration of the streaming session.

    • The Pixel Streaming application instance will periodically transmit updates to the frontend containing the average Quantization Parameter (QP) for the video frames during the last 1 second of the stream, which acts as a measure of the current video quality. When these updates are received, the player controller notifies the delegate using the onVideoEncoderAvgQP() event hook so it can update any UI elements that are used to display the latest QP value.

    • The Pixel Streaming application instance will periodically transmit updates to the frontend containing WebRTC statistics for the stream. When new statistics are received, the player controller notifies the delegate using the onVideoStats() event hook so it can update any UI elements that are used to display the latest values.

Disconnect Phase

  1. If either the WebSocket connection to the signalling server or the WebRTC connection to the Pixel Streaming application instance is disconnected at any point prior to the user closing the page, the player controller notifies the delegate using the onDisconnect() event hook so it can inform the user through the page's UI. Any message displayed to the user should indicate that the session has disconnected abnormally, due to either a backend issue (such as the Pixel Streaming application instance crashing) or an interruption to the user's network connection. The Pixel Streaming application instance will be cleaned up automatically in the event of an abnormal disconnection event, so if the user refreshes the page then a new Pixel Streaming application instance will be created for the new streaming session.

  2. When the user closes the page, the signalling server will detect the closure of the WebSocket connection and will destroy the Pixel Streaming application instance to reflect the successful completion of the streaming session.

Understanding overlays

Overlay controllers are the entities in the frontend library which are responsible for creating, displaying and hiding the various types of overlays supported by the frontend. The frontend library ships with default implementations for all of the supported overlay controllers, each of which can be extended or replaced by developers in order to customise overlay appearance and behaviour. Overlay controllers also have their own interfaces, through which both delegates and the player controller can manipulate overlay UI elements. Public properties that reference each type of overlay controller interface form part of the API contract for the delegate interface. The relationship between delegates and their overlay controllers is depicted in the diagram below:

{% include figure.html
image="customise/frontend/overlay_architecture.svg"
caption="Figure 2: The relationship between overlay controllers and delegate implementations."
class="xlarge"
%}

Every delegate must contain an instance of each of the three supported types of overlay controllers. The constructor of the [DelegateBase]({{ site.data.links.frontend.github }}/blob/v{{ site.data.versions.frontend }}/library/src/Delegate/DelegateBase.ts) base class instantiates the default implementations for each of the overlay controllers, and delegate implementations that extend the base class may choose to replace one or more of these with instances of custom overlay controller implementations. Delegate implementations that implement the [IDelegate]({{ site.data.links.frontend.github }}/blob/v{{ site.data.versions.frontend }}/library/src/Delegate/IDelegate.ts) interface directly are responsible for instantiating all three overlay controller objects, irrespective of whether the defaults are used or custom implementations are provided. (Note that it is strongly recommended that delegate implementations extend the base class rather than implementing the interface directly. See the Getting started writing a delegate implementation section below for a discussion of the benefits provided by the base class.)

The frontend supports a variety of different overlays that are used throughout the event lifecycle. Rather than representing every individual type of overlay with a corresponding controller, overlays are grouped into three controller classes, each of which is discussed in turn below.

Overlay Controller

The main overlay controller (known simply as "the overlay controller") provides the core functionality for creating, hiding and showing overlays that is used by both itself and the other two overlay controllers. This controller is responsible for managing all of the standard overlays that are used throughout the event lifecycle, including:

  • The connect overlay, which prompts the user to interact with the page to initiate the connection process during the Setup Phase.
  • The play overlay, which prompts the user to interact with a "Start" or "Play" control to begin playback of the video stream during the Setup Phase.
  • Text overlays, which are used to display information or error messages to the user.

(TODO: why do we require two separate user interactions when one should theoretically be sufficient to meet the criteria for playing media streams? If the connect overlay is an optional step in scenarios where the play overlay is used, should we have a configuration option to disable it that is separate from the autoplay option?)

AFK Overlay Controller

The AFK overlay controller is responsible for displaying the AFK overlay, which is displayed when user inactivity detection is enabled and the user remains idle for the configured number of seconds, as discussed in the section Configuring library behaviour.

(TODO: why does the AFK overlay controller contain the actual business logic of the AFK functionality (i.e. the detection timers) in addition to the logic for presenting the overlay UI? These concerns should be separated and the business logic should be moved to the player controller or its internal components. Once the business logic has been removed, the question then becomes whether the UI code could simply be merged into the main overlay controller?)

Freeze Frame Overlay Controller

Pixel Streaming applications can be configured to support an optional feature known as freeze frames, whereby rendering is temporarily paused and the video stream is replaced with a static image. When interacting with a Pixel Streaming application that supports freeze frames, the freeze frame overlay controller is responsible for displaying these static images.

(TODO: the freeze frame overlay controller also combines business logic with UI code. Should this be refactored in the same manner as the AFK overlay controller, or are there unique characteristics of this class that mean the refactor should be handled in a different manner?)

Getting started writing a delegate implementation

When writing a new delegate implementation, it is strongly recommended that you extend the [DelegateBase]({{ site.data.links.frontend.github }}/blob/v{{ site.data.versions.frontend }}/library/src/Delegate/DelegateBase.ts) base class rather than directly implementing the [IDelegate]({{ site.data.links.frontend.github }}/blob/v{{ site.data.versions.frontend }}/library/src/Delegate/IDelegate.ts) interface. The base class provides the following functionality:

  • Implements some of the common functionality required during the Setup Phase, including storing the reference to the player controller when it is first received, and instantiating the default implementations for each of the three overlay controllers discussed in the section above. This reduces the amount of boilerplate code that needs to be included in delegate implementations.

  • Provides default implementations for a number of interface methods, reducing the minimum number of methods that need to be implemented when writing a new delegate implementation.

The easiest way to get started is to take a look at one of the [example delegate implementations]({{ site.data.links.frontend.github }}/blob/v{{ site.data.versions.frontend }}/delegates/examples) that are included with the frontend source code on GitHub and then modify the example to suit your needs. The production-ready nature of the DOM API delegate implementation makes it less suited for use as a learning resource or a minimal starting point, but it is still designed to be easily modifiable by developers who simply wish to tweak one or more aspects of its behaviour.

(TODO: update the example delegate implementations to reflect the recent refactor and add them to the GitHub repo)

@lukehb
Copy link
Contributor Author

lukehb commented Mar 6, 2023

This PR #132 has some excellent information that we can roll into our update of this page:

https://github.com/EpicGames/PixelStreamingInfrastructure/blob/master/Frontend/Docs/Communicating%20from%20the%20Player%20Page%20to%20UE5.md

@EpicGames EpicGames deleted a comment Apr 27, 2023
@lukehb
Copy link
Contributor Author

lukehb commented Jun 6, 2023

We have added frontend docs in a number of PRs, I will close this uber ticket and we can address any missing docs on a case by case basis.

@lukehb lukehb closed this as completed Jun 6, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
documentation Improvements or additions to documentation
Development

No branches or pull requests

2 participants