From 06ac5dc1f614ad29ed52226cdf98b7af294dcb3e Mon Sep 17 00:00:00 2001 From: ThibaultBee <37510686+ThibaultBee@users.noreply.github.com> Date: Fri, 10 Jan 2025 13:44:43 +0100 Subject: [PATCH] docs(*): update documentation with v3 features --- README.md | 215 +++++++++++------- .../DEVELOPER_README.md | 83 +++---- 2 files changed, 179 insertions(+), 119 deletions(-) rename DEVELOPER_README.md => docs/DEVELOPER_README.md (65%) diff --git a/README.md b/README.md index 8814ffe0..25b1495d 100644 --- a/README.md +++ b/README.md @@ -59,58 +59,18 @@ android { * Ultra low-latency based on [SRT](https://github.com/Haivision/srt) * Network adaptive bitrate mechanism for [SRT](https://github.com/Haivision/srt) -## Samples - -### Camera and audio sample - -For source code example on how to use camera and audio streamers, check -the [sample app directory](https://github.com/ThibaultBee/StreamPack/tree/master/demos/camera). On -first launch, you will have to set RTMP url or SRT server IP in the settings menu. - -### Screen recorder - -For source code example on how to use screen recorder streamer, check -the [sample screen recorder directory](https://github.com/ThibaultBee/StreamPack/tree/master/demos/screenrecorder) -. On first launch, you will have to set RTMP url or SRT server IP in the settings menu. - -### Tests with a FFmpeg server - -FFmpeg has been used as an SRT server+demuxer+decoder for the tests. - -#### RTMP - -Tells FFplay to listen on IP `0.0.0.0` and port `1935`. - -``` -ffplay -listen 1 -i 'rtmp://0.0.0.0:1935/s/streamKey' -``` - -On StreamPack sample app settings, set `Endpoint` -> `Type` to `Stream to a remove RTMP device`, -then set the server `URL` to `rtmp://serverip:1935/s/streamKey`. At this point, StreamPack sample -app should successfully sends audio and video frames. On FFplay side, you should be able to watch -this live stream. - -#### SRT - -Check how to build FFmpeg with libsrt -in [SRT CookBook](https://srtlab.github.io/srt-cookbook/apps/ffmpeg/). Tells FFplay to listen on -IP `0.0.0.0` and port `9998`: - -``` -ffplay -fflags nobuffer 'srt://0.0.0.0:9998?mode=listener' -``` - -On StreamPack sample app settings, set the server `IP` to your server IP and server `Port` to `9998` -. At this point, StreamPack sample app should successfully sends audio and video frames. On FFplay -side, you should be able to watch this live stream. - ## Quick start If you want to create a new application, you should use the template [StreamPack boilerplate](https://github.com/ThibaultBee/StreamPack-boilerplate). In 5 minutes, you will be able to stream live video to your server. -1. Request the required permissions in your Activity/Fragment. +## Getting started + +### Getting started for a camera stream + +1. Request the required permissions in your Activity/Fragment. See the + [Permissions](#permissions) section for more information. 2. Creates a `SurfaceView` to display camera preview in your layout @@ -138,7 +98,15 @@ There are 2 types of streamers: - callback based: streamer APIs use callbacks ```kotlin -// For coroutine based +/** + * For coroutine based. + * Suspend and flow have to be called from a coroutine scope. + * Android comes with coroutine scopes like `lifecycleScope` or `viewModelScope`. + * Call suspend functions from a coroutine scope: + * viewModelScope.launch { + * streamer.startStream(uri) + * } + */ val streamer = CameraSingleStreamer(context = requireContext()) // For callback based // val streamer = CameraCallbackSingleStreamer(context = requireContext()) @@ -212,7 +180,41 @@ streamer.release() ``` For more detailed explanation, check out -the [API documentation](https://thibaultbee.github.io/StreamPack). +the [documentation](#doc). + +For a complete example, check out the [demos/camera](demos/camera) directory. + +### Getting started for a screen recorder stream + +1. Requests the required permissions in your Activity/Fragment. See the + [Permissions](#permissions) section for more information. +2. Creates a `MyService` that extends `DefaultScreenRecorderService` (so you can customize + notifications among other things). +3. Creates a screen record `Intent` and requests the activity result + +```kotlin +ScreenRecorderSingleStreamer.createScreenRecorderIntent(context = requireContext()) +``` + +4. Starts the service + +```kotlin +DefaultScreenRecorderService.launch( + requireContext(), + MyService::class.java, + { streamer -> + streamer.activityResult = result + try { + configure(streamer) + } catch (t: Throwable) { + // Handle exception + } + startStream(streamer) + } +) +``` + +For a complete example, check out the [demos/camera](demos/screenrecorder) directory . ## Permissions @@ -231,6 +233,8 @@ You need to add the following permissions in your `AndroidManifest.xml`: For a record, you also need to request the following dangerous permission: `android.permission.WRITE_EXTERNAL_STORAGE`. +### Permissions for a camera stream + To use the camera, you need to request the following permission: ```xml @@ -254,6 +258,8 @@ For the PlayStore, your application might declare this in its `AndroidManifest.x ``` +### Permissions for a screen recorder stream + To use the screen recorder, you need to request the following permission: ```xml @@ -328,53 +334,106 @@ See the `demos/camera` for a complete example. You can also create your own `targetRotation` provider. +## Documentations + +[StreamPack API guide](https://thibaultbee.github.io/StreamPack) + +- Additional documentation is available in the `docs` directory: + - [Live and record simultaneously](docs/LiveAndRecordSimultaneously.md) + - [Elements specific configuration](docs/StreamerElementsSettings.md) + - [Endpoints](docs/Endpoints.md) + - For definitions,... see the [Developer README](docs/DEVELOPER_README.md) + +## Demos + +### Camera and audio demo + +For source code example on how to use camera and audio streamers, +check [demos/camera](demos/camera). On +first launch, you will have to set RTMP url or SRT server IP in the settings menu. + +### Screen recorder demo + +For source code example on how to use screen recorder streamer, check +the [demos/screenrecorder](demos/screenrecorder) +. On first launch, you will have to set RTMP url or SRT server IP in the settings menu. + +### Tests with a FFmpeg server + +FFmpeg has been used as an SRT server+demuxer+decoder for the tests. + +#### RTMP + +Tells FFplay to listen on IP `0.0.0.0` and port `1935`. + +``` +ffplay -listen 1 -i 'rtmp://0.0.0.0:1935/s/streamKey' +``` + +On StreamPack sample app settings, set `Endpoint` -> `Type` to `Stream to a remove RTMP device`, +then set the server `URL` to `rtmp://serverip:1935/s/streamKey`. At this point, StreamPack sample +app should successfully sends audio and video frames. On FFplay side, you should be able to watch +this live stream. + +#### SRT + +If libsrt is not already installed on your FFmpeg, you have to build FFmpeg with libsrt. +Check how to build FFmpeg with libsrt +in [SRT CookBook](https://srtlab.github.io/srt-cookbook/apps/ffmpeg/). Tells FFplay to listen on +IP `0.0.0.0` and port `9998`: + +``` +ffplay -fflags nobuffer 'srt://0.0.0.0:9998?mode=listener' +``` + +On StreamPack sample app settings, set the server `IP` to your server IP and server `Port` to`9998`. +At this point, StreamPack sample app should successfully sends audio and video frames. On FFplay +side, you should be able to watch this live stream. + ## Tips ### RTMP or SRT -RTMP and SRT are both live streaming protocols . SRT is a UDP - based modern protocol, it is -reliable -and ultra low latency . RTMP is a TCP - based protocol, it is also reliable but it is only low -latency . +RTMP and SRT are both live streaming protocols. SRT is a UDP-based modern protocol, it is +reliable and ultra low latency. RTMP is a TCP-based protocol, it is also reliable but it is only low +latency. There are already a lot of comparison over the Internet, so here is a summary: -SRT: --Ultra low latency(< 1 s) --HEVC support through MPEG -TS RTMP : --Low latency (2 - 3 s) --HEVC not officially support (specification has been aban by its creator) +* SRT: + - Ultra low latency(< 1 s) +* RTMP: + - Low latency (2 - 3 s) So, the main question is : "which protocol to use?" It is easy: if your server has SRT support, use SRT otherwise use RTMP. ### Streamers -Let's start with some definitions! `Streamers` are classes that represent a streaming pipeline: -capture, encode, mux and send.They comes in multiple flavours: with different audio and video -source . 3 types of base streamers -are available : +Let's start with some definitions! +`Streamers` are classes that represent a whole streaming pipeline: +they capture, encode, mux and send. +They comes in multiple flavours: with different audio and video +source. +3 main types of streamers are available : -`CameraSingleStreamer`: for streaming from camera -`ScreenRecorderSingleStreamer`: for streaming from screen -`AudioOnlySingleStreamer`: for streaming audio only Since 3.0.0, the endpoint of a `Streamer` is inferred from the `MediaDescriptor` object passed to -the `open` or `startStream` methods.It is possible to limit the possibility of the endpoint by +the `open` or `startStream` methods. It is possible to limit the possibility of the endpoint by implementing your own `DynamicEndpoint.Factory` or passing a endpoint as the `Streamer` `endpoint` -parameter.To create a `Streamer` for a new source, you have to create a new `Streamer` class that -inherits -from `SingleStreamer` . +parameter. To create a `Streamer` for a new source, you have to create a new `Streamer` class that +inherits from `SingleStreamer`. ### Get device capabilities Have you ever wonder : "What are the supported resolution of my cameras?" or "What is the supported sample rate of my audio codecs ?"? `Info` classes are made for this. All `Streamer` comes with a -specific `Info` object : - - ```kotlin +specific `Info` object: + ```kotlin val info = streamer.getInfo(MediaDescriptor("rtmps://serverip:1935/s/streamKey")) - ``` For static endpoint or an opened dynamic endpoint, you can directly get the info: @@ -386,40 +445,40 @@ val info = streamer.info ### Get extended settings If you are looking for more settings on streamer, like the exposure compensation of your camera, you -must have a look on `Settings` class. Each `Streamer` elements (such -as `VideoSource`, `AudioSource`,...) +must have a look on `Settings` class. Each `Streamer` elements (such as `VideoSource`, +`AudioSource`,...) comes with a public interface that allows to have access to specific information or configuration. ```kotlin - (streamer.videoSource as IPublicCameraSource).settings +(streamer.videoSource as ICameraSource).settings ``` For example, if you want to change the exposure compensation of your camera, on a `CameraStreamers` you can do it like this: ```kotlin - (streamer.videoSource as IPublicCameraSource).settings.exposure.compensation = value +(streamer.videoSource as ICameraSource).settings.exposure.compensation = value ``` Moreover you can check exposure range and step with: ```kotlin - (streamer.videoSource as IPublicCameraSource).settings.exposure.availableCompensationRange -(streamer.videoSource as IPublicCameraSource).settings.exposure.availableCompensationStep +(streamer.videoSource as ICameraSource).settings.exposure.availableCompensationRange +(streamer.videoSource as ICameraSource).settings.exposure.availableCompensationStep ``` ### Screen recorder Service -To record the screen, you have to use the `DefaultScreenRecorderStreamer` inside +To record the screen, you have to use the `ScreenRecorderSingleStreamer` inside an [Android Service](https://developer.android.com/guide/components/services). To simplify this integration, StreamPack provides the `DefaultScreenRecorderService` classes. Extends one of these class and overrides `onNotification` to customise the notification. -### Android SDK version +### Android versions Even if StreamPack sdk supports a `minSdkVersion` 21. I strongly recommend to set the -`minSdkVersion` of your application to a higher version (the highest is the best!) for higher +`minSdkVersion` of your application to a higher version (the highest is the best!) for better performance. ## Licence diff --git a/DEVELOPER_README.md b/docs/DEVELOPER_README.md similarity index 65% rename from DEVELOPER_README.md rename to docs/DEVELOPER_README.md index d2f5d87a..b8c2417e 100644 --- a/DEVELOPER_README.md +++ b/docs/DEVELOPER_README.md @@ -4,41 +4,42 @@ ### Definitions -`Source`: -A class that represents an audio or video source. For example, a camera (`CameraSource`), or a -microphone (`AudioSource`). - -`Encoder`: -A class that represents an audio or video encoders. Only Android MediaCodec API is used ( -`MediaCodecEncoder`). - -`Endpoint`: -The last element of a live streaming pipeline. It is responsible for handling the frames after the -encoder. -The endpoint could be a remote server (RTMP, SRT,...) or a file (FLV, MPEG-TS,...). -The main endpoint is `CompositeEndpoint` that is composed of a `Muxer` and a `Sink`. - -`Muxer`: -A process that packs audio and video frames to a container (FLV, MPEG-TS, MP4,...). -The `CompositeEndpoint` is composed of a `IMuxer`. - -`Sink`: -A process that sends the container to a remote server (RTMP, SRT,...) or to a file. -The `CompositeEndpoint` is composed of a `ISink`. - -`Streamer`: -A class that represent a audio and/or video live streaming pipeline. It manages sources, encoders, -muxers, endpoints,... and have lot of tools. They are the most important class for users. -Unless explicitly stated, the `Endpoint` is inferred from the `MediaDescriptor` object thanks to -the `DynamicEndpoint`. - -`Streamer element`: -Could be a `Source`, `Encoder`, `Muxer`, or `Endpoint`. They implement the `Streamable` and they -might have a public interface to access specific info. - -`Info`: -A class that provides a set of methods to help to `streamer` configuration such as supported -resolutions,... It comes with an instantiated `Streamer` object: +* `Source`: + A class that represents an audio or video source. For example, a camera ( + `CameraSource`), or a microphone (`AudioSource`). + +* `Encoder`: + A class that represents an audio or video encoders. Only Android MediaCodec API is + used (`MediaCodecEncoder`). + +* `Endpoint`: + The last element of a live streaming pipeline. It is responsible for handling the frames after the + encoder. + The endpoint could be a remote server (RTMP, SRT,...) or a file (FLV, MPEG-TS,...). + The main endpoint is `CompositeEndpoint` that is composed of a `Muxer` and a `Sink`. + +* `Muxer`: + A process that packs audio and video frames to a container (FLV, MPEG-TS, MP4,...). + The `CompositeEndpoint` is composed of a `IMuxer`. + +* `Sink`: + A process that sends the container to a remote server (RTMP, SRT,...) or to a file. + The `CompositeEndpoint` is composed of a `ISink`. + +* `Streamer`: + A class that represent a audio and/or video live streaming pipeline. It manages sources, encoders, + muxers, endpoints,... and have lot of tools. They are the most important class for users. + Unless explicitly stated, the `Endpoint` is inferred from the `MediaDescriptor` object thanks to + the `DynamicEndpoint`. + +* `Streamer element`: + Could be a `Source`, `Encoder`, `Muxer`, or `Endpoint`. They implement the `Streamable` and + they + might have a public interface to access specific info. + +* `Info`: + A class that provides a set of methods to help to `streamer` configuration such as supported + resolutions,... It comes with an instantiated `Streamer` object: ```kotlin val info = streamer.getInfo(MediaDescriptor(`media uri`)) @@ -48,12 +49,12 @@ They might be different for each `Streamer` object. For example, a `FlvStreamer` have the same `Info` object as a `TsStreamer` object because FLV does not support a wide range of codecs, audio sample rate,... -`Settings`: -Each streamer elements have a public interface that allows go have access to specific -information or configuration. -For example, the `VideoEncoder` object has a `bitrate` property that allows to get and set the -current video bitrate. -Example: +* Specific configuration: + Each streamer elements have a public interface that allows yo have access to specific + information or configuration. + For example, the `VideoEncoder` object has a `bitrate` property that allows to get and set the + current video bitrate. + Example: ```kotlin // Get video bitrate