Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Documentation : glitch app link & video ingestion documentation #104

Merged
merged 10 commits into from
Jun 3, 2022
6 changes: 4 additions & 2 deletions content/docs/tutorial/app-with-webrtc/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,8 @@ This tutorial will show you how to create a streamer client and viewer page.
We will create a web page for the streamer client to capture our webcam directly and send it to Inlive encoder as a video source input once the user clicks the start button.
We will create a web page that can use by live stream viewers to watch the live video stream.

The example code for this tutorial is available on our [simple livestream Glitch app](https://glitch.com/~inlive-live-stream-app) demo.

## A. Requirement
Before coding your web app, you need to create an application key as stated in our [getting started documentation](/docs/getting-started). Please make sure you write down that key after you create it because it is used in this web app that we will create.

Expand Down Expand Up @@ -145,7 +147,7 @@ async function startStream(){
```

### 4. Prepare the live stream
For now, we need you to call this `prepare` API endpoint before starting to initiate the WebRTC connection. This is to start your live stream session, and this is where the billing will start counting your live streaming duration. In the future, we will automate the preparation process so the preparation will start automatically once we receive your video ingestion. Let's create a function that will be used to call the `prepare` API endpoint:
For now, we need you to call this `prepare` API endpoint before starting to initiate the WebRTC connection. This is to start your live stream session, and this is where the billing will start counting your live streaming duration. In the future, we will automate the preparation process so the preparation will start automatically once we receive your [video ingestion](/docs/video-ingestion/). Let's create a function that will be used to call the `prepare` API endpoint:

```js
async function prepareStream(id){
Expand Down Expand Up @@ -409,4 +411,4 @@ There are two options to play the video, and you can [read more detail here](/do
<video autoplay muted controls playsinline id="video"></video>
</body>
</html>
```
```
ingrid011 marked this conversation as resolved.
Show resolved Hide resolved
37 changes: 37 additions & 0 deletions content/docs/video-ingestion/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
---
date: 2022-05-31
lastmod: 2022-05-31
name: Video Ingestion
title: Video Ingestion
description: Inlive support WebRTC as our video ingestion. We have RTMP in development on progress.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ingrid011 this is will be used as SEO description meta tag, so consider to replace it with a summary about the content.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@tyohan Being revised to be : Inlive uses WebRTC which supports video ingestion on all platforms including web, has better latency than RTMP, and can be used to build your own OBS client.
If counted, it is 157 characters (still reach the SEO description best practice until 160 characters). Kindly advice is it okay mas?

slug: video-ingestion
weight: 4000
menu:
docs_sidebar:
identifier: Video Ingestion
name: Video Ingestion
weight: 4000
---

# WebRTC vs RTMP Video Ingestion
Talking about video ingestion for live streaming, the most common ingestion video use is RTMP. This has been a default option for video ingestion for most live streaming platforms out there. The reason is a common way to do live streaming these days is by using OBS live streaming clients that are able to produce a professional-quality live streaming video. And by default RTMP is the only way to ingest video with OBS.
ingrid011 marked this conversation as resolved.
Show resolved Hide resolved

## Why WebRTC?
WebRTC is a popular protocol if we’re talking about online communication apps like Google Meet or Zoom. It’s a web standard API that is supported on all platforms, not only on all modern browsers but also on native platforms like Android and iOS. This is the main reason why we choose WebRTC as the first video ingestion protocol for our live streaming APIs. It’s because it works everywhere including the web. RTMP can’t be used with the web, and we saw web is really important if we’re talking about applications.
ingrid011 marked this conversation as resolved.
Show resolved Hide resolved

## When to use WebRTC or RTMP?
Another main difference between WebRTC and RTMP besides RTMP doesn’t work on the web is the network protocol. WebRTC is by default based on UDP, which means it’s for better latency, not quality. WebRTC could use TCP protocol as well, but it will need some settings. Read more why WebRTC is better with UDP than TCP in [this article](https://bloggeek.me/why-you-should-prefer-udp-over-tcp-for-your-webrtc-sessions/). RTMP is based on TCP means it’s based on quality, not latency. With that, choosing WebRTC or RTMP should be based on the use case you’re developing.

If you’re building live streaming that required a good quality, for example, a live event that required a broadcast-quality video like music concerts, or conferences, then RTMP is a must choice. Because you can guarantee a better quality compared with WebRTC. But if you don’t need a broadcast-quality video, then WebRTC should be your choice because it will give you a better latency. And latency is always important for all live streaming use cases.

## WebRTC on the web
As we mentioned before the main reason why we choose WebRTC as our first option for live streaming applications it’s because it works everywhere including the web. But why the web is important for us? Because we think, if we’re talking about developing an application, the web is the simplest way to let the user use our application. They can just open our URL and they can go live streaming directly from their browser without installing anything.

You might be thinking the web is not as capable as OBS as a streaming client that allows the video to be filtered or modified on the fly before ingesting it to the streaming server. For example, you want to stack multiple layers of videos, one is your camera that captures your face, and another one is the screen capture that captures your screen. This is possible to build on the web, and the user doesn’t need to install anything to use this feature. You only need to develop a streaming client with your JavaScript skills to do this. You can capture each video frame with [requestVideoFrameCallback()](https://web.dev/requestvideoframecallback-rvfc/) API, put it in canvas, and render it as a video with [HTMLCanvasElement.captureStream()](https://developer.mozilla.org/en-US/docs/Web/API/HTMLCanvasElement/captureStream) that you can use as a video stream that you can send through WebRTC with our APIs. With this, you can build your own simple OBS client.

Check our documentation about developing a broadcast client on the web and use it with our inLive live streaming APIs.
ingrid011 marked this conversation as resolved.
Show resolved Hide resolved

## Summary
For now, we only support WebRTC as our video ingestion, we have RTMP in development but we can’t say when it will be available because we currently focus on live streaming use cases for mobile, which mostly will prioritize latency over quality.

If you want to use RTMP for your use case and are willing to use our live streaming APIs for that, [please let us know](mailto:[email protected]) so we can prioritize and let you know when the RTMP can be available for you.