Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multi encoder #100

Draft
wants to merge 5 commits into
base: main
Choose a base branch
from
Draft

Conversation

cdiddy77
Copy link

here is my first cut at the multi-encoder, along with a dual streamer which uses it to create both RTMP and file streams. Much of this code is bespoke for my application and will need subsequent modification.

Some of the code is generally useful and/or addresses bugs I found along the way.

.DS_Store
.vscode
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

whoops, will fix this


fun clone(): Frame {
val newRawBuffer = rawBuffer.duplicate() // Creates a new buffer that shares the content of 'rawBuffer'.
// Assuming the MediaFormat object can be shared. If not, you need to deep copy it as well.
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

comments courtesy of chatGPT. Will remove

@@ -150,6 +151,11 @@ abstract class MediaCodecEncoder<T : Config>(
Handler(handlerThread.looper)
}
}
private fun releaseHandler() {
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

currently I believe that the MCE leaks a thread every time the user hits start/stop? This code is not probably optimal. Maybe better to reuse the same handler on subsequent starts

Comment on lines +229 to +239
if(isStopped) {
return
}
isStopped = true
mediaCodec?.setCallback(null)
mediaCodec?.signalEndOfInputStream()
if(hasInputSurface) {
mediaCodec?.signalEndOfInputStream()
}
mediaCodec?.flush()
mediaCodec?.stop()
mediaCodec?.setCallback(null)
releaseHandler() // prevent thread leak
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

per offline discussion, this code remedies a bug where stopStream always results in mediaCodec throwing an exception, in both setCallback (must be stopped before call) or in signalEndOfStream (not allowed for non-surface encoder)

import io.github.thibaultbee.streampack.streamers.interfaces.settings.IVideoSettings
import java.util.concurrent.ExecutorService

data class MultiVideoEncoderTargetInfo(
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

obviously this code is pretty similar to alot of VideoMediaCodecEncoder.CodecSurface. Perhaps someday the two might be one.

@@ -0,0 +1,2 @@
jdk:
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes I will remove this again

Comment on lines +71 to +81
private val rtmpMuxer: IMuxer = FlvMuxer(writeToFile = false)
private val fileMuxer: IMuxer = MP4Muxer()

private var rtmpAudioStreamId: Int? = null
private var rtmpVideoStreamId: Int? = null
private var fileAudioStreamId: Int? = null
private var fileVideoStreamId: Int? = null

// Keep video configuration separate for rtmp and file
private var rtmpVideoConfig: VideoConfig? = null
private var fileVideoConfig: VideoConfig? = null
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All these things, instead of being two of everything, should get grouped into a thing that there are two of.

If done right, would lead eventually to a CameraMultiStreamer that could use any N streaming destinations.

Comment on lines +94 to +116
private val audioEncoderListener = object : IEncoderListener {
override fun onInputFrame(buffer: ByteBuffer): Frame {
return audioSource.getFrame(buffer)
}

override fun onOutputFrame(frame: Frame) {
val fileFrame = frame.clone()
rtmpAudioStreamId?.let {
try {
[email protected](frame, it)
} catch (e: Exception) {
throw StreamPackError(e)
}
}
fileAudioStreamId?.let {
try{
[email protected](fileFrame,it)
} catch(e:Exception){
throw StreamPackError(e)
}
}
}
}
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

per offline discussion, this construction implies that the audio encoding (and thus audio settings) are shared between the different output destinations. This is something that I can live with for my needs, since it allows for a much simpler implementation.

@@ -91,8 +91,10 @@ class FullFrameRect(var program: Texture2DProgram) {
*
* The appropriate EGL context must be current.
*/
fun changeProgram(program: Texture2DProgram) {
this.program.release()
fun changeProgram(program: Texture2DProgram, releasePrevious: Boolean = true) {
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

in my implementation, each encoding path has its own fullframerect (since the resolutions might differ) but they all share the same program. So I dont want them to nuke the program out from under eachother

class EglWindowSurface(private val surface: Surface, useHighBitDepth: Boolean = false) {
private var eglDisplay: EGLDisplay = EGL14.EGL_NO_DISPLAY
private var eglContext: EGLContext = EGL14.EGL_NO_CONTEXT
class EglWindowSurface(
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I broke EglWindowSurface and EgDisplayContext into two different classes bc I need to create the texture outside the scope of any particular surface context.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I broke EglWindowSurface and EgDisplayContext into two different classes bc I need to create the texture outside the scope of any particular surface context.

@@ -25,6 +25,8 @@ import io.github.thibaultbee.streampack.data.AudioConfig
import io.github.thibaultbee.streampack.data.BitrateRegulatorConfig
import io.github.thibaultbee.streampack.data.VideoConfig
import io.github.thibaultbee.streampack.ext.rtmp.streamers.AudioOnlyRtmpLiveStreamer
//import io.github.thibaultbee.streampack.ext.rtmp.streamers.CameraRtmpLiveStreamer
import io.github.thibaultbee.streampack.ext.rtmp.streamers.CameraDualLiveStreamer
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hmm weird probably this would break the demo app. should remove

@ThibaultBee
Copy link
Owner

As lot of code has changed for the v3, this won't be applicable anymore.

@ThibaultBee
Copy link
Owner

Multiple output is a feature of the version after the v3 (probably v3.1). Work in progress there: https://github.com/ThibaultBee/StreamPack/tree/dev_v3_1 with the DualStreamer.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants