Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

VideoEncoder using inputSurfaceView to send to RTMPClient #1702

Open
discovertalha opened this issue Jan 17, 2025 · 7 comments
Open

VideoEncoder using inputSurfaceView to send to RTMPClient #1702

discovertalha opened this issue Jan 17, 2025 · 7 comments

Comments

@discovertalha
Copy link

Hello,

I have been trying to send a video feed directly from DeepAR but DeepAR allows only one mode at a time (renderToSurfaceView) or offScreenRendering which gives me processed Frames that dont function very well.

My primary goal is to directly feed the surface as that will probably work best for my case but the setInputSurface method that comes with the VideoEncoder doesnt work in my case. I am working with the 2.2.6 version of the RootEncoder. With using frame by frame approach I managed to somewhat convert the frames and send them through the VideoEncoder and the GetVideoData interface but the performance is abysmal and the frames get distorted because of DeepArs weird way of processing frames. Is there anyway we could directly just use the Surface that DeepAR is rendering its frames on ?

The current approach which I experimented around with is something like:

public void frameAvailable(Image frame) {
    if (frame == null) {
        Log.e(TAG, "frameAvailable: Received null frame");
        return;
    }

// renderer.renderImage(frame);
ByteBuffer buffer=frame.getPlanes()[0].getBuffer();
int[] byteArray=getArrayfromBytes(buffer);
if(byteArray.length>0){
ByteBuffer yuvBuffer=YUVBufferExtractor.convertImageToYUV(frame);
// byte[] yuvByteArr = YUVUtil.ARGBtoYUV420SemiPlanar(byteArray, frame.getWidth()+48, frame.getHeight());
byte[] yuvByteArr=yuvBuffer.array();
Frame yuvFrame =new Frame(yuvByteArr, 0, frame.getWidth()*frame.getHeight());
renderer.renderImageThroughFrame(yuvFrame, frame.getWidth(), frame.getHeight());
videoEncoder.inputYUVData(yuvFrame);
}
}

@pedroSG94
Copy link
Owner

Hello,

I did a demo implementation of the library working with deepAR here:
https://github.com/pedroSG94/demo-deepAR-streaming
You only need focus on this classes:
https://github.com/pedroSG94/demo-deepAR-streaming/tree/feature/fixing-streaming/app/src/main/java/com/example/streaming_deepar/streaming

@discovertalha
Copy link
Author

It was working great, but now I'm facing an issue with the aspect ratio during streaming. It's not displaying correctly. Specifically, after the face is detected by DeepAR, the resulting stream seems to convert to a landscape orientation instead.

@pedroSG94
Copy link
Owner

Can you share a screenshot?

@discovertalha
Copy link
Author

mobile side

Image

@discovertalha
Copy link
Author

Image

@discovertalha
Copy link
Author

discovertalha commented Jan 21, 2025

when set
rtmpStream.getGlInterface().setAutoHandleOrientation(true);

stream side aspect ratio is working

Image

@discovertalha
Copy link
Author

discovertalha commented Jan 21, 2025

        rtmpStream.getGlInterface().setAutoHandleOrientation(true);

mobile side

Image

Image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants