-
-
Notifications
You must be signed in to change notification settings - Fork 792
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
VideoEncoder using inputSurfaceView to send to RTMPClient #1702
Comments
Hello, I did a demo implementation of the library working with deepAR here: |
It was working great, but now I'm facing an issue with the aspect ratio during streaming. It's not displaying correctly. Specifically, after the face is detected by DeepAR, the resulting stream seems to convert to a landscape orientation instead. |
Can you share a screenshot? |
mobile side |
Hello,
I have been trying to send a video feed directly from DeepAR but DeepAR allows only one mode at a time (renderToSurfaceView) or offScreenRendering which gives me processed Frames that dont function very well.
My primary goal is to directly feed the surface as that will probably work best for my case but the setInputSurface method that comes with the VideoEncoder doesnt work in my case. I am working with the 2.2.6 version of the RootEncoder. With using frame by frame approach I managed to somewhat convert the frames and send them through the VideoEncoder and the GetVideoData interface but the performance is abysmal and the frames get distorted because of DeepArs weird way of processing frames. Is there anyway we could directly just use the Surface that DeepAR is rendering its frames on ?
The current approach which I experimented around with is something like:
// renderer.renderImage(frame);
ByteBuffer buffer=frame.getPlanes()[0].getBuffer();
int[] byteArray=getArrayfromBytes(buffer);
if(byteArray.length>0){
ByteBuffer yuvBuffer=YUVBufferExtractor.convertImageToYUV(frame);
// byte[] yuvByteArr = YUVUtil.ARGBtoYUV420SemiPlanar(byteArray, frame.getWidth()+48, frame.getHeight());
byte[] yuvByteArr=yuvBuffer.array();
Frame yuvFrame =new Frame(yuvByteArr, 0, frame.getWidth()*frame.getHeight());
renderer.renderImageThroughFrame(yuvFrame, frame.getWidth(), frame.getHeight());
videoEncoder.inputYUVData(yuvFrame);
}
}
The text was updated successfully, but these errors were encountered: