-
-
Notifications
You must be signed in to change notification settings - Fork 792
-
Hi! Is it possible to configure your lib to use android mediaprojection to stream screen? So source would be not cameras, but surface from mediaprojection? BR, |
Beta Was this translation helpful? Give feedback.
All reactions
Replies: 11 comments · 21 replies
-
Yes, you can use RtmpDisplay or RtspDisplay depend of stream protocol desired. |
Beta Was this translation helpful? Give feedback.
All reactions
-
👍 1
-
Thank you very much for your fast reply! |
Beta Was this translation helpful? Give feedback.
All reactions
-
I am able to stream from cameras with audio, I am able to stream screen with audio, but simultaneously stream from camera and screen is possible? Ofc with disable audio from one source. When i'm trying it is crashing. |
Beta Was this translation helpful? Give feedback.
All reactions
-
You can play with video quality using different resolution, bitrate and dpi in the preparevideo method to improve it as desired. Also, this stream mode have a limitation related with mediaprojection api because mediaprojection api only render (produce a frame) when you have a new image in the screen. |
Beta Was this translation helpful? Give feedback.
All reactions
-
I forgot to mention it. For now, iOS version is under develop and all features aren't developed yet. For now only support stream RTSP with camera and microphone and it isn't well tested |
Beta Was this translation helpful? Give feedback.
All reactions
-
Ok, I understand, but maybe your already know if it is possible for iOS - to stream from camera and stream screen from any backgraund process just like in Android OS (service)? If I good know it is not possible from background, but eg. Zoom app does it :) |
Beta Was this translation helpful? Give feedback.
All reactions
-
I don't know anything about background jobs in iOS but I think it is possible. I'm sure that capture screen is possible but I need to read about background job limitations in iOS. |
Beta Was this translation helpful? Give feedback.
All reactions
-
Tell me one more think about use mediaprojection in Android - I am able to stream from camera with audio, I am able to stream screen with audio, but simultaneously stream from camera and screen is possible? Ofc with disable audio from one source. When i'm trying it is crashing. |
Beta Was this translation helpful? Give feedback.
All reactions
-
To full disbale audio you need skip prepareAudio method or you will create a microphone instance. |
Beta Was this translation helpful? Give feedback.
All reactions
-
Ok, it seems to work almost fine - I'm able to stream screen with audio and stream camera video only from ofc one device simultaneously, but the stream from camera is streaming properly only some time and after that it's crashing. I have trace info (I included not all), there are internal errors, but maybe you will have some ideas what is going wrong:
|
Beta Was this translation helpful? Give feedback.
All reactions
-
Can you show me a full logcat of VideoEncoder class? |
Beta Was this translation helpful? Give feedback.
All reactions
-
It's not deterministic - where the error arrives. After a few tries I've got other error from screen stream:
Consistent always is that after some time one of stream is crashing. |
Beta Was this translation helpful? Give feedback.
All reactions
-
We can ignore this because it only indicate that server close connection with you and fail to send a packet. It is not an error.
I also want logs before crash to know if encoder report any error before that.
|
Beta Was this translation helpful? Give feedback.
All reactions
-
There is nothing before and after crash about VideoEncoder, but I have full log, with no filters. Everything is stared with
and here are logs before and after.
And here logs about the crash:
|
Beta Was this translation helpful? Give feedback.
All reactions
-
@pedroSG94 Had you time to look at these logs? |
Beta Was this translation helpful? Give feedback.
All reactions
-
Yes, I'm actually working in a fix related with reload video encoder. I will upload this fix to a branch today and share you it to know if this solve your case |
Beta Was this translation helpful? Give feedback.
All reactions
-
try with this branch and let me know if all is working fine: |
Beta Was this translation helpful? Give feedback.
All reactions
-
I've tried with reloadcodec branch, but the result it the same - after some time stream from camera is crashing. I have a full logs. Ofc if you need more logs I can add to source code to provide more data:
|
Beta Was this translation helpful? Give feedback.
All reactions
-
I can see in logcat that camera2 close itself but I can't see the previous error of VideoEncoder class reported. Also, I can't see any logs of Camera2ApiManager where this camera error is notifyed.
|
Beta Was this translation helpful? Give feedback.
All reactions
-
I don't use here camera2 - I use camera RtmpCamera1. I have no logs for Camera2ApiManager. I've tried with only this tag for whole session (from app start to the crash) and there is nothing about Camera2ApiManager. I am using three different devices, eg xiaomi mi note 10 lite with Android 11. Yes ofc I will check it for these scenarios. It can be very valuable to check it by you - now even I'm using your example - displayexample, where for DisplayService class I edit method startStreamRtp - after displayBase?.startStream(endpoint) I just start camera - RtmpCamera1, that's all. Ofc I use for screen and camera different endpoints to stream. |
Beta Was this translation helpful? Give feedback.
All reactions
-
Ah, ok. I didn't understand you. Also I assumed that you was using camera2 because you have few logs of device refered to Camera2Client. Can you share me the code of your DisplayService class? If I understand you well the crash is normal because you need initialize RtmpCamera1 class and prepare video/audio so I need the code to discard it |
Beta Was this translation helpful? Give feedback.
All reactions
-
Sure. I've made some very simple changes before send you code - just for increase readability my changes and it seems to started works properly (without crash) :) I will test it more later. Changed code:
Before was changed only startStreamRtp method to this:
Are there calls between that init, stop stream locations matter? |
Beta Was this translation helpful? Give feedback.
All reactions
-
@pedroSG94 had you looked at the above? What the difference can be there? |
Beta Was this translation helpful? Give feedback.
All reactions
-
I thought that all was working fine so was waiting for your next response. The difference could be that service take time to init and maybe when you call startStreamRtp the RtmpCamera1 is still null. |
Beta Was this translation helpful? Give feedback.
All reactions
-
Ok, I'm testing it today intensive and it seems to work fine. I will continue and let you know about result. |
Beta Was this translation helpful? Give feedback.
All reactions
-
Quality of Display mode depend of dpi, resolution and bitrate (I'm using a low dpi, resolution and bitrate to be sure that all devices support it in example code). |
Beta Was this translation helpful? Give feedback.
All reactions
-
Ok, I see now - these params can be pass to prepareVideo method. Thx. |
Beta Was this translation helpful? Give feedback.
All reactions
-
I can't use the library. Can you help me? I have a service that records the screen of the device, and I want to send the stream with rtmp to my ant media server. Is this possible? |
Beta Was this translation helpful? Give feedback.
All reactions
-
At what point could I use the library?
|
Beta Was this translation helpful? Give feedback.
All reactions
-
Hello, |
Beta Was this translation helpful? Give feedback.
All reactions
-
very good, it worked |
Beta Was this translation helpful? Give feedback.
Yes, you can use RtmpDisplay or RtspDisplay depend of stream protocol desired.
You have an example here:
https://github.com/pedroSG94/rtmp-rtsp-stream-client-java/tree/master/app/src/main/java/com/pedro/rtpstreamer/displayexample