-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Android Camera2 rewrite #1674
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
The native platforms will use the best / most accurate colorSpace by default anyways.
Device: Looks pretty good.
Some issues I found while testingCrash Report from adb logcat (Crashed after phone screen was locked, modified `console.log` and unlocked phone again
console.log for device has a never ending log (Only 3 fields "videoWidth", "videoHeight", "photoWidth" have different values) Here is a truncated version
Something worth noting is the fps button was not visible for me. I modified the files in |
@Angelk90 try running |
@Angelk90 OpenJDK 20 doesn't support |
Reagrding this:
In my case it comes from here: private fun getMaximumVideoSize(cameraId: String): Size? {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.S) {
val profiles = CamcorderProfile.getAll(cameraId, CamcorderProfile.QUALITY_HIGH)
if (profiles != null) {
val largestProfile = profiles.videoProfiles.maxBy { it.width * it.height }
return Size(largestProfile.width, largestProfile.height)
}
}
// ...
}
|
OK, it's a known bug in some Android 13 versions. Flutter community has already fixed it for affected devices: |
@yegorius : What should I do then? |
@ahmedu007 : You changed these too, nothing changed?
|
Hey good morning guys - yooo thanks so much for helping me test this!!!!
|
The logs & reports are really helpful, damn. ❤️ Did anyone try to test 60 FPS on Samsung? (by removing those 2 lines in |
@Angelk90 maybe try increasing your gradle heap size and updating Android Studio itself to the latest version |
motorola one (xt1941-4)
Do not work:
If I take a picture even if I don't see anything I see this: How it is saved: |
@mrousavy : How do I increase the size of the gradle heap? Android Studio Giraffe | 2022.3.1 i can get it to run because i start it from android studio. |
Hey @Angelk90 Remove these 2 lines, then change the arguments passed to the function below these lines. Or you can change the 1st line to
And that should build okay without fps block. @mrousavy I modified the kotlin file as you suggested but it didn't change anything. The fps toggle button wasn't available until I removed the check. |
@mrousavy : I tried on an old model samsung tablet. the zoom function seems reversed compared to how the tablet's default camera app does. @ahmedu007 : Can you try if it happens to you on the s23 ultra that it is reversed? |
@Angelk90 zoom works as expected. Only thing I noticed was the photo showing a 90degree rotated preview which chops off everything. Also the app doesn't request storage permissions so can't save the media to device. |
@ahmedu007 : Instead he asked me for permission if he could save the photo. P.S. |
Tested on Pixel 7 - Android 13 Device InfoBack
Front
✅ Play around with minimizing the app, switching to other Cameras, etc to see if the Camera can handle those interruptions and restart again without blackscreens Issues found:
Other notes:
Overall this feels really good. The camera preview speed is amazing!! 🙌👏👌 |
Thank you @thorbenprimke this is really insightful feedback!! ❤️
Good point, will have to fix orientation here.
Interesting, this does use the Flash APIs if I'm not mistaken - will double check what's going on here.
Amazing - great news! 🤩
Yep good point will actually rewrite that & also add orientation support to the Example app. Everybody seems to want that so I'll also test for those things in the example
Hm, that's weird. In
Yea, those are Camera2 Extensions. Maybe the Vendor didn't implement them properly - do you have that in other apps?
This is not expected and I noticed that too! I think I need to do some special handling about aspect ratios to make sure that all aspect ratios are the same. Preview size is different than output photo size, so maybe I need to force an aspect ratio there... Will investigate thanks! |
@Angelk90 this is really weird. Can you share your full |
@Angelk90 Could you test 3 things please?
|
Lol, for Flash you can't simply pass You have to run a completely separate capture request before running your actual photo capture. This pre-capture-sequence will enable AE/AF and Flash, allow the Camera time to adapt to the new lighting, then we run capture. Insane amount of complexity, imo this should be part of the HAL. Example code for Flashprivate fun lockFocus() {
val previewRequestBuilder = previewRequestBuilder
val captureSession = captureSession
if (previewRequestBuilder != null && captureSession != null) {
try {
previewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_START)
captureState = STATE_WAITING_LOCK
waitingFrames = 0
captureSession.capture(previewRequestBuilder.build(), captureCallback, cameraHandler)
previewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, null)
} catch (e: Exception) {
}
}
}
private fun runPreCaptureSequence() {
val previewRequestBuilder = previewRequestBuilder
val captureSession = captureSession
if (previewRequestBuilder != null && captureSession != null) {
previewRequestBuilder.set(CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER, CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER_START)
captureState = STATE_WAITING_PRECAPTURE
captureSession.capture(previewRequestBuilder.build(), captureCallback, cameraHandler)
previewRequestBuilder.set(CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER, null)
previewRequestBuilder.set(CaptureRequest.FLASH_MODE, when (this.flash) {
CameraFlash.ON -> CaptureRequest.FLASH_MODE_TORCH
else -> CaptureRequest.FLASH_MODE_OFF
})
captureSession.setRepeatingRequest(previewRequestBuilder.build(), captureCallback, cameraHandler)
}
}
private fun captureStillPicture() {
val captureSession = captureSession
val cameraDevice = cameraDevice
val imageReader = imageReader
if (captureSession != null && cameraDevice != null && imageReader != null) {
val captureBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE)
captureBuilder.addTarget(imageReader.surface)
captureBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE)
captureBuilder.set(CaptureRequest.FLASH_MODE, when (flash) {
CameraFlash.ON -> CaptureRequest.FLASH_MODE_SINGLE
else -> CaptureRequest.FLASH_MODE_OFF
})
val delay = when (flash) {
CameraFlash.ON -> 75L
else -> 0L
}
cameraHandler.postDelayed({
captureSession.capture(captureBuilder.build(), object : CameraCaptureSession.CaptureCallback() {
override fun onCaptureCompleted(session: CameraCaptureSession, request: CaptureRequest, result: TotalCaptureResult) {
unlockFocus()
}
}, cameraHandler)
}, delay)
}
}
private fun unlockFocus() {
val previewRequestBuilder = previewRequestBuilder
val captureSession = captureSession
if (previewRequestBuilder != null && captureSession != null) {
previewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_CANCEL)
captureSession.capture(previewRequestBuilder.build(), captureCallback, cameraHandler)
captureState = STATE_PREVIEW
previewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, null)
previewRequestBuilder.set(CaptureRequest.FLASH_MODE, CaptureRequest.FLASH_MODE_OFF)
captureSession.setRepeatingRequest(previewRequestBuilder.build(), captureCallback, cameraHandler)
}
}
private var captureState: Int = STATE_PREVIEW
private val captureCallback = object : CameraCaptureSession.CaptureCallback() {
private fun process(result: CaptureResult) {
when (captureState) {
STATE_PREVIEW -> {
val image = imageReader?.acquireLatestImage()
if (image != null) {
val buffer = image.planes[0].buffer
val bytes = ByteArray(buffer.remaining())
buffer.get(bytes)
photoCallback?.invoke(bytes)
photoCallback = null
image.close()
}
}
STATE_WAITING_LOCK -> {
val afState = result.get(CaptureResult.CONTROL_AF_STATE)
if (CaptureResult.CONTROL_AF_STATE_FOCUSED_LOCKED == afState || CaptureResult.CONTROL_AF_STATE_NOT_FOCUSED_LOCKED == afState) {
runPreCaptureSequence()
} else if (null == afState || CaptureResult.CONTROL_AF_STATE_INACTIVE == afState) {
captureStillPicture()
} else if (waitingFrames >= 5) {
waitingFrames = 0
captureStillPicture()
} else {
waitingFrames++
}
}
STATE_WAITING_PRECAPTURE -> {
val aeState = result.get(CaptureResult.CONTROL_AE_STATE)
if (aeState == null ||
aeState == CaptureResult.CONTROL_AE_STATE_PRECAPTURE ||
aeState == CaptureRequest.CONTROL_AE_STATE_FLASH_REQUIRED) {
captureState = STATE_WAITING_NON_PRECAPTURE
}
}
STATE_WAITING_NON_PRECAPTURE -> {
val aeState = result.get(CaptureResult.CONTROL_AE_STATE)
if (aeState == null || aeState != CaptureResult.CONTROL_AE_STATE_PRECAPTURE) {
captureState = STATE_PICTURE_TAKEN
captureStillPicture()
}
}
}
}
override fun onCaptureCompleted(session: CameraCaptureSession, request: CaptureRequest, result: TotalCaptureResult) {
if (!previewStarted) {
onPreviewStarted()
previewStarted = true
}
process(result)
}
override fun onCaptureProgressed(session: CameraCaptureSession, request: CaptureRequest, partialResult: CaptureResult) {
process(partialResult)
}
} |
* Nuke CameraX * fix: Run View Finder on UI Thread * Open Camera, set up Threads * fix init * Mirror if needed * Try PreviewView * Use max resolution * Add `hardwareLevel` property * Check if output type is supported * Replace `frameRateRanges` with `minFps` and `maxFps` * Remove `isHighestPhotoQualitySupported` * Remove `colorSpace` The native platforms will use the best / most accurate colorSpace by default anyways. * HDR * Check from format * fix * Remove `supportsParallelVideoProcessing` * Correctly return video/photo sizes on Android now. Finally * Log all Device props * Log if optimized usecase is used * Cleanup * Configure Camera Input only once * Revert "Configure Camera Input only once" This reverts commit 0fd6c03. * Extract Camera configuration * Try to reconfigure all * Hook based * Properly set up `CameraSession` * Delete unused * fix: Fix recreate when outputs change * Update NativePreviewView.kt * Use callback for closing * Catch CameraAccessException * Finally got it stable * Remove isMirrored * Implement `takePhoto()` * Add ExifInterface library * Run findViewById on UI Thread * Add Photo Output Surface to takePhoto * Fix Video Stabilization Modes * Optimize Imports * More logs * Update CameraSession.kt * Close Image * Use separate Executor in CameraQueue * Delete hooks * Use same Thread again * If opened, call error * Update CameraSession.kt * Log HW level * fix: Don't enable Stream Use Case if it's not 100% supported * Move some stuff * Cleanup PhotoOutputSynchronizer * Try just open in suspend fun * Some synchronization fixes * fix logs * Update CameraDevice+createCaptureSession.kt * Update CameraDevice+createCaptureSession.kt * fixes * fix: Use Snapshot Template for speed capture prio * Use PREVIEW template for repeating request * Use `TEMPLATE_RECORD` if video use-case is attached * Use `isRunning` flag * Recreate session everytime on active/inactive * Lazily get values in capture session * Stability * Rebuild session if outputs change * Set `didOutputsChange` back to false * Capture first in lock * Try * kinda fix it? idk * fix: Keep Outputs * Refactor into single method * Update CameraView.kt * Use Enums for type safety * Implement Orientation (I think) * Move RefCount management to Java (Frame) * Don't crash when dropping a Frame * Prefer Devices with higher max resolution * Prefer multi-cams * Use FastImage for Media Page * Return orientation in takePhoto() * Load orientation from EXIF Data * Add `isMirrored` props and documentation for PhotoFile * fix: Return `not-determined` on Android * Update CameraViewModule.kt * chore: Upgrade packages * fix: Fix Metro Config * Cleanup config * Properly mirror Images on save * Prepare MediaRecorder * Start/Stop MediaRecorder * Remove `takeSnapshot()` It no longer works on Android and never worked on iOS. Users could use useFrameProcessor to take a Snapshot * Use `MediaCodec` * Move to `VideoRecording` class * Cleanup Snapshot * Create `SkiaPreviewView` hybrid class * Create OpenGL context * Create `SkiaPreviewView` * Fix texture creation missing context * Draw red frame * Somehow get it working * Add Skia CMake setup * Start looping * Init OpenGL * Refactor into `SkiaRenderer` * Cleanup PreviewSize * Set up * Only re-render UI if there is a new Frame * Preview * Fix init * Try rendering Preview * Update SkiaPreviewView.kt * Log version * Try using Skia (fail) * Drawwwww!!!!!!!!!! 🎉 * Use Preview Size * Clear first * Refactor into SkiaRenderer * Add `previewType: "none"` on iOS * Simplify a lot * Draw Camera? For some reason? I have no idea anymore * Fix OpenGL errors * Got it kinda working again? * Actually draw Frame woah * Clean up code * Cleanup * Update on main * Synchronize render calls * holy shit * Update SkiaRenderer.cpp * Update SkiaRenderer.cpp * Refactor * Update SkiaRenderer.cpp * Check for `NO_INPUT_TEXTURE`^ * Post & Wait * Set input size * Add Video back again * Allow session without preview * Convert JPEG to byte[] * feat: Use `ImageReader` and use YUV Image Buffers in Skia Context (mrousavy#1689) * Try to pass YUV Buffers as Pixmaps * Create pixmap! * Clean up * Render to preview * Only render if we have an output surface * Update SkiaRenderer.cpp * Fix Y+U+V sampling code * Cleanup * Fix Semaphore 0 * Use 4:2:0 YUV again idk * Update SkiaRenderer.h * Set minSdk to 26 * Set surface * Revert "Set minSdk to 26" This reverts commit c4085b7. * Set previewType * feat: Video Recording with Camera2 (mrousavy#1691) * Rename * Update CameraSession.kt * Use `SurfaceHolder` instead of `SurfaceView` for output * Update CameraOutputs.kt * Update CameraSession.kt * fix: Fix crash when Preview is null * Check if snapshot capture is supported * Update RecordingSession.kt * S * Use `MediaRecorder` * Make audio optional * Add Torch * Output duration * Update RecordingSession.kt * Start RecordingSession * logs * More log * Base for preparing pass-through Recording * Use `ImageWriter` to append Images to the Recording Surface * Stream PRIVATE GPU_SAMPLED_IMAGE Images * Add flags * Close session on stop * Allow customizing `videoCodec` and `fileType` * Enable Torch * Fix Torch Mode * Fix comparing outputs with hashCode * Update CameraSession.kt * Correctly pass along Frame Processor * fix: Use AUDIO_BIT_RATE of 16 * 44,1Khz * Use CAMCORDER instead of MIC microphone * Use 1 channel * fix: Use `Orientation` * Add `native` PixelFormat * Update iOS to latest Skia integration * feat: Add `pixelFormat` property to Camera * Catch error in configureSession * Fix JPEG format * Clean up best match finder * Update CameraDeviceDetails.kt * Clamp sizes by maximum CamcorderProfile size * Remove `getAvailableVideoCodecs` * chore: release 3.0.0-rc.5 * Use maximum video size of RECORD as default * Update CameraDeviceDetails.kt * Add a todo * Add JSON device to issue report * Prefer `full` devices and flash * Lock to 30 FPS on Samsung * Implement Zoom * Refactor * Format -> PixelFormat * fix: Feat `pixelFormat` -> `pixelFormats` * Update TROUBLESHOOTING.mdx * Format * fix: Implement `zoom` for Photo Capture * fix: Don't run if `isActive` is `false` * fix: Call `examplePlugin(frame)` * fix: Fix Flash * fix: Use `react-native-worklets-core`! * fix: Fix import
What
Complete rewrite of the entire Android codebase from CameraX to Camera2, the lower level Camera library from Android. This will give us more stability, more control, and more features for VisionCamera. (CameraX also uses Camera2 internally)
I've spent the past 2 weeks with full-on focus on this PR, and after around 180 hours of pure coding this is now finally at a testable state.
If you appreciate what I'm doing, please consider 💖 sponsoring me on GitHub 💖 :)
All changes:
CameraDevice.formats
API on Android, finally making it accurate. VideoSize and PhotoSize actually reflect the sizes you can record/shoot photos in.supportsParallelVideoProcessing
, we can now always do parallel video processing thanks to the Camera2 rewrite!!! (woohoo 🎉)CameraDevice.hardwareLevel
propertyflash
,videoStabilizationMode
,qualityPrioritization
,previewType
, ..CameraDeviceFormat.frameRateRanges
in favor ofminFps
/maxFps
isHighestPhotoQualitySupported
(was inaccurate)colorSpace
(was inaccurate)getAvailableVideoCodecs()
(is now in Device)videoStabilization
on Android 🎉pauseRecording()
/resumeRecording()
on Android 🎉h264
orh265
(HEVC).mp4
or.mov
filepixelFormat
forFrame
pixelFormat
as a Camera ProppixelFormats
onCameraDevice.Format
not-determined
/granted
permissions on Android to make it on-par with iOS/documentationorientation
andisMirrored
inPhotoFile
returned bytakePhoto()
This also fixes a ton of bugs such as:
CameraDevice
/formats
API (videoWidth/videoHeight and FPS)Test it
Small things like flash, orientation, etc. might still be off, but those should be minor fixes after some testing.
That's why I'd love to get some help here to test this PR thoroughly to see how it works on different devices!
To test it run:
In a separate Terminal, open Android LogCat to see all logs:
Let me know what happens when you open the app, and if everything works as expected. What to test:
CameraSession.kt
and remove lines 371 and 372. This will try to use higher FPS than 30. Samsung is known for blocking 60 FPS to third party apps, so this might result in a blackscreen, but I want to find out if this might work on some newer Samsung devices. (You need to runyarn android
again after this change) cc @ahmedu007<Camera ..>
component inCameraPage.tsx
. For example, try removing thefps={fps}
prop to see if it's because of the FPS. Same forhdr
,format
,video
,frameProcessor
, etc. This just helps me nail down where a problem might be originating fromAs next steps I will do some minor code cleanup after this brutal rewrite and then make sure it also builds without RN Worklets and RN Skia. And maybe some polishing if we find some bugs.
If you find some bugs, please post enough reproduction steps either here or as a separate issue, and make sure to also add the device you're using by doing:
In the component
CameraPage.tsx
. I need to know what camera device is used for bugs.Appreciate any help, thanks!!
Tested on
Related issues
pretty much every Android issue lol
.. maybe also (not tested):
SIGSEGV
on Android #1635[capture/inactive-source]
The recording failed because the source becomes inactive and stops sending frames. One case is that if camera is closed due to lifecycle stopped, the active recording will be finalized with this error, and the output will be generated, containing the frames produced before camera closing. Attempting to start a new recording will be finalized immediately if the source remains inactive and no output will be generated. #959