react-native-vision-camera/android/src/main/java/com/mrousavy/camera/CameraView+TakePhoto.kt
Marc Rousavy 87e6bb710e
feat: Frame Processors for Android (#196)
* Create android gradle build setup

* Fix `prefab` config

* Add `pickFirst **/*.so` to example build.gradle

* fix REA path

* cache gradle builds

* Update validate-android.yml

* Create Native Proxy

* Copy REA header

* implement ctor

* Rename CameraViewModule -> FrameProcessorRuntimeManager

* init FrameProcessorRuntimeManager

* fix name

* Update FrameProcessorRuntimeManager.h

* format

* Create AndroidErrorHandler.h

* Initialize runtime and install JSI funcs

* Update FrameProcessorRuntimeManager.cpp

* Update CameraViewModule.kt

* Make CameraView hybrid C++ class to find view & set frame processor

* Update FrameProcessorRuntimeManager.cpp

* pass function by rvalue

* pass by const &&

* extract hermes and JSC REA

* pass `FOR_HERMES`

* correctly prepare JSC and Hermes

* Update CMakeLists.txt

* add missing hermes include

* clean up imports

* Create JImageProxy.h

* pass ImageProxy to JNI as `jobject`

* try use `JImageProxy` C++ wrapper type

* Use `local_ref<JImageProxy>`

* Create `JImageProxyHostObject` for JSI interop

* debug call to frame processor

* Unset frame processor

* Fix CameraView native part not being registered

* close image

* use `jobject` instead of `JImageProxy` for now :(

* fix hermes build error

* Set enable FP callback

* fix JNI call

* Update CameraView.cpp

* Get Format

* Create plugin abstract

* Make `FrameProcessorPlugin` a hybrid object

* Register plugin CXX

* Call `registerPlugin`

* Catch

* remove JSI

* Create sample QR code plugin

* register plugins

* Fix missing JNI binding

* Add `mHybridData`

* prefix name with two underscores (`__`)

* Update CameraPage.tsx

* wrap `ImageProxy` in host object

* Use `jobject` for HO box

* Update JImageProxy.h

* reinterpret jobject

* Try using `JImageProxy` instead of `jobject`

* Update JImageProxy.h

* get bytes per row and plane count

* Update CameraView.cpp

* Return base

* add some docs and JNI JSI conversion

* indent

* Convert JSI value to JNI jobject

* using namespace facebook

* Try using class

* Use plain old Object[]

* Try convert JNI -> JSI

* fix decl

* fix bool init

* Correctly link folly

* Update CMakeLists.txt

* Convert Map to Object

* Use folly for Map and Array

* Return `alias_ref<jobject>` instead of raw `jobject`

* fix JNI <-> JSI conversion

* Update JSIJNIConversion.cpp

* Log parameters

* fix params index offset

* add more test cases

* Update FRAME_PROCESSORS_CREATE_OVERVIEW.mdx

* fix types

* Rename to example plugin

* remove support for hashmap

* Try use HashMap iterable fbjni binding

* try using JReadableArray/JReadableMap

* Fix list return values

* Update JSIJNIConversion.cpp

* Update JSIJNIConversion.cpp

* (iOS) Rename ObjC QR Code Plugin to Example Plugin

* Rename Swift plugin QR -> Example

* Update ExamplePluginSwift.swift

* Fix Map/Dictionary logging format

* Update ExampleFrameProcessorPlugin.m

* Reconfigure session if frame processor changed

* Handle use-cases via `maxUseCasesCount`

* Don't crash app on `configureSession` error

* Document "use-cases"

* Update DEVICES.mdx

* fix merge

* Make `const &`

* iOS: Automatically enable `video` if a `frameProcessor` is set

* Update CameraView.cpp

* fix docs

* Automatically fallback to snapshot capture if `supportsParallelVideoProcessing` is false.

* Fix lookup

* Update CameraView.kt

* Implement `frameProcessorFps`

* Finalize Frame Processor Plugin Hybrid

* Update CameraViewModule.kt

* Support `flash` on `takeSnapshot()`

* Update docs

* Add docs

* Update CameraPage.tsx

* Attribute NonNull

* remove unused imports

* Add Android docs for Frame Processors

* Make JNI HashMap <-> JSI Object conversion faster

directly access `toHashMap` instead of going through java

* add todo

* Always run `prepareJSC` and `prepareHermes`

* switch jsc and hermes

* Specify ndkVersion `21.4.7075529`

* Update gradle.properties

* Update gradle.properties

* Create .aar

* Correctly prepare android package

* Update package.json

* Update package.json

* remove `prefab` build feature

* split

* Add docs for registering the FP plugin

* Add step for dep

* Update CaptureButton.tsx

* Move to `reanimated-headers/`

* Exclude reanimated-headers from cpplint

* disable `build/include_order` rule

* cpplint fixes

* perf: Make `JSIJNIConversion` a `namespace` instead of `class`

* Ignore runtime/references for `convert` funcs

* Build Android .aar in CI

* Run android build script only on `prepack`

* Update package.json

* Update package.json

* Update build-android-npm-package.sh

* Move to `yarn build`

* Also install node_modules in example step

* Update validate-android.yml

* sort imports

* fix torch

* Run ImageAnalysis on `FrameProcessorThread`

* Update Errors.kt

* Add clean android script

* Upgrade reanimated to 2.3.0-alpha.1

* Revert "Upgrade reanimated to 2.3.0-alpha.1"

This reverts commit c1d3bed5e03728d0b5e335a359524ff4f56f5035.

* ⚠️ TEMP FIX: hotfix reanimated build.gradle

* Update CameraView+TakeSnapshot.kt

* ⚠️ TEMP FIX: Disable ktlint action for now

* Update clean.sh

* Set max heap size to 4g

* rebuild lockfiles

* Update Podfile.lock

* rename

* Build lib .aar before example/
2021-06-27 12:37:54 +02:00

115 lines
4.0 KiB
Kotlin

package com.mrousavy.camera
import android.annotation.SuppressLint
import android.hardware.camera2.*
import android.util.Log
import androidx.camera.camera2.interop.Camera2CameraInfo
import androidx.camera.core.ImageCapture
import androidx.camera.core.ImageProxy
import androidx.exifinterface.media.ExifInterface
import com.facebook.react.bridge.Arguments
import com.facebook.react.bridge.ReadableMap
import com.facebook.react.bridge.WritableMap
import com.mrousavy.camera.utils.*
import kotlinx.coroutines.*
import java.io.File
import kotlin.system.measureTimeMillis
@SuppressLint("UnsafeOptInUsageError")
suspend fun CameraView.takePhoto(options: ReadableMap): WritableMap = coroutineScope {
if (fallbackToSnapshot) {
Log.i(CameraView.TAG, "takePhoto() called, but falling back to Snapshot because 1 use-case is already occupied.")
return@coroutineScope takeSnapshot(options)
}
val startFunc = System.nanoTime()
Log.i(CameraView.TAG, "takePhoto() called")
if (imageCapture == null) {
if (photo == true) {
throw CameraNotReadyError()
} else {
throw PhotoNotEnabledError()
}
}
if (options.hasKey("flash")) {
val flashMode = options.getString("flash")
imageCapture!!.flashMode = when (flashMode) {
"on" -> ImageCapture.FLASH_MODE_ON
"off" -> ImageCapture.FLASH_MODE_OFF
"auto" -> ImageCapture.FLASH_MODE_AUTO
else -> throw InvalidTypeScriptUnionError("flash", flashMode ?: "(null)")
}
}
// All those options are not yet implemented - see https://github.com/mrousavy/react-native-vision-camera/issues/75
if (options.hasKey("photoCodec")) {
// TODO photoCodec
}
if (options.hasKey("qualityPrioritization")) {
// TODO qualityPrioritization
}
if (options.hasKey("enableAutoRedEyeReduction")) {
// TODO enableAutoRedEyeReduction
}
if (options.hasKey("enableDualCameraFusion")) {
// TODO enableDualCameraFusion
}
if (options.hasKey("enableAutoStabilization")) {
// TODO enableAutoStabilization
}
if (options.hasKey("enableAutoDistortionCorrection")) {
// TODO enableAutoDistortionCorrection
}
val skipMetadata = if (options.hasKey("skipMetadata")) options.getBoolean("skipMetadata") else false
val camera2Info = Camera2CameraInfo.from(camera!!.cameraInfo)
val lensFacing = camera2Info.getCameraCharacteristic(CameraCharacteristics.LENS_FACING)
val results = awaitAll(
async(coroutineContext) {
Log.d(CameraView.TAG, "Taking picture...")
val startCapture = System.nanoTime()
val pic = imageCapture!!.takePicture(takePhotoExecutor)
val endCapture = System.nanoTime()
Log.i(CameraView.TAG_PERF, "Finished image capture in ${(endCapture - startCapture) / 1_000_000}ms")
pic
},
async(Dispatchers.IO) {
Log.d(CameraView.TAG, "Creating temp file...")
File.createTempFile("mrousavy", ".jpg", context.cacheDir).apply { deleteOnExit() }
}
)
val photo = results.first { it is ImageProxy } as ImageProxy
val file = results.first { it is File } as File
val exif: ExifInterface?
@Suppress("BlockingMethodInNonBlockingContext")
withContext(Dispatchers.IO) {
Log.d(CameraView.TAG, "Saving picture to ${file.absolutePath}...")
val milliseconds = measureTimeMillis {
val flipHorizontally = lensFacing == CameraCharacteristics.LENS_FACING_FRONT
photo.save(file, flipHorizontally)
}
Log.i(CameraView.TAG_PERF, "Finished image saving in ${milliseconds}ms")
// TODO: Read Exif from existing in-memory photo buffer instead of file?
exif = if (skipMetadata) null else ExifInterface(file)
}
val map = Arguments.createMap()
map.putString("path", file.absolutePath)
map.putInt("width", photo.width)
map.putInt("height", photo.height)
map.putBoolean("isRawPhoto", photo.isRaw)
val metadata = exif?.buildMetadataMap()
map.putMap("metadata", metadata)
photo.close()
Log.d(CameraView.TAG, "Finished taking photo!")
val endFunc = System.nanoTime()
Log.i(CameraView.TAG_PERF, "Finished function execution in ${(endFunc - startFunc) / 1_000_000}ms")
return@coroutineScope map
}