feat: Frame Processors for Android (#196)
* Create android gradle build setup * Fix `prefab` config * Add `pickFirst **/*.so` to example build.gradle * fix REA path * cache gradle builds * Update validate-android.yml * Create Native Proxy * Copy REA header * implement ctor * Rename CameraViewModule -> FrameProcessorRuntimeManager * init FrameProcessorRuntimeManager * fix name * Update FrameProcessorRuntimeManager.h * format * Create AndroidErrorHandler.h * Initialize runtime and install JSI funcs * Update FrameProcessorRuntimeManager.cpp * Update CameraViewModule.kt * Make CameraView hybrid C++ class to find view & set frame processor * Update FrameProcessorRuntimeManager.cpp * pass function by rvalue * pass by const && * extract hermes and JSC REA * pass `FOR_HERMES` * correctly prepare JSC and Hermes * Update CMakeLists.txt * add missing hermes include * clean up imports * Create JImageProxy.h * pass ImageProxy to JNI as `jobject` * try use `JImageProxy` C++ wrapper type * Use `local_ref<JImageProxy>` * Create `JImageProxyHostObject` for JSI interop * debug call to frame processor * Unset frame processor * Fix CameraView native part not being registered * close image * use `jobject` instead of `JImageProxy` for now :( * fix hermes build error * Set enable FP callback * fix JNI call * Update CameraView.cpp * Get Format * Create plugin abstract * Make `FrameProcessorPlugin` a hybrid object * Register plugin CXX * Call `registerPlugin` * Catch * remove JSI * Create sample QR code plugin * register plugins * Fix missing JNI binding * Add `mHybridData` * prefix name with two underscores (`__`) * Update CameraPage.tsx * wrap `ImageProxy` in host object * Use `jobject` for HO box * Update JImageProxy.h * reinterpret jobject * Try using `JImageProxy` instead of `jobject` * Update JImageProxy.h * get bytes per row and plane count * Update CameraView.cpp * Return base * add some docs and JNI JSI conversion * indent * Convert JSI value to JNI jobject * using namespace facebook * Try using class * Use plain old Object[] * Try convert JNI -> JSI * fix decl * fix bool init * Correctly link folly * Update CMakeLists.txt * Convert Map to Object * Use folly for Map and Array * Return `alias_ref<jobject>` instead of raw `jobject` * fix JNI <-> JSI conversion * Update JSIJNIConversion.cpp * Log parameters * fix params index offset * add more test cases * Update FRAME_PROCESSORS_CREATE_OVERVIEW.mdx * fix types * Rename to example plugin * remove support for hashmap * Try use HashMap iterable fbjni binding * try using JReadableArray/JReadableMap * Fix list return values * Update JSIJNIConversion.cpp * Update JSIJNIConversion.cpp * (iOS) Rename ObjC QR Code Plugin to Example Plugin * Rename Swift plugin QR -> Example * Update ExamplePluginSwift.swift * Fix Map/Dictionary logging format * Update ExampleFrameProcessorPlugin.m * Reconfigure session if frame processor changed * Handle use-cases via `maxUseCasesCount` * Don't crash app on `configureSession` error * Document "use-cases" * Update DEVICES.mdx * fix merge * Make `const &` * iOS: Automatically enable `video` if a `frameProcessor` is set * Update CameraView.cpp * fix docs * Automatically fallback to snapshot capture if `supportsParallelVideoProcessing` is false. * Fix lookup * Update CameraView.kt * Implement `frameProcessorFps` * Finalize Frame Processor Plugin Hybrid * Update CameraViewModule.kt * Support `flash` on `takeSnapshot()` * Update docs * Add docs * Update CameraPage.tsx * Attribute NonNull * remove unused imports * Add Android docs for Frame Processors * Make JNI HashMap <-> JSI Object conversion faster directly access `toHashMap` instead of going through java * add todo * Always run `prepareJSC` and `prepareHermes` * switch jsc and hermes * Specify ndkVersion `21.4.7075529` * Update gradle.properties * Update gradle.properties * Create .aar * Correctly prepare android package * Update package.json * Update package.json * remove `prefab` build feature * split * Add docs for registering the FP plugin * Add step for dep * Update CaptureButton.tsx * Move to `reanimated-headers/` * Exclude reanimated-headers from cpplint * disable `build/include_order` rule * cpplint fixes * perf: Make `JSIJNIConversion` a `namespace` instead of `class` * Ignore runtime/references for `convert` funcs * Build Android .aar in CI * Run android build script only on `prepack` * Update package.json * Update package.json * Update build-android-npm-package.sh * Move to `yarn build` * Also install node_modules in example step * Update validate-android.yml * sort imports * fix torch * Run ImageAnalysis on `FrameProcessorThread` * Update Errors.kt * Add clean android script * Upgrade reanimated to 2.3.0-alpha.1 * Revert "Upgrade reanimated to 2.3.0-alpha.1" This reverts commit c1d3bed5e03728d0b5e335a359524ff4f56f5035. * ⚠️ TEMP FIX: hotfix reanimated build.gradle * Update CameraView+TakeSnapshot.kt * ⚠️ TEMP FIX: Disable ktlint action for now * Update clean.sh * Set max heap size to 4g * rebuild lockfiles * Update Podfile.lock * rename * Build lib .aar before example/
This commit is contained in:
@@ -358,13 +358,6 @@ export class Camera extends React.PureComponent<CameraProps> {
|
||||
this.assertFrameProcessorsEnabled();
|
||||
// frameProcessor argument changed. Update native to reflect the change.
|
||||
if (this.props.frameProcessor != null) {
|
||||
if (this.props.video !== true) {
|
||||
throw new CameraCaptureError(
|
||||
'capture/video-not-enabled',
|
||||
'Video capture is disabled! Pass `video={true}` to enable frame processors.',
|
||||
);
|
||||
}
|
||||
|
||||
// 1. Spawn threaded JSI Runtime (if not already done)
|
||||
// 2. Add video data output to Camera stream (if not already done)
|
||||
// 3. Workletize the frameProcessor and prepare it for being called with frames
|
||||
@@ -386,8 +379,19 @@ export class Camera extends React.PureComponent<CameraProps> {
|
||||
*/
|
||||
public render(): React.ReactNode {
|
||||
// We remove the big `device` object from the props because we only need to pass `cameraId` to native.
|
||||
const { device, frameProcessor: _, ...props } = this.props;
|
||||
return <NativeCameraView {...props} cameraId={device.id} ref={this.ref} onInitialized={this.onInitialized} onError={this.onError} />;
|
||||
const { device, video: enableVideo, frameProcessor, ...props } = this.props;
|
||||
// on iOS, enabling a frameProcessor requires `video` to be `true`. On Android, it doesn't.
|
||||
const video = Platform.OS === 'ios' ? frameProcessor != null || enableVideo : enableVideo;
|
||||
return (
|
||||
<NativeCameraView
|
||||
{...props}
|
||||
cameraId={device.id}
|
||||
ref={this.ref}
|
||||
onInitialized={this.onInitialized}
|
||||
onError={this.onError}
|
||||
video={video}
|
||||
/>
|
||||
);
|
||||
}
|
||||
}
|
||||
//#endregion
|
||||
|
@@ -251,27 +251,15 @@ export interface CameraDevice {
|
||||
*/
|
||||
formats: CameraDeviceFormat[];
|
||||
/**
|
||||
* Whether this camera device supports enabling photo and video capture at the same time.
|
||||
* Whether this camera device supports using Video Recordings (`video={true}`) and Frame Processors (`frameProcessor={...}`) at the same time. See ["The `supportsParallelVideoProcessing` prop"](https://mrousavy.github.io/react-native-vision-camera/docs/guides/devices#the-supportsparallelvideoprocessing-prop) for more information.
|
||||
*
|
||||
* * On **iOS** devices this value is always `true`.
|
||||
* * On newer **Android** devices this value is always `true`.
|
||||
* * On older **Android** devices this value is `true` if the device's hardware level is `LIMITED` or above, `false` otherwise. (`LEGACY`) (See [this table](https://developer.android.com/reference/android/hardware/camera2/CameraDevice#regular-capture))
|
||||
* If this property is `false`, you can only enable `video` or add a `frameProcessor`, but not both.
|
||||
*
|
||||
* If the device does not allow enabling `photo` and `video` capture at the same time, you might want to fall back to **snapshot capture** (See [**"Taking Snapshots"**](https://mrousavy.github.io/react-native-vision-camera/docs/guides/capturing#taking-snapshots)) instead:
|
||||
*
|
||||
* @example
|
||||
* ```tsx
|
||||
* const captureMode = device.supportsPhotoAndVideoCapture ? "photo" : "snapshot"
|
||||
* return (
|
||||
* <Camera
|
||||
* photo={captureMode === "photo"}
|
||||
* video={true}
|
||||
* audio={true}
|
||||
* />
|
||||
* )
|
||||
* ```
|
||||
* * On iOS this value is always `true`.
|
||||
* * On newer Android devices this value is always `true`.
|
||||
* * On older Android devices this value is `false` if the Camera's hardware level is `LEGACY` or `LIMITED`, `true` otherwise. (See [`INFO_SUPPORTED_HARDWARE_LEVEL`](https://developer.android.com/reference/android/hardware/camera2/CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL) or [the tables at "Regular capture"](https://developer.android.com/reference/android/hardware/camera2/CameraDevice#regular-capture))
|
||||
*/
|
||||
supportsPhotoAndVideoCapture: boolean;
|
||||
supportsParallelVideoProcessing: boolean;
|
||||
/**
|
||||
* Whether this camera device supports low light boost.
|
||||
*/
|
||||
|
@@ -9,6 +9,7 @@ export type DeviceError =
|
||||
| 'device/configuration-error'
|
||||
| 'device/no-device'
|
||||
| 'device/invalid-device'
|
||||
| 'device/too-many-use-cases'
|
||||
| 'device/torch-unavailable'
|
||||
| 'device/microphone-unavailable'
|
||||
| 'device/low-light-boost-not-supported'
|
||||
|
@@ -36,16 +36,19 @@ export interface CameraProps extends ViewProps {
|
||||
|
||||
//#region Use-cases
|
||||
/**
|
||||
* * Enables **photo capture** with the `takePhoto` function (see ["Taking Photos"](https://mrousavy.github.io/react-native-vision-camera/docs/guides/capturing#taking-photos))
|
||||
* Enables **photo capture** with the `takePhoto` function (see ["Taking Photos"](https://mrousavy.github.io/react-native-vision-camera/docs/guides/capturing#taking-photos))
|
||||
*
|
||||
* Note: This occupies a use-case. (See ["Use-cases"](https://mrousavy.github.io/react-native-vision-camera/docs/guides/devices#use-cases))
|
||||
*/
|
||||
photo?: boolean;
|
||||
/**
|
||||
* * Enables **video capture** with the `startRecording` function (see ["Recording Videos"](https://mrousavy.github.io/react-native-vision-camera/docs/guides/capturing/#recording-videos))
|
||||
* * Enables **frame processing** (see ["Frame Processors"](https://mrousavy.github.io/react-native-vision-camera/docs/guides/frame-processors))
|
||||
* Enables **video capture** with the `startRecording` function (see ["Recording Videos"](https://mrousavy.github.io/react-native-vision-camera/docs/guides/capturing/#recording-videos))
|
||||
*
|
||||
* Note: This occupies a use-case. (See ["Use-cases"](https://mrousavy.github.io/react-native-vision-camera/docs/guides/devices#use-cases))
|
||||
*/
|
||||
video?: boolean;
|
||||
/**
|
||||
* * Enables **audio capture** for video recordings (see ["Recording Videos"](https://mrousavy.github.io/react-native-vision-camera/docs/guides/capturing/#recording-videos))
|
||||
* Enables **audio capture** for video recordings (see ["Recording Videos"](https://mrousavy.github.io/react-native-vision-camera/docs/guides/capturing/#recording-videos))
|
||||
*/
|
||||
audio?: boolean;
|
||||
//#endregion
|
||||
@@ -161,6 +164,8 @@ export interface CameraProps extends ViewProps {
|
||||
*
|
||||
* > See [the Frame Processors documentation](https://mrousavy.github.io/react-native-vision-camera/docs/guides/frame-processors) for more information
|
||||
*
|
||||
* Note: This occupies a use-case. (See ["Use-cases"](https://mrousavy.github.io/react-native-vision-camera/docs/guides/devices#use-cases))
|
||||
*
|
||||
* @example
|
||||
* ```tsx
|
||||
* const frameProcessor = useFrameProcessor((frame) => {
|
||||
|
@@ -8,6 +8,13 @@ export interface TakeSnapshotOptions {
|
||||
*/
|
||||
quality?: number;
|
||||
|
||||
/**
|
||||
* Whether the Flash should be enabled or disabled
|
||||
*
|
||||
* @default "off"
|
||||
*/
|
||||
flash?: 'on' | 'off';
|
||||
|
||||
/**
|
||||
* When set to `true`, metadata reading and mapping will be skipped. ({@linkcode PhotoFile.metadata} will be `null`)
|
||||
*
|
||||
|
Reference in New Issue
Block a user