fix: Rename getFrameProcessorPlugin to initFrameProcessorPlugin (#2038)

* fix: Rename `getFrameProcessorPlugin` to `initFrameProcessorPlugin`

* fix: Make nullable, add comments

* Format

* Update FrameProcessorPlugin.java

* Update FrameProcessorPlugin.h

* fix: Fix dead links

* Call super constructor

* Update ExampleFrameProcessorPlugin.java

* fix: Init calls
This commit is contained in:
Marc Rousavy 2023-10-19 11:19:47 +02:00 committed by GitHub
parent a291642c53
commit 07027d8010
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
22 changed files with 97 additions and 63 deletions

View File

@ -52,7 +52,7 @@ Frame Processors are JS functions that will be _workletized_ using [react-native
Frame Processor Plugins are native functions (written in Objective-C, Swift, C++, Java or Kotlin) that are injected into the VisionCamera JS-Runtime. They can be synchronously called from your JS Frame Processors (using JSI) without ever going over the bridge. Because VisionCamera provides an easy-to-use plugin API, you can easily create a Frame Processor Plugin yourself. Some examples include [Barcode Scanning](https://developers.google.com/ml-kit/vision/barcode-scanning), [Face Detection](https://developers.google.com/ml-kit/vision/face-detection), [Image Labeling](https://developers.google.com/ml-kit/vision/image-labeling), [Text Recognition](https://developers.google.com/ml-kit/vision/text-recognition) and more.
> Learn how to [create Frame Processor Plugins](frame-processors-plugins-overview), or check out the [example Frame Processor Plugin for iOS](https://github.com/mrousavy/react-native-vision-camera/blob/main/package/example/ios/Frame%20Processor%20Plugins/Example%20Plugin%20(Swift)/ExamplePluginSwift.swift) or [Android](https://github.com/mrousavy/react-native-vision-camera/blob/main/package/example/android/app/src/main/java/com/mrousavy/camera/example/ExampleFrameProcessorPlugin.java).
> Learn how to [create Frame Processor Plugins](frame-processors-plugins-overview), or check out the [example Frame Processor Plugin for iOS](https://github.com/mrousavy/react-native-vision-camera/blob/main/package/example/ios/Frame%20Processor%20Plugins/Example%20Swift%20Plugin/ExampleSwiftFrameProcessor.swift) or [Android](https://github.com/mrousavy/react-native-vision-camera/blob/main/package/example/android/app/src/main/java/com/mrousavy/camera/example/ExampleKotlinFrameProcessorPlugin.kt).
### The `Frame` object

View File

@ -11,7 +11,7 @@ To make the Frame Processor Plugin available to the Frame Processor Worklet Runt
```ts
import { VisionCameraProxy, Frame } from 'react-native-vision-camera'
const plugin = VisionCameraProxy.getFrameProcessorPlugin('scanFaces')
const plugin = VisionCameraProxy.initFrameProcessorPlugin('scanFaces')
/**
* Scans faces.

View File

@ -108,7 +108,7 @@ public class FaceDetectorFrameProcessorPluginPackage implements ReactPackage {
```
:::note
The Frame Processor Plugin will be exposed to JS through the `VisionCameraProxy` object. In this case, it would be `VisionCameraProxy.getFrameProcessorPlugin("detectFaces")`.
The Frame Processor Plugin will be exposed to JS through the `VisionCameraProxy` object. In this case, it would be `VisionCameraProxy.initFrameProcessorPlugin("detectFaces")`.
:::
6. Register the package in MainApplication.java
@ -146,7 +146,7 @@ class FaceDetectorFrameProcessorPlugin(options: Map<String, Any>?): FrameProcess
}
```
4. **Implement your Frame Processing.** See the [Example Plugin (Java)](https://github.com/mrousavy/react-native-vision-camera/blob/main/package/example/android/app/src/main/java/com/mrousavy/camera/example/ExampleFrameProcessorPlugin.java) for reference.
4. **Implement your Frame Processing.** See the [Example Plugin (Kotlin)](https://github.com/mrousavy/react-native-vision-camera/blob/main/package/example/android/app/src/main/java/com/mrousavy/camera/example/ExampleKotlinFrameProcessorPlugin.kt) for reference.
5. Create a new Kotlin file which registers the Frame Processor Plugin in a React Package, for the Face Detector plugin this file will be called `FaceDetectorFrameProcessorPluginPackage.kt`:
```kotlin
@ -176,7 +176,7 @@ class FaceDetectorFrameProcessorPluginPackage : ReactPackage {
```
:::note
The Frame Processor Plugin will be exposed to JS through the `VisionCameraProxy` object. In this case, it would be `VisionCameraProxy.getFrameProcessorPlugin("detectFaces")`.
The Frame Processor Plugin will be exposed to JS through the `VisionCameraProxy` object. In this case, it would be `VisionCameraProxy.initFrameProcessorPlugin("detectFaces")`.
:::
6. Register the package in MainApplication.java

View File

@ -51,7 +51,7 @@ For reference see the [CLI's docs](https://github.com/mateusz1913/vision-camera-
@implementation FaceDetectorFrameProcessorPlugin
- (instancetype) initWithOptions:(NSDictionary*)options; {
- (instancetype) initWithOptions:(NSDictionary* _Nullable)options; {
self = [super initWithOptions:options];
return self;
}
@ -71,10 +71,10 @@ VISION_EXPORT_FRAME_PROCESSOR(FaceDetectorFrameProcessorPlugin, detectFaces)
```
:::note
The Frame Processor Plugin will be exposed to JS through the `VisionCameraProxy` object. In this case, it would be `VisionCameraProxy.getFrameProcessorPlugin("detectFaces")`.
The Frame Processor Plugin will be exposed to JS through the `VisionCameraProxy` object. In this case, it would be `VisionCameraProxy.initFrameProcessorPlugin("detectFaces")`.
:::
4. **Implement your Frame Processing.** See the [Example Plugin (Objective-C)](https://github.com/mrousavy/react-native-vision-camera/blob/main/package/example/ios/Frame%20Processor%20Plugins/Example%20Plugin%20%28Objective%2DC%29) for reference.
4. **Implement your Frame Processing.** See the [Example Plugin (Objective-C)](https://github.com/mrousavy/react-native-vision-camera/blob/main/package/example/ios/Frame%20Processor%20Plugins/Example%20Plugin/ExampleFrameProcessorPlugin.m) for reference.
</TabItem>
<TabItem value="swift">
@ -117,7 +117,7 @@ VISION_EXPORT_SWIFT_FRAME_PROCESSOR(FaceDetectorFrameProcessorPlugin, detectFace
// highlight-end
```
5. **Implement your frame processing.** See [Example Plugin (Swift)](https://github.com/mrousavy/react-native-vision-camera/blob/main/package/example/ios/Frame%20Processor%20Plugins/Example%20Plugin%20%28Swift%29) for reference.
5. **Implement your frame processing.** See [Example Plugin (Swift)](https://github.com/mrousavy/react-native-vision-camera/blob/main/package/example/ios/Frame%20Processor%20Plugins/Example%20Swift%20Plugin/ExampleSwiftFrameProcessor.swift) for reference.
</TabItem>

View File

@ -13,7 +13,7 @@ import useBaseUrl from '@docusaurus/useBaseUrl'
## Mocking VisionCamera
These steps allow you to mock VisionCamera and use it for developing or testing. Based on
[Detox Mock Guide](https://github.com/wix/Detox/blob/master/docs/Guide.Mocking.md).
[Detox Mock Guide](https://wix.github.io/Detox/docs/guide/mocking).
### Configure the Metro bundler

View File

@ -82,7 +82,7 @@ If you're experiencing build issues or runtime issues in VisionCamera, make sure
yarn # or `npm i`
```
4. Make sure you have installed the [Android NDK](https://developer.android.com/ndk).
5. Make sure your minimum SDK version is **26 or higher**, and target SDK version is **33 or higher**. See [the example's `build.gradle`](https://github.com/mrousavy/react-native-vision-camera/blob/main/package/example/android/build.gradle#L5-L10) for reference.
5. Make sure your minimum SDK version is **26 or higher**, and target SDK version is **33 or higher**. See [the example's `build.gradle`](https://github.com/mrousavy/react-native-vision-camera/blob/main/package/example/android/build.gradle#L5-L9) for reference.
1. Open your `build.gradle`
2. Set `buildToolsVersion` to `33.0.0` or higher
3. Set `compileSdkVersion` to `33` or higher

View File

@ -37,7 +37,7 @@ std::vector<jsi::PropNameID> VisionCameraProxy::getPropertyNames(jsi::Runtime& r
std::vector<jsi::PropNameID> result;
result.push_back(jsi::PropNameID::forUtf8(runtime, std::string("setFrameProcessor")));
result.push_back(jsi::PropNameID::forUtf8(runtime, std::string("removeFrameProcessor")));
result.push_back(jsi::PropNameID::forUtf8(runtime, std::string("getFrameProcessorPlugin")));
result.push_back(jsi::PropNameID::forUtf8(runtime, std::string("initFrameProcessorPlugin")));
return result;
}
@ -49,10 +49,10 @@ void VisionCameraProxy::removeFrameProcessor(int viewTag) {
_javaProxy->cthis()->removeFrameProcessor(viewTag);
}
jsi::Value VisionCameraProxy::getFrameProcessorPlugin(jsi::Runtime& runtime, const std::string& name, const jsi::Object& jsOptions) {
jsi::Value VisionCameraProxy::initFrameProcessorPlugin(jsi::Runtime& runtime, const std::string& name, const jsi::Object& jsOptions) {
auto options = JSIJNIConversion::convertJSIObjectToJNIMap(runtime, jsOptions);
auto plugin = _javaProxy->cthis()->getFrameProcessorPlugin(name, options);
auto plugin = _javaProxy->cthis()->initFrameProcessorPlugin(name, options);
auto pluginHostObject = std::make_shared<FrameProcessorPluginHostObject>(plugin);
return jsi::Object::createFromHostObject(runtime, pluginHostObject);
@ -80,9 +80,9 @@ jsi::Value VisionCameraProxy::get(jsi::Runtime& runtime, const jsi::PropNameID&
return jsi::Value::undefined();
});
}
if (name == "getFrameProcessorPlugin") {
if (name == "initFrameProcessorPlugin") {
return jsi::Function::createFromHostFunction(
runtime, jsi::PropNameID::forUtf8(runtime, "getFrameProcessorPlugin"), 1,
runtime, jsi::PropNameID::forUtf8(runtime, "initFrameProcessorPlugin"), 1,
[this](jsi::Runtime& runtime, const jsi::Value& thisValue, const jsi::Value* arguments, size_t count) -> jsi::Value {
if (count < 1 || !arguments[0].isString()) {
throw jsi::JSError(runtime, "First argument needs to be a string (pluginName)!");
@ -90,7 +90,7 @@ jsi::Value VisionCameraProxy::get(jsi::Runtime& runtime, const jsi::PropNameID&
auto pluginName = arguments[0].asString(runtime).utf8(runtime);
auto options = count > 1 ? arguments[1].asObject(runtime) : jsi::Object(runtime);
return this->getFrameProcessorPlugin(runtime, pluginName, options);
return this->initFrameProcessorPlugin(runtime, pluginName, options);
});
}

View File

@ -28,7 +28,7 @@ public:
private:
void setFrameProcessor(int viewTag, jsi::Runtime& runtime, const jsi::Object& frameProcessor);
void removeFrameProcessor(int viewTag);
jsi::Value getFrameProcessorPlugin(jsi::Runtime& runtime, const std::string& name, const jsi::Object& options);
jsi::Value initFrameProcessorPlugin(jsi::Runtime& runtime, const std::string& name, const jsi::Object& options);
private:
jni::global_ref<JVisionCameraProxy::javaobject> _javaProxy;

View File

@ -82,10 +82,10 @@ void JVisionCameraProxy::removeFrameProcessor(int viewTag) {
removeFrameProcessorMethod(_javaPart, viewTag);
}
local_ref<JFrameProcessorPlugin::javaobject> JVisionCameraProxy::getFrameProcessorPlugin(const std::string& name, TOptions options) {
auto getFrameProcessorPluginMethod =
javaClassLocal()->getMethod<JFrameProcessorPlugin(local_ref<jstring>, TOptions)>("getFrameProcessorPlugin");
return getFrameProcessorPluginMethod(_javaPart, make_jstring(name), std::move(options));
local_ref<JFrameProcessorPlugin::javaobject> JVisionCameraProxy::initFrameProcessorPlugin(const std::string& name, TOptions options) {
auto initFrameProcessorPluginMethod =
javaClassLocal()->getMethod<JFrameProcessorPlugin(local_ref<jstring>, TOptions)>("initFrameProcessorPlugin");
return initFrameProcessorPluginMethod(_javaPart, make_jstring(name), std::move(options));
}
void JVisionCameraProxy::registerNatives() {

View File

@ -30,8 +30,8 @@ public:
void setFrameProcessor(int viewTag, jsi::Runtime& runtime, const jsi::Object& frameProcessor);
void removeFrameProcessor(int viewTag);
jni::local_ref<JFrameProcessorPlugin::javaobject> getFrameProcessorPlugin(const std::string& name,
jni::local_ref<JMap<jstring, jobject>> options);
jni::local_ref<JFrameProcessorPlugin::javaobject> initFrameProcessorPlugin(const std::string& name,
jni::local_ref<JMap<jstring, jobject>> options);
jsi::Runtime* getJSRuntime() {
return _runtime;

View File

@ -7,25 +7,35 @@ import com.facebook.proguard.annotations.DoNotStrip;
import java.util.Map;
/**
* Declares a Frame Processor Plugin.
* The base class of a native Frame Processor Plugin.
*
* Subclass this to create a custom Frame Processor Plugin, which can be called from a JS Frame Processor.
* Once subclassed, it needs to be registered in the VisionCamera Frame Processor
* runtime via `FrameProcessorPluginRegistry.addFrameProcessorPlugin` - ideally at app startup.
* See: <a href="https://react-native-vision-camera.com/docs/guides/frame-processors-plugins-android">Creating Frame Processor Plugins (Android)</a>
* for more information
*/
@DoNotStrip
@Keep
public abstract class FrameProcessorPlugin {
public FrameProcessorPlugin() {}
/**
* The initializer for a Frame Processor Plugin class that takes optional object that consists
* options passed from JS layer
* The initializer of this Frame Processor Plugin.
* This is called everytime this Frame Processor Plugin is loaded from the JS side (`initFrameProcessorPlugin(..)`).
* Optionally override this method to implement custom initialization logic.
* @param options An options dictionary passed from the JS side, or null if none.
*/
public FrameProcessorPlugin(@Nullable Map<String, Object> options) {}
/**
* The actual Frame Processor plugin callback. Called for every frame the ImageAnalyzer receives.
* The actual Frame Processor Plugin's implementation that runs when `plugin.call(..)` is called in the JS Frame Processor.
* Implement your Frame Processing here, and keep in mind that this is a hot-path so optimize as good as possible.
* See: <a href="https://react-native-vision-camera.com/docs/guides/frame-processors-tips#fast-frame-processor-plugins">Performance Tips</a>
*
* @param frame The Frame from the Camera. Don't call .close() on this, as VisionCamera handles that.
* @return You can return any primitive, map or array you want. See the
* <a href="https://react-native-vision-camera.com/docs/guides/frame-processors-plugins-overview#types">Types</a>
* table for a list of supported types.
* @return You can return any primitive, map or array you want.
* See the <a href="https://react-native-vision-camera.com/docs/guides/frame-processors-plugins-overview#types">Types</a>
* table for a list of supported types.
*/
@DoNotStrip
@Keep

View File

@ -70,7 +70,7 @@ class VisionCameraProxy(context: ReactApplicationContext) {
@DoNotStrip
@Keep
fun getFrameProcessorPlugin(name: String, options: Map<String, Any>): FrameProcessorPlugin =
fun initFrameProcessorPlugin(name: String, options: Map<String, Any>): FrameProcessorPlugin =
FrameProcessorPluginRegistry.getPlugin(name, options)
// private C++ funcs

View File

@ -41,6 +41,7 @@ public class ExampleFrameProcessorPlugin extends FrameProcessorPlugin {
}
ExampleFrameProcessorPlugin(@Nullable Map<String, Object> options) {
Log.d("ExamplePlugin", " - options: " + options.toString());
super(options);
Log.d("ExamplePlugin", "ExampleFrameProcessorPlugin initialized with options: " + options);
}
}

View File

@ -6,7 +6,7 @@ import com.mrousavy.camera.frameprocessor.FrameProcessorPlugin
class ExampleKotlinFrameProcessorPlugin(options: Map<String, Any>?): FrameProcessorPlugin(options) {
init {
Log.d("ExampleKotlinPlugin", " - options" + options?.toString())
Log.d("ExampleKotlinPlugin", "ExampleKotlinFrameProcessorPlugin initialized with options: " + options?.toString())
}
override fun callback(frame: Frame, params: Map<String, Any>?): Any? {

View File

@ -18,10 +18,10 @@
@implementation ExampleFrameProcessorPlugin
- (instancetype)initWithOptions:(NSDictionary * _Nullable)options
- (instancetype)initWithOptions:(NSDictionary* _Nullable)options
{
self = [super initWithOptions:options];
NSLog(@"ExamplePlugin - options: %@", options);
NSLog(@"ExampleFrameProcessorPlugin initialized with options: %@", options);
return self;
}

View File

@ -14,7 +14,7 @@ public class ExampleSwiftFrameProcessorPlugin: FrameProcessorPlugin {
public override init(options: [AnyHashable: Any]! = [:]) {
super.init(options: options)
print("ExampleSwiftPlugin - options: \(String(describing: options))")
print("ExampleSwiftFrameProcessorPlugin initialized with options: \(String(describing: options))")
}
public override func callback(_ frame: Frame, withArguments arguments: [AnyHashable: Any]?) -> Any? {

View File

@ -1,6 +1,6 @@
import { VisionCameraProxy, Frame } from 'react-native-vision-camera'
const plugin = VisionCameraProxy.getFrameProcessorPlugin('example_kotlin_swift_plugin', { foo: 'bar' })
const plugin = VisionCameraProxy.initFrameProcessorPlugin('example_kotlin_swift_plugin', { foo: 'bar' })
export function exampleKotlinSwiftPlugin(frame: Frame): string[] {
'worklet'

View File

@ -1,6 +1,6 @@
import { VisionCameraProxy, Frame } from 'react-native-vision-camera'
const plugin = VisionCameraProxy.getFrameProcessorPlugin('example_plugin')
const plugin = VisionCameraProxy.initFrameProcessorPlugin('example_plugin')
interface Result {
example_array: (string | number | boolean)[]

View File

@ -11,22 +11,38 @@
#import "Frame.h"
#import <Foundation/Foundation.h>
/// The base class for a Frame Processor Plugin which can be called synchronously from a JS Frame
/// Processor.
///
/// Subclass this class in a Swift or Objective-C class and override the `callback:withArguments:`
/// method, and implement your Frame Processing there.
///
/// Use `[FrameProcessorPluginRegistry addFrameProcessorPlugin:]` to register the Plugin to the
/// VisionCamera Runtime.
/**
* The base class of a native Frame Processor Plugin.
*
* Subclass this to create a custom Frame Processor Plugin, which can be called from a JS Frame Processor.
* Once subclassed, it needs to be registered in the VisionCamera Frame Processor runtime via
* the `VISION_EXPORT_FRAME_PROCESSOR` or `VISION_EXPORT_SWIFT_FRAME_PROCESSOR` macros.
* See: <a href="https://react-native-vision-camera.com/docs/guides/frame-processors-plugins-ios">Creating Frame Processor Plugins (iOS)</a>
* for more information
*/
@interface FrameProcessorPlugin : NSObject
/// The initializer for a Frame Processor Plugin class that takes optional object that consists
/// options passed from JS layer
/**
* The initializer of this Frame Processor Plugin.
* This is called everytime this Frame Processor Plugin is loaded from the JS side (`initFrameProcessorPlugin(..)`).
* Optionally override this method to implement custom initialization logic.
* - Parameters:
* - options: An options dictionary passed from the JS side, or `nil` if none.
*/
- (instancetype _Nonnull)initWithOptions:(NSDictionary* _Nullable)options;
/// The actual callback when calling this plugin. Any Frame Processing should be handled there.
/// Make sure your code is optimized, as this is a hot path.
/**
* The actual Frame Processor Plugin's implementation that runs when `plugin.call(..)` is called in the JS Frame Processor.
* Implement your Frame Processing here, and keep in mind that this is a hot-path so optimize as good as possible.
* See: <a href="https://react-native-vision-camera.com/docs/guides/frame-processors-tips#fast-frame-processor-plugins">Performance Tips</a>
*
* - Parameters:
* - frame: The Frame from the Camera. Don't do any ref-counting on this, as VisionCamera handles that.
* - Returns: You can return any primitive, map or array you want.
* See the <a href="https://react-native-vision-camera.com/docs/guides/frame-processors-plugins-overview#types">Types</a>
* table for a list of supported types.
*/
- (id _Nullable)callback:(Frame* _Nonnull)frame withArguments:(NSDictionary* _Nullable)arguments;
@end

View File

@ -30,7 +30,7 @@ public:
private:
void setFrameProcessor(jsi::Runtime& runtime, int viewTag, const jsi::Object& frameProcessor);
void removeFrameProcessor(jsi::Runtime& runtime, int viewTag);
jsi::Value getFrameProcessorPlugin(jsi::Runtime& runtime, std::string name, const jsi::Object& options);
jsi::Value initFrameProcessorPlugin(jsi::Runtime& runtime, std::string name, const jsi::Object& options);
private:
std::shared_ptr<RNWorklet::JsiWorkletContext> _workletContext;

View File

@ -65,7 +65,7 @@ std::vector<jsi::PropNameID> VisionCameraProxy::getPropertyNames(jsi::Runtime& r
std::vector<jsi::PropNameID> result;
result.push_back(jsi::PropNameID::forUtf8(runtime, std::string("setFrameProcessor")));
result.push_back(jsi::PropNameID::forUtf8(runtime, std::string("removeFrameProcessor")));
result.push_back(jsi::PropNameID::forUtf8(runtime, std::string("getFrameProcessorPlugin")));
result.push_back(jsi::PropNameID::forUtf8(runtime, std::string("initFrameProcessorPlugin")));
return result;
}
@ -94,7 +94,7 @@ void VisionCameraProxy::removeFrameProcessor(jsi::Runtime& runtime, int viewTag)
});
}
jsi::Value VisionCameraProxy::getFrameProcessorPlugin(jsi::Runtime& runtime, std::string name, const jsi::Object& options) {
jsi::Value VisionCameraProxy::initFrameProcessorPlugin(jsi::Runtime& runtime, std::string name, const jsi::Object& options) {
NSString* key = [NSString stringWithUTF8String:name.c_str()];
NSDictionary* optionsObjc = JSINSObjectConversion::convertJSIObjectToNSDictionary(runtime, options, _callInvoker);
FrameProcessorPlugin* plugin = [FrameProcessorPluginRegistry getPlugin:key withOptions:optionsObjc];
@ -128,9 +128,9 @@ jsi::Value VisionCameraProxy::get(jsi::Runtime& runtime, const jsi::PropNameID&
return jsi::Value::undefined();
});
}
if (name == "getFrameProcessorPlugin") {
if (name == "initFrameProcessorPlugin") {
return jsi::Function::createFromHostFunction(
runtime, jsi::PropNameID::forUtf8(runtime, "getFrameProcessorPlugin"), 1,
runtime, jsi::PropNameID::forUtf8(runtime, "initFrameProcessorPlugin"), 1,
[this](jsi::Runtime& runtime, const jsi::Value& thisValue, const jsi::Value* arguments, size_t count) -> jsi::Value {
if (count < 1 || !arguments[0].isString()) {
throw jsi::JSError(runtime, "First argument needs to be a string (pluginName)!");
@ -138,7 +138,7 @@ jsi::Value VisionCameraProxy::get(jsi::Runtime& runtime, const jsi::PropNameID&
auto pluginName = arguments[0].asString(runtime).utf8(runtime);
auto options = count > 1 ? arguments[1].asObject(runtime) : jsi::Object(runtime);
return this->getFrameProcessorPlugin(runtime, pluginName, options);
return this->initFrameProcessorPlugin(runtime, pluginName, options);
});
}

View File

@ -24,10 +24,17 @@ interface TVisionCameraProxy {
setFrameProcessor: (viewTag: number, frameProcessor: FrameProcessor) => void
removeFrameProcessor: (viewTag: number) => void
/**
* Creates a new instance of a Frame Processor Plugin.
* The Plugin has to be registered on the native side, otherwise this returns `undefined`
* Creates a new instance of a native Frame Processor Plugin.
* The Plugin has to be registered on the native side, otherwise this returns `undefined`.
* @param name The name of the Frame Processor Plugin. This has to be the same name as on the native side.
* @param options (optional) Options, as a native dictionary, passed to the constructor/init-function of the native plugin.
* @example
* ```ts
* const plugin = VisionCameraProxy.initFrameProcessorPlugin('scanFaces', { model: 'fast' })
* if (plugin == null) throw new Error("Failed to load scanFaces plugin!")
* ```
*/
getFrameProcessorPlugin: (name: string, options?: Record<string, ParameterType>) => FrameProcessorPlugin | undefined
initFrameProcessorPlugin: (name: string, options?: Record<string, ParameterType>) => FrameProcessorPlugin | undefined
}
let hasWorklets = false
@ -66,7 +73,7 @@ try {
}
let proxy: TVisionCameraProxy = {
getFrameProcessorPlugin: () => {
initFrameProcessorPlugin: () => {
throw new CameraRuntimeError('system/frame-processors-unavailable', 'Frame Processors are not enabled!')
},
removeFrameProcessor: () => {