--- id: frame-processors-plugins-overview title: Creating Frame Processor Plugins sidebar_label: Overview --- import useBaseUrl from '@docusaurus/useBaseUrl'; import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; ## Overview Frame Processor Plugins are **native functions** which can be directly called from a JS Frame Processor. (See ["Frame Processors"](frame-processors)) They **receive a frame from the Camera** as an input and can return any kind of output. For example, a `scanQRCodes` function returns an array of detected QR code strings in the frame: ```tsx {4-5} function App() { const frameProcessor = useFrameProcessor((frame) => { 'worklet' const qrCodes = scanQRCodes(frame) console.log(`QR Codes in Frame: ${qrCodes}`) }, []) return ( ) } ``` To achieve **maximum performance**, the `scanQRCodes` function is written in a native language (e.g. Objective-C), but it will be directly called from the VisionCamera Frame Processor JavaScript-Runtime. ### Types Similar to a TurboModule, the Frame Processor Plugin Registry API automatically manages type conversion from JS <-> native. They are converted into the most efficient data-structures, as seen here: | JS Type | Objective-C/Swift Type | Java/Kotlin Type | |----------------------|-------------------------------|----------------------------| | `number` | `NSNumber*` (double) | `Double` | | `boolean` | `NSNumber*` (boolean) | `Boolean` | | `string` | `NSString*` | `String` | | `[]` | `NSArray*` | `ReadableNativeArray` | | `{}` | `NSDictionary*` | `ReadableNativeMap` | | `undefined` / `null` | `nil` | `null` | | `(any, any) => void` | [`RCTResponseSenderBlock`][4] | `(Object, Object) -> void` | | [`Frame`][1] | [`Frame*`][2] | [`ImageProxy`][3] | ### Return values Return values will automatically be converted to JS values, assuming they are representable in the ["Types" table](#types). So the following Java Frame Processor Plugin: ```java @Override public Object callback(ImageProxy image, Object[] params) { return "cat"; } ``` Returns a `string` in JS: ```js export function detectObject(frame: Frame): string { 'worklet' const result = __detectObject(frame) console.log(result) // <-- "cat" } ``` You can also manipulate the buffer and return it (or a copy of it) by returning a [`Frame`][2]/[`ImageProxy`][3] instance: ```java @Override public Object callback(ImageProxy image, Object[] params) { ImageProxy resizedImage = new ImageProxy(/* ... */); return resizedImage; } ``` Which returns a [`Frame`](https://github.com/mrousavy/react-native-vision-camera/blob/main/src/Frame.ts) in JS: ```js const frameProcessor = useFrameProcessor((frame) => { 'worklet'; // creates a new `Frame` that's 720x480 const resizedFrame = resize(frame, 720, 480) // by downscaling the frame, the `detectObjects` function runs faster. const objects = detectObjects(resizedFrame) console.log(objects) }, []) ``` ### Parameters Frame Processors can also accept parameters, following the same type convention as [return values](#return-values): ```ts const frameProcessor = useFrameProcessor((frame) => { 'worklet' const codes = scanCodes(frame, ['qr', 'barcode']) }, []) ``` Or with multiple ("variadic") parameters: ```ts const frameProcessor = useFrameProcessor((frame) => { 'worklet' const codes = scanCodes(frame, true, 'hello-world', 42) }, []) ``` ### Exceptions To let the user know that something went wrong you can use Exceptions: ```java @Override public Object callback(ImageProxy image, Object[] params) { if (params[0] instanceof String) { // ... } else { throw new Exception("First argument has to be a string!"); } } ``` Which will throw a JS-error: ```ts const frameProcessor = useFrameProcessor((frame) => { 'worklet' try { const codes = scanCodes(frame, true) } catch (e) { console.log(`Error: ${e.message}`) } }, []) ``` ## What's possible? You can run any native code you want in a Frame Processor Plugin. Just like in the native iOS and Android Camera APIs, you will receive a frame (`CMSampleBuffer` on iOS, `ImageProxy` on Android) which you can use however you want. In other words; **everything is possible**. ## Implementations ### Long-running Frame Processors If your Frame Processor takes longer than a single frame interval to execute, or runs asynchronously, you can create a **copy of the frame** and dispatch the actual frame processing to a **separate thread**. For example, a realtime video chat application might use WebRTC to send the frames to the server. I/O operations (networking) are asynchronous, and we don't _need_ to wait for the upload to succeed before pushing the next frame, so we copy the frame and perform the upload on another Thread. ```java @Override public Object callback(ImageProxy image, Object[] params) { String serverURL = (String)params[0]; ImageProxy imageCopy = new ImageProxy(/* ... */); uploaderQueue.runAsync(() -> { WebRTC.uploadImage(imageCopy, serverURL); imageCopy.close(); }); return null; } ``` ### Async Frame Processors with Event Emitters You might also run some very complex AI algorithms which are not fast enough to smoothly run at **30 FPS** (**33ms**). To not drop any frames you can create a custom "frame queue" which processes the copied frames and calls back into JS via a React event emitter. For this you'll have to create a Native Module that handles the asynchronous native -> JS communication, see ["Sending events to JavaScript" (Android)](https://reactnative.dev/docs/native-modules-android#sending-events-to-javascript) and ["Sending events to JavaScript" (iOS)](https://reactnative.dev/docs/native-modules-ios#sending-events-to-javascript). This might look like this for the user: ```tsx function App() { const frameProcessor = useFrameProcessor((frame) => { 'worklet' SomeAI.process(frame) // does not block frame processor, runs async }, []) useEffect(() => { SomeAI.addListener((results) => { // gets called asynchronously, goes through the React Event Emitter system console.log(`AI results: ${results}`) }) }, []) return ( ) } ``` This way you can handle queueing up the frames yourself and asynchronously call back into JS at some later point in time using event emitters. ### Benchmarking Frame Processor Plugins Your Frame Processor Plugins have to be fast. VisionCamera automatically detects slow Frame Processors and outputs relevant information in the native console (Xcode: **Debug Area**, Android Studio: **Logcat**):

#### 🚀 Create your first Frame Processor Plugin for [iOS](frame-processors-plugins-ios) or [Android](frame-processors-plugins-android)! [1]: https://github.com/mrousavy/react-native-vision-camera/blob/main/src/Frame.ts [2]: https://github.com/mrousavy/react-native-vision-camera/blob/main/ios/Frame%20Processor/Frame.h [3]: https://developer.android.com/reference/androidx/camera/core/ImageProxy [4]: https://github.com/facebook/react-native/blob/9a43eac7a32a6ba3164a048960101022a92fcd5a/React/Base/RCTBridgeModule.h#L20-L24