2021-05-06 14:11:55 +02:00
---
id: frame-processors-plugins-overview
title: Creating Frame Processor Plugins
sidebar_label: Overview
---
2023-09-26 11:39:17 +02:00
import useBaseUrl from '@docusaurus/useBaseUrl'
import Tabs from '@theme/Tabs'
import TabItem from '@theme/TabItem'
2021-05-06 14:11:55 +02:00
## Overview
2021-06-09 11:22:53 +02:00
Frame Processor Plugins are **native functions** which can be directly called from a JS Frame Processor. (See ["Frame Processors"](frame-processors))
2021-05-06 14:11:55 +02:00
2023-09-25 12:57:03 +02:00
They receive a frame from the Camera as an input and can return any kind of output. For example, a `detectFaces` function returns an array of detected faces in the frame:
2021-05-06 14:11:55 +02:00
2023-09-27 12:10:06 +02:00
```tsx
2021-05-06 14:11:55 +02:00
function App() {
const frameProcessor = useFrameProcessor((frame) => {
'worklet'
2023-09-27 12:10:06 +02:00
// highlight-next-line
2023-07-21 17:52:30 +02:00
const faces = detectFaces(frame)
console.log(`Faces in Frame: ${faces}`)
2021-05-06 14:11:55 +02:00
}, [])
return (
<Camera frameProcessor={frameProcessor} {...cameraProps} />
)
}
```
2023-09-25 12:57:03 +02:00
For maximum performance, the `detectFaces` function is written in a native language (e.g. Objective-C), but it will be directly called from the VisionCamera Frame Processor JavaScript-Runtime through JSI.
2021-05-06 14:11:55 +02:00
2021-06-08 10:15:34 +02:00
### Types
2021-05-06 14:11:55 +02:00
2023-09-11 11:45:17 +02:00
Similar to a TurboModule, the Frame Processor Plugin Registry API automatically manages type conversion from JS to native. They are converted into the most efficient data-structures, as seen here:
2021-05-06 14:11:55 +02:00
2021-07-08 16:05:51 +02:00
| JS Type | Objective-C/Swift Type | Java/Kotlin Type |
2021-06-09 10:57:05 +02:00
|----------------------|-------------------------------|----------------------------|
2021-06-27 12:37:54 +02:00
| `number` | `NSNumber*` (double) | `Double` |
| `boolean` | `NSNumber*` (boolean) | `Boolean` |
2021-06-09 10:57:05 +02:00
| `string` | `NSString*` | `String` |
2023-08-25 12:22:44 +02:00
| `[]` | `NSArray*` | `List<Object>` |
| `{}` | `NSDictionary*` | `Map<String, Object>` |
2021-06-09 10:57:05 +02:00
| `undefined` / `null` | `nil` | `null` |
| `(any, any) => void` | [`RCTResponseSenderBlock`][4] | `(Object, Object) -> void` |
2024-01-13 20:17:48 +01:00
| `ArrayBuffer` | [`SharedArray*`][7] | [`SharedArray`][8] |
2023-07-21 17:52:30 +02:00
| [`Frame`][1] | [`Frame*`][2] | [`Frame`][3] |
2021-05-06 14:11:55 +02:00
2021-06-08 10:15:34 +02:00
### Return values
2021-07-08 16:05:51 +02:00
Return values will automatically be converted to JS values, assuming they are representable in the ["Types" table](#types). So the following Java Frame Processor Plugin:
2021-05-06 14:11:55 +02:00
2021-07-08 16:05:51 +02:00
```java
2023-10-03 11:36:55 +02:00
@Nullable
2021-07-08 16:05:51 +02:00
@Override
2023-10-03 11:36:55 +02:00
public Object callback(@NonNull Frame frame, @Nullable Map<String, Object> arguments) {
2021-07-08 16:05:51 +02:00
return "cat";
2021-05-06 14:11:55 +02:00
}
```
Returns a `string` in JS:
```js
export function detectObject(frame: Frame): string {
2021-06-08 14:20:07 +02:00
'worklet'
2023-02-13 15:22:45 +01:00
const result = FrameProcessorPlugins.detectObject(frame)
2021-07-30 10:27:45 +02:00
console.log(result) // <-- "cat"
2021-05-06 14:11:55 +02:00
}
```
2023-07-21 17:52:30 +02:00
You can also manipulate the buffer and return it (or a copy of it) by returning a [`Frame`][2]/[`Frame`][3] instance:
2021-06-08 14:20:07 +02:00
2021-07-08 16:05:51 +02:00
```java
2023-10-03 11:36:55 +02:00
@Nullable
2021-07-08 16:05:51 +02:00
@Override
2023-10-03 11:36:55 +02:00
public Object callback(@NonNull Frame frame, @Nullable Map<String, Object> arguments) {
2023-07-21 17:52:30 +02:00
Frame resizedFrame = new Frame(/* ... */);
return resizedFrame;
2021-06-08 14:20:07 +02:00
}
```
2023-09-01 18:15:28 +02:00
Which returns a [`Frame`](https://github.com/mrousavy/react-native-vision-camera/blob/main/package/src/Frame.ts) in JS:
2021-06-08 14:20:07 +02:00
```js
const frameProcessor = useFrameProcessor((frame) => {
2023-09-26 11:39:17 +02:00
'worklet'
2021-07-08 16:05:51 +02:00
// creates a new `Frame` that's 720x480
2021-06-08 14:20:07 +02:00
const resizedFrame = resize(frame, 720, 480)
2021-07-08 16:05:51 +02:00
// by downscaling the frame, the `detectObjects` function runs faster.
2021-06-08 14:20:07 +02:00
const objects = detectObjects(resizedFrame)
2021-07-30 10:27:45 +02:00
console.log(objects)
2021-06-08 14:20:07 +02:00
}, [])
```
2021-05-06 14:11:55 +02:00
### Parameters
2021-06-08 10:15:34 +02:00
Frame Processors can also accept parameters, following the same type convention as [return values](#return-values):
2021-05-06 14:11:55 +02:00
```ts
const frameProcessor = useFrameProcessor((frame) => {
'worklet'
2023-10-04 12:56:47 +02:00
const faces = scanFaces(frame, { accuracy: 'fast' })
2021-05-06 14:11:55 +02:00
}, [])
```
2021-07-08 16:05:51 +02:00
### Exceptions
To let the user know that something went wrong you can use Exceptions:
```java
2023-10-03 11:36:55 +02:00
@Nullable
2021-07-08 16:05:51 +02:00
@Override
2023-10-03 11:36:55 +02:00
public Object callback(@NonNull Frame frame, @Nullable Map<String, Object> arguments) {
if (arguments != null && arguments.get("codeType") instanceof String) {
2021-07-08 16:05:51 +02:00
// ...
} else {
2023-10-03 11:36:55 +02:00
throw new Exception("codeType property has to be a string!");
2021-07-08 16:05:51 +02:00
}
}
```
Which will throw a JS-error:
```ts
const frameProcessor = useFrameProcessor((frame) => {
'worklet'
try {
2023-10-03 11:36:55 +02:00
const codes = scanCodes(frame, { codeType: 1234 })
2021-07-08 16:05:51 +02:00
} catch (e) {
2021-07-30 10:27:45 +02:00
console.log(`Error: ${e.message}`)
2021-07-08 16:05:51 +02:00
}
}, [])
```
2021-07-06 15:28:17 +02:00
## What's possible?
2023-07-22 00:15:11 +02:00
You can run any native code you want in a Frame Processor Plugin. Just like in the native iOS and Android Camera APIs, you will receive a frame ([`CMSampleBuffer`][5] on iOS, [`ImageProxy`][6] on Android) which you can use however you want. In other words; **everything is possible**.
2021-07-06 15:28:17 +02:00
2021-06-28 15:56:49 +02:00
## Implementations
2021-05-06 14:11:55 +02:00
### Long-running Frame Processors
If your Frame Processor takes longer than a single frame interval to execute, or runs asynchronously, you can create a **copy of the frame** and dispatch the actual frame processing to a **separate thread**.
For example, a realtime video chat application might use WebRTC to send the frames to the server. I/O operations (networking) are asynchronous, and we don't _need_ to wait for the upload to succeed before pushing the next frame, so we copy the frame and perform the upload on another Thread.
2021-07-08 16:05:51 +02:00
```java
2023-10-03 11:36:55 +02:00
@Nullable
2021-07-08 16:05:51 +02:00
@Override
2023-10-03 11:36:55 +02:00
public Object callback(@NonNull Frame frame, @Nullable Map<String, Object> arguments) {
if (arguments == null) {
return null;
}
String serverURL = (String)arguments.get("serverURL");
2023-07-21 17:52:30 +02:00
Frame frameCopy = new Frame(/* ... */);
2021-05-06 14:11:55 +02:00
2021-07-08 16:05:51 +02:00
uploaderQueue.runAsync(() -> {
2023-07-21 17:52:30 +02:00
WebRTC.uploadImage(frameCopy, serverURL);
frameCopy.close();
2021-05-06 14:11:55 +02:00
});
2021-07-08 16:05:51 +02:00
return null;
2021-05-06 14:11:55 +02:00
}
```
### Async Frame Processors with Event Emitters
You might also run some very complex AI algorithms which are not fast enough to smoothly run at **30 FPS** (**33ms**). To not drop any frames you can create a custom "frame queue" which processes the copied frames and calls back into JS via a React event emitter. For this you'll have to create a Native Module that handles the asynchronous native -> JS communication, see ["Sending events to JavaScript" (Android)](https://reactnative.dev/docs/native-modules-android#sending-events-to-javascript) and ["Sending events to JavaScript" (iOS)](https://reactnative.dev/docs/native-modules-ios#sending-events-to-javascript).
This might look like this for the user:
```tsx
function App() {
const frameProcessor = useFrameProcessor((frame) => {
'worklet'
SomeAI.process(frame) // does not block frame processor, runs async
}, [])
useEffect(() => {
SomeAI.addListener((results) => {
2021-05-11 11:11:49 +02:00
// gets called asynchronously, goes through the React Event Emitter system
2021-05-06 14:11:55 +02:00
console.log(`AI results: ${results}`)
})
}, [])
return (
<Camera frameProcessor={frameProcessor} {...cameraProps} />
)
}
```
This way you can handle queueing up the frames yourself and asynchronously call back into JS at some later point in time using event emitters.
### Benchmarking Frame Processor Plugins
2023-07-21 17:52:30 +02:00
Your Frame Processor Plugins have to be fast. Use the FPS Graph (`enableFpsGraph`) to see how fast your Camera is running, if it is not running at the target FPS, your Frame Processor is too slow.
2021-05-06 14:11:55 +02:00
<br />
#### 🚀 Create your first Frame Processor Plugin for [iOS](frame-processors-plugins-ios) or [Android](frame-processors-plugins-android)!
2021-06-09 10:57:05 +02:00
2023-09-01 18:15:28 +02:00
[1]: https://github.com/mrousavy/react-native-vision-camera/blob/main/package/src/Frame.ts
[2]: https://github.com/mrousavy/react-native-vision-camera/blob/main/package/ios/Frame%20Processor/Frame.h
[3]: https://github.com/mrousavy/react-native-vision-camera/blob/main/package/android/src/main/java/com/mrousavy/camera/frameprocessor/Frame.java
2021-06-09 10:57:05 +02:00
[4]: https://github.com/facebook/react-native/blob/9a43eac7a32a6ba3164a048960101022a92fcd5a/React/Base/RCTBridgeModule.h#L20-L24
2023-07-22 00:15:11 +02:00
[5]: https://developer.apple.com/documentation/coremedia/cmsamplebuffer
[6]: https://developer.android.com/reference/androidx/camera/core/ImageProxy
2024-01-12 16:00:36 +01:00
[7]: https://github.com/mrousavy/react-native-vision-camera/blob/main/package/ios/Frame%20Processor/SharedArray.h
[8]: https://github.com/mrousavy/react-native-vision-camera/blob/main/package/android/src/main/java/com/mrousavy/camera/frameprocessor/SharedArray.java