* Setup RN Worklets * Use RN Worklets on iOS * Fix console * Add `installFrameProcessorBindings()` function * Add `FrameProcessorPlugins` proxy (BREAKING CHANGE) * Clean up docs * Update FRAME_PROCESSORS.mdx * Use RN Worklets 0.2.5 * feat: Android build setup * Rewrite Android Frame Processor Part * Update CMakeLists.txt * fix: Add react-native-worklets Gradle dependency * Update Podfile.lock * fix build * gradle:7.4.1 * Init JSI Bindings in method on Android * Fix Folly flags * fix: Init `FrameProcessorRuntimeManager` later * fix: Wrap in `<GestureHandlerRootView>` * Refactor plugins * fix: Remove enableFrameProcessors * Install RN Worklets from current GH master * Update babel.config.js * Update CameraViewModule.kt * Update ImageProxyUtils.java * feat: Upgrade to Reanimated v3 * fix: Fix crash on Worklet init * Update RN Worklets to latest master * fix: Simplify FP Plugins Proxy
56 lines
1.9 KiB
Plaintext
56 lines
1.9 KiB
Plaintext
---
|
|
id: frame-processors-plugins-final
|
|
title: Creating Frame Processor Plugins
|
|
sidebar_label: Finish creating your Frame Processor Plugin
|
|
---
|
|
|
|
## Expose your Frame Processor Plugin to JS
|
|
|
|
To make the Frame Processor Plugin available to the Frame Processor Worklet Runtime, create the following wrapper function in JS/TS:
|
|
|
|
```ts
|
|
import { FrameProcessorPlugins, Frame } from 'react-native-vision-camera'
|
|
|
|
/**
|
|
* Scans QR codes.
|
|
*/
|
|
export function scanQRCodes(frame: Frame): string[] {
|
|
'worklet'
|
|
return FrameProcessorPlugins.scanQRCodes(frame)
|
|
}
|
|
```
|
|
|
|
## Test it!
|
|
|
|
Simply call the wrapper Worklet in your Frame Processor:
|
|
|
|
```tsx {4}
|
|
function App() {
|
|
const frameProcessor = useFrameProcessor((frame) => {
|
|
'worklet'
|
|
const qrCodes = scanQRCodes(frame)
|
|
console.log(`QR Codes in Frame: ${qrCodes}`)
|
|
}, [])
|
|
|
|
return (
|
|
<Camera frameProcessor={frameProcessor} {...cameraProps} />
|
|
)
|
|
}
|
|
```
|
|
|
|
## Next Steps
|
|
|
|
If you want to distribute your Frame Processor Plugin, simply use npm.
|
|
|
|
1. Create a blank Native Module using [bob](https://github.com/callstack/react-native-builder-bob) or [create-react-native-module](https://github.com/brodybits/create-react-native-module)
|
|
2. Name it `vision-camera-plugin-xxxxx` where `xxxxx` is the name of your plugin
|
|
3. Remove the generated template code from the Example Native Module
|
|
4. Add VisionCamera to `peerDependencies`: `"react-native-vision-camera": ">= 3"`
|
|
5. Implement the Frame Processor Plugin in the iOS, Android and JS/TS Codebase using the guides above
|
|
6. Publish the plugin to npm. Users will only have to install the plugin using `npm i vision-camera-plugin-xxxxx` and add it to their `babel.config.js` file.
|
|
7. [Add the plugin to the **official VisionCamera plugin list**](https://github.com/mrousavy/react-native-vision-camera/edit/main/docs/docs/guides/FRAME_PROCESSOR_PLUGIN_LIST.mdx) for more visibility
|
|
|
|
<br />
|
|
|
|
#### 🚀 Next section: [Browse Community Plugins](frame-processor-plugin-list)
|