They receive a frame from the Camera as an input and can return any kind of output. For example, a `detectFaces` function returns an array of detected faces in the frame:
For maximum performance, the `detectFaces` function is written in a native language (e.g. Objective-C), but it will be directly called from the VisionCamera Frame Processor JavaScript-Runtime through JSI.
Similar to a TurboModule, the Frame Processor Plugin Registry API automatically manages type conversion from JS to native. They are converted into the most efficient data-structures, as seen here:
Return values will automatically be converted to JS values, assuming they are representable in the ["Types" table](#types). So the following Java Frame Processor Plugin:
You can run any native code you want in a Frame Processor Plugin. Just like in the native iOS and Android Camera APIs, you will receive a frame ([`CMSampleBuffer`][5] on iOS, [`ImageProxy`][6] on Android) which you can use however you want. In other words; **everything is possible**.
If your Frame Processor takes longer than a single frame interval to execute, or runs asynchronously, you can create a **copy of the frame** and dispatch the actual frame processing to a **separate thread**.
For example, a realtime video chat application might use WebRTC to send the frames to the server. I/O operations (networking) are asynchronous, and we don't _need_ to wait for the upload to succeed before pushing the next frame, so we copy the frame and perform the upload on another Thread.
You might also run some very complex AI algorithms which are not fast enough to smoothly run at **30 FPS** (**33ms**). To not drop any frames you can create a custom "frame queue" which processes the copied frames and calls back into JS via a React event emitter. For this you'll have to create a Native Module that handles the asynchronous native -> JS communication, see ["Sending events to JavaScript" (Android)](https://reactnative.dev/docs/native-modules-android#sending-events-to-javascript) and ["Sending events to JavaScript" (iOS)](https://reactnative.dev/docs/native-modules-ios#sending-events-to-javascript).
Your Frame Processor Plugins have to be fast. Use the FPS Graph (`enableFpsGraph`) to see how fast your Camera is running, if it is not running at the target FPS, your Frame Processor is too slow.