react-native-vision-camera/example
Marc Rousavy 30b56153db
feat: Sync Frame Processors (plus runAsync and runAtTargetFps) (#1472)
Before, Frame Processors ran on a separate Thread.

After, Frame Processors run fully synchronous and always at the same FPS as the Camera.

Two new functions have been introduced:

* `runAtTargetFps(fps: number, func: () => void)`: Runs the given code as often as the given `fps`, effectively throttling it's calls.
* `runAsync(frame: Frame, func: () => void)`: Runs the given function on a separate Thread for Frame Processing. A strong reference to the Frame is held as long as the function takes to execute.

You can use `runAtTargetFps` to throttle calls to a specific API (e.g. if your Camera is running at 60 FPS, but you only want to run face detection at ~25 FPS, use `runAtTargetFps(25, ...)`.)

You can use `runAsync` to run a heavy algorithm asynchronous, so that the Camera is not blocked while your algorithm runs. This is useful if your main sync processor draws something, and your async processor is doing some image analysis on the side. 

You can also combine both functions.

Examples:

```js
const frameProcessor = useFrameProcessor((frame) => {
  'worklet'
  console.log("I'm running at 60 FPS!")
}, [])
```

```js
const frameProcessor = useFrameProcessor((frame) => {
  'worklet'
  console.log("I'm running at 60 FPS!")

  runAtTargetFps(10, () => {
    'worklet'
    console.log("I'm running at 10 FPS!")
  })
}, [])
```



```js
const frameProcessor = useFrameProcessor((frame) => {
  'worklet'
  console.log("I'm running at 60 FPS!")

  runAsync(frame, () => {
    'worklet'
    console.log("I'm running on another Thread, I can block for longer!")
  })
}, [])
```

```js
const frameProcessor = useFrameProcessor((frame) => {
  'worklet'
  console.log("I'm running at 60 FPS!")

  runAtTargetFps(10, () => {
    'worklet'
    runAsync(frame, () => {
      'worklet'
      console.log("I'm running on another Thread at 10 FPS, I can block for longer!")
    })
  })
}, [])
```
2023-02-15 16:47:09 +01:00
..
android feat: Replace Reanimated with RN Worklets (#1468) 2023-02-13 15:22:45 +01:00
ios feat: Sync Frame Processors (plus runAsync and runAtTargetFps) (#1472) 2023-02-15 16:47:09 +01:00
src feat: Sync Frame Processors (plus runAsync and runAtTargetFps) (#1472) 2023-02-15 16:47:09 +01:00
.eslintrc.js feat: Add React Native 0.66 support (#490) 2021-10-05 12:22:14 +02:00
app.json Bootstrap 2021-02-19 16:07:53 +01:00
babel.config.js feat: Replace Reanimated with RN Worklets (#1468) 2023-02-13 15:22:45 +01:00
clean.sh feat: Frame Processors for Android (#196) 2021-06-27 12:37:54 +02:00
index.js feat: Add React Native 0.66 support (#490) 2021-10-05 12:22:14 +02:00
metro.config.js feature: Frame Processors (iOS) (#2) 2021-05-06 14:11:55 +02:00
package.json feat: Sync Frame Processors (plus runAsync and runAtTargetFps) (#1472) 2023-02-15 16:47:09 +01:00
README.md chore: fix typo in README.md (#718) 2022-01-05 14:51:44 +01:00
tsconfig.json feat: Add React Native 0.66 support (#490) 2021-10-05 12:22:14 +02:00
yarn.lock feat: Sync Frame Processors (plus runAsync and runAtTargetFps) (#1472) 2023-02-15 16:47:09 +01:00

Vision Camera playground

Overview

This is a demo application featuring some of the many features of the Vision Camera:

  • Photo capture
  • Video capture
  • Flipping device (back camera <-> front camera)
  • Device filtering (ultra-wide-angle, wide-angle, telephoto, or even combined virtual multi-cameras)
  • Format filtering (targeting 60 FPS, best capture size, best matching aspect ratio, etc.)
  • Zooming using react-native-gesture-handler and react-native-reanimated
  • Smoothly switching between constituent camera devices (see demo on my Twitter)
  • HDR mode
  • Night mode
  • Flash for photo capture
  • Flash for video capture
  • Activating/Pausing the Camera but keeping it "warm"
  • Using the Example Frame Processor Plugin

Get started

To try the playground out for yourself, run the following commands:

git clone https://github.com/mrousavy/react-native-vision-camera
cd react-native-vision-camera
yarn bootstrap

iOS

  1. Open the example/ios/VisionCameraExample.xcworkspace file with Xcode
  2. Change signing configuration to your developer account
  3. Select your device in the devices drop-down
  4. Hit run

Android

  1. Open the example/android/ folder with Android Studio
  2. Select your device in the devices drop-down
  3. Hit run