docs: Lifecycle (#182)

* Add docs for Lifecycle

* Update CAPTURING.mdx

* move

* Update DEVICES.mdx

* Update FRAME_PROCESSOR_PLUGIN_LIST.mdx

* f

* move FP

* separate focusing

* fix links
This commit is contained in:
Marc Rousavy 2021-06-07 15:55:20 +02:00 committed by GitHub
parent 2915b176b2
commit a02f378a4b
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
12 changed files with 95 additions and 53 deletions

View File

@ -91,4 +91,4 @@ The above example only demonstrates how to animate the `zoom` property. To actua
<br /> <br />
#### 🚀 Next section: [Camera Errors](errors) #### 🚀 Next section: [Focusing](focusing)

View File

@ -36,7 +36,6 @@ The most important actions are:
* [Taking Photos](#taking-photos) * [Taking Photos](#taking-photos)
- [Taking Snapshots](#taking-snapshots) - [Taking Snapshots](#taking-snapshots)
* [Recording Videos](#recording-videos) * [Recording Videos](#recording-videos)
* [Focussing](#focussing)
## Taking Photos ## Taking Photos
@ -86,7 +85,7 @@ To start a video recording you first have to enable video capture:
{...props} {...props}
video={true} video={true}
audio={true} // <-- optional audio={true} // <-- optional
/> />
``` ```
Then, simply use the Camera's [`startRecording(...)`](../api/classes/camera.camera-1#startrecording) function: Then, simply use the Camera's [`startRecording(...)`](../api/classes/camera.camera-1#startrecording) function:
@ -109,20 +108,6 @@ await camera.current.stopRecording()
Once a recording has been stopped, the `onRecordingFinished` callback passed to the `startRecording` function will be invoked with a [`VideoFile`](../api/interfaces/videofile.videofile-1) which you can then use to display in a [`<Video>`](https://github.com/react-native-video/react-native-video) component. Once a recording has been stopped, the `onRecordingFinished` callback passed to the `startRecording` function will be invoked with a [`VideoFile`](../api/interfaces/videofile.videofile-1) which you can then use to display in a [`<Video>`](https://github.com/react-native-video/react-native-video) component.
## Focussing
To focus the camera to a specific point, simply use the Camera's [`focus(...)`](../api/classes/camera.camera-1#focus) function:
```ts
await camera.current.focus({ x: tapEvent.x, y: tapEvent.y })
```
The focus function expects a [`Point`](../api/interfaces/point.point-1) parameter which represents the location relative to the Camera View where you want to focus the Camera to. If you use react-native-gesture-handler, this will consist of the [`x`](https://docs.swmansion.com/react-native-gesture-handler/docs/api/gesture-handlers/tap-gh#x) and [`y`](https://docs.swmansion.com/react-native-gesture-handler/docs/api/gesture-handlers/tap-gh#y) properties of the tap event payload.
So for example, `{ x: 0, y: 0 }` will focus to the upper left corner, while `{ x: CAM_WIDTH, y: CAM_HEIGHT }` will focus to the bottom right corner.
Focussing adjusts auto-focus (AF) and auto-exposure (AE).
<br /> <br />
#### 🚀 Next section: [Frame Processors](frame-processors) #### 🚀 Next section: [Frame Processors](frame-processors)

View File

@ -17,7 +17,7 @@ import useBaseUrl from '@docusaurus/useBaseUrl';
Camera devices are the physical (or "virtual") devices that can be used to record videos or capture photos. Camera devices are the physical (or "virtual") devices that can be used to record videos or capture photos.
* **Physical**: A physical camera device is a **camera lens on your phone**. Different physical camera devices have different specifications, such as different capture formats, field of views, focal lengths, and more. Some phones have multiple physical camera devices. * **Physical**: A physical camera device is a **camera lens on your phone**. Different physical camera devices have different specifications, such as different capture formats, field of views, frame rates, focal lengths, and more. Some phones have multiple physical camera devices.
> Examples: _"Backside Wide-Angle Camera"_, _"Frontside Wide-Angle Camera (FaceTime HD)"_, _"Ultra-Wide-Angle back camera"_. > Examples: _"Backside Wide-Angle Camera"_, _"Frontside Wide-Angle Camera (FaceTime HD)"_, _"Ultra-Wide-Angle back camera"_.
@ -62,9 +62,7 @@ Make sure to be careful when filtering out unneeded camera devices, since not ev
### The `useCameraDevices` hook ### The `useCameraDevices` hook
The react-native-vision-camera library provides a hook to make camera device selection a lot easier. VisionCamera provides a hook to make camera device selection a lot easier. You can specify a device type to only find devices with the given type:
You can specify a device type to only find devices with the given type:
```tsx ```tsx
function App() { function App() {
@ -100,31 +98,6 @@ function App() {
} }
``` ```
### The `isActive` prop
The Camera's `isActive` property can be used to _pause_ the session (`isActive={false}`) while still keeping the session "warm". This is more desirable than completely unmounting the camera, since _resuming_ the session (`isActive={true}`) will be **much faster** than re-mounting the camera view.
For example, you want to **pause the camera** when the user **navigates to another page** or **minimizes the app** since otherwise the camera continues to run in the background without the user seeing it, causing **siginificant battery drain**. Also, on iOS a green dot indicates the user that the camera is still active, possibly causing the user to raise privacy concerns. (🔗 See ["About the orange and green indicators in your iPhone status bar"](https://support.apple.com/en-us/HT211876))
This example demonstrates how you could pause the camera stream once the app goes into background using a custom `useIsAppForeground` hook:
```tsx
function App() {
const devices = useCameraDevices()
const device = devices.back
const isAppForeground = useIsAppForeground()
if (device == null) return <LoadingView />
return (
<Camera
style={StyleSheet.absoluteFill}
device={device}
isActive={isAppForeground}
/>
)
}
```
### The `supportsPhotoAndVideoCapture` prop ### The `supportsPhotoAndVideoCapture` prop
Camera devices provide the [`supportsPhotoAndVideoCapture` property](/docs/api/interfaces/cameradevice.cameradevice-1#supportsphotoandvideocapture) which determines whether the device supports enabling photo- and video-capture at the same time. Camera devices provide the [`supportsPhotoAndVideoCapture` property](/docs/api/interfaces/cameradevice.cameradevice-1#supportsphotoandvideocapture) which determines whether the device supports enabling photo- and video-capture at the same time.
@ -136,4 +109,4 @@ If `supportsPhotoAndVideoCapture` is `false` but you still need photo- and video
<br /> <br />
#### 🚀 Next section: [Camera Formats](formats) #### 🚀 Next section: [Camera Lifecycle](lifecycle)

View File

@ -0,0 +1,25 @@
---
id: focusing
title: Focusing
sidebar_label: Focusing
---
To focus the camera to a specific point, simply use the Camera's [`focus(...)`](../api/classes/camera.camera-1#focus) function:
```ts
await camera.current.focus({ x: tapEvent.x, y: tapEvent.y })
```
The focus function expects a [`Point`](../api/interfaces/point.point-1) parameter which represents the location relative to the Camera view where you want to focus the Camera to (in _points_). If you use [react-native-gesture-handler](https://docs.swmansion.com/react-native-gesture-handler/), this will consist of the [`x`](https://docs.swmansion.com/react-native-gesture-handler/docs/api/gesture-handlers/tap-gh#x) and [`y`](https://docs.swmansion.com/react-native-gesture-handler/docs/api/gesture-handlers/tap-gh#y) properties of the tap event payload.
So for example, `{ x: 0, y: 0 }` will focus to the upper left corner, while `{ x: CAM_WIDTH, y: CAM_HEIGHT }` will focus to the bottom right corner.
Focussing adjusts auto-focus (AF) and auto-exposure (AE).
:::note
`focus(...)` will fail if the selected Camera device does not support focusing (see [`CameraDevice.supportsFocus`](/docs/api/interfaces/cameradevice.cameradevice-1#supportsfocus))
:::
<br />
#### 🚀 Next section: [Camera Errors](errors)

View File

@ -153,9 +153,11 @@ You should always verify that the format supports the desired FPS, and fall back
Other props that depend on the `format`: Other props that depend on the `format`:
* `fps`: Specifies the frame rate to use
* `hdr`: Enables HDR photo or video capture and preview * `hdr`: Enables HDR photo or video capture and preview
* `lowLightBoost`: Enables a night-mode/low-light-boost for photo or video capture and preview * `lowLightBoost`: Enables a night-mode/low-light-boost for photo or video capture and preview
* `colorSpace`: Uses the specified color-space for photo or video capture and preview (iOS only since Android only uses `YUV`) * `colorSpace`: Uses the specified color-space for photo or video capture and preview (iOS only since Android only uses `YUV`)
* `videoStabilizationMode`: Specifies the video stabilization mode to use for this camera device
<br /> <br />

View File

@ -150,6 +150,10 @@ I have used [MLKit Vision Image Labeling](https://firebase.google.com/docs/ml-ki
> All measurements are recorded on an iPhone 11 Pro, benchmarked total execution time of the [`captureOutput`](https://developer.apple.com/documentation/avfoundation/avcapturevideodataoutputsamplebufferdelegate/1385775-captureoutput) function by using [`CFAbsoluteTimeGetCurrent`](https://developer.apple.com/documentation/corefoundation/1543542-cfabsolutetimegetcurrent). Running smaller images (lower than 4k resolution) is much quicker and many algorithms can even run at 60 FPS. > All measurements are recorded on an iPhone 11 Pro, benchmarked total execution time of the [`captureOutput`](https://developer.apple.com/documentation/avfoundation/avcapturevideodataoutputsamplebufferdelegate/1385775-captureoutput) function by using [`CFAbsoluteTimeGetCurrent`](https://developer.apple.com/documentation/corefoundation/1543542-cfabsolutetimegetcurrent). Running smaller images (lower than 4k resolution) is much quicker and many algorithms can even run at 60 FPS.
### Performance
Frame Processors will be **synchronously** called for each frame the Camera sees and have to finish executing before the next frame arrives, otherwise the next frame(s) will be dropped. For a frame rate of **30 FPS**, you have about **33ms** to finish processing frames. Use [`frameProcessorFps`](../api/interfaces/cameraprops.cameraprops-1#frameprocessorfps) to throttle the frame processor's FPS. For a QR Code Scanner, **5 FPS** might suffice.
### ESLint react-hooks plugin ### ESLint react-hooks plugin
If you are using the [react-hooks ESLint plugin](https://www.npmjs.com/package/eslint-plugin-react-hooks), make sure to add `useFrameProcessor` to `additionalHooks` inside your ESLint config. (See ["advanced configuration"](https://www.npmjs.com/package/eslint-plugin-react-hooks#advanced-configuration)) If you are using the [react-hooks ESLint plugin](https://www.npmjs.com/package/eslint-plugin-react-hooks), make sure to add `useFrameProcessor` to `additionalHooks` inside your ESLint config. (See ["advanced configuration"](https://www.npmjs.com/package/eslint-plugin-react-hooks#advanced-configuration))

View File

@ -28,10 +28,6 @@ function App() {
To achieve **maximum performance**, the `scanQRCodes` function is written in a native language (e.g. Objective-C), but it will be directly called from the VisionCamera Frame Processor JavaScript-Runtime. To achieve **maximum performance**, the `scanQRCodes` function is written in a native language (e.g. Objective-C), but it will be directly called from the VisionCamera Frame Processor JavaScript-Runtime.
### Execution
Frame Processors will be **synchronously** called for each frame the Camera sees and have to finish executing before the next frame arrives, otherwise the next frame(s) will be dropped. For a frame rate of **30 FPS**, you have about **33ms** to finish processing frames. Use [`frameProcessorFps`](../api/interfaces/cameraprops.cameraprops-1#frameprocessorfps) to throttle the frame processor's FPS. For a QR Code Scanner, **5 FPS** might suffice.
### Return Types ### Return Types
Frame Processors can return any primitive value that is representable in JS. So for Objective-C that maps to: Frame Processors can return any primitive value that is representable in JS. So for Objective-C that maps to:

View File

@ -23,6 +23,7 @@ You have to restart metro-bundler for changes in the `babel.config.js` file to t
## Plugin List ## Plugin List
* [mrousavy/**vision-camera-image-labeler**](https://github.com/mrousavy/vision-camera-image-labeler): A fast realtime image labeler plugin using **MLKit Vision**. * [mrousavy/**vision-camera-image-labeler**](https://github.com/mrousavy/vision-camera-image-labeler): A fast realtime image labeler plugin using **MLKit Vision**.
* [mrousavy/**vision-camera-resize-plugin**](https://github.com/mrousavy/vision-camera-resize-plugin): A plugin for fast frame resizing to optimize execution speed of expensive AI algorithms.

View File

@ -0,0 +1,54 @@
---
id: lifecycle
title: Lifecycle
sidebar_label: Lifecycle
---
import useBaseUrl from '@docusaurus/useBaseUrl';
<div>
<img align="right" width="283" src={useBaseUrl("img/example.png")} />
</div>
### The `isActive` prop
The Camera's `isActive` property can be used to _pause_ the session (`isActive={false}`) while still keeping the session "warm". This is more desirable than completely unmounting the camera, since _resuming_ the session (`isActive={true}`) will be **much faster** than re-mounting the camera view.
For example, you want to **pause the camera** when the user **navigates to another page** or **minimizes the app** since otherwise the camera continues to run in the background without the user seeing it, causing **siginificant battery drain**. Also, on iOS a green dot indicates the user that the camera is still active, possibly causing the user to raise privacy concerns. (🔗 See ["About the orange and green indicators in your iPhone status bar"](https://support.apple.com/en-us/HT211876))
This example demonstrates how you could pause the camera stream once the app goes into background using a custom `useIsAppForeground` hook:
```tsx
function App() {
const devices = useCameraDevices()
const device = devices.back
const isAppForeground = useIsAppForeground()
if (device == null) return <LoadingView />
return (
<Camera
style={StyleSheet.absoluteFill}
device={device}
isActive={isAppForeground}
/>
)
}
```
#### Usage with `react-navigation`
To automatically pause the Camera when the user navigates to a different page, use the [`useIsFocused`](https://reactnavigation.org/docs/use-is-focused/) function:
```tsx {4}
function App() {
// ...
const isFocused = useIsFocused()
return <Camera {...props} isActive={isFocused} />
}
```
<br />
#### 🚀 Next section: [Camera Formats](formats)

View File

@ -3,6 +3,7 @@ module.exports = {
Guides: [ Guides: [
'guides/setup', 'guides/setup',
'guides/devices', 'guides/devices',
'guides/lifecycle',
'guides/formats', 'guides/formats',
'guides/capturing', 'guides/capturing',
'guides/frame-processors', 'guides/frame-processors',
@ -18,6 +19,7 @@ module.exports = {
] ]
}, },
'guides/animated', 'guides/animated',
'guides/focusing',
'guides/errors', 'guides/errors',
'guides/troubleshooting', 'guides/troubleshooting',
], ],

View File

@ -289,7 +289,7 @@ export interface CameraDevice {
*/ */
supportsRawCapture: boolean; supportsRawCapture: boolean;
/** /**
* Specifies whether this device supports focussing ({@linkcode Camera.focus | Camera.focus(...)}) * Specifies whether this device supports focusing ({@linkcode Camera.focus | Camera.focus(...)})
*/ */
supportsFocus: boolean; supportsFocus: boolean;
} }

View File

@ -26,7 +26,7 @@ export interface CameraProps extends ViewProps {
*/ */
device: CameraDevice; device: CameraDevice;
/** /**
* Whether the Camera should actively stream video frames, or not. See the [documentation about the `isActive` prop](https://cuvent.github.io/react-native-vision-camera/docs/guides/devices#the-isactive-prop) for more information. * Whether the Camera should actively stream video frames, or not. See the [documentation about the `isActive` prop](https://cuvent.github.io/react-native-vision-camera/docs/guides/lifecycle#the-isactive-prop) for more information.
* *
* This can be compared to a Video component, where `isActive` specifies whether the video is paused or not. * This can be compared to a Video component, where `isActive` specifies whether the video is paused or not.
* *