docs: Fix links

This commit is contained in:
Marc Rousavy 2023-10-03 11:31:37 +02:00
parent d465c37bea
commit b24b1c808f
5 changed files with 25 additions and 22 deletions

View File

@ -72,20 +72,20 @@ A [`CameraDevice`](/docs/api/interfaces/CameraDevice) consists of the following
- USB Camera Devices (if they support the [USB Video Class (UVC) Specification](https://en.wikipedia.org/wiki/List_of_USB_video_class_devices))
- [Continuity Camera Devices](https://support.apple.com/en-us/HT213244) (e.g. your iPhone's or Mac's Camera connected through WiFi/Continuity)
- Bluetooth/WiFi Camera Devices (if they are supported in the platform-native Camera APIs)
- [`physicalDevices`](/docs/api/interfaces/CameraDevice#physicalDevices): The physical Camera Devices (lenses) this Camera Device consists of. This can either be one of these values ("physical" device) or any combination of these values ("virtual" device):
- [`physicalDevices`](/docs/api/interfaces/CameraDevice#physicaldevices): The physical Camera Devices (lenses) this Camera Device consists of. This can either be one of these values ("physical" device) or any combination of these values ("virtual" device):
- `ultra-wide-angle-camera`: The "fish-eye" camera for 0.5x zoom
- `wide-angle-camera`: The "default" camera for 1x zoom
- `telephoto-camera`: A zoomed-in camera for 3x zoom
- [`sensorOrientation`](/docs/api/interfaces/CameraDevice#sensorOrientation): The orientation of the Camera sensor/lens relative to the phone. Cameras are usually in `landscapeLeft` orientation, meaning they are rotated by 90°. This includes their resolutions, so a 4k format might be 3840x2160, not 2160x3840
- [`minZoom`](/docs/api/interfaces/CameraDevice#minZoom): The minimum possible zoom factor for this Camera Device. If this is a multi-cam, this is the point where the device with the widest field of view is used (e.g. ultra-wide)
- [`maxZoom`](/docs/api/interfaces/CameraDevice#maxZoom): The maximum possible zoom factor for this Camera Device. If this is a multi-cam, this is the point where the device with the lowest field of view is used (e.g. telephoto)
- [`neutralZoom`](/docs/api/interfaces/CameraDevice#neutralZoom): A value between `minZoom` and `maxZoom` where the "default" Camera Device is used (e.g. wide-angle). When using multi-cams, make sure to start off at this zoom level, so the user can optionally zoom out to the ultra-wide-angle Camera instead of already starting zoomed out
- [`sensorOrientation`](/docs/api/interfaces/CameraDevice#sensororientation): The orientation of the Camera sensor/lens relative to the phone. Cameras are usually in `landscapeLeft` orientation, meaning they are rotated by 90°. This includes their resolutions, so a 4k format might be 3840x2160, not 2160x3840
- [`minZoom`](/docs/api/interfaces/CameraDevice#minzoom): The minimum possible zoom factor for this Camera Device. If this is a multi-cam, this is the point where the device with the widest field of view is used (e.g. ultra-wide)
- [`maxZoom`](/docs/api/interfaces/CameraDevice#maxzoom): The maximum possible zoom factor for this Camera Device. If this is a multi-cam, this is the point where the device with the lowest field of view is used (e.g. telephoto)
- [`neutralZoom`](/docs/api/interfaces/CameraDevice#neutralzoom): A value between `minZoom` and `maxZoom` where the "default" Camera Device is used (e.g. wide-angle). When using multi-cams, make sure to start off at this zoom level, so the user can optionally zoom out to the ultra-wide-angle Camera instead of already starting zoomed out
- [`formats`](/docs/api/interfaces/CameraDevice#formats): The list of [`CameraDeviceFormat`s](/docs/api/interfaces/CameraDeviceFormat) (See ["Camera Formats"](/docs/guides/formats)) this Camera Device supports. A format specifies:
- Video Resolution (see ["Formats: Video Resolution"](/docs/guides/formats#video-resolution))
- Photo Resolution (see ["Formats: Photo Resolution"](/docs/guides/formats#photo-resolution))
- FPS (see ["Formats: FPS"](/docs/guides/formats#fps))
- Video Stabilization Mode (see: ["Formats: Video Stabilization Mode"](/docs/guides/formats#videoStabilization))
- Pixel Format (see: ["Formats: Pixel Format"](/docs/guides/formats#pixelFormat))
- Video Stabilization Mode (see: ["Formats: Video Stabilization Mode"](/docs/guides/formats#videostabilization))
- Pixel Format (see: ["Formats: Pixel Format"](/docs/guides/formats#pixelformat))
### Examples on an iPhone

View File

@ -37,14 +37,14 @@ To understand a bit more about camera formats, you first need to understand a fe
To get all available formats, simply use the `CameraDevice`'s [`formats` property](/docs/api/interfaces/CameraDevice#formats). These are a [CameraFormat's](/docs/api/interfaces/CameraDeviceFormat) props:
- [`photoHeight`](/docs/api/interfaces/CameraDeviceFormat#photoHeight)/[`photoWidth`](/docs/api/interfaces/CameraDeviceFormat#photoWidth): The resolution that will be used for taking photos. Choose a format with your desired resolution.
- [`videoHeight`](/docs/api/interfaces/CameraDeviceFormat#videoHeight)/[`videoWidth`](/docs/api/interfaces/CameraDeviceFormat#videoWidth): The resolution that will be used for recording videos. Choose a format with your desired resolution.
- [`minFps`](/docs/api/interfaces/CameraDeviceFormat#minFps)/[`maxFps`](/docs/api/interfaces/CameraDeviceFormat#maxFps): A range of possible values for the `fps` property. For example, if your format has `minFps: 1` and `maxFps: 60`, you can either use `fps={30}`, `fps={60}` or any other value in between for recording videos.
- [`videoStabilizationModes`](/docs/api/interfaces/CameraDeviceFormat#videoStabilizationModes): All supported Video Stabilization Modes, digital and optical. If this specific format contains your desired [`VideoStabilizationMode`](/docs/api/#videostabilizationmode), you can pass it to your `<Camera>` via the [`videoStabilizationMode` property](/docs/api/interfaces/CameraProps#videoStabilizationMode).
- [`pixelFormats`](/docs/api/interfaces/CameraDeviceFormat#pixelFormats): All supported Pixel Formats. If this specific format contains your desired [`PixelFormat`](/docs/api/#PixelFormat), you can pass it to your `<Camera>` via the [`pixelFormat` property](/docs/api/interfaces/CameraProps#pixelFormat).
- [`supportsVideoHDR`](/docs/api/interfaces/CameraDeviceFormat#supportsVideoHDR): Whether this specific format supports true 10-bit HDR for video capture. If this is `true`, you can enable `hdr` on your `<Camera>`.
- [`supportsPhotoHDR`](/docs/api/interfaces/CameraDeviceFormat#supportsPhotoHDR): Whether this specific format supports HDR for photo capture. It will use multiple captures to fuse over-exposed and under-exposed Images together to form one HDR photo. If this is `true`, you can enable `hdr` on your `<Camera>`.
- [`supportsDepthCapture`](/docs/api/interfaces/CameraDeviceFormat#supportsDepthCapture): Whether this specific format supports depth data capture. For devices like the TrueDepth/LiDAR cameras, this will always be true.
- [`photoHeight`](/docs/api/interfaces/CameraDeviceFormat#photoheight)/[`photoWidth`](/docs/api/interfaces/CameraDeviceFormat#photoWidth): The resolution that will be used for taking photos. Choose a format with your desired resolution.
- [`videoHeight`](/docs/api/interfaces/CameraDeviceFormat#videoheight)/[`videoWidth`](/docs/api/interfaces/CameraDeviceFormat#videoWidth): The resolution that will be used for recording videos. Choose a format with your desired resolution.
- [`minFps`](/docs/api/interfaces/CameraDeviceFormat#minfps)/[`maxFps`](/docs/api/interfaces/CameraDeviceFormat#maxfps): A range of possible values for the `fps` property. For example, if your format has `minFps: 1` and `maxFps: 60`, you can either use `fps={30}`, `fps={60}` or any other value in between for recording videos.
- [`videoStabilizationModes`](/docs/api/interfaces/CameraDeviceFormat#videostabilizationmodes): All supported Video Stabilization Modes, digital and optical. If this specific format contains your desired [`VideoStabilizationMode`](/docs/api/#videostabilizationmode), you can pass it to your `<Camera>` via the [`videoStabilizationMode` property](/docs/api/interfaces/CameraProps#videoStabilizationMode).
- [`pixelFormats`](/docs/api/interfaces/CameraDeviceFormat#pixelformats): All supported Pixel Formats. If this specific format contains your desired [`PixelFormat`](/docs/api/#PixelFormat), you can pass it to your `<Camera>` via the [`pixelFormat` property](/docs/api/interfaces/CameraProps#pixelFormat).
- [`supportsVideoHDR`](/docs/api/interfaces/CameraDeviceFormat#supportsvideohdr): Whether this specific format supports true 10-bit HDR for video capture. If this is `true`, you can enable `hdr` on your `<Camera>`.
- [`supportsPhotoHDR`](/docs/api/interfaces/CameraDeviceFormat#supportsphotohdr): Whether this specific format supports HDR for photo capture. It will use multiple captures to fuse over-exposed and under-exposed Images together to form one HDR photo. If this is `true`, you can enable `hdr` on your `<Camera>`.
- [`supportsDepthCapture`](/docs/api/interfaces/CameraDeviceFormat#supportsdepthcapture): Whether this specific format supports depth data capture. For devices like the TrueDepth/LiDAR cameras, this will always be true.
- ...and more. See the [`CameraDeviceFormat` type](/docs/api/interfaces/CameraDeviceFormat) for all supported properties.
You can either find a matching format manually by looping through your `CameraDevice`'s [`formats` property](/docs/api/interfaces/CameraDevice#formats), or by using the helper functions from VisionCamera:

View File

@ -170,7 +170,7 @@ const frameProcessor = useFrameProcessor((frame) => {
### Running at a throttled FPS rate
Some Frame Processor Plugins don't need to run on every Frame, for example a Frame Processor that detects the brightness in a Frame only needs to run twice per second. You can achieve this by using [`runAtTargetFps(..)`](/docs/api/#runAtTargetFps):
Some Frame Processor Plugins don't need to run on every Frame, for example a Frame Processor that detects the brightness in a Frame only needs to run twice per second. You can achieve this by using [`runAtTargetFps(..)`](/docs/api/#runattargetfps):
```ts
const frameProcessor = useFrameProcessor((frame) => {
@ -223,7 +223,7 @@ Check out [Frame Processor community plugins](/docs/guides/frame-processor-plugi
When running frame processors, it is often important to choose an appropriate [format](/docs/guides/formats). Here are some general tips to consider:
* If you are running heavy AI/ML calculations in your frame processor, make sure to [select a format](/docs/guides/formats) that has a lower resolution to optimize it's performance. You can also resize the Frame on-demand.
* Sometimes a frame processor plugin only works with specific [pixel formats](/docs/api/interfaces/CameraDeviceFormat#pixelformats). Some plugins (like Tensorflow Lite Models) don't work with `yuv`, so use a [`pixelFormat`](/docs/api/interfaces/CameraProps#pixelFormat) of `rgb` instead.
* Sometimes a frame processor plugin only works with specific [pixel formats](/docs/api/interfaces/CameraDeviceFormat#pixelformats). Some plugins (like Tensorflow Lite Models) don't work with `yuv`, so use a [`pixelFormat`](/docs/api/interfaces/CameraProps#pixelformat) of `rgb` instead.
* Some Frame Processor plugins don't work with HDR formats. In this case you need to disable [`hdr`](/docs/api/interfaces/CameraProps#hdr).
## Benchmarks

View File

@ -72,11 +72,14 @@ Note: When not using a `frameProcessor`, buffer compression is automatically ena
### Video Stabilization
Video Stabilization requires additional overhead to start the algorithm, so disabling [`videoStabilizationMode`](/docs/api/interfaces/CameraProps#videoStabilizationMode) can significantly speed up the Camera initialization time.
Video Stabilization requires additional overhead to start the algorithm, so disabling [`videoStabilizationMode`](/docs/api/interfaces/CameraProps#videostabilizationmode) can significantly speed up the Camera initialization time.
### Pixel Format
By default, the `native` [`PixelFormat`](/docs/api#PixelFormat) is used, which is much more efficient than `rgb`.
By default, the `native` [`PixelFormat`](/docs/api#pixelformat) is used, which is much more efficient than `rgb`.
- On iOS, `native` is `yuv`
- On Android `native` is some kind of vendor specific format, which might be `yuv`
### Disable unneeded pipelines

View File

@ -54,7 +54,7 @@ camera.current.startRecording({
})
```
You can customize capture options such as [video codec](/docs/api/interfaces/RecordVideoOptions#videoCodec), [video bit-rate](/docs/api/interfaces/RecordVideoOptions#videoBitRate), [file type](/docs/api/interfaces/RecordVideoOptions#fileType), [enable flash](/docs/api/interfaces/RecordVideoOptions#flash) and more using the [`RecordVideoOptions`](/docs/api/interfaces/RecordVideoOptions) parameter.
You can customize capture options such as [video codec](/docs/api/interfaces/RecordVideoOptions#videocodec), [video bit-rate](/docs/api/interfaces/RecordVideoOptions#videobitrate), [file type](/docs/api/interfaces/RecordVideoOptions#filetype), [enable flash](/docs/api/interfaces/RecordVideoOptions#flash) and more using the [`RecordVideoOptions`](/docs/api/interfaces/RecordVideoOptions) parameter.
For any error that occured _while recording the video_, the `onRecordingError` callback will be invoked with a [`CaptureError`](/docs/api/classes/CameraCaptureError) and the recording is therefore cancelled.
@ -94,7 +94,7 @@ camera.current.startRecording({
Videos are recorded with a target bit-rate, which the encoder aims to match as closely as possible. A lower bit-rate means less quality (and less file size), a higher bit-rate means higher quality (and larger file size) since it can assign more bits to moving pixels.
To simply record videos with higher quality, use a [`videoBitRate`](/docs/api/interfaces/RecordVideoOptions#videoBitRate) of `'high'`, which effectively increases the bit-rate by 20%:
To simply record videos with higher quality, use a [`videoBitRate`](/docs/api/interfaces/RecordVideoOptions#videobitrate) of `'high'`, which effectively increases the bit-rate by 20%:
```ts
camera.current.startRecording({
@ -103,7 +103,7 @@ camera.current.startRecording({
})
```
To use a lower bit-rate for lower quality and lower file-size, use a [`videoBitRate`](/docs/api/interfaces/RecordVideoOptions#videoBitRate) of `'low'`, which effectively decreases the bit-rate by 20%:
To use a lower bit-rate for lower quality and lower file-size, use a [`videoBitRate`](/docs/api/interfaces/RecordVideoOptions#videobitrate) of `'low'`, which effectively decreases the bit-rate by 20%:
```ts
camera.current.startRecording({