feat: Split videoHdr and photoHdr into two settings (#2161)

* feat: Split `videoHdr` and `photoHdr` into two settings

* fix: Rename all `hdr`

* fix: Fix HDR on Android

* Update CameraDeviceDetails.kt

* Update CameraDeviceDetails.kt

* fix: Correctly configure `pixelFormat` AFTER `format`

* Update CameraSession+Configuration.swift

* fix: Also after format changed
This commit is contained in:
Marc Rousavy
2023-11-15 18:33:12 +01:00
committed by GitHub
parent 75fd924899
commit c5dfb6c247
26 changed files with 129 additions and 88 deletions

View File

@@ -42,8 +42,8 @@ To get all available formats, simply use the `CameraDevice`'s [`formats` propert
- [`minFps`](/docs/api/interfaces/CameraDeviceFormat#minfps)/[`maxFps`](/docs/api/interfaces/CameraDeviceFormat#maxfps): A range of possible values for the `fps` property. For example, if your format has `minFps: 1` and `maxFps: 60`, you can either use `fps={30}`, `fps={60}` or any other value in between for recording videos.
- [`videoStabilizationModes`](/docs/api/interfaces/CameraDeviceFormat#videostabilizationmodes): All supported Video Stabilization Modes, digital and optical. If this specific format contains your desired [`VideoStabilizationMode`](/docs/api/#videostabilizationmode), you can pass it to your `<Camera>` via the [`videoStabilizationMode` property](/docs/api/interfaces/CameraProps#videoStabilizationMode).
- [`pixelFormats`](/docs/api/interfaces/CameraDeviceFormat#pixelformats): All supported Pixel Formats. If this specific format contains your desired [`PixelFormat`](/docs/api/#PixelFormat), you can pass it to your `<Camera>` via the [`pixelFormat` property](/docs/api/interfaces/CameraProps#pixelFormat).
- [`supportsVideoHDR`](/docs/api/interfaces/CameraDeviceFormat#supportsvideohdr): Whether this specific format supports true 10-bit HDR for video capture. If this is `true`, you can enable `hdr` on your `<Camera>`.
- [`supportsPhotoHDR`](/docs/api/interfaces/CameraDeviceFormat#supportsphotohdr): Whether this specific format supports HDR for photo capture. It will use multiple captures to fuse over-exposed and under-exposed Images together to form one HDR photo. If this is `true`, you can enable `hdr` on your `<Camera>`.
- [`supportsVideoHdr`](/docs/api/interfaces/CameraDeviceFormat#supportsvideohdr): Whether this specific format supports true 10-bit HDR for video capture. If this is `true`, you can enable `videoHdr` on your `<Camera>`.
- [`supportsPhotoHdr`](/docs/api/interfaces/CameraDeviceFormat#supportsphotohdr): Whether this specific format supports HDR for photo capture. It will use multiple captures to fuse over-exposed and under-exposed Images together to form one HDR photo. If this is `true`, you can enable `photoHdr` on your `<Camera>`.
- [`supportsDepthCapture`](/docs/api/interfaces/CameraDeviceFormat#supportsdepthcapture): Whether this specific format supports depth data capture. For devices like the TrueDepth/LiDAR cameras, this will always be true.
- ...and more. See the [`CameraDeviceFormat` type](/docs/api/interfaces/CameraDeviceFormat) for all supported properties.
@@ -201,7 +201,8 @@ function App() {
Other props that depend on the `format`:
* `fps`: Specifies the frame rate to use
* `hdr`: Enables HDR photo or video capture and preview
* `videoHdr`: Enables HDR video capture and preview
* `photoHdr`: Enables HDR photo capture
* `lowLightBoost`: Enables a night-mode/low-light-boost for photo or video capture and preview
* `videoStabilizationMode`: Specifies the video stabilization mode to use for the video pipeline
* `pixelFormat`: Specifies the pixel format to use for the video pipeline

View File

@@ -239,7 +239,7 @@ When running frame processors, it is often important to choose an appropriate [f
* If you are running heavy AI/ML calculations in your frame processor, make sure to [select a format](/docs/guides/formats) that has a lower resolution to optimize it's performance. You can also resize the Frame on-demand.
* Sometimes a frame processor plugin only works with specific [pixel formats](/docs/api/interfaces/CameraDeviceFormat#pixelformats). Some plugins (like Tensorflow Lite Models) don't work with `yuv`, so use a [`pixelFormat`](/docs/api/interfaces/CameraProps#pixelformat) of `rgb` instead.
* Some Frame Processor plugins don't work with HDR formats. In this case you need to disable [`hdr`](/docs/api/interfaces/CameraProps#hdr).
* Some Frame Processor plugins don't work with HDR formats. In this case you need to disable [`videoHdr`](/docs/api/interfaces/CameraProps#videohdr).
## Benchmarks

View File

@@ -49,8 +49,8 @@ To enable HDR capture, you need to select a format (see ["Camera Formats"](forma
```ts
const format = useCameraFormat(device, [
{ photoHDR: true },
{ videoHDR: true },
{ photoHdr: true },
{ videoHdr: true },
])
```
@@ -59,21 +59,27 @@ const format = useCameraFormat(device, [
```ts
const format = getCameraFormat(device, [
{ photoHDR: true },
{ videoHDR: true },
{ photoHdr: true },
{ videoHdr: true },
])
```
</TabItem>
</Tabs>
Then, pass the `format` to the Camera and enable the `hdr` prop if it is supported:
Then, pass the `format` to the Camera and enable the `videoHdr`/`photoHdr` props if it is supported:
```tsx
const format = ...
const supportsHdr = format.supportsPhotoHDR && format.supportsVideoHDR
return <Camera {...props} format={format} hdr={supportsHdr} />
return (
<Camera
{...props}
format={format}
videoHdr={format.supportsVideoHdr}
photoHdr={format.supportsPhotoHdr}
/>
)
```
Now, all captures (`takePhoto(..)` and `startRecording(..)`) will be configured to use HDR.

View File

@@ -60,9 +60,9 @@ See ["Camera Devices"](devices) for more information.
Note: By default (when not passing the options object), a simpler device is already chosen.
### No HDR
### No Video HDR
HDR uses 10-bit formats and/or additional processing steps that come with additional computation overhead. Disable HDR (don't pass `hdr` to the `<Camera>`) for higher efficiency.
Video HDR uses 10-bit formats and/or additional processing steps that come with additional computation overhead. Disable Video HDR (don't pass `videoHdr` to the `<Camera>`) for higher efficiency.
### Buffer Compression

View File

@@ -145,7 +145,7 @@ To calculate your target bit-rate, you can use this formula:
```ts
let bitRate = baseBitRate
bitRate = bitRate / 30 * fps // FPS
if (hdr === true) bitRate *= 1.2 // HDR
if (videoHdr === true) bitRate *= 1.2 // 10-Bit Video HDR
if (codec === 'h265') bitRate *= 0.8 // H.265
bitRate *= yourCustomFactor // e.g. 0.5x for half the bit-rate
```

View File

@@ -57,7 +57,7 @@ If you're experiencing build issues or runtime issues in VisionCamera, make sure
* For errors without messages, there's often an error code attached. Look up the error code on [osstatus.com](https://www.osstatus.com) to get more information about a specific error.
2. If your Frame Processor is not running, make sure you check the native Xcode logs. There is useful information about the Frame Processor Runtime that will tell you if something goes wrong.
3. If your Frame Processor is not running, make sure you are not using a remote JS debugger such as Google Chrome, since those don't work with JSI.
4. If you are experiencing black-screens, try removing all properties such as `fps`, `hdr` or `format` on the `<Camera>` component except for the required ones:
4. If you are experiencing black-screens, try removing all properties such as `fps`, `videoHdr` or `format` on the `<Camera>` component except for the required ones:
```tsx
<Camera device={device} isActive={true} style={{ width: 500, height: 500 }} />
```
@@ -112,7 +112,7 @@ If you're experiencing build issues or runtime issues in VisionCamera, make sure
2. If a camera device is not being returned by [`Camera.getAvailableCameraDevices()`](/docs/api/classes/Camera#getavailablecameradevices), make sure it is a Camera2 compatible device. See [this section in the Android docs](https://developer.android.com/reference/android/hardware/camera2/CameraDevice#reprocessing) for more information.
3. If your Frame Processor is not running, make sure you check the native Android Studio/Logcat logs. There is useful information about the Frame Processor Runtime that will tell you if something goes wrong.
4. If your Frame Processor is not running, make sure you are not using a remote JS debugger such as Google Chrome, since those don't work with JSI.
5. If you are experiencing black-screens, try removing all properties such as `fps`, `hdr` or `format` on the `<Camera>` component except for the required ones:
5. If you are experiencing black-screens, try removing all properties such as `fps`, `videoHdr` or `format` on the `<Camera>` component except for the required ones:
```tsx
<Camera device={device} isActive={true} style={{ width: 500, height: 500 }} />
```