* feat: Add more Error insights when the Camera Module cannot be found
* Assert JSI is available
* Update error description
* fix
* Update CameraError.ts
Before, Frame Processors ran on a separate Thread.
After, Frame Processors run fully synchronous and always at the same FPS as the Camera.
Two new functions have been introduced:
* `runAtTargetFps(fps: number, func: () => void)`: Runs the given code as often as the given `fps`, effectively throttling it's calls.
* `runAsync(frame: Frame, func: () => void)`: Runs the given function on a separate Thread for Frame Processing. A strong reference to the Frame is held as long as the function takes to execute.
You can use `runAtTargetFps` to throttle calls to a specific API (e.g. if your Camera is running at 60 FPS, but you only want to run face detection at ~25 FPS, use `runAtTargetFps(25, ...)`.)
You can use `runAsync` to run a heavy algorithm asynchronous, so that the Camera is not blocked while your algorithm runs. This is useful if your main sync processor draws something, and your async processor is doing some image analysis on the side.
You can also combine both functions.
Examples:
```js
const frameProcessor = useFrameProcessor((frame) => {
'worklet'
console.log("I'm running at 60 FPS!")
}, [])
```
```js
const frameProcessor = useFrameProcessor((frame) => {
'worklet'
console.log("I'm running at 60 FPS!")
runAtTargetFps(10, () => {
'worklet'
console.log("I'm running at 10 FPS!")
})
}, [])
```
```js
const frameProcessor = useFrameProcessor((frame) => {
'worklet'
console.log("I'm running at 60 FPS!")
runAsync(frame, () => {
'worklet'
console.log("I'm running on another Thread, I can block for longer!")
})
}, [])
```
```js
const frameProcessor = useFrameProcessor((frame) => {
'worklet'
console.log("I'm running at 60 FPS!")
runAtTargetFps(10, () => {
'worklet'
runAsync(frame, () => {
'worklet'
console.log("I'm running on another Thread at 10 FPS, I can block for longer!")
})
})
}, [])
```
* Calculate a format's video dimensions based on supported resolutions and photo dimensions
* Add Android fallback strategy for recording quality
* Ensure that session props are not ignored when app is resumed
* Simplify setting Android video dimensions in format
* Modify Android imageAnalysisBuilder to use photoSize
* Update onHostResume function to reference android preview issue
* Add missing Android capture errors
* feat: disableFrameProcessors for android via expo-config-plugin prop
* chore: naming
* feat: fix shared library issues with expo config plug prop flag
* fix: use a glob pattern instead of listing every single shared lib
* fix: use wildcard since libc++ is not enough (libhermes, libjni, libjsi etc)
* fix: use wildcard since libc++ is not enough (libhermes, libjni, libjsi etc)
* feat: 🎉 disable frame processors for iOS as well
* chore: comments
* chore: make eslint/ts happy
* chore: cleanup
* refactor: no need to pass a param here. We just want to disbale it
* chore: remove withDangerouslyHandleAndroidSharedLibrary
* chore: remove danger plugin
* add video codec value
* add types
* use `recommendedVideoSettings` method instead
* lint
* refactor for better readability
* add a method to get available codecs (ios)
* imrove tsDoc description of the videoCodec option
Co-authored-by: Marc Rousavy <marcrousavy@hotmail.com>
* ios format
Co-authored-by: Marc Rousavy <marcrousavy@hotmail.com>
* Add custom `onViewReady` event to get layout
`componentDidMount` is async, so the native view _might_ not exist yet causing a race condition in the `setFrameProcessor` code.
This PR fixes this by calling `setFrameProcessor` only after the native view has actually mounted, and to ensure that I created a custom event that fires at that point.
* Update CameraView.swift
* fix: switched incorrect property ordering for qualityPrioritization options
fix: added extra step required for create frame processing plugin on Android
* fix: adjusted the highlighted line
* chore: added guidelines on how to generate and check docs updares
* chore: change instructions so they aren't so unnecessarily wordy! :P
* Add `onFrameProcessorPerformanceSuggestionAvailable` and make `frameProcessorFps` support `auto`
* Implement performance suggestion and auto-adjusting
* Fix FPS setting, evaluate correctly
* Floor suggested FPS
* Remove `console.log` for frame drop warnings.
* Swift format
* Use `30` magic number
* only call if FPS is different
* Update CameraView.swift
* Implement Android 1/2
* Cleanup
* Update `frameProcessorFps` if available
* Optimize `FrameProcessorPerformanceDataCollector` initialization
* Cache call
* Set frameProcessorFps directly (Kotlin setter)
* Don't suggest if same value
* Call suggestion every second
* reset time on set
* Always store 15 last samples
* reset counter too
* Update FrameProcessorPerformanceDataCollector.swift
* Update CameraView+RecordVideo.swift
* Update CameraView.kt
* iOS: Redesign evaluation
* Update CameraView+RecordVideo.swift
* Android: Redesign evaluation
* Update CameraView.kt
* Update REA to latest alpha and install RNScreens
* Fix frameProcessorFps updating
* fix: Fix UI Thread race condition in `setFrameProcessor(...)`
* Revert "fix: Fix UI Thread race condition in `setFrameProcessor(...)`"
This reverts commit 9c524e123cff6843d7d11db602a5027d1bb06b4b.
* Use `setImmediate` to call `setFrameProcessor(...)`
* Fix frame processor order of applying
* Add `enableFrameProcessor` prop that defines if a FP is added
* rename constant
* Implement `enableFrameProcessor` prop for Android and make `frameProcessorFps` faster
* link to troubleshooting guide
* Update TROUBLESHOOTING.mdx
* Add logs for use-cases
* fix log
* set initial frame processor in `onLayout` instead of `componentDidMount`