Marc Rousavy
f0ea18115e
fix: Fix CI for V3 ( #1475 )
...
* fix: Fix CI for "Build Android"
* update versions
* Update Gemfile.lock
* format swift
* fix: Fix swift lint
* Update .swiftlint.yml
* Use C++17 for lint
* fix: Fix C++ lints
2023-02-15 17:24:33 +01:00
Marc Rousavy
30b56153db
feat: Sync Frame Processors (plus runAsync
and runAtTargetFps
) ( #1472 )
...
Before, Frame Processors ran on a separate Thread.
After, Frame Processors run fully synchronous and always at the same FPS as the Camera.
Two new functions have been introduced:
* `runAtTargetFps(fps: number, func: () => void)`: Runs the given code as often as the given `fps`, effectively throttling it's calls.
* `runAsync(frame: Frame, func: () => void)`: Runs the given function on a separate Thread for Frame Processing. A strong reference to the Frame is held as long as the function takes to execute.
You can use `runAtTargetFps` to throttle calls to a specific API (e.g. if your Camera is running at 60 FPS, but you only want to run face detection at ~25 FPS, use `runAtTargetFps(25, ...)`.)
You can use `runAsync` to run a heavy algorithm asynchronous, so that the Camera is not blocked while your algorithm runs. This is useful if your main sync processor draws something, and your async processor is doing some image analysis on the side.
You can also combine both functions.
Examples:
```js
const frameProcessor = useFrameProcessor((frame) => {
'worklet'
console.log("I'm running at 60 FPS!")
}, [])
```
```js
const frameProcessor = useFrameProcessor((frame) => {
'worklet'
console.log("I'm running at 60 FPS!")
runAtTargetFps(10, () => {
'worklet'
console.log("I'm running at 10 FPS!")
})
}, [])
```
```js
const frameProcessor = useFrameProcessor((frame) => {
'worklet'
console.log("I'm running at 60 FPS!")
runAsync(frame, () => {
'worklet'
console.log("I'm running on another Thread, I can block for longer!")
})
}, [])
```
```js
const frameProcessor = useFrameProcessor((frame) => {
'worklet'
console.log("I'm running at 60 FPS!")
runAtTargetFps(10, () => {
'worklet'
runAsync(frame, () => {
'worklet'
console.log("I'm running on another Thread at 10 FPS, I can block for longer!")
})
})
}, [])
```
2023-02-15 16:47:09 +01:00
Thomas Coldwell
52a1d50d91
fix: Frame Processor FPS ( #1288 )
...
* fix: Build using XCode 14
* fix: Throttle FP by start time rather than end time
2022-10-20 12:49:22 +02:00
Thomas Coldwell
fb2156ec39
fix: Asset Writer Video-Audio Sync ( #1075 )
...
* fix: Start asset writer session on first frame
* fix: Remove debug log
* fix: Reset dev team
2022-06-11 11:15:24 +02:00
Marc Rousavy
971b824914
fix: Fix RecordingSession nil crash by keeping it local ( #938 )
...
* fix: Fix RecordingSession nil crash by keeping it local
* fix: Fix error init
* Update CameraView+RecordVideo.swift
2022-04-15 09:48:32 +02:00
Marc Rousavy
4b9bcb37e0
feat: Add pauseRecording
and resumeRecording
🔥 ( #911 )
...
* feat: Add `pauseRecording` and `resumeRecording` (iOS)
* feat: Add `pauseRecording` and `resumeRecording` (Android)
* feat: Add `pauseRecording` and `resumeRecording` (JS)
* fix: Simplify Swift code for Recording
2022-03-22 10:44:58 +01:00
Marc Rousavy
d9932f4b7a
fix: Prevent NaN/+Inf crash for auto frameProcessorFps
2022-02-09 18:05:32 +01:00
Paul Rostorp
d96c5863c9
feat: Video Codec Option for recording video ( #645 )
...
* add video codec value
* add types
* use `recommendedVideoSettings` method instead
* lint
* refactor for better readability
* add a method to get available codecs (ios)
* imrove tsDoc description of the videoCodec option
Co-authored-by: Marc Rousavy <marcrousavy@hotmail.com>
* ios format
Co-authored-by: Marc Rousavy <marcrousavy@hotmail.com>
2021-12-30 10:47:23 +01:00
Marc Rousavy
a3cfcc2908
Remove conditional downcast
2021-12-30 10:34:46 +01:00
Marc Rousavy
6319e5cc49
fix: Fix torch staying on after recording a video ( #584 )
2021-12-10 09:57:05 +01:00
Marc Rousavy
934de142ee
docs: Update Format.pixelFormat
documentation
2021-12-10 09:52:40 +01:00
Marc Rousavy
f9dbb6921c
fix: Fix divison by zero in Performance Sample collector ( #416 )
...
* fix: Fix divison by zero in Performance Sample collector
* use `isEmpty`
2021-09-08 17:18:12 +02:00
Marc Rousavy
ad5e131f6a
feat: frameProcessorFps="auto"
and automatic performance suggestions (throttle or increase FPS) ( #393 )
...
* Add `onFrameProcessorPerformanceSuggestionAvailable` and make `frameProcessorFps` support `auto`
* Implement performance suggestion and auto-adjusting
* Fix FPS setting, evaluate correctly
* Floor suggested FPS
* Remove `console.log` for frame drop warnings.
* Swift format
* Use `30` magic number
* only call if FPS is different
* Update CameraView.swift
* Implement Android 1/2
* Cleanup
* Update `frameProcessorFps` if available
* Optimize `FrameProcessorPerformanceDataCollector` initialization
* Cache call
* Set frameProcessorFps directly (Kotlin setter)
* Don't suggest if same value
* Call suggestion every second
* reset time on set
* Always store 15 last samples
* reset counter too
* Update FrameProcessorPerformanceDataCollector.swift
* Update CameraView+RecordVideo.swift
* Update CameraView.kt
* iOS: Redesign evaluation
* Update CameraView+RecordVideo.swift
* Android: Redesign evaluation
* Update CameraView.kt
* Update REA to latest alpha and install RNScreens
* Fix frameProcessorFps updating
2021-09-06 16:27:16 +02:00
Marc Rousavy
7d3b352155
perf: Avoid expensive CMSampleBuffer copy ( #235 )
...
* Don't copy CMSampleBuffer
* Update CameraView+RecordVideo.swift
* Update Podfile.lock
2021-07-06 09:25:11 +02:00
Marc Rousavy
9ea158ad8f
chore: Move to /mrousavy/ ( #224 )
...
* rename 1/n
* 2
* 3
* fix indent
2021-06-21 22:42:46 +02:00
Marc Rousavy
9c579c65aa
try: Improvements from WWDC 2021 1:1 workshop ( #197 )
...
* perf: Automatically determine Pixel Format depending on active format. (More efficient video recording 🚀 )
* perf: Skip `AVAssetWriter` transform by directly correctly orienting the Video Output connection
* feat: Support camera flipping while recording
* feat: Run frame processor on separate queue, avoids stutters in video recordigns
* feat: Automatically drop late frame processor frames
2021-06-11 21:06:19 +02:00
Marc Rousavy
5919d46a46
fix: Make recorder less error-prone ( #189 )
...
* Abort recording if failed to start or empty frames
* Activate Audio Session on `cameraQueue`
* Double-check stop recording in callback
* Only call callback once
* Format
* Add description to `.aborted` error
* Update RecordingSession.swift
* Update AVAudioSession+updateCategory.swift
* Rename serial dispatch queues
2021-06-09 14:56:56 +02:00
Marc Rousavy
16f2a7cdec
chore: Cleanup void returns ( #187 )
...
* Place `return` in `return [void]` on separate line
* format
* Update CameraView+RecordVideo.swift
* f
2021-06-09 11:14:49 +02:00
Marc Rousavy
68a716b506
feat: native Frame
type to provide Orientation ( #186 )
...
* Use Frame.h
* Add orientation
* Determine buffer orientation
* Replace plugins
* fix calls
* Update FRAME_PROCESSOR_CREATE_PLUGIN_IOS.mdx
* Update FRAME_PROCESSOR_CREATE_PLUGIN_IOS.mdx
* format
* Update CameraPage.tsx
* Update FRAME_PROCESSOR_CREATE_PLUGIN_IOS.mdx
* Add links to docs
* Use `.` syntax
* Make properties `readonly`
* Fix `@synthesize` backing store
2021-06-09 10:57:05 +02:00
Marc Rousavy
4038db2e28
feat: Frame Processors: Allow returning Frame
s (support for resize and other frame manipulations) ( #185 )
...
* batch
* Init Frame as box
* Use ObjC syntax
* Fix access
* Revert "Fix access"
This reverts commit 7de09e52739d4c2b53f485d5ed696f1665fa5737.
* Revert "Use ObjC syntax"
This reverts commit e33f05ae8451cc4ee24af41d14dc76a57c157554.
* Revert "Init Frame as box"
This reverts commit 5adafb6109bfbf7fddb8ddc4af7d306b7b76b476.
* use holder
* convert buffer <-> jsi object
* add docs
* add more docs
* Update JSIUtils.mm
* Update FRAME_PROCESSORS_CREATE_OVERVIEW.mdx
* Update CameraView+RecordVideo.swift
2021-06-08 14:20:07 +02:00
Marc Rousavy
72a1fad78e
feat: Separate usecases (decouple microphone, video, photo) ( #168 )
...
* Add props
* add props (iOS)
* Add use-cases conditionally
* Update CameraView+RecordVideo.swift
* Update RecordingSession.swift
* reconfigure on change
* Throw correct errors
* Check for audio permission
* Move `#if` outward
* Throw appropriate errors
* Update CameraView+RecordVideo.swift
* fix Splashscreen
* Dynamic filePath
* Fix video extension
* add `avci` and `m4v` file types
* Fix RecordVideo errors
* Fix audio setup
* Enable `photo`, `video` and `audio`
* Check for `video={true}` in frameProcessor
* format
* Remove unused DispatchQueue
* Update docs
* Add `supportsPhotoAndVideoCapture`
* Fix view manager
* Fix error not being propagated
* Catch normal errors too
* Update DEVICES.mdx
* Update CAPTURING.mdx
* Update classdocs
2021-06-07 13:08:40 +02:00
Marc Rousavy
ca0c0e92e0
fix: Synchronize Audio Frames with Video Frames using masterClock
( #161 )
...
* Synchronize Audio Frames with Video Frames using `masterClock`
* Update CameraView+RecordVideo.swift
2021-06-03 14:50:08 +02:00
Marc Rousavy
eeb765f018
fix: Move Audio Input initialization shortly before startRecording
( #159 )
...
* rename
* Update AVAudioSession+updateCategory.swift
* fix bootstrap script
* Update CameraView+AVAudioSession.swift
* move audio input adding lower
* Activate AudioSession only when starting recording
* format
* Deactivate Audio Session
* remove audio input before deactivating audio session
* Update CameraView+AVAudioSession.swift
* log time
* Update CameraView+AVAudioSession.swift
* measure time with `measureElapsedTime`
* Update project.pbxproj
* only log in debug builds
* bootstrap with bridge (RNN new API)
* Mark two funcs as `@inlinable`
* format
* Update ReactLogger.swift
* Make audioWriter optional (allow videos without sound)
* only log frame drop reason in DEBUG
* Make audio writing entirely optional
* format
* Use function name as label for measureElapsedTime
* Update MeasureElapsedTime.swift
* Update MeasureElapsedTime.swift
* Mark AudioWriter as finished
* set `automaticallyConfiguresApplicationAudioSession` once
* Add JS console logging
* log to JS console for a few logs
* Update AVAudioSession+updateCategory.swift
* format
* Update JSConsoleHelper.mm
* catch log errors
* Update ReactLogger.swift
* fix docs
* Update RecordingSession.swift
* Immediatelly add audio input
* Update CameraView+AVCaptureSession.swift
* Update CameraView+AVCaptureSession.swift
* Update ReactLogger.swift
* immediatelly set audio session
* extract
* format
* Update TROUBLESHOOTING.mdx
* hmm
* Update AVAudioSession+updateCategory.swift
* Create secondary `AVCaptureSession` for audio
* Configure once, start stop on demand
* format
* fix audio notification interruptions
* docs
2021-06-03 14:16:02 +02:00
Marc Rousavy
71730a73ef
fix: Fix AVAudioSession not allowing background music playback ( #155 )
...
* Set category always if different
* rename org
* Fix video format sorting
* fix format filtering
* Update AVAudioSession+setCategoryIfNotSet.swift
* upgrade all dependencies
* Also run dependabot for JS codebase
* Update MediaPage.tsx
* Use typescript 4.2.4
* Also run TS in check-all
* Downgrade typescript to 4.2.3
* f
* recreate lockfiles
* docs: Revert package.json changes
* revert all package.json changes
* Update Podfile.lock
* bump all dependencies, pin typescript to 4.2.4
* Downgrade react-native-navigation for now
* upgrade to later snapshot
* Update yarn.lock
* remove yeet
2021-06-01 13:07:57 +02:00
Marc Rousavy
b6a67d5ced
feature: Frame Processors (iOS) ( #2 )
...
* Clean up Frame Processor
* Create FrameProcessorHolder
* Create FrameProcessorDelegate in ObjC++
* Move frame processor to FrameProcessorDelegate
* Decorate runtime, check for null
* Update FrameProcessorDelegate.mm
* Cleanup FrameProcessorBindings.mm
* Fix RuntimeDecorator.h import
* Update FrameProcessorDelegate.mm
* "React" -> "React Helper" to avoid confusion
* Rename folders again
* Fix podspec flattening a lot of headers, causing REA nameclash
* Fix header imports to avoid REA naming collision
* Lazily initialize jsi::Runtime on DispatchQueue
* Install frame processor bindings from Swift
* First try to call jsi::Function (frame processor) 👀
* Call viewForReactTag on RCT main thread
* Fix bridge accessing
* Add more logs
* Update CameraViewManager.swift
* Add more TODOs
* Re-indent .cpp files
* Fix RCTTurboModule import podspec
* Remove unnecessary include check for swift umbrella header
* Merge branch 'main' into frame-processors
* Docs: use static width for images (283)
* Create validate-cpp.yml
* Update a lot of packages to latest
* Set SWIFT_VERSION to 5.2 in podspec
* Create clean.sh
* Delete unused C++ files
* podspec: Remove CLANG_CXX_LANGUAGE_STANDARD and OTHER_CFLAGS
* Update pod lockfiles
* Regenerate lockfiles
* Remove IOSLogger
* Use NSLog
* Create FrameProcessorManager (inherits from REA RuntimeManager)
* Create reanimated::RuntimeManager shared_ptr
* Re-integrate pods
* Add react-native-reanimated >=2 peerDependency
* Add metro-config
* blacklist -> exclusionList
* Try to call worklet
* Fix jsi::Value* initializer
* Call ShareableValue::adapt (makeShareable) with React/JS Runtime
* Add null-checks
* Lift runtime manager creation out of delegate, into bindings
* Remove debug statement
* Make RuntimeManager unique_ptr
* Set _FRAME_PROCESSOR
* Extract convertJSIFunctionToFrameProcessorCallback
* Print frame
* Merge branch 'main' into frame-processors
* Reformat Swift code
* Install reanimated from npm again
* Re-integrate Pods
* Dependabot: Also scan example/ and docs/
* Update validate-cpp.yml
* Create FrameProcessorUtils
* Create Frame.h
* Abstract HostObject creation away
* Fix types
* Fix frame processor call
* Add todo
* Update lockfiles
* Add C++ contributing instructions
* Update CONTRIBUTING.md
* Add android/src/main/cpp to cpplint
* Update cpplint.sh
* Fix a few cpplint errors
* Fix globals
* Fix a few more cpplint errors
* Update App.tsx
* Update AndroidLogger.cpp
* Format
* Fix cpplint script (check-cpp)
* Try to simplify frame processor
* y
* Update FrameProcessorUtils.mm
* Update FrameProcessorBindings.mm
* Update CameraView.swift
* Update CameraViewManager.m
* Restructure everything
* fix
* Fix `@objc` export (make public)
* Refactor installFrameProcessorBindings into FrameProcessorRuntimeManager
* Add swift RCTBridge.runOnJS helper
* Fix run(onJS)
* Add pragma once
* Add `&self` to lambda
* Update FrameProcessorRuntimeManager.mm
* reorder imports
* Fix imports
* forward declare
* Rename extension
* Destroy buffer after execution
* Add FrameProcessorPluginRegistry base
* Merge branch 'main' into frame-processors
* Add frameProcessor to types
* Update Camera.tsx
* Fix rebase merge
* Remove movieOutput
* Use `useFrameProcessor`
* Fix bad merge
* Add additional ESLint rules
* Update lockfiles
* Update CameraViewManager.m
* Add support for V8 runtime
* Add frame processor plugins API
* Print plugin invoke
* Fix React Utils in podspec
* Fix runOnJS swift name
* Remove invalid redecl of `captureSession`
* Use REA 2.1.0 which includes all my big PRs 🎉
* Update validate-cpp.yml
* Update Podfile.lock
* Remove Flipper
* Fix dereferencing
* Capture `self` by value. Fucking hell, what a dumb mistake.
* Override a few HostObject functions
* Expose isReady, width, height, bytesPerRow and planesCount
* use hook again
* Expose property names
* FrameProcessor -> Frame
* Update CameraView+RecordVideo.swift
* Add Swift support for Frame Processors Plugins
* Add macros for plugin installation
* Add ObjC frame processor plugin
* Correctly install frame processor plugins
* Don't require custom name for macro
* Check if plugin already exists
* Implement QR Code Frame Processor Plugin in Swift
* Adjust ObjC style frame processor macro
* optimize
* Add `frameProcessorFrameDropRate`
* Fix types
* Only log once
* Log if it executes slowly
* Implement `frameProcessorFps`
* Implement manual encoded video recordings
* Use recommended video settings
* Add fileType types
* Ignore if input is not ready for media data
* Add completion handler
* Add audio buffer sampling
* Init only for video frame
* use AVAssetWriterInputPixelBufferAdaptor
* Remove AVAssetWriterInputPixelBufferAdaptor
* Rotate VideoWriter
* Always assume portrait orientation
* Update RecordingSession.swift
* Use a separate Queue for Audio
* Format Swift
* Update CameraView+RecordVideo.swift
* Use `videoQueue` instead of `cameraQueue`
* Move example plugins to example app
* Fix hardcoded name in plugin macro
* QRFrame... -> QRCodeFrame...
* Update FrameProcessorPlugin.h
* Add example frame processors to JS base
* Update QRCodeFrameProcessorPluginSwift.m
* Add docs to create FP Plugins
* Update FRAME_PROCESSORS_CREATE.mdx
* Update FRAME_PROCESSORS_CREATE.mdx
* Use `AVAssetWriterInputPixelBufferAdaptor` for efficient pixel buffer recycling
* Add customizable `pixelFormat`
* Use native format if available
* Update project.pbxproj
* Set video width and height as source-pixel-buffer attributes
* Catch
* Update App.tsx
* Don't explicitly set video dimensions, let CVPixelBufferPool handle it
* Add a few logs
* Cleanup
* Update CameraView+RecordVideo.swift
* Eagerly initialize asset writer to fix stutter at first frame
* Use `cameraQueue` DispatchQueue to not block CaptureDataOutputDelegate
* Fix duration calculation
* cleanup
* Cleanup
* Swiftformat
* Return available video codecs
* Only show frame drop notification for video output
* Remove photo and video codec functionality
It was too much complexity and probably never used anyways.
* Revert all android related changes for now
* Cleanup
* Remove unused header
* Update AVAssetWriter.Status+descriptor.swift
* Only call Frame Processor for Video Frames
* Fix `if`
* Add support for Frame Processor plugin parameters/arguments
* Fix arg support
* Move to JSIUtils.mm
* Update JSIUtils.h
* Update FRAME_PROCESSORS_CREATE.mdx
* Update FRAME_PROCESSORS_CREATE.mdx
* Upgrade packages for docs/
* fix docs
* Rename
* highlight lines
* docs
* community plugins
* Update FRAME_PROCESSOR_CREATE_FINAL.mdx
* Update FRAME_PROCESSOR_PLUGIN_LIST.mdx
* Update FRAME_PROCESSOR_PLUGIN_LIST.mdx
* Update dependencies (1/2)
* Update dependencies (2/2)
* Update Gemfile.lock
* add FP docs
* Update README.md
* Make `lastFrameProcessor` private
* add `frameProcessor` docs
* fix docs
* adjust docs
* Update DEVICES.mdx
* fix
* s
* Add logs demo
* add metro restart note
* Update FRAME_PROCESSOR_CREATE_PLUGIN_IOS.mdx
* Mirror video device
* Update AVCaptureVideoDataOutput+mirror.swift
* Create .swift-version
* Enable whole module optimization
* Fix recording mirrored video
* Swift format
* Clean dictionary on `markInvalid`
* Fix cleanup
* Add docs for disabling frame processors
* Update project.pbxproj
* Revert "Update project.pbxproj"
This reverts commit e67861e51b88b4888a6940e2d20388f3044211d0.
* Log frame drop reason
* Format
* add more samples
* Add clang-format
* also check .mm
* Revert "also check .mm"
This reverts commit 8b9d5e2c29866b05909530d104f6633d6c49eadd.
* Revert "Add clang-format"
This reverts commit 7643ac808e0fc34567ea1f814e73d84955381636.
* Use `kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange` as default
* Read matching video attributes from videoSettings
* Add TODO
* Swiftformat
* Conditionally disable frame processors
* Assert if trying to use frame processors when disabled
* Add frame-processors demo gif
* Allow disabling frame processors via `VISION_CAMERA_DISABLE_FRAME_PROCESSORS`
* Update FrameProcessorRuntimeManager.mm
* Update FRAME_PROCESSORS.mdx
* Update project.pbxproj
* Update FRAME_PROCESSORS_CREATE_OVERVIEW.mdx
2021-05-06 14:11:55 +02:00
Marc Rousavy
00c8970366
Add iOS
2021-02-19 16:28:05 +01:00