Marc Rousavy
375e894038
feat: Complete iOS Codebase rewrite ( #1647 )
...
* Make Frame Processors an extra subspec
* Update VisionCamera.podspec
* Make optional
* Make VisionCamera compile without Skia
* Fix
* Add skia again
* Update VisionCamera.podspec
* Make VisionCamera build without Frame Processors
* Rename error to `system/frame-processors-unavailable`
* Fix Frame Processor returning early
* Remove `preset`, FP partial rewrite
* Only warn on frame drop
* Fix wrong queue
* fix: Run on CameraQueue again
* Update CameraView.swift
* fix: Activate audio session asynchronously on audio queue
* Update CameraView+RecordVideo.swift
* Update PreviewView.h
* Cleanups
* Cleanup
* fix cast
* feat: Add LiDAR Depth Camera support
* Upgrade Ruby
* Add vector icons type
* Update Gemfile.lock
* fix: Stop queues on deinit
* Also load `builtInTrueDepthCamera`
* Update CameraViewManager.swift
* Update SkImageHelpers.mm
* Extract FrameProcessorCallback to FrameProcessor
Holds more context now :)
* Rename to .m
* fix: Add `RCTLog` import
* Create SkiaFrameProcessor
* Update CameraBridge.h
* Call Frame Processor
* Fix defines
* fix: Allow deleting callback funcs
* fix Skia build
* batch
* Just call `setSkiaFrameProcessor`
* Rewrite in Swift
* Pass `SkiaRenderer`
* Fix Import
* Move `PreviewView` to Swift
* Fix Layer
* Set Skia Canvas to Frame Host Object
* Make `DrawableFrameHostObject` subclass
* Fix TS types
* Use same MTLDevice and apply scale
* Make getter
* Extract `setTorch` and `Preview`
* fix: Fix nil metal device
* Don't wait for session stop in deinit
* Use main pixel ratio
* Use unique_ptr for Render Contexts
* fix: Fix SkiaPreviewDisplayLink broken after deinit
* inline `getTextureCache`
* Update CameraPage.tsx
* chore: Format iOS
* perf: Allow MTLLayer to be optimized for only frame buffers
* Add RN Video types
* fix: Fix Frame Processors if guard
* Find nodeModules recursively
* Create `Frame.isDrawable`
* Add `cocoapods-check` dependency
2023-07-20 15:30:04 +02:00
Marc Rousavy
12f850c8e1
feat: Draw onto Frame
as if it was a Skia Canvas ( #1479 )
...
* Create Shaders.ts
* Add `previewType` and `enableFpsGraph`
* Add RN Skia native dependency
* Add Skia Preview View on iOS
* Pass 1
* Update FrameHostObject.mm
* Wrap Canvas
* Lockfiles
* fix: Fix stuff
* chore: Upgrade RNWorklets
* Add `previewType` to set the Preview
* feat: Add Example
* Update project.pbxproj
* `enableFpsGraph`
* Cache the `std::shared_ptr<FrameHostObject>`
* Update CameraView+RecordVideo.swift
* Update SkiaMetalCanvasProvider.mm
* Android: Integrate Skia Dependency
* fix: Use new Prefix
* Add example for rendering shader
* chore: Upgrade CameraX
* Remove KTX
* Enable `viewBinding`
* Revert "Enable `viewBinding`"
This reverts commit f2a603f53b33ea4311a296422ffd1a910ce03f9e.
* Revert "chore: Upgrade CameraX"
This reverts commit 8dc832cf8754490d31a6192e6c1a1f11cdcd94fe.
* Remove unneeded `ProcessCameraProvider.getInstance()` call
* fix: Add REA hotfix patch
* fix: Fix FrameHostObject dead in runAsync
* fix: Make `runAsync` run truly async by dropping new Frames while executing
* chore: Upgrade RN Worklets to latest
* chore: Upgrade RN Skia
* Revert "Remove KTX"
This reverts commit 253f586633f7af2da992d2279fc206dc62597129.
* Make Skia optional in CMake
* Fix import
* Update CMakeLists.txt
* Update build.gradle
* Update CameraView.kt
* Update CameraView.kt
* Update CameraView.kt
* Update Shaders.ts
* Center Blur
* chore: Upgrade RN Worklets
* feat: Add `toByteArray()`, `orientation`, `isMirrored` and `timestamp` to `Frame` (#1487 )
* feat: Implement `orientation` and `isMirrored` on Frame
* feat: Add `toArrayBuffer()` func
* perf: Do faster buffer copy
* feat: Implement `toArrayBuffer()` on Android
* feat: Add `orientation` and `isMirrored` to Android
* feat: Add `timestamp` to Frame
* Update Frame.ts
* Update JImageProxy.h
* Update FrameHostObject.cpp
* Update FrameHostObject.cpp
* Update CameraPage.tsx
* fix: Format Swift
2023-02-21 15:00:48 +01:00
Marc Rousavy
9301430165
chore: Lint
2021-12-10 09:44:54 +01:00
Marc Rousavy
ad5e131f6a
feat: frameProcessorFps="auto"
and automatic performance suggestions (throttle or increase FPS) ( #393 )
...
* Add `onFrameProcessorPerformanceSuggestionAvailable` and make `frameProcessorFps` support `auto`
* Implement performance suggestion and auto-adjusting
* Fix FPS setting, evaluate correctly
* Floor suggested FPS
* Remove `console.log` for frame drop warnings.
* Swift format
* Use `30` magic number
* only call if FPS is different
* Update CameraView.swift
* Implement Android 1/2
* Cleanup
* Update `frameProcessorFps` if available
* Optimize `FrameProcessorPerformanceDataCollector` initialization
* Cache call
* Set frameProcessorFps directly (Kotlin setter)
* Don't suggest if same value
* Call suggestion every second
* reset time on set
* Always store 15 last samples
* reset counter too
* Update FrameProcessorPerformanceDataCollector.swift
* Update CameraView+RecordVideo.swift
* Update CameraView.kt
* iOS: Redesign evaluation
* Update CameraView+RecordVideo.swift
* Android: Redesign evaluation
* Update CameraView.kt
* Update REA to latest alpha and install RNScreens
* Fix frameProcessorFps updating
2021-09-06 16:27:16 +02:00
Marc Rousavy
3c845ed4b0
fix: Fix app hard-crashing when FPS value is not supported ( #391 )
2021-08-28 14:14:16 +02:00
Marc Rousavy
ef455df865
feat: Support rotation ( #301 )
...
* feat: Android: Listen to rotation changes
* Only change rotation on configuration change
* feat: iOS: Support Rotation
* Swift lint
2021-07-26 11:32:58 +02:00
Marc Rousavy
4b4ea0ff33
fix: Fix UI Thread race condition in setFrameProcessor(...)
( #265 )
...
* fix: Fix UI Thread race condition in `setFrameProcessor(...)`
* Revert "fix: Fix UI Thread race condition in `setFrameProcessor(...)`"
This reverts commit 9c524e123cff6843d7d11db602a5027d1bb06b4b.
* Use `setImmediate` to call `setFrameProcessor(...)`
* Fix frame processor order of applying
* Add `enableFrameProcessor` prop that defines if a FP is added
* rename constant
* Implement `enableFrameProcessor` prop for Android and make `frameProcessorFps` faster
* link to troubleshooting guide
* Update TROUBLESHOOTING.mdx
* Add logs for use-cases
* fix log
* set initial frame processor in `onLayout` instead of `componentDidMount`
2021-07-12 15:16:03 +02:00
Marc Rousavy
f52e9fd831
fix: Correctly catch unsupported colorSpace
errors
2021-06-12 11:21:26 +02:00
Marc Rousavy
9c579c65aa
try: Improvements from WWDC 2021 1:1 workshop ( #197 )
...
* perf: Automatically determine Pixel Format depending on active format. (More efficient video recording 🚀 )
* perf: Skip `AVAssetWriter` transform by directly correctly orienting the Video Output connection
* feat: Support camera flipping while recording
* feat: Run frame processor on separate queue, avoids stutters in video recordigns
* feat: Automatically drop late frame processor frames
2021-06-11 21:06:19 +02:00
Marc Rousavy
0e606affce
feat: High quality mode (enableHighQualityPhotos
) ( #194 )
...
* feat: High Quality photo capture
* prepare photo output for re-used settings
* use high quality captures
* Remove `enableVirtualDeviceFusion` as that is enabled by default
* Clean up configuration, remove default
* format
* Update CameraViewManager.kt
* rename
* Update CameraProps.ts
* Fix overriding `photoSettings`
* Update CameraView+TakePhoto.swift
* Update CameraView+TakePhoto.swift
2021-06-10 13:49:34 +02:00
Marc Rousavy
16f2a7cdec
chore: Cleanup void returns ( #187 )
...
* Place `return` in `return [void]` on separate line
* format
* Update CameraView+RecordVideo.swift
* f
2021-06-09 11:14:49 +02:00
Marc Rousavy
72a1fad78e
feat: Separate usecases (decouple microphone, video, photo) ( #168 )
...
* Add props
* add props (iOS)
* Add use-cases conditionally
* Update CameraView+RecordVideo.swift
* Update RecordingSession.swift
* reconfigure on change
* Throw correct errors
* Check for audio permission
* Move `#if` outward
* Throw appropriate errors
* Update CameraView+RecordVideo.swift
* fix Splashscreen
* Dynamic filePath
* Fix video extension
* add `avci` and `m4v` file types
* Fix RecordVideo errors
* Fix audio setup
* Enable `photo`, `video` and `audio`
* Check for `video={true}` in frameProcessor
* format
* Remove unused DispatchQueue
* Update docs
* Add `supportsPhotoAndVideoCapture`
* Fix view manager
* Fix error not being propagated
* Catch normal errors too
* Update DEVICES.mdx
* Update CAPTURING.mdx
* Update classdocs
2021-06-07 13:08:40 +02:00
Marc Rousavy
555474be7d
fix: Represent neutralZoom in factor instead of percentage ( #179 )
...
* Use factor instead of percent for `neutralZoom`
* fix zoom calculation
* Update CameraPage.tsx
2021-06-07 10:46:53 +02:00
Marc Rousavy
eeb765f018
fix: Move Audio Input initialization shortly before startRecording
( #159 )
...
* rename
* Update AVAudioSession+updateCategory.swift
* fix bootstrap script
* Update CameraView+AVAudioSession.swift
* move audio input adding lower
* Activate AudioSession only when starting recording
* format
* Deactivate Audio Session
* remove audio input before deactivating audio session
* Update CameraView+AVAudioSession.swift
* log time
* Update CameraView+AVAudioSession.swift
* measure time with `measureElapsedTime`
* Update project.pbxproj
* only log in debug builds
* bootstrap with bridge (RNN new API)
* Mark two funcs as `@inlinable`
* format
* Update ReactLogger.swift
* Make audioWriter optional (allow videos without sound)
* only log frame drop reason in DEBUG
* Make audio writing entirely optional
* format
* Use function name as label for measureElapsedTime
* Update MeasureElapsedTime.swift
* Update MeasureElapsedTime.swift
* Mark AudioWriter as finished
* set `automaticallyConfiguresApplicationAudioSession` once
* Add JS console logging
* log to JS console for a few logs
* Update AVAudioSession+updateCategory.swift
* format
* Update JSConsoleHelper.mm
* catch log errors
* Update ReactLogger.swift
* fix docs
* Update RecordingSession.swift
* Immediatelly add audio input
* Update CameraView+AVCaptureSession.swift
* Update CameraView+AVCaptureSession.swift
* Update ReactLogger.swift
* immediatelly set audio session
* extract
* format
* Update TROUBLESHOOTING.mdx
* hmm
* Update AVAudioSession+updateCategory.swift
* Create secondary `AVCaptureSession` for audio
* Configure once, start stop on demand
* format
* fix audio notification interruptions
* docs
2021-06-03 14:16:02 +02:00
Marc Rousavy
71730a73ef
fix: Fix AVAudioSession not allowing background music playback ( #155 )
...
* Set category always if different
* rename org
* Fix video format sorting
* fix format filtering
* Update AVAudioSession+setCategoryIfNotSet.swift
* upgrade all dependencies
* Also run dependabot for JS codebase
* Update MediaPage.tsx
* Use typescript 4.2.4
* Also run TS in check-all
* Downgrade typescript to 4.2.3
* f
* recreate lockfiles
* docs: Revert package.json changes
* revert all package.json changes
* Update Podfile.lock
* bump all dependencies, pin typescript to 4.2.4
* Downgrade react-native-navigation for now
* upgrade to later snapshot
* Update yarn.lock
* remove yeet
2021-06-01 13:07:57 +02:00
Marc Rousavy
b6a67d5ced
feature: Frame Processors (iOS) ( #2 )
...
* Clean up Frame Processor
* Create FrameProcessorHolder
* Create FrameProcessorDelegate in ObjC++
* Move frame processor to FrameProcessorDelegate
* Decorate runtime, check for null
* Update FrameProcessorDelegate.mm
* Cleanup FrameProcessorBindings.mm
* Fix RuntimeDecorator.h import
* Update FrameProcessorDelegate.mm
* "React" -> "React Helper" to avoid confusion
* Rename folders again
* Fix podspec flattening a lot of headers, causing REA nameclash
* Fix header imports to avoid REA naming collision
* Lazily initialize jsi::Runtime on DispatchQueue
* Install frame processor bindings from Swift
* First try to call jsi::Function (frame processor) 👀
* Call viewForReactTag on RCT main thread
* Fix bridge accessing
* Add more logs
* Update CameraViewManager.swift
* Add more TODOs
* Re-indent .cpp files
* Fix RCTTurboModule import podspec
* Remove unnecessary include check for swift umbrella header
* Merge branch 'main' into frame-processors
* Docs: use static width for images (283)
* Create validate-cpp.yml
* Update a lot of packages to latest
* Set SWIFT_VERSION to 5.2 in podspec
* Create clean.sh
* Delete unused C++ files
* podspec: Remove CLANG_CXX_LANGUAGE_STANDARD and OTHER_CFLAGS
* Update pod lockfiles
* Regenerate lockfiles
* Remove IOSLogger
* Use NSLog
* Create FrameProcessorManager (inherits from REA RuntimeManager)
* Create reanimated::RuntimeManager shared_ptr
* Re-integrate pods
* Add react-native-reanimated >=2 peerDependency
* Add metro-config
* blacklist -> exclusionList
* Try to call worklet
* Fix jsi::Value* initializer
* Call ShareableValue::adapt (makeShareable) with React/JS Runtime
* Add null-checks
* Lift runtime manager creation out of delegate, into bindings
* Remove debug statement
* Make RuntimeManager unique_ptr
* Set _FRAME_PROCESSOR
* Extract convertJSIFunctionToFrameProcessorCallback
* Print frame
* Merge branch 'main' into frame-processors
* Reformat Swift code
* Install reanimated from npm again
* Re-integrate Pods
* Dependabot: Also scan example/ and docs/
* Update validate-cpp.yml
* Create FrameProcessorUtils
* Create Frame.h
* Abstract HostObject creation away
* Fix types
* Fix frame processor call
* Add todo
* Update lockfiles
* Add C++ contributing instructions
* Update CONTRIBUTING.md
* Add android/src/main/cpp to cpplint
* Update cpplint.sh
* Fix a few cpplint errors
* Fix globals
* Fix a few more cpplint errors
* Update App.tsx
* Update AndroidLogger.cpp
* Format
* Fix cpplint script (check-cpp)
* Try to simplify frame processor
* y
* Update FrameProcessorUtils.mm
* Update FrameProcessorBindings.mm
* Update CameraView.swift
* Update CameraViewManager.m
* Restructure everything
* fix
* Fix `@objc` export (make public)
* Refactor installFrameProcessorBindings into FrameProcessorRuntimeManager
* Add swift RCTBridge.runOnJS helper
* Fix run(onJS)
* Add pragma once
* Add `&self` to lambda
* Update FrameProcessorRuntimeManager.mm
* reorder imports
* Fix imports
* forward declare
* Rename extension
* Destroy buffer after execution
* Add FrameProcessorPluginRegistry base
* Merge branch 'main' into frame-processors
* Add frameProcessor to types
* Update Camera.tsx
* Fix rebase merge
* Remove movieOutput
* Use `useFrameProcessor`
* Fix bad merge
* Add additional ESLint rules
* Update lockfiles
* Update CameraViewManager.m
* Add support for V8 runtime
* Add frame processor plugins API
* Print plugin invoke
* Fix React Utils in podspec
* Fix runOnJS swift name
* Remove invalid redecl of `captureSession`
* Use REA 2.1.0 which includes all my big PRs 🎉
* Update validate-cpp.yml
* Update Podfile.lock
* Remove Flipper
* Fix dereferencing
* Capture `self` by value. Fucking hell, what a dumb mistake.
* Override a few HostObject functions
* Expose isReady, width, height, bytesPerRow and planesCount
* use hook again
* Expose property names
* FrameProcessor -> Frame
* Update CameraView+RecordVideo.swift
* Add Swift support for Frame Processors Plugins
* Add macros for plugin installation
* Add ObjC frame processor plugin
* Correctly install frame processor plugins
* Don't require custom name for macro
* Check if plugin already exists
* Implement QR Code Frame Processor Plugin in Swift
* Adjust ObjC style frame processor macro
* optimize
* Add `frameProcessorFrameDropRate`
* Fix types
* Only log once
* Log if it executes slowly
* Implement `frameProcessorFps`
* Implement manual encoded video recordings
* Use recommended video settings
* Add fileType types
* Ignore if input is not ready for media data
* Add completion handler
* Add audio buffer sampling
* Init only for video frame
* use AVAssetWriterInputPixelBufferAdaptor
* Remove AVAssetWriterInputPixelBufferAdaptor
* Rotate VideoWriter
* Always assume portrait orientation
* Update RecordingSession.swift
* Use a separate Queue for Audio
* Format Swift
* Update CameraView+RecordVideo.swift
* Use `videoQueue` instead of `cameraQueue`
* Move example plugins to example app
* Fix hardcoded name in plugin macro
* QRFrame... -> QRCodeFrame...
* Update FrameProcessorPlugin.h
* Add example frame processors to JS base
* Update QRCodeFrameProcessorPluginSwift.m
* Add docs to create FP Plugins
* Update FRAME_PROCESSORS_CREATE.mdx
* Update FRAME_PROCESSORS_CREATE.mdx
* Use `AVAssetWriterInputPixelBufferAdaptor` for efficient pixel buffer recycling
* Add customizable `pixelFormat`
* Use native format if available
* Update project.pbxproj
* Set video width and height as source-pixel-buffer attributes
* Catch
* Update App.tsx
* Don't explicitly set video dimensions, let CVPixelBufferPool handle it
* Add a few logs
* Cleanup
* Update CameraView+RecordVideo.swift
* Eagerly initialize asset writer to fix stutter at first frame
* Use `cameraQueue` DispatchQueue to not block CaptureDataOutputDelegate
* Fix duration calculation
* cleanup
* Cleanup
* Swiftformat
* Return available video codecs
* Only show frame drop notification for video output
* Remove photo and video codec functionality
It was too much complexity and probably never used anyways.
* Revert all android related changes for now
* Cleanup
* Remove unused header
* Update AVAssetWriter.Status+descriptor.swift
* Only call Frame Processor for Video Frames
* Fix `if`
* Add support for Frame Processor plugin parameters/arguments
* Fix arg support
* Move to JSIUtils.mm
* Update JSIUtils.h
* Update FRAME_PROCESSORS_CREATE.mdx
* Update FRAME_PROCESSORS_CREATE.mdx
* Upgrade packages for docs/
* fix docs
* Rename
* highlight lines
* docs
* community plugins
* Update FRAME_PROCESSOR_CREATE_FINAL.mdx
* Update FRAME_PROCESSOR_PLUGIN_LIST.mdx
* Update FRAME_PROCESSOR_PLUGIN_LIST.mdx
* Update dependencies (1/2)
* Update dependencies (2/2)
* Update Gemfile.lock
* add FP docs
* Update README.md
* Make `lastFrameProcessor` private
* add `frameProcessor` docs
* fix docs
* adjust docs
* Update DEVICES.mdx
* fix
* s
* Add logs demo
* add metro restart note
* Update FRAME_PROCESSOR_CREATE_PLUGIN_IOS.mdx
* Mirror video device
* Update AVCaptureVideoDataOutput+mirror.swift
* Create .swift-version
* Enable whole module optimization
* Fix recording mirrored video
* Swift format
* Clean dictionary on `markInvalid`
* Fix cleanup
* Add docs for disabling frame processors
* Update project.pbxproj
* Revert "Update project.pbxproj"
This reverts commit e67861e51b88b4888a6940e2d20388f3044211d0.
* Log frame drop reason
* Format
* add more samples
* Add clang-format
* also check .mm
* Revert "also check .mm"
This reverts commit 8b9d5e2c29866b05909530d104f6633d6c49eadd.
* Revert "Add clang-format"
This reverts commit 7643ac808e0fc34567ea1f814e73d84955381636.
* Use `kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange` as default
* Read matching video attributes from videoSettings
* Add TODO
* Swiftformat
* Conditionally disable frame processors
* Assert if trying to use frame processors when disabled
* Add frame-processors demo gif
* Allow disabling frame processors via `VISION_CAMERA_DISABLE_FRAME_PROCESSORS`
* Update FrameProcessorRuntimeManager.mm
* Update FRAME_PROCESSORS.mdx
* Update project.pbxproj
* Update FRAME_PROCESSORS_CREATE_OVERVIEW.mdx
2021-05-06 14:11:55 +02:00
Marc Rousavy
4ea636e0d0
Automatically handle Audio interruptions ( #113 )
...
* Remove audio device when interruption begins
* Remove ReactLogger:alsoLogToJS
* Fix ReactLogger.logJS calls
* Fix `AVCaptureSessionInterruptionReasonKey` cast
2021-03-29 14:12:04 +02:00
Marc Rousavy
cd180dc73b
Set automaticallyConfiguresApplicationAudioSession = false
2021-03-29 13:18:02 +02:00
Marc Rousavy
66b93181e1
Remove code scanning ( #112 )
...
* Remove Audio Device if it failed to configure
* Add `audio-in-use-by-other-app` error
* Try removing on interruption
* Format code
* Remove code scanning
* Fix export
2021-03-29 11:34:35 +02:00
Marc Rousavy
1558dd2f15
Error when Audio Input is in use by another app ( #111 )
...
* Remove Audio Device if it failed to configure
* Add `audio-in-use-by-other-app` error
* Try removing on interruption
* Format code
* Make error more clear
2021-03-29 11:32:00 +02:00
Marc Rousavy
b25cf6a04f
Refactor lifecycle vars
2021-03-26 16:28:08 +01:00
Marc Rousavy
9404b93dc3
Extract AVCaptureSession
and AVAudioSession
setup to extensions
2021-03-26 16:20:57 +01:00