Marc Rousavy
|
0f0f1fbb07
|
fix: React Native 0.65 compatibility (#230)
* Update Hermes Header import for RN 0.65
* Update VisionCamera.podspec
|
2021-06-28 18:27:03 +02:00 |
|
Marc Rousavy
|
9ea158ad8f
|
chore: Move to /mrousavy/ (#224)
* rename 1/n
* 2
* 3
* fix indent
|
2021-06-21 22:42:46 +02:00 |
|
Marc Rousavy
|
68a716b506
|
feat: native Frame type to provide Orientation (#186)
* Use Frame.h
* Add orientation
* Determine buffer orientation
* Replace plugins
* fix calls
* Update FRAME_PROCESSOR_CREATE_PLUGIN_IOS.mdx
* Update FRAME_PROCESSOR_CREATE_PLUGIN_IOS.mdx
* format
* Update CameraPage.tsx
* Update FRAME_PROCESSOR_CREATE_PLUGIN_IOS.mdx
* Add links to docs
* Use `.` syntax
* Make properties `readonly`
* Fix `@synthesize` backing store
|
2021-06-09 10:57:05 +02:00 |
|
Marc Rousavy
|
4038db2e28
|
feat: Frame Processors: Allow returning Frame s (support for resize and other frame manipulations) (#185)
* batch
* Init Frame as box
* Use ObjC syntax
* Fix access
* Revert "Fix access"
This reverts commit 7de09e52739d4c2b53f485d5ed696f1665fa5737.
* Revert "Use ObjC syntax"
This reverts commit e33f05ae8451cc4ee24af41d14dc76a57c157554.
* Revert "Init Frame as box"
This reverts commit 5adafb6109bfbf7fddb8ddc4af7d306b7b76b476.
* use holder
* convert buffer <-> jsi object
* add docs
* add more docs
* Update JSIUtils.mm
* Update FRAME_PROCESSORS_CREATE_OVERVIEW.mdx
* Update CameraView+RecordVideo.swift
|
2021-06-08 14:20:07 +02:00 |
|
Marc Rousavy
|
eeb765f018
|
fix: Move Audio Input initialization shortly before startRecording (#159)
* rename
* Update AVAudioSession+updateCategory.swift
* fix bootstrap script
* Update CameraView+AVAudioSession.swift
* move audio input adding lower
* Activate AudioSession only when starting recording
* format
* Deactivate Audio Session
* remove audio input before deactivating audio session
* Update CameraView+AVAudioSession.swift
* log time
* Update CameraView+AVAudioSession.swift
* measure time with `measureElapsedTime`
* Update project.pbxproj
* only log in debug builds
* bootstrap with bridge (RNN new API)
* Mark two funcs as `@inlinable`
* format
* Update ReactLogger.swift
* Make audioWriter optional (allow videos without sound)
* only log frame drop reason in DEBUG
* Make audio writing entirely optional
* format
* Use function name as label for measureElapsedTime
* Update MeasureElapsedTime.swift
* Update MeasureElapsedTime.swift
* Mark AudioWriter as finished
* set `automaticallyConfiguresApplicationAudioSession` once
* Add JS console logging
* log to JS console for a few logs
* Update AVAudioSession+updateCategory.swift
* format
* Update JSConsoleHelper.mm
* catch log errors
* Update ReactLogger.swift
* fix docs
* Update RecordingSession.swift
* Immediatelly add audio input
* Update CameraView+AVCaptureSession.swift
* Update CameraView+AVCaptureSession.swift
* Update ReactLogger.swift
* immediatelly set audio session
* extract
* format
* Update TROUBLESHOOTING.mdx
* hmm
* Update AVAudioSession+updateCategory.swift
* Create secondary `AVCaptureSession` for audio
* Configure once, start stop on demand
* format
* fix audio notification interruptions
* docs
|
2021-06-03 14:16:02 +02:00 |
|
Marc Rousavy
|
b6a67d5ced
|
feature: Frame Processors (iOS) (#2)
* Clean up Frame Processor
* Create FrameProcessorHolder
* Create FrameProcessorDelegate in ObjC++
* Move frame processor to FrameProcessorDelegate
* Decorate runtime, check for null
* Update FrameProcessorDelegate.mm
* Cleanup FrameProcessorBindings.mm
* Fix RuntimeDecorator.h import
* Update FrameProcessorDelegate.mm
* "React" -> "React Helper" to avoid confusion
* Rename folders again
* Fix podspec flattening a lot of headers, causing REA nameclash
* Fix header imports to avoid REA naming collision
* Lazily initialize jsi::Runtime on DispatchQueue
* Install frame processor bindings from Swift
* First try to call jsi::Function (frame processor) 👀
* Call viewForReactTag on RCT main thread
* Fix bridge accessing
* Add more logs
* Update CameraViewManager.swift
* Add more TODOs
* Re-indent .cpp files
* Fix RCTTurboModule import podspec
* Remove unnecessary include check for swift umbrella header
* Merge branch 'main' into frame-processors
* Docs: use static width for images (283)
* Create validate-cpp.yml
* Update a lot of packages to latest
* Set SWIFT_VERSION to 5.2 in podspec
* Create clean.sh
* Delete unused C++ files
* podspec: Remove CLANG_CXX_LANGUAGE_STANDARD and OTHER_CFLAGS
* Update pod lockfiles
* Regenerate lockfiles
* Remove IOSLogger
* Use NSLog
* Create FrameProcessorManager (inherits from REA RuntimeManager)
* Create reanimated::RuntimeManager shared_ptr
* Re-integrate pods
* Add react-native-reanimated >=2 peerDependency
* Add metro-config
* blacklist -> exclusionList
* Try to call worklet
* Fix jsi::Value* initializer
* Call ShareableValue::adapt (makeShareable) with React/JS Runtime
* Add null-checks
* Lift runtime manager creation out of delegate, into bindings
* Remove debug statement
* Make RuntimeManager unique_ptr
* Set _FRAME_PROCESSOR
* Extract convertJSIFunctionToFrameProcessorCallback
* Print frame
* Merge branch 'main' into frame-processors
* Reformat Swift code
* Install reanimated from npm again
* Re-integrate Pods
* Dependabot: Also scan example/ and docs/
* Update validate-cpp.yml
* Create FrameProcessorUtils
* Create Frame.h
* Abstract HostObject creation away
* Fix types
* Fix frame processor call
* Add todo
* Update lockfiles
* Add C++ contributing instructions
* Update CONTRIBUTING.md
* Add android/src/main/cpp to cpplint
* Update cpplint.sh
* Fix a few cpplint errors
* Fix globals
* Fix a few more cpplint errors
* Update App.tsx
* Update AndroidLogger.cpp
* Format
* Fix cpplint script (check-cpp)
* Try to simplify frame processor
* y
* Update FrameProcessorUtils.mm
* Update FrameProcessorBindings.mm
* Update CameraView.swift
* Update CameraViewManager.m
* Restructure everything
* fix
* Fix `@objc` export (make public)
* Refactor installFrameProcessorBindings into FrameProcessorRuntimeManager
* Add swift RCTBridge.runOnJS helper
* Fix run(onJS)
* Add pragma once
* Add `&self` to lambda
* Update FrameProcessorRuntimeManager.mm
* reorder imports
* Fix imports
* forward declare
* Rename extension
* Destroy buffer after execution
* Add FrameProcessorPluginRegistry base
* Merge branch 'main' into frame-processors
* Add frameProcessor to types
* Update Camera.tsx
* Fix rebase merge
* Remove movieOutput
* Use `useFrameProcessor`
* Fix bad merge
* Add additional ESLint rules
* Update lockfiles
* Update CameraViewManager.m
* Add support for V8 runtime
* Add frame processor plugins API
* Print plugin invoke
* Fix React Utils in podspec
* Fix runOnJS swift name
* Remove invalid redecl of `captureSession`
* Use REA 2.1.0 which includes all my big PRs 🎉
* Update validate-cpp.yml
* Update Podfile.lock
* Remove Flipper
* Fix dereferencing
* Capture `self` by value. Fucking hell, what a dumb mistake.
* Override a few HostObject functions
* Expose isReady, width, height, bytesPerRow and planesCount
* use hook again
* Expose property names
* FrameProcessor -> Frame
* Update CameraView+RecordVideo.swift
* Add Swift support for Frame Processors Plugins
* Add macros for plugin installation
* Add ObjC frame processor plugin
* Correctly install frame processor plugins
* Don't require custom name for macro
* Check if plugin already exists
* Implement QR Code Frame Processor Plugin in Swift
* Adjust ObjC style frame processor macro
* optimize
* Add `frameProcessorFrameDropRate`
* Fix types
* Only log once
* Log if it executes slowly
* Implement `frameProcessorFps`
* Implement manual encoded video recordings
* Use recommended video settings
* Add fileType types
* Ignore if input is not ready for media data
* Add completion handler
* Add audio buffer sampling
* Init only for video frame
* use AVAssetWriterInputPixelBufferAdaptor
* Remove AVAssetWriterInputPixelBufferAdaptor
* Rotate VideoWriter
* Always assume portrait orientation
* Update RecordingSession.swift
* Use a separate Queue for Audio
* Format Swift
* Update CameraView+RecordVideo.swift
* Use `videoQueue` instead of `cameraQueue`
* Move example plugins to example app
* Fix hardcoded name in plugin macro
* QRFrame... -> QRCodeFrame...
* Update FrameProcessorPlugin.h
* Add example frame processors to JS base
* Update QRCodeFrameProcessorPluginSwift.m
* Add docs to create FP Plugins
* Update FRAME_PROCESSORS_CREATE.mdx
* Update FRAME_PROCESSORS_CREATE.mdx
* Use `AVAssetWriterInputPixelBufferAdaptor` for efficient pixel buffer recycling
* Add customizable `pixelFormat`
* Use native format if available
* Update project.pbxproj
* Set video width and height as source-pixel-buffer attributes
* Catch
* Update App.tsx
* Don't explicitly set video dimensions, let CVPixelBufferPool handle it
* Add a few logs
* Cleanup
* Update CameraView+RecordVideo.swift
* Eagerly initialize asset writer to fix stutter at first frame
* Use `cameraQueue` DispatchQueue to not block CaptureDataOutputDelegate
* Fix duration calculation
* cleanup
* Cleanup
* Swiftformat
* Return available video codecs
* Only show frame drop notification for video output
* Remove photo and video codec functionality
It was too much complexity and probably never used anyways.
* Revert all android related changes for now
* Cleanup
* Remove unused header
* Update AVAssetWriter.Status+descriptor.swift
* Only call Frame Processor for Video Frames
* Fix `if`
* Add support for Frame Processor plugin parameters/arguments
* Fix arg support
* Move to JSIUtils.mm
* Update JSIUtils.h
* Update FRAME_PROCESSORS_CREATE.mdx
* Update FRAME_PROCESSORS_CREATE.mdx
* Upgrade packages for docs/
* fix docs
* Rename
* highlight lines
* docs
* community plugins
* Update FRAME_PROCESSOR_CREATE_FINAL.mdx
* Update FRAME_PROCESSOR_PLUGIN_LIST.mdx
* Update FRAME_PROCESSOR_PLUGIN_LIST.mdx
* Update dependencies (1/2)
* Update dependencies (2/2)
* Update Gemfile.lock
* add FP docs
* Update README.md
* Make `lastFrameProcessor` private
* add `frameProcessor` docs
* fix docs
* adjust docs
* Update DEVICES.mdx
* fix
* s
* Add logs demo
* add metro restart note
* Update FRAME_PROCESSOR_CREATE_PLUGIN_IOS.mdx
* Mirror video device
* Update AVCaptureVideoDataOutput+mirror.swift
* Create .swift-version
* Enable whole module optimization
* Fix recording mirrored video
* Swift format
* Clean dictionary on `markInvalid`
* Fix cleanup
* Add docs for disabling frame processors
* Update project.pbxproj
* Revert "Update project.pbxproj"
This reverts commit e67861e51b88b4888a6940e2d20388f3044211d0.
* Log frame drop reason
* Format
* add more samples
* Add clang-format
* also check .mm
* Revert "also check .mm"
This reverts commit 8b9d5e2c29866b05909530d104f6633d6c49eadd.
* Revert "Add clang-format"
This reverts commit 7643ac808e0fc34567ea1f814e73d84955381636.
* Use `kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange` as default
* Read matching video attributes from videoSettings
* Add TODO
* Swiftformat
* Conditionally disable frame processors
* Assert if trying to use frame processors when disabled
* Add frame-processors demo gif
* Allow disabling frame processors via `VISION_CAMERA_DISABLE_FRAME_PROCESSORS`
* Update FrameProcessorRuntimeManager.mm
* Update FRAME_PROCESSORS.mdx
* Update project.pbxproj
* Update FRAME_PROCESSORS_CREATE_OVERVIEW.mdx
|
2021-05-06 14:11:55 +02:00 |
|
Marc Rousavy
|
501827cb87
|
Rename pod to VisionCamera
|
2021-03-26 16:22:24 +01:00 |
|