feature: Frame Processors (iOS) (#2)

* Clean up Frame Processor

* Create FrameProcessorHolder

* Create FrameProcessorDelegate in ObjC++

* Move frame processor to FrameProcessorDelegate

* Decorate runtime, check for null

* Update FrameProcessorDelegate.mm

* Cleanup FrameProcessorBindings.mm

* Fix RuntimeDecorator.h import

* Update FrameProcessorDelegate.mm

* "React" -> "React Helper" to avoid confusion

* Rename folders again

* Fix podspec flattening a lot of headers, causing REA nameclash

* Fix header imports to avoid REA naming collision

* Lazily initialize jsi::Runtime on DispatchQueue

* Install frame processor bindings from Swift

* First try to call jsi::Function (frame processor) 👀

* Call viewForReactTag on RCT main thread

* Fix bridge accessing

* Add more logs

* Update CameraViewManager.swift

* Add more TODOs

* Re-indent .cpp files

* Fix RCTTurboModule import podspec

* Remove unnecessary include check for swift umbrella header

* Merge branch 'main' into frame-processors

* Docs: use static width for images (283)

* Create validate-cpp.yml

* Update a lot of packages to latest

* Set SWIFT_VERSION to 5.2 in podspec

* Create clean.sh

* Delete unused C++ files

* podspec: Remove CLANG_CXX_LANGUAGE_STANDARD and OTHER_CFLAGS

* Update pod lockfiles

* Regenerate lockfiles

* Remove IOSLogger

* Use NSLog

* Create FrameProcessorManager (inherits from REA RuntimeManager)

* Create reanimated::RuntimeManager shared_ptr

* Re-integrate pods

* Add react-native-reanimated >=2 peerDependency

* Add metro-config

* blacklist -> exclusionList

* Try to call worklet

* Fix jsi::Value* initializer

* Call ShareableValue::adapt (makeShareable) with React/JS Runtime

* Add null-checks

* Lift runtime manager creation out of delegate, into bindings

* Remove debug statement

* Make RuntimeManager unique_ptr

* Set _FRAME_PROCESSOR

* Extract convertJSIFunctionToFrameProcessorCallback

* Print frame

* Merge branch 'main' into frame-processors

* Reformat Swift code

* Install reanimated from npm again

* Re-integrate Pods

* Dependabot: Also scan example/ and docs/

* Update validate-cpp.yml

* Create FrameProcessorUtils

* Create Frame.h

* Abstract HostObject creation away

* Fix types

* Fix frame processor call

* Add todo

* Update lockfiles

* Add C++ contributing instructions

* Update CONTRIBUTING.md

* Add android/src/main/cpp to cpplint

* Update cpplint.sh

* Fix a few cpplint errors

* Fix globals

* Fix a few more cpplint errors

* Update App.tsx

* Update AndroidLogger.cpp

* Format

* Fix cpplint script (check-cpp)

* Try to simplify frame processor

* y

* Update FrameProcessorUtils.mm

* Update FrameProcessorBindings.mm

* Update CameraView.swift

* Update CameraViewManager.m

* Restructure everything

* fix

* Fix `@objc` export (make public)

* Refactor installFrameProcessorBindings into FrameProcessorRuntimeManager

* Add swift RCTBridge.runOnJS helper

* Fix run(onJS)

* Add pragma once

* Add `&self` to lambda

* Update FrameProcessorRuntimeManager.mm

* reorder imports

* Fix imports

* forward declare

* Rename extension

* Destroy buffer after execution

* Add FrameProcessorPluginRegistry base

* Merge branch 'main' into frame-processors

* Add frameProcessor to types

* Update Camera.tsx

* Fix rebase merge

* Remove movieOutput

* Use `useFrameProcessor`

* Fix bad merge

* Add additional ESLint rules

* Update lockfiles

* Update CameraViewManager.m

* Add support for V8 runtime

* Add frame processor plugins API

* Print plugin invoke

* Fix React Utils in podspec

* Fix runOnJS swift name

* Remove invalid redecl of `captureSession`

* Use REA 2.1.0 which includes all my big PRs 🎉

* Update validate-cpp.yml

* Update Podfile.lock

* Remove Flipper

* Fix dereferencing

* Capture `self` by value. Fucking hell, what a dumb mistake.

* Override a few HostObject functions

* Expose isReady, width, height, bytesPerRow and planesCount

* use hook again

* Expose property names

* FrameProcessor -> Frame

* Update CameraView+RecordVideo.swift

* Add Swift support for Frame Processors Plugins

* Add macros for plugin installation

* Add ObjC frame processor plugin

* Correctly install frame processor plugins

* Don't require custom name for macro

* Check if plugin already exists

* Implement QR Code Frame Processor Plugin in Swift

* Adjust ObjC style frame processor macro

* optimize

* Add `frameProcessorFrameDropRate`

* Fix types

* Only log once

* Log if it executes slowly

* Implement `frameProcessorFps`

* Implement manual encoded video recordings

* Use recommended video settings

* Add fileType types

* Ignore if input is not ready for media data

* Add completion handler

* Add audio buffer sampling

* Init only for video frame

* use AVAssetWriterInputPixelBufferAdaptor

* Remove AVAssetWriterInputPixelBufferAdaptor

* Rotate VideoWriter

* Always assume portrait orientation

* Update RecordingSession.swift

* Use a separate Queue for Audio

* Format Swift

* Update CameraView+RecordVideo.swift

* Use `videoQueue` instead of `cameraQueue`

* Move example plugins to example app

* Fix hardcoded name in plugin macro

* QRFrame... -> QRCodeFrame...

* Update FrameProcessorPlugin.h

* Add example frame processors to JS base

* Update QRCodeFrameProcessorPluginSwift.m

* Add docs to create FP Plugins

* Update FRAME_PROCESSORS_CREATE.mdx

* Update FRAME_PROCESSORS_CREATE.mdx

* Use `AVAssetWriterInputPixelBufferAdaptor` for efficient pixel buffer recycling

* Add customizable `pixelFormat`

* Use native format if available

* Update project.pbxproj

* Set video width and height as source-pixel-buffer attributes

* Catch

* Update App.tsx

* Don't explicitly set video dimensions, let CVPixelBufferPool handle it

* Add a few logs

* Cleanup

* Update CameraView+RecordVideo.swift

* Eagerly initialize asset writer to fix stutter at first frame

* Use `cameraQueue` DispatchQueue to not block CaptureDataOutputDelegate

* Fix duration calculation

* cleanup

* Cleanup

* Swiftformat

* Return available video codecs

* Only show frame drop notification for video output

* Remove photo and video codec functionality

It was too much complexity and probably never used anyways.

* Revert all android related changes for now

* Cleanup

* Remove unused header

* Update AVAssetWriter.Status+descriptor.swift

* Only call Frame Processor for Video Frames

* Fix `if`

* Add support for Frame Processor plugin parameters/arguments

* Fix arg support

* Move to JSIUtils.mm

* Update JSIUtils.h

* Update FRAME_PROCESSORS_CREATE.mdx

* Update FRAME_PROCESSORS_CREATE.mdx

* Upgrade packages for docs/

* fix docs

* Rename

* highlight lines

* docs

* community plugins

* Update FRAME_PROCESSOR_CREATE_FINAL.mdx

* Update FRAME_PROCESSOR_PLUGIN_LIST.mdx

* Update FRAME_PROCESSOR_PLUGIN_LIST.mdx

* Update dependencies (1/2)

* Update dependencies (2/2)

* Update Gemfile.lock

* add FP docs

* Update README.md

* Make `lastFrameProcessor` private

* add `frameProcessor` docs

* fix docs

* adjust docs

* Update DEVICES.mdx

* fix

* s

* Add logs demo

* add metro restart note

* Update FRAME_PROCESSOR_CREATE_PLUGIN_IOS.mdx

* Mirror video device

* Update AVCaptureVideoDataOutput+mirror.swift

* Create .swift-version

* Enable whole module optimization

* Fix recording mirrored video

* Swift format

* Clean dictionary on `markInvalid`

* Fix cleanup

* Add docs for disabling frame processors

* Update project.pbxproj

* Revert "Update project.pbxproj"

This reverts commit e67861e51b88b4888a6940e2d20388f3044211d0.

* Log frame drop reason

* Format

* add more samples

* Add clang-format

* also check .mm

* Revert "also check .mm"

This reverts commit 8b9d5e2c29866b05909530d104f6633d6c49eadd.

* Revert "Add clang-format"

This reverts commit 7643ac808e0fc34567ea1f814e73d84955381636.

* Use `kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange` as default

* Read matching video attributes from videoSettings

* Add TODO

* Swiftformat

* Conditionally disable frame processors

* Assert if trying to use frame processors when disabled

* Add frame-processors demo gif

* Allow disabling frame processors via `VISION_CAMERA_DISABLE_FRAME_PROCESSORS`

* Update FrameProcessorRuntimeManager.mm

* Update FRAME_PROCESSORS.mdx

* Update project.pbxproj

* Update FRAME_PROCESSORS_CREATE_OVERVIEW.mdx
This commit is contained in:
Marc Rousavy
2021-05-06 14:11:55 +02:00
committed by GitHub
parent 77b3d78566
commit b6a67d5ced
100 changed files with 4703 additions and 3664 deletions

View File

@@ -9,9 +9,20 @@
#pragma once
#import <Foundation/Foundation.h>
#import <React/RCTViewManager.h>
#import <React/RCTUIManager.h>
#import "FrameProcessorCallback.h"
#import "FrameProcessorRuntimeManager.h"
#import "RCTBridge+runOnJS.h"
#ifdef VISION_CAMERA_DISABLE_FRAME_PROCESSORS
static bool enableFrameProcessors = false;
#else
static bool enableFrameProcessors = true;
#endif
@interface CameraBridge: RCTViewManager
@end

View File

@@ -179,6 +179,7 @@ enum CaptureError {
case noRecordingInProgress
case fileError
case createTempFileError
case createRecorderError(message: String? = nil)
case invalidPhotoCodec
case unknown(message: String? = nil)
@@ -194,6 +195,8 @@ enum CaptureError {
return "file-io-error"
case .createTempFileError:
return "create-temp-file-error"
case .createRecorderError:
return "create-recorder-error"
case .invalidPhotoCodec:
return "invalid-photo-codec"
case .unknown:
@@ -215,6 +218,8 @@ enum CaptureError {
return "An unexpected File IO error occured!"
case .createTempFileError:
return "Failed to create a temporary file!"
case let .createRecorderError(message: message):
return "Failed to create the AVAssetWriter (Recorder)! \(message ?? "(no additional message)")"
case let .unknown(message: message):
return message ?? "An unknown error occured while capturing a video/photo."
}

33
ios/CameraQueues.swift Normal file
View File

@@ -0,0 +1,33 @@
//
// CameraQueues.swift
// VisionCamera
//
// Created by Marc Rousavy on 22.03.21.
// Copyright © 2021 Facebook. All rights reserved.
//
import Foundation
@objc
public class CameraQueues: NSObject {
/// The serial execution queue for the camera preview layer (input stream) as well as output processing of photos.
@objc public static let cameraQueue = DispatchQueue(label: "com.mrousavy.vision.camera-queue",
qos: .userInteractive,
attributes: [],
autoreleaseFrequency: .inherit,
target: nil)
/// The serial execution queue for output processing of videos as well as frame processors.
@objc public static let videoQueue = DispatchQueue(label: "com.mrousavy.vision.video-queue",
qos: .userInteractive,
attributes: [],
autoreleaseFrequency: .inherit,
target: nil)
// TODO: Is it a good idea to use a separate queue for audio output processing?
/// The serial execution queue for output processing of audio buffers.
@objc public static let audioQueue = DispatchQueue(label: "com.mrousavy.vision.audio-queue",
qos: .userInteractive,
attributes: [],
autoreleaseFrequency: .inherit,
target: nil)
}

View File

@@ -98,19 +98,36 @@ extension CameraView {
photoOutput!.mirror()
}
// Video Output
if let movieOutput = self.movieOutput {
captureSession.removeOutput(movieOutput)
// Video Output + Frame Processor
if let videoOutput = self.videoOutput {
captureSession.removeOutput(videoOutput)
self.videoOutput = nil
}
movieOutput = AVCaptureMovieFileOutput()
guard captureSession.canAddOutput(movieOutput!) else {
return invokeOnError(.parameter(.unsupportedOutput(outputDescriptor: "movie-output")))
ReactLogger.log(level: .info, message: "Adding Video Data output...")
videoOutput = AVCaptureVideoDataOutput()
guard captureSession.canAddOutput(videoOutput!) else {
return invokeOnError(.parameter(.unsupportedOutput(outputDescriptor: "video-output")))
}
captureSession.addOutput(movieOutput!)
videoOutput!.setSampleBufferDelegate(self, queue: videoQueue)
videoOutput!.alwaysDiscardsLateVideoFrames = true
captureSession.addOutput(videoOutput!)
if videoDeviceInput!.device.position == .front {
movieOutput!.mirror()
videoOutput!.mirror()
}
// Audio Output
if let audioOutput = self.audioOutput {
captureSession.removeOutput(audioOutput)
self.audioOutput = nil
}
ReactLogger.log(level: .info, message: "Adding Audio Data output...")
audioOutput = AVCaptureAudioDataOutput()
guard captureSession.canAddOutput(audioOutput!) else {
return invokeOnError(.parameter(.unsupportedOutput(outputDescriptor: "audio-output")))
}
audioOutput!.setSampleBufferDelegate(self, queue: audioQueue)
captureSession.addOutput(audioOutput!)
invokeOnInitialized()
isReady = true
ReactLogger.log(level: .info, message: "Session successfully configured!")

View File

@@ -8,48 +8,160 @@
import AVFoundation
extension CameraView {
private var hasLoggedFrameDropWarning = false
// MARK: - CameraView + AVCaptureVideoDataOutputSampleBufferDelegate, AVCaptureAudioDataOutputSampleBufferDelegate
extension CameraView: AVCaptureVideoDataOutputSampleBufferDelegate, AVCaptureAudioDataOutputSampleBufferDelegate {
func startRecording(options: NSDictionary, callback: @escaping RCTResponseSenderBlock) {
queue.async {
guard let movieOutput = self.movieOutput else {
return callback([NSNull(), makeReactError(.session(.cameraNotReady))])
}
if movieOutput.isRecording {
return callback([NSNull(), makeReactError(.capture(.recordingInProgress))])
}
cameraQueue.async {
ReactLogger.log(level: .info, message: "Starting Video recording...")
do {
let errorPointer = ErrorPointer(nilLiteral: ())
guard let tempFilePath = RCTTempFilePath("mov", errorPointer) else {
return callback([NSNull(), makeReactError(.capture(.createTempFileError), cause: errorPointer?.pointee)])
}
let errorPointer = ErrorPointer(nilLiteral: ())
guard let tempFilePath = RCTTempFilePath("mov", errorPointer) else {
return callback([NSNull(), makeReactError(.capture(.createTempFileError), cause: errorPointer?.pointee)])
}
let tempURL = URL(string: "file://\(tempFilePath)")!
if let flashMode = options["flash"] as? String {
// use the torch as the video's flash
self.setTorchMode(flashMode)
}
let tempURL = URL(string: "file://\(tempFilePath)")!
if let flashMode = options["flash"] as? String {
// use the torch as the video's flash
self.setTorchMode(flashMode)
}
movieOutput.startRecording(to: tempURL, recordingDelegate: RecordingDelegateWithCallback(callback: callback, resetTorchMode: {
// reset torch in case it was used as the video's "flash"
self.setTorchMode(self.torch)
}))
// TODO: The startRecording() func cannot be async because RN doesn't allow both a callback and a Promise in a single function. Wait for TurboModules?
// return ["path": tempFilePath]
var fileType = AVFileType.mov
if let fileTypeOption = options["fileType"] as? String {
fileType = AVFileType(withString: fileTypeOption)
}
// TODO: The startRecording() func cannot be async because RN doesn't allow
// both a callback and a Promise in a single function. Wait for TurboModules?
// This means that any errors that occur in this function have to be delegated through
// the callback, but I'd prefer for them to throw for the original function instead.
let onFinish = { (status: AVAssetWriter.Status, error: Error?) -> Void in
defer {
self.recordingSession = nil
}
ReactLogger.log(level: .info, message: "RecordingSession finished with status \(status.descriptor).")
if let error = error {
let description = (error as NSError).description
return callback([NSNull(), CameraError.capture(.unknown(message: "An unknown recording error occured! \(description)"))])
} else {
if status == .completed {
return callback([[
"path": self.recordingSession!.url.absoluteString,
"duration": self.recordingSession!.duration,
], NSNull()])
} else {
return callback([NSNull(), CameraError.unknown(message: "AVAssetWriter completed with status: \(status.descriptor)")])
}
}
}
let videoSettings = self.videoOutput!.recommendedVideoSettingsForAssetWriter(writingTo: fileType)
let audioSettings = self.audioOutput!.recommendedAudioSettingsForAssetWriter(writingTo: fileType) as? [String: Any]
self.recordingSession = try RecordingSession(url: tempURL,
fileType: fileType,
videoSettings: videoSettings ?? [:],
audioSettings: audioSettings ?? [:],
isVideoMirrored: self.videoOutput!.isMirrored,
completion: onFinish)
self.isRecording = true
} catch EnumParserError.invalidValue {
return callback([NSNull(), EnumParserError.invalidValue])
} catch let error as NSError {
return callback([NSNull(), makeReactError(.capture(.createTempFileError), cause: error)])
}
}
}
func stopRecording(promise: Promise) {
queue.async {
withPromise(promise) {
guard let movieOutput = self.movieOutput else {
throw CameraError.session(SessionError.cameraNotReady)
}
if !movieOutput.isRecording {
throw CameraError.capture(CaptureError.noRecordingInProgress)
}
isRecording = false
movieOutput.stopRecording()
cameraQueue.async {
withPromise(promise) {
guard let recordingSession = self.recordingSession else {
throw CameraError.capture(.noRecordingInProgress)
}
recordingSession.finish()
return nil
}
}
}
// TODO: Implement for JS
func pauseRecording(promise: Promise) {
cameraQueue.async {
withPromise(promise) {
if self.isRecording {
self.isRecording = false
return nil
} else {
throw CameraError.capture(.noRecordingInProgress)
}
}
}
}
// TODO: Implement for JS
func resumeRecording(promise: Promise) {
cameraQueue.async {
withPromise(promise) {
if !self.isRecording {
self.isRecording = true
return nil
} else {
throw CameraError.capture(.noRecordingInProgress)
}
}
}
}
public final func captureOutput(_ captureOutput: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from _: AVCaptureConnection) {
if isRecording {
guard let recordingSession = recordingSession else {
return invokeOnError(.capture(.unknown(message: "isRecording was true but the RecordingSession was null!")))
}
switch captureOutput {
case is AVCaptureVideoDataOutput:
recordingSession.appendBuffer(sampleBuffer, type: .video)
case is AVCaptureAudioDataOutput:
recordingSession.appendBuffer(sampleBuffer, type: .audio)
default:
break
}
}
if let frameProcessor = frameProcessorCallback, captureOutput is AVCaptureVideoDataOutput {
// check if last frame was x nanoseconds ago, effectively throttling FPS
let diff = DispatchTime.now().uptimeNanoseconds - lastFrameProcessorCall.uptimeNanoseconds
let secondsPerFrame = 1.0 / frameProcessorFps.doubleValue
let nanosecondsPerFrame = secondsPerFrame * 1_000_000_000.0
if diff > UInt64(nanosecondsPerFrame) {
frameProcessor(sampleBuffer)
lastFrameProcessorCall = DispatchTime.now()
}
}
}
public final func captureOutput(_ captureOutput: AVCaptureOutput, didDrop buffer: CMSampleBuffer, from _: AVCaptureConnection) {
if frameProcessorCallback != nil && !hasLoggedFrameDropWarning && captureOutput is AVCaptureVideoDataOutput {
let reason = findFrameDropReason(inBuffer: buffer)
// TODO: Show in React console?
ReactLogger.log(level: .warning, message: "Dropped a Frame. This might indicate that your Frame Processor is doing too much work. " +
"Either throttle the frame processor's frame rate, or optimize your frame processor's execution speed. Frame drop reason: \(reason)")
hasLoggedFrameDropWarning = true
}
}
private final func findFrameDropReason(inBuffer buffer: CMSampleBuffer) -> String {
var mode: CMAttachmentMode = 0
guard let reason = CMGetAttachment(buffer,
key: kCMSampleBufferAttachmentKey_DroppedFrameReason,
attachmentModeOut: &mode) else {
return "unknown"
}
return String(describing: reason)
}
}

View File

@@ -24,7 +24,7 @@ struct TakePhotoOptions {
extension CameraView {
func takePhoto(options: NSDictionary, promise: Promise) {
queue.async {
cameraQueue.async {
guard let photoOutput = self.photoOutput, let videoDeviceInput = self.videoDeviceInput else {
return promise.reject(error: .session(.cameraNotReady))
}

View File

@@ -35,6 +35,7 @@ private let propsThatRequireDeviceReconfiguration = ["fps",
public final class CameraView: UIView {
// pragma MARK: React Properties
// pragma MARK: Exported Properties
// props that require reconfiguring
@objc var cameraId: NSString?
@objc var enableDepthData = false
@@ -44,6 +45,7 @@ public final class CameraView: UIView {
// props that require format reconfiguring
@objc var format: NSDictionary?
@objc var fps: NSNumber?
@objc var frameProcessorFps: NSNumber = 1.0
@objc var hdr: NSNumber? // nullable bool
@objc var lowLightBoost: NSNumber? // nullable bool
@objc var colorSpace: NSString?
@@ -75,18 +77,24 @@ public final class CameraView: UIView {
// Inputs
internal var videoDeviceInput: AVCaptureDeviceInput?
internal var audioDeviceInput: AVCaptureDeviceInput?
// Outputs
internal var photoOutput: AVCapturePhotoOutput?
internal var movieOutput: AVCaptureMovieFileOutput?
internal var videoOutput: AVCaptureVideoDataOutput?
internal var audioOutput: AVCaptureAudioDataOutput?
// CameraView+RecordView (+ FrameProcessorDelegate.mm)
internal var isRecording = false
internal var recordingSession: RecordingSession?
@objc public var frameProcessorCallback: FrameProcessorCallback?
internal var lastFrameProcessorCall = DispatchTime.now()
// CameraView+TakePhoto
internal var photoCaptureDelegates: [PhotoCaptureDelegate] = []
// CameraView+RecordVideo
internal var recordingDelegateResolver: RCTPromiseResolveBlock?
internal var recordingDelegateRejecter: RCTPromiseRejectBlock?
// CameraView+Zoom
internal var pinchGestureRecognizer: UIPinchGestureRecognizer?
internal var pinchScaleOffset: CGFloat = 1.0
internal let cameraQueue = CameraQueues.cameraQueue
internal let videoQueue = CameraQueues.videoQueue
internal let audioQueue = CameraQueues.audioQueue
var isRunning: Bool {
return captureSession.isRunning
}
@@ -126,6 +134,11 @@ public final class CameraView: UIView {
object: AVAudioSession.sharedInstance)
}
@available(*, unavailable)
required init?(coder _: NSCoder) {
fatalError("init(coder:) is not implemented.")
}
deinit {
NotificationCenter.default.removeObserver(self,
name: .AVCaptureSessionRuntimeError,
@@ -141,11 +154,6 @@ public final class CameraView: UIView {
object: AVAudioSession.sharedInstance)
}
@available(*, unavailable)
required init?(coder _: NSCoder) {
fatalError("init(coder:) is not implemented.")
}
override public func removeFromSuperview() {
ReactLogger.log(level: .info, message: "Removing Camera View...")
captureSession.stopRunning()
@@ -166,7 +174,7 @@ public final class CameraView: UIView {
let shouldUpdateZoom = willReconfigure || changedProps.contains("zoom") || shouldCheckActive
if shouldReconfigure || shouldCheckActive || shouldUpdateTorch || shouldUpdateZoom || shouldReconfigureFormat || shouldReconfigureDevice {
queue.async {
cameraQueue.async {
if shouldReconfigure {
self.configureCaptureSession()
}
@@ -198,7 +206,7 @@ public final class CameraView: UIView {
}
// This is a wack workaround, but if I immediately set torch mode after `startRunning()`, the session isn't quite ready yet and will ignore torch.
self.queue.asyncAfter(deadline: .now() + 0.1) {
self.cameraQueue.asyncAfter(deadline: .now() + 0.1) {
if shouldUpdateTorch {
self.setTorchMode(self.torch)
}

View File

@@ -7,7 +7,9 @@
//
#import <Foundation/Foundation.h>
#import <React/RCTViewManager.h>
#import <React/RCTUtils.h>
@interface RCT_EXTERN_REMAP_MODULE(CameraView, CameraViewManager, RCTViewManager)
@@ -28,6 +30,7 @@ RCT_EXPORT_VIEW_PROPERTY(enablePortraitEffectsMatteDelivery, BOOL);
// device format
RCT_EXPORT_VIEW_PROPERTY(format, NSDictionary);
RCT_EXPORT_VIEW_PROPERTY(fps, NSNumber);
RCT_EXPORT_VIEW_PROPERTY(frameProcessorFps, NSNumber);
RCT_EXPORT_VIEW_PROPERTY(hdr, NSNumber); // nullable bool
RCT_EXPORT_VIEW_PROPERTY(lowLightBoost, NSNumber); // nullable bool
RCT_EXPORT_VIEW_PROPERTY(colorSpace, NSString);
@@ -46,7 +49,4 @@ RCT_EXTERN_METHOD(stopRecording:(nonnull NSNumber *)node resolve:(RCTPromiseReso
RCT_EXTERN_METHOD(takePhoto:(nonnull NSNumber *)node options:(NSDictionary *)options resolve:(RCTPromiseResolveBlock)resolve reject:(RCTPromiseRejectBlock)reject);
RCT_EXTERN_METHOD(focus:(nonnull NSNumber *)node point:(NSDictionary *)point resolve:(RCTPromiseResolveBlock)resolve reject:(RCTPromiseRejectBlock)reject);
RCT_EXTERN_METHOD(getAvailableVideoCodecs:(nonnull NSNumber *)node resolve:(RCTPromiseResolveBlock)resolve reject:(RCTPromiseRejectBlock)reject);
RCT_EXTERN_METHOD(getAvailablePhotoCodecs:(nonnull NSNumber *)node resolve:(RCTPromiseResolveBlock)resolve reject:(RCTPromiseRejectBlock)reject);
@end

View File

@@ -11,6 +11,23 @@ import Foundation
@objc(CameraViewManager)
final class CameraViewManager: RCTViewManager {
// pragma MARK: Properties
private var runtimeManager: FrameProcessorRuntimeManager?
override var bridge: RCTBridge! {
didSet {
if !enableFrameProcessors { return }
CameraQueues.videoQueue.async {
self.runtimeManager = FrameProcessorRuntimeManager(bridge: self.bridge)
self.bridge.runOnJS {
self.runtimeManager!.installFrameProcessorBindings()
}
}
}
}
override var methodQueue: DispatchQueue! {
return DispatchQueue.main
}
@@ -19,12 +36,12 @@ final class CameraViewManager: RCTViewManager {
return true
}
// pragma MARK: Setup
override final func view() -> UIView! {
return CameraView()
}
// pragma MARK: Exported Functions
// pragma MARK: React Functions
@objc
final func startRecording(_ node: NSNumber, options: NSDictionary, onRecordCallback: @escaping RCTResponseSenderBlock) {
let component = getCameraView(withTag: node)
@@ -53,29 +70,6 @@ final class CameraViewManager: RCTViewManager {
component.focus(point: CGPoint(x: x.doubleValue, y: y.doubleValue), promise: promise)
}
@objc
final func getAvailableVideoCodecs(_ node: NSNumber, resolve: @escaping RCTPromiseResolveBlock, reject: @escaping RCTPromiseRejectBlock) {
withPromise(resolve: resolve, reject: reject) {
let component = getCameraView(withTag: node)
guard let movieOutput = component.movieOutput else {
throw CameraError.session(SessionError.cameraNotReady)
}
return movieOutput.availableVideoCodecTypes.map(\.descriptor)
}
}
@objc
final func getAvailablePhotoCodecs(_ node: NSNumber, resolve: @escaping RCTPromiseResolveBlock, reject: @escaping RCTPromiseRejectBlock) {
withPromise(resolve: resolve, reject: reject) {
let component = getCameraView(withTag: node)
guard let photoOutput = component.photoOutput else {
throw CameraError.session(SessionError.cameraNotReady)
}
return photoOutput.availablePhotoCodecTypes.map(\.descriptor)
}
}
// pragma MARK: View Manager funcs
@objc
final func getAvailableCameraDevices(_ resolve: @escaping RCTPromiseResolveBlock, reject: @escaping RCTPromiseRejectBlock) {
withPromise(resolve: resolve, reject: reject) {
@@ -103,7 +97,7 @@ final class CameraViewManager: RCTViewManager {
"supportsRawCapture": false, // TODO: supportsRawCapture
"supportsLowLightBoost": $0.isLowLightBoostSupported,
"supportsFocus": $0.isFocusPointOfInterestSupported,
"formats": $0.formats.map { (format) -> [String: Any] in
"formats": $0.formats.map { format -> [String: Any] in
format.toDictionary()
},
]

View File

@@ -0,0 +1,30 @@
//
// AVAssetWriterInputPixelBufferAdaptor+initWithVideoSettings.swift
// VisionCamera
//
// Created by Marc Rousavy on 05.05.21.
// Copyright © 2021 Facebook. All rights reserved.
//
import AVFoundation
import Foundation
extension AVAssetWriterInputPixelBufferAdaptor {
/**
Convenience initializer to extract correct attributes from the given videoSettings.
*/
convenience init(assetWriterInput: AVAssetWriterInput, withVideoSettings videoSettings: [String: Any]) {
var attributes: [String: Any] = [:]
if let width = videoSettings[AVVideoWidthKey] as? NSNumber,
let height = videoSettings[AVVideoHeightKey] as? NSNumber {
attributes[kCVPixelBufferWidthKey as String] = width as CFNumber
attributes[kCVPixelBufferHeightKey as String] = height as CFNumber
}
// TODO: Is "Bi-Planar Y'CbCr 8-bit 4:2:0 full-range" the best CVPixelFormatType? How can I find natively supported ones?
attributes[kCVPixelBufferPixelFormatTypeKey as String] = kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
self.init(assetWriterInput: assetWriterInput, sourcePixelBufferAttributes: attributes)
}
}

View File

@@ -70,8 +70,8 @@ extension AVCaptureDevice.Format {
}
}
if let frameRateRanges = filter.value(forKey: "frameRateRanges") as? [NSDictionary] {
let allFrameRateRangesIncluded = videoSupportedFrameRateRanges.allSatisfy { (range) -> Bool in
frameRateRanges.contains { (dict) -> Bool in
let allFrameRateRangesIncluded = videoSupportedFrameRateRanges.allSatisfy { range -> Bool in
frameRateRanges.contains { dict -> Bool in
guard let max = dict.value(forKey: "maxFrameRate") as? NSNumber,
let min = dict.value(forKey: "minFrameRate") as? NSNumber
else {

View File

@@ -1,5 +1,5 @@
//
// AVCaptureMovieFileOutput+mirror.swift
// AVCaptureVideoDataOutput+mirror.swift
// Cuvent
//
// Created by Marc Rousavy on 18.01.21.
@@ -8,7 +8,7 @@
import AVFoundation
extension AVCaptureMovieFileOutput {
extension AVCaptureVideoDataOutput {
func mirror() {
connections.forEach { connection in
if connection.isVideoMirroringSupported {
@@ -16,4 +16,10 @@ extension AVCaptureMovieFileOutput {
}
}
}
var isMirrored: Bool {
return connections.contains { connection in
connection.isVideoMirrored
}
}
}

View File

@@ -0,0 +1,20 @@
//
// Frame.h
// VisionCamera
//
// Created by Marc Rousavy on 15.03.21.
// Copyright © 2021 Facebook. All rights reserved.
//
#pragma once
#import <CoreMedia/CMSampleBuffer.h>
// TODO: Make this Objective-C so it can be imported in Swift?
class Frame {
public:
explicit Frame(CMSampleBufferRef buffer): buffer(buffer) {}
public:
CMSampleBufferRef buffer;
};

View File

@@ -0,0 +1,26 @@
//
// FrameHostObject.h
// VisionCamera
//
// Created by Marc Rousavy on 22.03.21.
// Copyright © 2021 Facebook. All rights reserved.
//
#pragma once
#import "Frame.h"
#import <jsi/jsi.h>
#import <CoreMedia/CMSampleBuffer.h>
using namespace facebook;
class JSI_EXPORT FrameHostObject: public Frame, public jsi::HostObject {
public:
explicit FrameHostObject(CMSampleBufferRef buffer): Frame(buffer) {}
~FrameHostObject();
public:
jsi::Value get(jsi::Runtime&, const jsi::PropNameID& name) override;
std::vector<jsi::PropNameID> getPropertyNames(jsi::Runtime& rt) override;
void destroyBuffer();
};

View File

@@ -0,0 +1,93 @@
//
// FrameHostObject.m
// VisionCamera
//
// Created by Marc Rousavy on 22.03.21.
// Copyright © 2021 Facebook. All rights reserved.
//
#import "FrameHostObject.h"
#import <Foundation/Foundation.h>
#import <jsi/jsi.h>
std::vector<jsi::PropNameID> FrameHostObject::getPropertyNames(jsi::Runtime& rt) {
std::vector<jsi::PropNameID> result;
result.push_back(jsi::PropNameID::forUtf8(rt, std::string("toString")));
result.push_back(jsi::PropNameID::forUtf8(rt, std::string("isValid")));
result.push_back(jsi::PropNameID::forUtf8(rt, std::string("isReady")));
result.push_back(jsi::PropNameID::forUtf8(rt, std::string("width")));
result.push_back(jsi::PropNameID::forUtf8(rt, std::string("height")));
result.push_back(jsi::PropNameID::forUtf8(rt, std::string("bytesPerRow")));
result.push_back(jsi::PropNameID::forUtf8(rt, std::string("planesCount")));
result.push_back(jsi::PropNameID::forUtf8(rt, std::string("buffer")));
return result;
}
jsi::Value FrameHostObject::get(jsi::Runtime& runtime, const jsi::PropNameID& propName) {
auto name = propName.utf8(runtime);
if (name == "Symbol.toPrimitive") {
// not implemented
return jsi::Value::undefined();
}
if (name == "valueOf") {
// not implemented
return jsi::Value::undefined();
}
if (name == "toString") {
auto toString = [this] (jsi::Runtime& runtime, const jsi::Value& thisValue, const jsi::Value* arguments, size_t count) -> jsi::Value {
auto imageBuffer = CMSampleBufferGetImageBuffer(buffer);
auto width = CVPixelBufferGetWidth(imageBuffer);
auto height = CVPixelBufferGetHeight(imageBuffer);
NSMutableString* string = [NSMutableString stringWithFormat:@"%lu x %lu Frame", width, height];
return jsi::String::createFromUtf8(runtime, string.UTF8String);
};
return jsi::Function::createFromHostFunction(runtime, jsi::PropNameID::forUtf8(runtime, "toString"), 0, toString);
}
if (name == "isValid") {
auto isValid = buffer != nil && CMSampleBufferIsValid(buffer);
return jsi::Value(isValid);
}
if (name == "isReady") {
auto isReady = buffer != nil && CMSampleBufferDataIsReady(buffer);
return jsi::Value(isReady);
}
if (name == "width") {
auto imageBuffer = CMSampleBufferGetImageBuffer(buffer);
auto width = CVPixelBufferGetWidth(imageBuffer);
return jsi::Value((double) width);
}
if (name == "height") {
auto imageBuffer = CMSampleBufferGetImageBuffer(buffer);
auto height = CVPixelBufferGetHeight(imageBuffer);
return jsi::Value((double) height);
}
if (name == "bytesPerRow") {
auto imageBuffer = CMSampleBufferGetImageBuffer(buffer);
auto bytesPerRow = CVPixelBufferGetPlaneCount(imageBuffer);
return jsi::Value((double) bytesPerRow);
}
if (name == "planesCount") {
auto imageBuffer = CMSampleBufferGetImageBuffer(buffer);
auto planesCount = CVPixelBufferGetPlaneCount(imageBuffer);
return jsi::Value((double) planesCount);
}
if (name == "buffer") {
// TODO: Actually return the pixels of the buffer. Not sure if this will be a huge performance hit or not
return jsi::Array(runtime, 0);
}
return jsi::Value::undefined();
}
FrameHostObject::~FrameHostObject() {
destroyBuffer();
}
void FrameHostObject::destroyBuffer() {
// ARC will hopefully delete it lol
this->buffer = nil;
}

View File

@@ -0,0 +1,14 @@
//
// FrameProcessorCallback.h
// VisionCamera
//
// Created by Marc Rousavy on 11.03.21.
// Copyright © 2021 Facebook. All rights reserved.
//
#pragma once
#import <Foundation/Foundation.h>
#import <CoreMedia/CMSampleBuffer.h>
typedef void (^FrameProcessorCallback) (CMSampleBufferRef buffer);

View File

@@ -0,0 +1,63 @@
//
// FrameProcessorPlugin.h
// VisionCamera
//
// Created by Marc Rousavy on 01.05.21.
// Copyright © 2021 Facebook. All rights reserved.
//
#ifndef FrameProcessorPlugin_h
#define FrameProcessorPlugin_h
#import <Foundation/Foundation.h>
#import "FrameProcessorPluginRegistry.h"
#import <CoreMedia/CMSampleBuffer.h>
@protocol FrameProcessorPluginBase
+ (id) callback:(CMSampleBufferRef)buffer withArgs:(NSArray<id>*)args;
@end
#define VISION_CONCAT2(A, B) A##B
#define VISION_CONCAT(A, B) VISION_CONCAT2(A, B)
/**
* Use this Macro to register the given function as a Frame Processor.
* * Make sure the given function is a C-style function with the following signature: static inline id callback(CMSampleBufferRef buffer)
* * Make sure the given function's name is unique across other frame processor plugins
* * Make sure your frame processor returns a Value that can be converted to JS
* * Make sure to use this Macro in an @implementation, not @interface
*
* The JS function will have the same name as the given Objective-C function, but with a "__" prefix.
* Make sure to add that function to the babel.config.js under reanimated's "globals" option, and add TypeScript type declarations.
*/
#define VISION_EXPORT_FRAME_PROCESSOR(frame_processor) \
\
+(void)load \
{ \
[FrameProcessorPluginRegistry addFrameProcessorPlugin:@"__" @ #frame_processor callback:^id(CMSampleBufferRef buffer, NSArray<id>* args) { \
return frame_processor(buffer, args); \
}]; \
}
/**
* Same as VISION_EXPORT_FRAME_PROCESSOR, but uses __attribute__((constructor)) for
* registration. Useful for registering swift classes that forbids use of +(void)load.
*/
#define VISION_EXPORT_SWIFT_FRAME_PROCESSOR(name, objc_name) \
objc_name : NSObject<FrameProcessorPluginBase> \
@end \
\
@interface objc_name (FrameProcessorPlugin) \
@end \
@implementation objc_name (FrameProcessorPlugin) \
\
__attribute__((constructor)) static void VISION_CONCAT(initialize_, objc_name)() \
{ \
[FrameProcessorPluginRegistry addFrameProcessorPlugin:@"__" @ #name callback:^id(CMSampleBufferRef buffer, NSArray<id>* args) { \
return [objc_name callback:buffer withArgs:args]; \
}]; \
}
#endif /* FrameProcessorPlugin_h */

View File

@@ -0,0 +1,23 @@
//
// FrameProcessorPluginRegistry.h
// VisionCamera
//
// Created by Marc Rousavy on 24.03.21.
// Copyright © 2021 Facebook. All rights reserved.
//
#pragma once
#import <Foundation/Foundation.h>
#import <CoreMedia/CMSampleBuffer.h>
typedef id (^FrameProcessorPlugin) (CMSampleBufferRef buffer, NSArray<id>* arguments);
@interface FrameProcessorPluginRegistry : NSObject
+ (NSMutableDictionary<NSString*, FrameProcessorPlugin>*)frameProcessorPlugins;
+ (void) addFrameProcessorPlugin:(NSString*)name callback:(FrameProcessorPlugin)callback;
+ (void) markInvalid;
@end

View File

@@ -0,0 +1,37 @@
//
// FrameProcessorPluginRegistry.mm
// VisionCamera
//
// Created by Marc Rousavy on 24.03.21.
// Copyright © 2021 Facebook. All rights reserved.
//
#import "FrameProcessorPluginRegistry.h"
#import <Foundation/Foundation.h>
@implementation FrameProcessorPluginRegistry
+ (NSMutableDictionary<NSString*, FrameProcessorPlugin>*)frameProcessorPlugins {
static NSMutableDictionary<NSString*, FrameProcessorPlugin>* plugins = nil;
if (plugins == nil) {
plugins = [[NSMutableDictionary alloc] init];
}
return plugins;
}
static BOOL _isValid = YES;
+ (void) markInvalid {
_isValid = NO;
[[FrameProcessorPluginRegistry frameProcessorPlugins] removeAllObjects];
}
+ (void) addFrameProcessorPlugin:(NSString*)name callback:(FrameProcessorPlugin)callback {
NSAssert(_isValid, @"Tried to add Frame Processor Plugin but Frame Processor Registry has already registered all plugins!");
BOOL alreadyExists = [[FrameProcessorPluginRegistry frameProcessorPlugins] valueForKey:name] != nil;
NSAssert(!alreadyExists, @"Tried to two Frame Processor Plugins with the same name! Either choose unique names, or remove the unused plugin.");
[[FrameProcessorPluginRegistry frameProcessorPlugins] setValue:callback forKey:name];
}
@end

View File

@@ -0,0 +1,26 @@
//
// FrameProcessorRuntimeManager.h
// VisionCamera
//
// Created by Marc Rousavy on 23.03.21.
// Copyright © 2021 Facebook. All rights reserved.
//
#pragma once
#import <Foundation/Foundation.h>
#import <React/RCTBridge.h>
@interface FrameProcessorRuntimeManager : NSObject
- (instancetype)init NS_UNAVAILABLE;
/**
Initializes the Frame Processor Runtime Manager with the given bridge.
This init is not thread safe, so only init this on the Thread you want the runtime to run on.
*/
- (instancetype) initWithBridge:(RCTBridge*)bridge;
- (void) installFrameProcessorBindings;
@end

View File

@@ -0,0 +1,184 @@
//
// FrameProcessorRuntimeManager.m
// VisionCamera
//
// Created by Marc Rousavy on 23.03.21.
// Copyright © 2021 Facebook. All rights reserved.
//
#import <Foundation/Foundation.h>
#import "FrameProcessorRuntimeManager.h"
#import "FrameProcessorPluginRegistry.h"
#import "FrameHostObject.h"
#import <memory>
#import <React/RCTBridge.h>
#import <ReactCommon/RCTTurboModule.h>
#import <React/RCTBridge+Private.h>
#import <React/RCTUIManager.h>
#import <ReactCommon/RCTTurboModuleManager.h>
#if __has_include(<RNReanimated/NativeReanimatedModule.h>)
#if __has_include(<RNReanimated/RuntimeManager.h>)
#ifndef VISION_CAMERA_DISABLE_FRAME_PROCESSORS
#import <RNReanimated/RuntimeManager.h>
#import <RNReanimated/RuntimeDecorator.h>
#import <RNReanimated/REAIOSScheduler.h>
#import <RNReanimated/REAIOSErrorHandler.h>
#define ENABLE_FRAME_PROCESSORS
#endif
#else
#warning Your react-native-reanimated version is not compatible with VisionCamera, Frame Processors are disabled. Make sure you're using reanimated 2.1.0 or above!
#endif
#else
#warning The NativeReanimatedModule.h header could not be found, Frame Processors are disabled. If you want to use Frame Processors, make sure you install react-native-reanimated!
#endif
#import "../../cpp/MakeJSIRuntime.h"
#import "FrameProcessorUtils.h"
#import "FrameProcessorCallback.h"
#import "../React Utils/JSIUtils.h"
// Forward declarations for the Swift classes
__attribute__((objc_runtime_name("_TtC12VisionCamera12CameraQueues")))
@interface CameraQueues : NSObject
@property (nonatomic, class, readonly, strong) dispatch_queue_t _Nonnull videoQueue;
@end
__attribute__((objc_runtime_name("_TtC12VisionCamera10CameraView")))
@interface CameraView : UIView
@property (nonatomic, copy) FrameProcessorCallback _Nullable frameProcessorCallback;
@end
@implementation FrameProcessorRuntimeManager {
#ifdef ENABLE_FRAME_PROCESSORS
std::unique_ptr<reanimated::RuntimeManager> runtimeManager;
#endif
__weak RCTBridge* weakBridge;
}
- (instancetype) initWithBridge:(RCTBridge*)bridge {
self = [super init];
if (self) {
#ifdef ENABLE_FRAME_PROCESSORS
NSLog(@"FrameProcessorBindings: Creating Runtime Manager...");
weakBridge = bridge;
auto runtime = vision::makeJSIRuntime();
reanimated::RuntimeDecorator::decorateRuntime(*runtime, "FRAME_PROCESSOR");
runtime->global().setProperty(*runtime, "_FRAME_PROCESSOR", jsi::Value(true));
auto callInvoker = bridge.jsCallInvoker;
auto scheduler = std::make_shared<reanimated::REAIOSScheduler>(callInvoker);
runtimeManager = std::make_unique<reanimated::RuntimeManager>(std::move(runtime),
std::make_shared<reanimated::REAIOSErrorHandler>(scheduler),
scheduler);
NSLog(@"FrameProcessorBindings: Runtime Manager created!");
NSLog(@"FrameProcessorBindings: Installing Frame Processor plugins...");
auto& visionRuntime = *runtimeManager->runtime;
auto visionGlobal = visionRuntime.global();
for (NSString* pluginKey in [FrameProcessorPluginRegistry frameProcessorPlugins]) {
auto pluginName = [pluginKey UTF8String];
NSLog(@"FrameProcessorBindings: Installing Frame Processor plugin \"%s\"...", pluginName);
FrameProcessorPlugin callback = [[FrameProcessorPluginRegistry frameProcessorPlugins] valueForKey:pluginKey];
auto function = [callback, callInvoker](jsi::Runtime& runtime, const jsi::Value& thisValue, const jsi::Value* arguments, size_t count) -> jsi::Value {
auto frameHostObject = arguments[0].asObject(runtime).asHostObject(runtime);
auto frame = static_cast<FrameHostObject*>(frameHostObject.get());
auto args = convertJSICStyleArrayToNSArray(runtime,
arguments + 1, // start at index 1 since first arg = Frame
count - 1, // use smaller count
callInvoker);
id result = callback(frame->buffer, args);
return convertObjCObjectToJSIValue(runtime, result);
};
visionGlobal.setProperty(visionRuntime, pluginName, jsi::Function::createFromHostFunction(visionRuntime,
jsi::PropNameID::forAscii(visionRuntime, pluginName),
1, // frame
function));
}
[FrameProcessorPluginRegistry markInvalid];
NSLog(@"FrameProcessorBindings: Frame Processor plugins installed!");
#else
NSLog(@"Reanimated not found, Frame Processors are disabled.");
#endif
}
return self;
}
- (void) installFrameProcessorBindings {
#ifdef ENABLE_FRAME_PROCESSORS
if (!weakBridge) {
NSLog(@"FrameProcessorBindings: Failed to install Frame Processor Bindings - bridge was null!");
return;
}
NSLog(@"FrameProcessorBindings: Installing Frame Processor Bindings for Bridge...");
RCTCxxBridge *cxxBridge = (RCTCxxBridge *)weakBridge;
if (!cxxBridge.runtime) {
return;
}
jsi::Runtime& jsiRuntime = *(jsi::Runtime*)cxxBridge.runtime;
NSLog(@"FrameProcessorBindings: Installing global functions...");
// setFrameProcessor(viewTag: number, frameProcessor: (frame: Frame) => void)
auto setFrameProcessor = [self](jsi::Runtime& runtime, const jsi::Value& thisValue, const jsi::Value* arguments, size_t count) -> jsi::Value {
NSLog(@"FrameProcessorBindings: Setting new frame processor...");
if (!arguments[0].isNumber()) throw jsi::JSError(runtime, "Camera::setFrameProcessor: First argument ('viewTag') must be a number!");
if (!arguments[1].isObject()) throw jsi::JSError(runtime, "Camera::setFrameProcessor: Second argument ('frameProcessor') must be a function!");
if (!runtimeManager || !runtimeManager->runtime) throw jsi::JSError(runtime, "Camera::setFrameProcessor: The RuntimeManager is not yet initialized!");
auto viewTag = arguments[0].asNumber();
NSLog(@"FrameProcessorBindings: Adapting Shareable value from function (conversion to worklet)...");
auto worklet = reanimated::ShareableValue::adapt(runtime, arguments[1], runtimeManager.get());
NSLog(@"FrameProcessorBindings: Successfully created worklet!");
RCTExecuteOnMainQueue([worklet, viewTag, self]() -> void {
auto currentBridge = [RCTBridge currentBridge];
auto anonymousView = [currentBridge.uiManager viewForReactTag:[NSNumber numberWithDouble:viewTag]];
auto view = static_cast<CameraView*>(anonymousView);
dispatch_async(CameraQueues.videoQueue, [worklet, view, self]() -> void {
NSLog(@"FrameProcessorBindings: Converting worklet to Objective-C callback...");
auto& rt = *runtimeManager->runtime;
auto function = worklet->getValue(rt).asObject(rt).asFunction(rt);
view.frameProcessorCallback = convertJSIFunctionToFrameProcessorCallback(rt, function);
NSLog(@"FrameProcessorBindings: Frame processor set!");
});
});
return jsi::Value::undefined();
};
jsiRuntime.global().setProperty(jsiRuntime, "setFrameProcessor", jsi::Function::createFromHostFunction(jsiRuntime,
jsi::PropNameID::forAscii(jsiRuntime, "setFrameProcessor"),
2, // viewTag, frameProcessor
setFrameProcessor));
// unsetFrameProcessor(viewTag: number)
auto unsetFrameProcessor = [](jsi::Runtime& runtime, const jsi::Value& thisValue, const jsi::Value* arguments, size_t count) -> jsi::Value {
NSLog(@"FrameProcessorBindings: Removing frame processor...");
if (!arguments[0].isNumber()) throw jsi::JSError(runtime, "Camera::unsetFrameProcessor: First argument ('viewTag') must be a number!");
auto viewTag = arguments[0].asNumber();
RCTExecuteOnMainQueue(^{
auto currentBridge = [RCTBridge currentBridge];
if (!currentBridge) return;
auto anonymousView = [currentBridge.uiManager viewForReactTag:[NSNumber numberWithDouble:viewTag]];
if (!anonymousView) return;
auto view = static_cast<CameraView*>(anonymousView);
view.frameProcessorCallback = nil;
NSLog(@"FrameProcessorBindings: Frame processor removed!");
});
return jsi::Value::undefined();
};
jsiRuntime.global().setProperty(jsiRuntime, "unsetFrameProcessor", jsi::Function::createFromHostFunction(jsiRuntime,
jsi::PropNameID::forAscii(jsiRuntime, "unsetFrameProcessor"),
1, // viewTag
unsetFrameProcessor));
NSLog(@"FrameProcessorBindings: Finished installing bindings.");
#endif
}
@end

View File

@@ -0,0 +1,23 @@
//
// FrameProcessorUtils.h
// VisionCamera
//
// Created by Marc Rousavy on 15.03.21.
// Copyright © 2021 Facebook. All rights reserved.
//
#pragma once
#import <Foundation/Foundation.h>
#import <React/RCTBridgeModule.h>
#import "FrameProcessorCallback.h"
#ifndef __cplusplus
#error FrameProcessorUtils.h has to be compiled with C++!
#endif
#import <jsi/jsi.h>
using namespace facebook;
FrameProcessorCallback convertJSIFunctionToFrameProcessorCallback(jsi::Runtime &runtime, const jsi::Function &value);

View File

@@ -0,0 +1,44 @@
//
// FrameProcessorUtils.m
// VisionCamera
//
// Created by Marc Rousavy on 15.03.21.
// Copyright © 2021 Facebook. All rights reserved.
//
#import "FrameProcessorUtils.h"
#import <CoreMedia/CMSampleBuffer.h>
#import <chrono>
#import <memory>
#import "FrameHostObject.h"
FrameProcessorCallback convertJSIFunctionToFrameProcessorCallback(jsi::Runtime &runtime, const jsi::Function &value) {
__block auto cb = value.getFunction(runtime);
return ^(CMSampleBufferRef buffer) {
#if DEBUG
std::chrono::steady_clock::time_point begin = std::chrono::steady_clock::now();
#endif
auto frame = std::make_shared<FrameHostObject>(buffer);
try {
cb.call(runtime, jsi::Object::createFromHostObject(runtime, frame));
} catch (jsi::JSError& jsError) {
NSLog(@"Frame Processor threw an error: %s", jsError.getMessage().c_str());
}
#if DEBUG
std::chrono::steady_clock::time_point end = std::chrono::steady_clock::now();
auto duration = std::chrono::duration_cast<std::chrono::milliseconds>(end - begin).count();
if (duration > 100) {
NSLog(@"Warning: Frame Processor function took %lld ms to execute. This blocks the video queue from recording, optimize your frame processor!", duration);
}
#endif
// Manually free the buffer because:
// 1. we are sure we don't need it anymore, the frame processor worklet has finished executing.
// 2. we don't know when the JS runtime garbage collects this object, it might be holding it for a few more frames
// which then blocks the camera queue from pushing new frames (memory limit)
frame->destroyBuffer();
};
}

View File

@@ -0,0 +1,28 @@
//
// AVAssetWriter.Status+descriptor.swift
// VisionCamera
//
// Created by Marc Rousavy on 01.05.21.
// Copyright © 2021 Facebook. All rights reserved.
//
import AVFoundation
extension AVAssetWriter.Status {
var descriptor: String {
switch self {
case .cancelled:
return "cancelled"
case .completed:
return "completed"
case .failed:
return "failed"
case .unknown:
return "unknown"
case .writing:
return "writing"
@unknown default:
fatalError("Unknown AVAssetWriter.Status value! \(rawValue)")
}
}
}

View File

@@ -0,0 +1,20 @@
//
// AVFileType+descriptor.swift
// VisionCamera
//
// Created by Marc Rousavy on 01.05.21.
// Copyright © 2021 Facebook. All rights reserved.
//
import AVFoundation
import Foundation
extension AVFileType {
init(withString string: String) {
self.init(rawValue: string)
}
var descriptor: String {
return rawValue
}
}

View File

@@ -0,0 +1,55 @@
//
// JSIUtils.h
// VisionCamera
//
// Created by Marc Rousavy on 30.04.21.
// Copyright © 2021 Facebook. All rights reserved.
//
#pragma once
#import <jsi/jsi.h>
#import <ReactCommon/CallInvoker.h>
#import <React/RCTBridgeModule.h>
using namespace facebook;
using namespace facebook::react;
// NSNumber -> boolean
jsi::Value convertNSNumberToJSIBoolean(jsi::Runtime& runtime, NSNumber* value);
// NSNumber -> number
jsi::Value convertNSNumberToJSINumber(jsi::Runtime& runtime, NSNumber* value);
// NSNumber -> string
jsi::String convertNSStringToJSIString(jsi::Runtime& runtime, NSString* value);
// NSDictionary -> {}
jsi::Object convertNSDictionaryToJSIObject(jsi::Runtime& runtime, NSDictionary* value);
// NSArray -> []
jsi::Array convertNSArrayToJSIArray(jsi::Runtime& runtime, NSArray* value);
// id -> ???
jsi::Value convertObjCObjectToJSIValue(jsi::Runtime& runtime, id value);
// string -> NSString
NSString* convertJSIStringToNSString(jsi::Runtime& runtime, const jsi::String& value);
// any... -> NSArray
NSArray* convertJSICStyleArrayToNSArray(jsi::Runtime& runtime, const jsi::Value* array, size_t length, std::shared_ptr<CallInvoker> jsInvoker);
// NSArray -> any...
jsi::Value* convertNSArrayToJSICStyleArray(jsi::Runtime& runtime, NSArray* array);
// [] -> NSArray
NSArray* convertJSIArrayToNSArray(jsi::Runtime& runtime, const jsi::Array& value, std::shared_ptr<CallInvoker> jsInvoker);
// {} -> NSDictionary
NSDictionary* convertJSIObjectToNSDictionary(jsi::Runtime& runtime, const jsi::Object& value, std::shared_ptr<CallInvoker> jsInvoker);
// any -> id
id convertJSIValueToObjCObject(jsi::Runtime& runtime, const jsi::Value& value, std::shared_ptr<CallInvoker> jsInvoker);
// (any...) => any -> (void)(id, id)
RCTResponseSenderBlock convertJSIFunctionToCallback(jsi::Runtime& runtime, const jsi::Function& value, std::shared_ptr<CallInvoker> jsInvoker);

187
ios/React Utils/JSIUtils.mm Normal file
View File

@@ -0,0 +1,187 @@
//
// JSIUtils.mm
// VisionCamera
//
// Created by Marc Rousavy on 02.05.21.
// Copyright © 2021 Facebook. All rights reserved.
//
#import "JSIUtils.h"
#import <Foundation/Foundation.h>
#import <jsi/jsi.h>
#import <ReactCommon/CallInvoker.h>
#import <React/RCTBridge.h>
#import <ReactCommon/TurboModuleUtils.h>
using namespace facebook;
using namespace facebook::react;
jsi::Value convertNSNumberToJSIBoolean(jsi::Runtime &runtime, NSNumber *value)
{
return jsi::Value((bool)[value boolValue]);
}
jsi::Value convertNSNumberToJSINumber(jsi::Runtime &runtime, NSNumber *value)
{
return jsi::Value([value doubleValue]);
}
jsi::String convertNSStringToJSIString(jsi::Runtime &runtime, NSString *value)
{
return jsi::String::createFromUtf8(runtime, [value UTF8String] ?: "");
}
jsi::Object convertNSDictionaryToJSIObject(jsi::Runtime &runtime, NSDictionary *value)
{
jsi::Object result = jsi::Object(runtime);
for (NSString *k in value) {
result.setProperty(runtime, [k UTF8String], convertObjCObjectToJSIValue(runtime, value[k]));
}
return result;
}
jsi::Array convertNSArrayToJSIArray(jsi::Runtime &runtime, NSArray *value)
{
jsi::Array result = jsi::Array(runtime, value.count);
for (size_t i = 0; i < value.count; i++) {
result.setValueAtIndex(runtime, i, convertObjCObjectToJSIValue(runtime, value[i]));
}
return result;
}
jsi::Value convertObjCObjectToJSIValue(jsi::Runtime &runtime, id value)
{
if (value == nil) {
return jsi::Value::undefined();
} else if ([value isKindOfClass:[NSString class]]) {
return convertNSStringToJSIString(runtime, (NSString *)value);
} else if ([value isKindOfClass:[NSNumber class]]) {
if ([value isKindOfClass:[@YES class]]) {
return convertNSNumberToJSIBoolean(runtime, (NSNumber *)value);
}
return convertNSNumberToJSINumber(runtime, (NSNumber *)value);
} else if ([value isKindOfClass:[NSDictionary class]]) {
return convertNSDictionaryToJSIObject(runtime, (NSDictionary *)value);
} else if ([value isKindOfClass:[NSArray class]]) {
return convertNSArrayToJSIArray(runtime, (NSArray *)value);
} else if (value == (id)kCFNull) {
return jsi::Value::null();
}
return jsi::Value::undefined();
}
NSString *convertJSIStringToNSString(jsi::Runtime &runtime, const jsi::String &value)
{
return [NSString stringWithUTF8String:value.utf8(runtime).c_str()];
}
NSArray* convertJSICStyleArrayToNSArray(jsi::Runtime &runtime, const jsi::Value* array, size_t length, std::shared_ptr<CallInvoker> jsInvoker) {
if (length == 0) return @[];
NSMutableArray *result = [NSMutableArray new];
for (size_t i = 0; i < length; i++) {
// Insert kCFNull when it's `undefined` value to preserve the indices.
[result
addObject:convertJSIValueToObjCObject(runtime, array[i], jsInvoker) ?: (id)kCFNull];
}
return [result copy];
}
jsi::Value* convertNSArrayToJSICStyleArray(jsi::Runtime &runtime, NSArray* array) {
auto result = new jsi::Value[array.count];
for (size_t i = 0; i < array.count; i++) {
result[i] = convertObjCObjectToJSIValue(runtime, array[i]);
}
return result;
}
NSArray* convertJSIArrayToNSArray(jsi::Runtime &runtime, const jsi::Array &value, std::shared_ptr<CallInvoker> jsInvoker)
{
size_t size = value.size(runtime);
NSMutableArray *result = [NSMutableArray new];
for (size_t i = 0; i < size; i++) {
// Insert kCFNull when it's `undefined` value to preserve the indices.
[result
addObject:convertJSIValueToObjCObject(runtime, value.getValueAtIndex(runtime, i), jsInvoker) ?: (id)kCFNull];
}
return [result copy];
}
NSDictionary* convertJSIObjectToNSDictionary(jsi::Runtime &runtime, const jsi::Object &value, std::shared_ptr<CallInvoker> jsInvoker)
{
jsi::Array propertyNames = value.getPropertyNames(runtime);
size_t size = propertyNames.size(runtime);
NSMutableDictionary *result = [NSMutableDictionary new];
for (size_t i = 0; i < size; i++) {
jsi::String name = propertyNames.getValueAtIndex(runtime, i).getString(runtime);
NSString *k = convertJSIStringToNSString(runtime, name);
id v = convertJSIValueToObjCObject(runtime, value.getProperty(runtime, name), jsInvoker);
if (v) {
result[k] = v;
}
}
return [result copy];
}
id convertJSIValueToObjCObject(jsi::Runtime &runtime, const jsi::Value &value, std::shared_ptr<CallInvoker> jsInvoker)
{
if (value.isUndefined() || value.isNull()) {
return nil;
}
if (value.isBool()) {
return @(value.getBool());
}
if (value.isNumber()) {
return @(value.getNumber());
}
if (value.isString()) {
return convertJSIStringToNSString(runtime, value.getString(runtime));
}
if (value.isObject()) {
jsi::Object o = value.getObject(runtime);
if (o.isArray(runtime)) {
return convertJSIArrayToNSArray(runtime, o.getArray(runtime), jsInvoker);
}
if (o.isFunction(runtime)) {
return convertJSIFunctionToCallback(runtime, std::move(o.getFunction(runtime)), jsInvoker);
}
return convertJSIObjectToNSDictionary(runtime, o, jsInvoker);
}
throw std::runtime_error("Unsupported jsi::jsi::Value kind");
}
RCTResponseSenderBlock convertJSIFunctionToCallback(jsi::Runtime &runtime, const jsi::Function &value, std::shared_ptr<CallInvoker> jsInvoker)
{
auto weakWrapper = CallbackWrapper::createWeak(value.getFunction(runtime), runtime, jsInvoker);
BOOL __block wrapperWasCalled = NO;
RCTResponseSenderBlock callback = ^(NSArray *responses) {
if (wrapperWasCalled) {
throw std::runtime_error("callback arg cannot be called more than once");
}
auto strongWrapper = weakWrapper.lock();
if (!strongWrapper) {
return;
}
strongWrapper->jsInvoker().invokeAsync([weakWrapper, responses]() {
auto strongWrapper2 = weakWrapper.lock();
if (!strongWrapper2) {
return;
}
const jsi::Value* args = convertNSArrayToJSICStyleArray(strongWrapper2->runtime(), responses);
strongWrapper2->callback().call(strongWrapper2->runtime(), args, static_cast<size_t>(responses.count));
strongWrapper2->destroy();
delete[] args;
});
wrapperWasCalled = YES;
};
if (RCTTurboModuleBlockCopyEnabled()) {
return [callback copy];
}
return callback;
}

View File

@@ -0,0 +1,18 @@
//
// RCTBridge+runOnJS.h
// VisionCamera
//
// Created by Marc Rousavy on 23.03.21.
// Copyright © 2021 Facebook. All rights reserved.
//
#pragma once
#import <Foundation/Foundation.h>
#import <React/RCTBridge.h>
@interface RCTBridge (RunOnJS)
- (void) runOnJS:(void (^)(void))block NS_SWIFT_NAME( runOnJS(_:) );
@end

View File

@@ -0,0 +1,23 @@
//
// RCTBridge+runOnJS.mm
// VisionCamera
//
// Created by Marc Rousavy on 23.03.21.
// Copyright © 2021 Facebook. All rights reserved.
//
#import "RCTBridge+runOnJS.h"
#import <Foundation/Foundation.h>
#import <React/RCTBridge.h>
#import <ReactCommon/RCTTurboModule.h>
@implementation RCTBridge (RunOnJS)
- (void) runOnJS:(void (^)())block {
auto callInvoker = [self jsCallInvoker];
callInvoker->invokeAsync([block]() {
block();
});
}
@end

135
ios/RecordingSession.swift Normal file
View File

@@ -0,0 +1,135 @@
//
// RecordingSession.swift
// VisionCamera
//
// Created by Marc Rousavy on 01.05.21.
// Copyright © 2021 Facebook. All rights reserved.
//
import AVFoundation
import Foundation
// MARK: - BufferType
enum BufferType {
case audio
case video
}
// MARK: - RecordingSession
class RecordingSession {
private let assetWriter: AVAssetWriter
private let audioWriter: AVAssetWriterInput
private let videoWriter: AVAssetWriterInput
private let bufferAdaptor: AVAssetWriterInputPixelBufferAdaptor
private let completionHandler: (AVAssetWriter.Status, Error?) -> Void
private let initialTimestamp: CMTime
private var latestTimestamp: CMTime?
private var hasWrittenFirstVideoFrame = false
var url: URL {
return assetWriter.outputURL
}
var duration: Double {
guard let latestTimestamp = latestTimestamp else {
return 0.0
}
return (latestTimestamp - initialTimestamp).seconds
}
init(url: URL,
fileType: AVFileType,
videoSettings: [String: Any],
audioSettings: [String: Any],
isVideoMirrored: Bool,
completion: @escaping (AVAssetWriter.Status, Error?) -> Void) throws {
do {
assetWriter = try AVAssetWriter(outputURL: url, fileType: fileType)
audioWriter = AVAssetWriterInput(mediaType: .audio, outputSettings: audioSettings)
videoWriter = AVAssetWriterInput(mediaType: .video, outputSettings: videoSettings)
completionHandler = completion
} catch let error as NSError {
throw CameraError.capture(.createRecorderError(message: error.description))
}
audioWriter.expectsMediaDataInRealTime = true
videoWriter.expectsMediaDataInRealTime = true
if isVideoMirrored {
videoWriter.transform = CGAffineTransform(rotationAngle: -(.pi / 2))
} else {
videoWriter.transform = CGAffineTransform(rotationAngle: .pi / 2)
}
bufferAdaptor = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput: videoWriter, withVideoSettings: videoSettings)
assetWriter.add(videoWriter)
assetWriter.add(audioWriter)
assetWriter.startWriting()
initialTimestamp = CMTime(seconds: CACurrentMediaTime(), preferredTimescale: 1_000_000_000)
assetWriter.startSession(atSourceTime: initialTimestamp)
ReactLogger.log(level: .info, message: "Initialized Video and Audio AssetWriter.")
}
deinit {
if assetWriter.status == .writing {
ReactLogger.log(level: .info, message: "Cancelling AssetWriter...")
assetWriter.cancelWriting()
}
}
func appendBuffer(_ buffer: CMSampleBuffer, type bufferType: BufferType) {
if !CMSampleBufferDataIsReady(buffer) {
return
}
let timestamp = CMSampleBufferGetPresentationTimeStamp(buffer)
latestTimestamp = timestamp
switch bufferType {
case .video:
if !videoWriter.isReadyForMoreMediaData {
ReactLogger.log(level: .warning, message: "The Video AVAssetWriterInput was not ready for more data! Is your frame rate too high?")
return
}
guard let imageBuffer = CMSampleBufferGetImageBuffer(buffer) else {
ReactLogger.log(level: .error, message: "Failed to get the CVImageBuffer!")
return
}
bufferAdaptor.append(imageBuffer, withPresentationTime: timestamp)
if !hasWrittenFirstVideoFrame {
hasWrittenFirstVideoFrame = true
ReactLogger.log(level: .warning, message: "VideoWriter: First frame arrived \((timestamp - initialTimestamp).seconds) seconds late.")
}
case .audio:
if !audioWriter.isReadyForMoreMediaData {
return
}
if !hasWrittenFirstVideoFrame {
// first video frame has not been written yet, so skip this audio frame.
return
}
audioWriter.append(buffer)
}
if assetWriter.status == .failed {
// TODO: Should I call the completion handler or is this instance still valid?
ReactLogger.log(level: .error, message: "AssetWriter failed to write buffer! Error: \(assetWriter.error?.localizedDescription ?? "none")")
}
}
func finish() {
ReactLogger.log(level: .info, message: "Finishing Recording with AssetWriter status \"\(assetWriter.status.descriptor)\"...")
if assetWriter.status == .writing {
videoWriter.markAsFinished()
assetWriter.finishWriting {
self.completionHandler(self.assetWriter.status, self.assetWriter.error)
}
} else {
completionHandler(assetWriter.status, assetWriter.error)
}
}
}

View File

@@ -1,44 +0,0 @@
//
// VideoCaptureDelegate.swift
// Cuvent
//
// Created by Marc Rousavy on 14.01.21.
// Copyright © 2021 Facebook. All rights reserved.
//
import AVFoundation
// Functions like `startRecording(delegate: ...)` only maintain a weak reference on the delegates to prevent memory leaks.
// In our use case, we exit from the function which will deinit our recording delegate since no other references are being held.
// That's why we're keeping a strong reference to the delegate by appending it to the `delegateReferences` list and removing it
// once the delegate has been triggered once.
private var delegateReferences: [NSObject] = []
// MARK: - RecordingDelegateWithCallback
class RecordingDelegateWithCallback: NSObject, AVCaptureFileOutputRecordingDelegate {
init(callback: @escaping RCTResponseSenderBlock, resetTorchMode: @escaping () -> Void) {
self.callback = callback
self.resetTorchMode = resetTorchMode
super.init()
delegateReferences.append(self)
}
func fileOutput(_ output: AVCaptureFileOutput, didFinishRecordingTo outputFileURL: URL, from _: [AVCaptureConnection], error: Error?) {
defer {
self.resetTorchMode()
delegateReferences.removeAll(where: { $0 == self })
}
if let error = error as NSError? {
return callback([NSNull(), makeReactError(.capture(.unknown(message: error.description)), cause: error)])
}
let seconds = CMTimeGetSeconds(output.recordedDuration)
return callback([["path": outputFileURL.absoluteString, "duration": seconds, "size": output.recordedFileSize], NSNull()])
}
// MARK: Private
private let callback: RCTResponseSenderBlock // (video?, error?) => void
private let resetTorchMode: () -> Void
}

View File

@@ -7,13 +7,17 @@
objects = {
/* Begin PBXBuildFile section */
B80C0E00260BDDF7001699AB /* FrameProcessorPluginRegistry.mm in Sources */ = {isa = PBXBuildFile; fileRef = B80C0DFF260BDDF7001699AB /* FrameProcessorPluginRegistry.mm */; };
B8103E1C25FF553B007A1684 /* FrameProcessorUtils.mm in Sources */ = {isa = PBXBuildFile; fileRef = B8103E1B25FF553B007A1684 /* FrameProcessorUtils.mm */; };
B82FBA962614B69D00909718 /* RCTBridge+runOnJS.mm in Sources */ = {isa = PBXBuildFile; fileRef = B82FBA952614B69D00909718 /* RCTBridge+runOnJS.mm */; };
B84760A62608EE7C004C3180 /* FrameHostObject.mm in Sources */ = {isa = PBXBuildFile; fileRef = B84760A52608EE7C004C3180 /* FrameHostObject.mm */; };
B84760DF2608F57D004C3180 /* CameraQueues.swift in Sources */ = {isa = PBXBuildFile; fileRef = B84760DE2608F57D004C3180 /* CameraQueues.swift */; };
B86DC971260E2D5200FB17B2 /* AVAudioSession+trySetAllowHaptics.swift in Sources */ = {isa = PBXBuildFile; fileRef = B86DC970260E2D5200FB17B2 /* AVAudioSession+trySetAllowHaptics.swift */; };
B86DC974260E310600FB17B2 /* CameraView+AVAudioSession.swift in Sources */ = {isa = PBXBuildFile; fileRef = B86DC973260E310600FB17B2 /* CameraView+AVAudioSession.swift */; };
B86DC977260E315100FB17B2 /* CameraView+AVCaptureSession.swift in Sources */ = {isa = PBXBuildFile; fileRef = B86DC976260E315100FB17B2 /* CameraView+AVCaptureSession.swift */; };
B887518525E0102000DB86D6 /* PhotoCaptureDelegate.swift in Sources */ = {isa = PBXBuildFile; fileRef = B887515C25E0102000DB86D6 /* PhotoCaptureDelegate.swift */; };
B887518625E0102000DB86D6 /* CameraView+RecordVideo.swift in Sources */ = {isa = PBXBuildFile; fileRef = B887515D25E0102000DB86D6 /* CameraView+RecordVideo.swift */; };
B887518725E0102000DB86D6 /* CameraViewManager.m in Sources */ = {isa = PBXBuildFile; fileRef = B887515F25E0102000DB86D6 /* CameraViewManager.m */; };
B887518825E0102000DB86D6 /* VideoCaptureDelegate.swift in Sources */ = {isa = PBXBuildFile; fileRef = B887516025E0102000DB86D6 /* VideoCaptureDelegate.swift */; };
B887518925E0102000DB86D6 /* Collection+safe.swift in Sources */ = {isa = PBXBuildFile; fileRef = B887516225E0102000DB86D6 /* Collection+safe.swift */; };
B887518A25E0102000DB86D6 /* AVCaptureDevice+neutralZoom.swift in Sources */ = {isa = PBXBuildFile; fileRef = B887516325E0102000DB86D6 /* AVCaptureDevice+neutralZoom.swift */; };
B887518B25E0102000DB86D6 /* AVCaptureDevice.Format+isBetterThan.swift in Sources */ = {isa = PBXBuildFile; fileRef = B887516425E0102000DB86D6 /* AVCaptureDevice.Format+isBetterThan.swift */; };
@@ -23,7 +27,7 @@
B887518F25E0102000DB86D6 /* AVCapturePhotoOutput+mirror.swift in Sources */ = {isa = PBXBuildFile; fileRef = B887516825E0102000DB86D6 /* AVCapturePhotoOutput+mirror.swift */; };
B887519025E0102000DB86D6 /* AVCaptureDevice.Format+matchesFilter.swift in Sources */ = {isa = PBXBuildFile; fileRef = B887516925E0102000DB86D6 /* AVCaptureDevice.Format+matchesFilter.swift */; };
B887519125E0102000DB86D6 /* AVCaptureDevice.Format+toDictionary.swift in Sources */ = {isa = PBXBuildFile; fileRef = B887516A25E0102000DB86D6 /* AVCaptureDevice.Format+toDictionary.swift */; };
B887519225E0102000DB86D6 /* AVCaptureMovieFileOutput+mirror.swift in Sources */ = {isa = PBXBuildFile; fileRef = B887516B25E0102000DB86D6 /* AVCaptureMovieFileOutput+mirror.swift */; };
B887519225E0102000DB86D6 /* AVCaptureVideoDataOutput+mirror.swift in Sources */ = {isa = PBXBuildFile; fileRef = B887516B25E0102000DB86D6 /* AVCaptureVideoDataOutput+mirror.swift */; };
B887519425E0102000DB86D6 /* MakeReactError.swift in Sources */ = {isa = PBXBuildFile; fileRef = B887516E25E0102000DB86D6 /* MakeReactError.swift */; };
B887519525E0102000DB86D6 /* ReactLogger.swift in Sources */ = {isa = PBXBuildFile; fileRef = B887516F25E0102000DB86D6 /* ReactLogger.swift */; };
B887519625E0102000DB86D6 /* Promise.swift in Sources */ = {isa = PBXBuildFile; fileRef = B887517025E0102000DB86D6 /* Promise.swift */; };
@@ -45,6 +49,12 @@
B88751A725E0102000DB86D6 /* CameraView+Zoom.swift in Sources */ = {isa = PBXBuildFile; fileRef = B887518225E0102000DB86D6 /* CameraView+Zoom.swift */; };
B88751A825E0102000DB86D6 /* CameraError.swift in Sources */ = {isa = PBXBuildFile; fileRef = B887518325E0102000DB86D6 /* CameraError.swift */; };
B88751A925E0102000DB86D6 /* CameraView.swift in Sources */ = {isa = PBXBuildFile; fileRef = B887518425E0102000DB86D6 /* CameraView.swift */; };
B8994E6C263F03E100069589 /* JSIUtils.mm in Sources */ = {isa = PBXBuildFile; fileRef = B8994E6B263F03E100069589 /* JSIUtils.mm */; };
B8A751D82609E4B30011C623 /* FrameProcessorRuntimeManager.mm in Sources */ = {isa = PBXBuildFile; fileRef = B8A751D72609E4B30011C623 /* FrameProcessorRuntimeManager.mm */; };
B8D22CDC2642DB4D00234472 /* AVAssetWriterInputPixelBufferAdaptor+initWithVideoSettings.swift in Sources */ = {isa = PBXBuildFile; fileRef = B8D22CDB2642DB4D00234472 /* AVAssetWriterInputPixelBufferAdaptor+initWithVideoSettings.swift */; };
B8DB3BC8263DC28C004C18D7 /* AVAssetWriter.Status+descriptor.swift in Sources */ = {isa = PBXBuildFile; fileRef = B8DB3BC7263DC28C004C18D7 /* AVAssetWriter.Status+descriptor.swift */; };
B8DB3BCA263DC4D8004C18D7 /* RecordingSession.swift in Sources */ = {isa = PBXBuildFile; fileRef = B8DB3BC9263DC4D8004C18D7 /* RecordingSession.swift */; };
B8DB3BCC263DC97E004C18D7 /* AVFileType+descriptor.swift in Sources */ = {isa = PBXBuildFile; fileRef = B8DB3BCB263DC97E004C18D7 /* AVFileType+descriptor.swift */; };
/* End PBXBuildFile section */
/* Begin PBXCopyFilesBuildPhase section */
@@ -61,6 +71,18 @@
/* Begin PBXFileReference section */
134814201AA4EA6300B7C361 /* libVisionCamera.a */ = {isa = PBXFileReference; explicitFileType = archive.ar; includeInIndex = 0; path = libVisionCamera.a; sourceTree = BUILT_PRODUCTS_DIR; };
B80C0DFE260BDD97001699AB /* FrameProcessorPluginRegistry.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = FrameProcessorPluginRegistry.h; sourceTree = "<group>"; };
B80C0DFF260BDDF7001699AB /* FrameProcessorPluginRegistry.mm */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.cpp.objcpp; path = FrameProcessorPluginRegistry.mm; sourceTree = "<group>"; };
B80D67A825FA25380008FE8D /* FrameProcessorCallback.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = FrameProcessorCallback.h; sourceTree = "<group>"; };
B8103E1B25FF553B007A1684 /* FrameProcessorUtils.mm */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.cpp.objcpp; path = FrameProcessorUtils.mm; sourceTree = "<group>"; };
B8103E1E25FF5550007A1684 /* FrameProcessorUtils.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = FrameProcessorUtils.h; sourceTree = "<group>"; };
B8103E5725FF56F0007A1684 /* Frame.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = Frame.h; sourceTree = "<group>"; };
B81D41EF263C86F900B041FD /* JSIUtils.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = JSIUtils.h; sourceTree = "<group>"; };
B82FBA942614B69D00909718 /* RCTBridge+runOnJS.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = "RCTBridge+runOnJS.h"; sourceTree = "<group>"; };
B82FBA952614B69D00909718 /* RCTBridge+runOnJS.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = "RCTBridge+runOnJS.mm"; sourceTree = "<group>"; };
B84760A22608EE38004C3180 /* FrameHostObject.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = FrameHostObject.h; sourceTree = "<group>"; };
B84760A52608EE7C004C3180 /* FrameHostObject.mm */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.cpp.objcpp; path = FrameHostObject.mm; sourceTree = "<group>"; };
B84760DE2608F57D004C3180 /* CameraQueues.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = CameraQueues.swift; sourceTree = "<group>"; };
B86DC970260E2D5200FB17B2 /* AVAudioSession+trySetAllowHaptics.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = "AVAudioSession+trySetAllowHaptics.swift"; sourceTree = "<group>"; };
B86DC973260E310600FB17B2 /* CameraView+AVAudioSession.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = "CameraView+AVAudioSession.swift"; sourceTree = "<group>"; };
B86DC976260E315100FB17B2 /* CameraView+AVCaptureSession.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = "CameraView+AVCaptureSession.swift"; sourceTree = "<group>"; };
@@ -68,7 +90,6 @@
B887515D25E0102000DB86D6 /* CameraView+RecordVideo.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = "CameraView+RecordVideo.swift"; sourceTree = "<group>"; };
B887515E25E0102000DB86D6 /* CameraBridge.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CameraBridge.h; sourceTree = "<group>"; };
B887515F25E0102000DB86D6 /* CameraViewManager.m */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.objc; path = CameraViewManager.m; sourceTree = "<group>"; };
B887516025E0102000DB86D6 /* VideoCaptureDelegate.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = VideoCaptureDelegate.swift; sourceTree = "<group>"; };
B887516225E0102000DB86D6 /* Collection+safe.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = "Collection+safe.swift"; sourceTree = "<group>"; };
B887516325E0102000DB86D6 /* AVCaptureDevice+neutralZoom.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = "AVCaptureDevice+neutralZoom.swift"; sourceTree = "<group>"; };
B887516425E0102000DB86D6 /* AVCaptureDevice.Format+isBetterThan.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = "AVCaptureDevice.Format+isBetterThan.swift"; sourceTree = "<group>"; };
@@ -78,7 +99,7 @@
B887516825E0102000DB86D6 /* AVCapturePhotoOutput+mirror.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = "AVCapturePhotoOutput+mirror.swift"; sourceTree = "<group>"; };
B887516925E0102000DB86D6 /* AVCaptureDevice.Format+matchesFilter.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = "AVCaptureDevice.Format+matchesFilter.swift"; sourceTree = "<group>"; };
B887516A25E0102000DB86D6 /* AVCaptureDevice.Format+toDictionary.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = "AVCaptureDevice.Format+toDictionary.swift"; sourceTree = "<group>"; };
B887516B25E0102000DB86D6 /* AVCaptureMovieFileOutput+mirror.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = "AVCaptureMovieFileOutput+mirror.swift"; sourceTree = "<group>"; };
B887516B25E0102000DB86D6 /* AVCaptureVideoDataOutput+mirror.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = "AVCaptureVideoDataOutput+mirror.swift"; sourceTree = "<group>"; };
B887516E25E0102000DB86D6 /* MakeReactError.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = MakeReactError.swift; sourceTree = "<group>"; };
B887516F25E0102000DB86D6 /* ReactLogger.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = ReactLogger.swift; sourceTree = "<group>"; };
B887517025E0102000DB86D6 /* Promise.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = Promise.swift; sourceTree = "<group>"; };
@@ -100,6 +121,16 @@
B887518225E0102000DB86D6 /* CameraView+Zoom.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = "CameraView+Zoom.swift"; sourceTree = "<group>"; };
B887518325E0102000DB86D6 /* CameraError.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = CameraError.swift; sourceTree = "<group>"; };
B887518425E0102000DB86D6 /* CameraView.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = CameraView.swift; sourceTree = "<group>"; };
B88873E5263D46C7008B1D0E /* FrameProcessorPlugin.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = FrameProcessorPlugin.h; sourceTree = "<group>"; };
B8994E6B263F03E100069589 /* JSIUtils.mm */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.cpp.objcpp; path = JSIUtils.mm; sourceTree = "<group>"; };
B8A751D62609E4980011C623 /* FrameProcessorRuntimeManager.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = FrameProcessorRuntimeManager.h; sourceTree = "<group>"; };
B8A751D72609E4B30011C623 /* FrameProcessorRuntimeManager.mm */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.cpp.objcpp; path = FrameProcessorRuntimeManager.mm; sourceTree = "<group>"; };
B8D22CDB2642DB4D00234472 /* AVAssetWriterInputPixelBufferAdaptor+initWithVideoSettings.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = "AVAssetWriterInputPixelBufferAdaptor+initWithVideoSettings.swift"; sourceTree = "<group>"; };
B8DB3BC7263DC28C004C18D7 /* AVAssetWriter.Status+descriptor.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = "AVAssetWriter.Status+descriptor.swift"; sourceTree = "<group>"; };
B8DB3BC9263DC4D8004C18D7 /* RecordingSession.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = RecordingSession.swift; sourceTree = "<group>"; };
B8DB3BCB263DC97E004C18D7 /* AVFileType+descriptor.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = "AVFileType+descriptor.swift"; sourceTree = "<group>"; };
B8DCF09125EA7BEE00EA5C72 /* SpeedChecker.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = SpeedChecker.h; sourceTree = "<group>"; };
B8DCF14425EA817D00EA5C72 /* MakeJSIRuntime.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = MakeJSIRuntime.h; sourceTree = "<group>"; };
/* End PBXFileReference section */
/* Begin PBXFrameworksBuildPhase section */
@@ -124,7 +155,9 @@
58B511D21A9E6C8500147676 = {
isa = PBXGroup;
children = (
B8DCF2D725EA940700EA5C72 /* Frame Processor */,
B887515E25E0102000DB86D6 /* CameraBridge.h */,
B84760DE2608F57D004C3180 /* CameraQueues.swift */,
B887518325E0102000DB86D6 /* CameraError.swift */,
B887518425E0102000DB86D6 /* CameraView.swift */,
B86DC976260E315100FB17B2 /* CameraView+AVCaptureSession.swift */,
@@ -135,11 +168,12 @@
B887518225E0102000DB86D6 /* CameraView+Zoom.swift */,
B887515F25E0102000DB86D6 /* CameraViewManager.m */,
B887518125E0102000DB86D6 /* CameraViewManager.swift */,
B8DB3BC9263DC4D8004C18D7 /* RecordingSession.swift */,
B8DCF08F25EA7BEE00EA5C72 /* cpp */,
B887516125E0102000DB86D6 /* Extensions */,
B887517225E0102000DB86D6 /* Parsers */,
B887515C25E0102000DB86D6 /* PhotoCaptureDelegate.swift */,
B887516D25E0102000DB86D6 /* React Utils */,
B887516025E0102000DB86D6 /* VideoCaptureDelegate.swift */,
134814211AA4EA7D00B7C361 /* Products */,
);
sourceTree = "<group>";
@@ -147,6 +181,7 @@
B887516125E0102000DB86D6 /* Extensions */ = {
isa = PBXGroup;
children = (
B8D22CDB2642DB4D00234472 /* AVAssetWriterInputPixelBufferAdaptor+initWithVideoSettings.swift */,
B86DC970260E2D5200FB17B2 /* AVAudioSession+trySetAllowHaptics.swift */,
B887516325E0102000DB86D6 /* AVCaptureDevice+neutralZoom.swift */,
B887516425E0102000DB86D6 /* AVCaptureDevice.Format+isBetterThan.swift */,
@@ -156,7 +191,7 @@
B887516825E0102000DB86D6 /* AVCapturePhotoOutput+mirror.swift */,
B887516925E0102000DB86D6 /* AVCaptureDevice.Format+matchesFilter.swift */,
B887516A25E0102000DB86D6 /* AVCaptureDevice.Format+toDictionary.swift */,
B887516B25E0102000DB86D6 /* AVCaptureMovieFileOutput+mirror.swift */,
B887516B25E0102000DB86D6 /* AVCaptureVideoDataOutput+mirror.swift */,
B887516225E0102000DB86D6 /* Collection+safe.swift */,
);
path = Extensions;
@@ -168,6 +203,10 @@
B887516E25E0102000DB86D6 /* MakeReactError.swift */,
B887516F25E0102000DB86D6 /* ReactLogger.swift */,
B887517025E0102000DB86D6 /* Promise.swift */,
B82FBA942614B69D00909718 /* RCTBridge+runOnJS.h */,
B82FBA952614B69D00909718 /* RCTBridge+runOnJS.mm */,
B81D41EF263C86F900B041FD /* JSIUtils.h */,
B8994E6B263F03E100069589 /* JSIUtils.mm */,
);
path = "React Utils";
sourceTree = "<group>";
@@ -176,6 +215,7 @@
isa = PBXGroup;
children = (
B887517325E0102000DB86D6 /* EnumParserError.swift */,
B8DB3BC7263DC28C004C18D7 /* AVAssetWriter.Status+descriptor.swift */,
B887517425E0102000DB86D6 /* AVCaptureVideoStabilizationMode+descriptor.swift */,
B887517525E0102000DB86D6 /* AVVideoCodecType+descriptor.swift */,
B887517625E0102000DB86D6 /* AVCaptureSession.Preset+descriptor.swift */,
@@ -187,10 +227,39 @@
B887517D25E0102000DB86D6 /* AVCaptureColorSpace+descriptor.swift */,
B887517E25E0102000DB86D6 /* AVCaptureDevice.FlashMode+descriptor.swift */,
B887517F25E0102000DB86D6 /* AVCaptureDevice.Format.AutoFocusSystem+descriptor.swift */,
B8DB3BCB263DC97E004C18D7 /* AVFileType+descriptor.swift */,
);
path = Parsers;
sourceTree = "<group>";
};
B8DCF08F25EA7BEE00EA5C72 /* cpp */ = {
isa = PBXGroup;
children = (
B8DCF14425EA817D00EA5C72 /* MakeJSIRuntime.h */,
B8DCF09125EA7BEE00EA5C72 /* SpeedChecker.h */,
);
name = cpp;
path = ../cpp;
sourceTree = "<group>";
};
B8DCF2D725EA940700EA5C72 /* Frame Processor */ = {
isa = PBXGroup;
children = (
B80D67A825FA25380008FE8D /* FrameProcessorCallback.h */,
B8103E1E25FF5550007A1684 /* FrameProcessorUtils.h */,
B8103E1B25FF553B007A1684 /* FrameProcessorUtils.mm */,
B8103E5725FF56F0007A1684 /* Frame.h */,
B84760A22608EE38004C3180 /* FrameHostObject.h */,
B84760A52608EE7C004C3180 /* FrameHostObject.mm */,
B8A751D62609E4980011C623 /* FrameProcessorRuntimeManager.h */,
B8A751D72609E4B30011C623 /* FrameProcessorRuntimeManager.mm */,
B80C0DFE260BDD97001699AB /* FrameProcessorPluginRegistry.h */,
B80C0DFF260BDDF7001699AB /* FrameProcessorPluginRegistry.mm */,
B88873E5263D46C7008B1D0E /* FrameProcessorPlugin.h */,
);
path = "Frame Processor";
sourceTree = "<group>";
};
/* End PBXGroup section */
/* Begin PBXNativeTarget section */
@@ -198,8 +267,8 @@
isa = PBXNativeTarget;
buildConfigurationList = 58B511EF1A9E6C8500147676 /* Build configuration list for PBXNativeTarget "VisionCamera" */;
buildPhases = (
B81F6C7625E515810008974A /* Run SwiftLint */,
B80D6CAB25F770FE006F2CB7 /* Run SwiftFormat */,
B81F6C7625E515810008974A /* Run SwiftLint */,
58B511D71A9E6C8500147676 /* Sources */,
58B511D81A9E6C8500147676 /* Frameworks */,
58B511D91A9E6C8500147676 /* CopyFiles */,
@@ -291,6 +360,7 @@
files = (
B86DC974260E310600FB17B2 /* CameraView+AVAudioSession.swift in Sources */,
B887518625E0102000DB86D6 /* CameraView+RecordVideo.swift in Sources */,
B8DB3BCA263DC4D8004C18D7 /* RecordingSession.swift in Sources */,
B88751A225E0102000DB86D6 /* AVCaptureColorSpace+descriptor.swift in Sources */,
B887518925E0102000DB86D6 /* Collection+safe.swift in Sources */,
B887519125E0102000DB86D6 /* AVCaptureDevice.Format+toDictionary.swift in Sources */,
@@ -299,6 +369,7 @@
B887518C25E0102000DB86D6 /* AVCaptureDevice+isMultiCam.swift in Sources */,
B887518D25E0102000DB86D6 /* AVCaptureDevice+physicalDevices.swift in Sources */,
B887519625E0102000DB86D6 /* Promise.swift in Sources */,
B8DB3BC8263DC28C004C18D7 /* AVAssetWriter.Status+descriptor.swift in Sources */,
B887518725E0102000DB86D6 /* CameraViewManager.m in Sources */,
B88751A925E0102000DB86D6 /* CameraView.swift in Sources */,
B887519925E0102000DB86D6 /* AVCaptureVideoStabilizationMode+descriptor.swift in Sources */,
@@ -308,25 +379,33 @@
B88751A725E0102000DB86D6 /* CameraView+Zoom.swift in Sources */,
B887518525E0102000DB86D6 /* PhotoCaptureDelegate.swift in Sources */,
B887518B25E0102000DB86D6 /* AVCaptureDevice.Format+isBetterThan.swift in Sources */,
B84760A62608EE7C004C3180 /* FrameHostObject.mm in Sources */,
B8103E1C25FF553B007A1684 /* FrameProcessorUtils.mm in Sources */,
B887518E25E0102000DB86D6 /* AVFrameRateRange+includes.swift in Sources */,
B88751A125E0102000DB86D6 /* AVCaptureDevice.Position+descriptor.swift in Sources */,
B86DC977260E315100FB17B2 /* CameraView+AVCaptureSession.swift in Sources */,
B887518A25E0102000DB86D6 /* AVCaptureDevice+neutralZoom.swift in Sources */,
B88751A325E0102000DB86D6 /* AVCaptureDevice.FlashMode+descriptor.swift in Sources */,
B8A751D82609E4B30011C623 /* FrameProcessorRuntimeManager.mm in Sources */,
B887519A25E0102000DB86D6 /* AVVideoCodecType+descriptor.swift in Sources */,
B88751A825E0102000DB86D6 /* CameraError.swift in Sources */,
B887519225E0102000DB86D6 /* AVCaptureMovieFileOutput+mirror.swift in Sources */,
B887519225E0102000DB86D6 /* AVCaptureVideoDataOutput+mirror.swift in Sources */,
B88751A625E0102000DB86D6 /* CameraViewManager.swift in Sources */,
B887519F25E0102000DB86D6 /* AVCaptureDevice.DeviceType+descriptor.swift in Sources */,
B8D22CDC2642DB4D00234472 /* AVAssetWriterInputPixelBufferAdaptor+initWithVideoSettings.swift in Sources */,
B82FBA962614B69D00909718 /* RCTBridge+runOnJS.mm in Sources */,
B84760DF2608F57D004C3180 /* CameraQueues.swift in Sources */,
B887519025E0102000DB86D6 /* AVCaptureDevice.Format+matchesFilter.swift in Sources */,
B887518F25E0102000DB86D6 /* AVCapturePhotoOutput+mirror.swift in Sources */,
B88751A425E0102000DB86D6 /* AVCaptureDevice.Format.AutoFocusSystem+descriptor.swift in Sources */,
B8DB3BCC263DC97E004C18D7 /* AVFileType+descriptor.swift in Sources */,
B88751A025E0102000DB86D6 /* AVAuthorizationStatus+descriptor.swift in Sources */,
B80C0E00260BDDF7001699AB /* FrameProcessorPluginRegistry.mm in Sources */,
B887519C25E0102000DB86D6 /* AVCaptureDevice.TorchMode+descriptor.swift in Sources */,
B8994E6C263F03E100069589 /* JSIUtils.mm in Sources */,
B88751A525E0102000DB86D6 /* CameraView+Focus.swift in Sources */,
B86DC971260E2D5200FB17B2 /* AVAudioSession+trySetAllowHaptics.swift in Sources */,
B887519E25E0102000DB86D6 /* AVCapturePhotoOutput.QualityPrioritization+descriptor.swift in Sources */,
B887518825E0102000DB86D6 /* VideoCaptureDelegate.swift in Sources */,
);
runOnlyForDeploymentPostprocessing = 0;
};
@@ -436,6 +515,7 @@
58B511F01A9E6C8500147676 /* Debug */ = {
isa = XCBuildConfiguration;
buildSettings = {
DEFINES_MODULE = YES;
HEADER_SEARCH_PATHS = (
"$(inherited)",
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/include,
@@ -449,12 +529,14 @@
SWIFT_OBJC_BRIDGING_HEADER = CameraBridge.h;
SWIFT_OPTIMIZATION_LEVEL = "-Onone";
SWIFT_VERSION = 5.2;
USER_HEADER_SEARCH_PATHS = "\"$(SRCROOT)/../cpp\"/**";
};
name = Debug;
};
58B511F11A9E6C8500147676 /* Release */ = {
isa = XCBuildConfiguration;
buildSettings = {
DEFINES_MODULE = YES;
HEADER_SEARCH_PATHS = (
"$(inherited)",
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/include,
@@ -467,6 +549,7 @@
SKIP_INSTALL = YES;
SWIFT_OBJC_BRIDGING_HEADER = CameraBridge.h;
SWIFT_VERSION = 5.2;
USER_HEADER_SEARCH_PATHS = "\"$(SRCROOT)/../cpp\"/**";
};
name = Release;
};