feat: Full Android rewrite (CameraX -> Camera2) (#1674)
* Nuke CameraX * fix: Run View Finder on UI Thread * Open Camera, set up Threads * fix init * Mirror if needed * Try PreviewView * Use max resolution * Add `hardwareLevel` property * Check if output type is supported * Replace `frameRateRanges` with `minFps` and `maxFps` * Remove `isHighestPhotoQualitySupported` * Remove `colorSpace` The native platforms will use the best / most accurate colorSpace by default anyways. * HDR * Check from format * fix * Remove `supportsParallelVideoProcessing` * Correctly return video/photo sizes on Android now. Finally * Log all Device props * Log if optimized usecase is used * Cleanup * Configure Camera Input only once * Revert "Configure Camera Input only once" This reverts commit 0fd6c03f54c7566cb5592053720c4a8743aba92e. * Extract Camera configuration * Try to reconfigure all * Hook based * Properly set up `CameraSession` * Delete unused * fix: Fix recreate when outputs change * Update NativePreviewView.kt * Use callback for closing * Catch CameraAccessException * Finally got it stable * Remove isMirrored * Implement `takePhoto()` * Add ExifInterface library * Run findViewById on UI Thread * Add Photo Output Surface to takePhoto * Fix Video Stabilization Modes * Optimize Imports * More logs * Update CameraSession.kt * Close Image * Use separate Executor in CameraQueue * Delete hooks * Use same Thread again * If opened, call error * Update CameraSession.kt * Log HW level * fix: Don't enable Stream Use Case if it's not 100% supported * Move some stuff * Cleanup PhotoOutputSynchronizer * Try just open in suspend fun * Some synchronization fixes * fix logs * Update CameraDevice+createCaptureSession.kt * Update CameraDevice+createCaptureSession.kt * fixes * fix: Use Snapshot Template for speed capture prio * Use PREVIEW template for repeating request * Use `TEMPLATE_RECORD` if video use-case is attached * Use `isRunning` flag * Recreate session everytime on active/inactive * Lazily get values in capture session * Stability * Rebuild session if outputs change * Set `didOutputsChange` back to false * Capture first in lock * Try * kinda fix it? idk * fix: Keep Outputs * Refactor into single method * Update CameraView.kt * Use Enums for type safety * Implement Orientation (I think) * Move RefCount management to Java (Frame) * Don't crash when dropping a Frame * Prefer Devices with higher max resolution * Prefer multi-cams * Use FastImage for Media Page * Return orientation in takePhoto() * Load orientation from EXIF Data * Add `isMirrored` props and documentation for PhotoFile * fix: Return `not-determined` on Android * Update CameraViewModule.kt * chore: Upgrade packages * fix: Fix Metro Config * Cleanup config * Properly mirror Images on save * Prepare MediaRecorder * Start/Stop MediaRecorder * Remove `takeSnapshot()` It no longer works on Android and never worked on iOS. Users could use useFrameProcessor to take a Snapshot * Use `MediaCodec` * Move to `VideoRecording` class * Cleanup Snapshot * Create `SkiaPreviewView` hybrid class * Create OpenGL context * Create `SkiaPreviewView` * Fix texture creation missing context * Draw red frame * Somehow get it working * Add Skia CMake setup * Start looping * Init OpenGL * Refactor into `SkiaRenderer` * Cleanup PreviewSize * Set up * Only re-render UI if there is a new Frame * Preview * Fix init * Try rendering Preview * Update SkiaPreviewView.kt * Log version * Try using Skia (fail) * Drawwwww!!!!!!!!!! 🎉 * Use Preview Size * Clear first * Refactor into SkiaRenderer * Add `previewType: "none"` on iOS * Simplify a lot * Draw Camera? For some reason? I have no idea anymore * Fix OpenGL errors * Got it kinda working again? * Actually draw Frame woah * Clean up code * Cleanup * Update on main * Synchronize render calls * holy shit * Update SkiaRenderer.cpp * Update SkiaRenderer.cpp * Refactor * Update SkiaRenderer.cpp * Check for `NO_INPUT_TEXTURE`^ * Post & Wait * Set input size * Add Video back again * Allow session without preview * Convert JPEG to byte[] * feat: Use `ImageReader` and use YUV Image Buffers in Skia Context (#1689) * Try to pass YUV Buffers as Pixmaps * Create pixmap! * Clean up * Render to preview * Only render if we have an output surface * Update SkiaRenderer.cpp * Fix Y+U+V sampling code * Cleanup * Fix Semaphore 0 * Use 4:2:0 YUV again idk * Update SkiaRenderer.h * Set minSdk to 26 * Set surface * Revert "Set minSdk to 26" This reverts commit c4085b7c16c628532e5c2d68cf7ed11c751d0b48. * Set previewType * feat: Video Recording with Camera2 (#1691) * Rename * Update CameraSession.kt * Use `SurfaceHolder` instead of `SurfaceView` for output * Update CameraOutputs.kt * Update CameraSession.kt * fix: Fix crash when Preview is null * Check if snapshot capture is supported * Update RecordingSession.kt * S * Use `MediaRecorder` * Make audio optional * Add Torch * Output duration * Update RecordingSession.kt * Start RecordingSession * logs * More log * Base for preparing pass-through Recording * Use `ImageWriter` to append Images to the Recording Surface * Stream PRIVATE GPU_SAMPLED_IMAGE Images * Add flags * Close session on stop * Allow customizing `videoCodec` and `fileType` * Enable Torch * Fix Torch Mode * Fix comparing outputs with hashCode * Update CameraSession.kt * Correctly pass along Frame Processor * fix: Use AUDIO_BIT_RATE of 16 * 44,1Khz * Use CAMCORDER instead of MIC microphone * Use 1 channel * fix: Use `Orientation` * Add `native` PixelFormat * Update iOS to latest Skia integration * feat: Add `pixelFormat` property to Camera * Catch error in configureSession * Fix JPEG format * Clean up best match finder * Update CameraDeviceDetails.kt * Clamp sizes by maximum CamcorderProfile size * Remove `getAvailableVideoCodecs` * chore: release 3.0.0-rc.5 * Use maximum video size of RECORD as default * Update CameraDeviceDetails.kt * Add a todo * Add JSON device to issue report * Prefer `full` devices and flash * Lock to 30 FPS on Samsung * Implement Zoom * Refactor * Format -> PixelFormat * fix: Feat `pixelFormat` -> `pixelFormats` * Update TROUBLESHOOTING.mdx * Format * fix: Implement `zoom` for Photo Capture * fix: Don't run if `isActive` is `false` * fix: Call `examplePlugin(frame)` * fix: Fix Flash * fix: Use `react-native-worklets-core`! * fix: Fix import
This commit is contained in:
@@ -74,7 +74,7 @@ enum DeviceError: String {
|
||||
case configureError = "configuration-error"
|
||||
case noDevice = "no-device"
|
||||
case invalid = "invalid-device"
|
||||
case torchUnavailable = "torch-unavailable"
|
||||
case flashUnavailable = "flash-unavailable"
|
||||
case microphoneUnavailable = "microphone-unavailable"
|
||||
case lowLightBoostNotSupported = "low-light-boost-not-supported"
|
||||
case focusNotSupported = "focus-not-supported"
|
||||
@@ -92,8 +92,8 @@ enum DeviceError: String {
|
||||
return "No device was set! Use `getAvailableCameraDevices()` to select a suitable Camera device."
|
||||
case .invalid:
|
||||
return "The given Camera device was invalid. Use `getAvailableCameraDevices()` to select a suitable Camera device."
|
||||
case .torchUnavailable:
|
||||
return "The current camera device does not have a torch."
|
||||
case .flashUnavailable:
|
||||
return "The Camera Device does not have a flash unit! Make sure you select a device where `hasFlash`/`hasTorch` is true!"
|
||||
case .lowLightBoostNotSupported:
|
||||
return "The currently selected camera device does not support low-light boost! Make sure you select a device where `supportsLowLightBoost` is true!"
|
||||
case .focusNotSupported:
|
||||
@@ -112,7 +112,6 @@ enum FormatError {
|
||||
case invalidFps(fps: Int)
|
||||
case invalidHdr
|
||||
case invalidFormat
|
||||
case invalidColorSpace(colorSpace: String)
|
||||
|
||||
var code: String {
|
||||
switch self {
|
||||
@@ -122,8 +121,6 @@ enum FormatError {
|
||||
return "invalid-fps"
|
||||
case .invalidHdr:
|
||||
return "invalid-hdr"
|
||||
case .invalidColorSpace:
|
||||
return "invalid-color-space"
|
||||
}
|
||||
}
|
||||
|
||||
@@ -132,12 +129,9 @@ enum FormatError {
|
||||
case .invalidFormat:
|
||||
return "The given format was invalid. Did you check if the current device supports the given format by using `getAvailableCameraDevices(...)`?"
|
||||
case let .invalidFps(fps):
|
||||
return "The given FPS were not valid for the currently selected format. Make sure you select a format which `frameRateRanges` includes \(fps) FPS!"
|
||||
return "The given format cannot run at \(fps) FPS! Make sure your FPS is lower than `format.maxFps` but higher than `format.minFps`."
|
||||
case .invalidHdr:
|
||||
return "The currently selected format does not support HDR capture! Make sure you select a format which `frameRateRanges` includes `supportsPhotoHDR`!"
|
||||
case let .invalidColorSpace(colorSpace):
|
||||
return "The currently selected format does not support the colorSpace \"\(colorSpace)\"! " +
|
||||
"Make sure you select a format which `colorSpaces` includes \"\(colorSpace)\"!"
|
||||
return "The currently selected format does not support HDR capture! Make sure you select a format which includes `supportsPhotoHDR`!"
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -265,7 +259,7 @@ enum SystemError: String {
|
||||
case .skiaUnavailable:
|
||||
return "Skia Integration is unavailable - is @shopify/react-native-skia installed?"
|
||||
case .frameProcessorsUnavailable:
|
||||
return "Frame Processors are unavailable - is react-native-worklets installed?"
|
||||
return "Frame Processors are unavailable - is react-native-worklets-core installed?"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@@ -74,8 +74,10 @@ extension CameraView {
|
||||
photoOutput = AVCapturePhotoOutput()
|
||||
|
||||
if enableHighQualityPhotos?.boolValue == true {
|
||||
// TODO: In iOS 16 this will be removed in favor of maxPhotoDimensions.
|
||||
photoOutput!.isHighResolutionCaptureEnabled = true
|
||||
if #available(iOS 13.0, *) {
|
||||
// TODO: Test if this actually does any fusion or if this just calls the captureOutput twice. If the latter, remove it.
|
||||
photoOutput!.isVirtualDeviceConstituentPhotoDeliveryEnabled = photoOutput!.isVirtualDeviceConstituentPhotoDeliverySupported
|
||||
photoOutput!.maxPhotoQualityPrioritization = .quality
|
||||
} else {
|
||||
@@ -113,12 +115,21 @@ extension CameraView {
|
||||
videoOutput!.setSampleBufferDelegate(self, queue: CameraQueues.videoQueue)
|
||||
videoOutput!.alwaysDiscardsLateVideoFrames = false
|
||||
|
||||
if previewType == "skia" {
|
||||
// If the PreviewView is a Skia view, we need to use the RGB format since Skia works in the RGB colorspace instead of YUV.
|
||||
// This does introduce a performance overhead, but it's inevitable since Skia would internally convert
|
||||
// YUV frames to RGB anyways since all Shaders and draw operations operate in the RGB space.
|
||||
if let pixelFormat = pixelFormat as? String {
|
||||
let defaultFormat = CMFormatDescriptionGetMediaSubType(videoDeviceInput!.device.activeFormat.formatDescription)
|
||||
var pixelFormatType: OSType = defaultFormat
|
||||
switch pixelFormat {
|
||||
case "yuv":
|
||||
pixelFormatType = kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange
|
||||
case "rgb":
|
||||
pixelFormatType = kCVPixelFormatType_32BGRA
|
||||
case "native":
|
||||
pixelFormatType = defaultFormat
|
||||
default:
|
||||
invokeOnError(.parameter(.invalid(unionName: "pixelFormat", receivedValue: pixelFormat)))
|
||||
}
|
||||
videoOutput!.videoSettings = [
|
||||
String(kCVPixelBufferPixelFormatTypeKey): kCVPixelFormatType_32BGRA, // default: kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange
|
||||
String(kCVPixelBufferPixelFormatTypeKey): pixelFormatType,
|
||||
]
|
||||
}
|
||||
captureSession.addOutput(videoOutput!)
|
||||
@@ -134,7 +145,7 @@ extension CameraView {
|
||||
// pragma MARK: Configure Device
|
||||
|
||||
/**
|
||||
Configures the Video Device with the given FPS, HDR and ColorSpace.
|
||||
Configures the Video Device with the given FPS and HDR modes.
|
||||
*/
|
||||
final func configureDevice() {
|
||||
ReactLogger.log(level: .info, message: "Configuring Device...")
|
||||
@@ -182,14 +193,6 @@ extension CameraView {
|
||||
device.automaticallyEnablesLowLightBoostWhenAvailable = lowLightBoost!.boolValue
|
||||
}
|
||||
}
|
||||
if let colorSpace = colorSpace as String? {
|
||||
guard let avColorSpace = try? AVCaptureColorSpace(string: colorSpace),
|
||||
device.activeFormat.supportedColorSpaces.contains(avColorSpace) else {
|
||||
invokeOnError(.format(.invalidColorSpace(colorSpace: colorSpace)))
|
||||
return
|
||||
}
|
||||
device.activeColorSpace = avColorSpace
|
||||
}
|
||||
|
||||
device.unlockForConfiguration()
|
||||
ReactLogger.log(level: .info, message: "Device successfully configured!")
|
||||
|
@@ -21,24 +21,30 @@ extension CameraView {
|
||||
#endif
|
||||
|
||||
public func setupPreviewView() {
|
||||
if previewType == "skia" {
|
||||
switch previewType {
|
||||
case "none":
|
||||
previewView?.removeFromSuperview()
|
||||
previewView = nil
|
||||
case "native":
|
||||
// Normal iOS PreviewView is lighter and more performant (YUV Format, GPU only)
|
||||
if previewView is NativePreviewView { return }
|
||||
previewView?.removeFromSuperview()
|
||||
previewView = NativePreviewView(frame: frame, session: captureSession)
|
||||
addSubview(previewView!)
|
||||
case "skia":
|
||||
// Skia Preview View allows user to draw onto a Frame in a Frame Processor
|
||||
#if VISION_CAMERA_ENABLE_SKIA
|
||||
if previewView is SkiaPreviewView { return }
|
||||
previewView?.removeFromSuperview()
|
||||
previewView = SkiaPreviewView(frame: frame, skiaRenderer: getSkiaRenderer())
|
||||
addSubview(previewView!)
|
||||
#else
|
||||
invokeOnError(.system(.skiaUnavailable))
|
||||
return
|
||||
#endif
|
||||
} else {
|
||||
// Normal iOS PreviewView is lighter and more performant (YUV Format, GPU only)
|
||||
if previewView is NativePreviewView { return }
|
||||
previewView?.removeFromSuperview()
|
||||
previewView = NativePreviewView(frame: frame, session: captureSession)
|
||||
default:
|
||||
invokeOnError(.parameter(.invalid(unionName: "previewType", receivedValue: previewType as String)))
|
||||
}
|
||||
|
||||
addSubview(previewView!)
|
||||
}
|
||||
|
||||
internal func setupFpsGraph() {
|
||||
|
@@ -24,26 +24,12 @@ extension CameraView {
|
||||
|
||||
ReactLogger.log(level: .info, message: "Capturing photo...")
|
||||
|
||||
var format: [String: Any]?
|
||||
// photo codec
|
||||
if let photoCodecString = options["photoCodec"] as? String {
|
||||
guard let photoCodec = AVVideoCodecType(withString: photoCodecString) else {
|
||||
promise.reject(error: .parameter(.invalid(unionName: "PhotoCodec", receivedValue: photoCodecString)))
|
||||
return
|
||||
}
|
||||
if photoOutput.availablePhotoCodecTypes.contains(photoCodec) {
|
||||
format = [AVVideoCodecKey: photoCodec]
|
||||
} else {
|
||||
promise.reject(error: .capture(.invalidPhotoCodec))
|
||||
return
|
||||
}
|
||||
}
|
||||
|
||||
// Create photo settings
|
||||
let photoSettings = AVCapturePhotoSettings(format: format)
|
||||
let photoSettings = AVCapturePhotoSettings()
|
||||
|
||||
// default, overridable settings if high quality capture was enabled
|
||||
if self.enableHighQualityPhotos?.boolValue == true {
|
||||
// TODO: On iOS 16+ this will be removed in favor of maxPhotoDimensions.
|
||||
photoSettings.isHighResolutionPhotoEnabled = true
|
||||
if #available(iOS 13.0, *) {
|
||||
photoSettings.photoQualityPrioritization = .quality
|
||||
|
@@ -32,7 +32,7 @@ extension CameraView {
|
||||
return
|
||||
} else {
|
||||
// torch mode is .auto or .on, but no torch is available.
|
||||
invokeOnError(.device(.torchUnavailable))
|
||||
invokeOnError(.device(.flashUnavailable))
|
||||
return
|
||||
}
|
||||
}
|
||||
|
@@ -26,11 +26,11 @@ private let propsThatRequireReconfiguration = ["cameraId",
|
||||
"photo",
|
||||
"video",
|
||||
"enableFrameProcessor",
|
||||
"pixelFormat",
|
||||
"previewType"]
|
||||
private let propsThatRequireDeviceReconfiguration = ["fps",
|
||||
"hdr",
|
||||
"lowLightBoost",
|
||||
"colorSpace"]
|
||||
"lowLightBoost"]
|
||||
|
||||
// MARK: - CameraView
|
||||
|
||||
@@ -46,12 +46,12 @@ public final class CameraView: UIView {
|
||||
@objc var video: NSNumber? // nullable bool
|
||||
@objc var audio: NSNumber? // nullable bool
|
||||
@objc var enableFrameProcessor = false
|
||||
@objc var pixelFormat: NSString?
|
||||
// props that require format reconfiguring
|
||||
@objc var format: NSDictionary?
|
||||
@objc var fps: NSNumber?
|
||||
@objc var hdr: NSNumber? // nullable bool
|
||||
@objc var lowLightBoost: NSNumber? // nullable bool
|
||||
@objc var colorSpace: NSString?
|
||||
@objc var orientation: NSString?
|
||||
// other props
|
||||
@objc var isActive = false
|
||||
@@ -59,7 +59,7 @@ public final class CameraView: UIView {
|
||||
@objc var zoom: NSNumber = 1.0 // in "factor"
|
||||
@objc var enableFpsGraph = false
|
||||
@objc var videoStabilizationMode: NSString?
|
||||
@objc var previewType: NSString?
|
||||
@objc var previewType: NSString = "none"
|
||||
// events
|
||||
@objc var onInitialized: RCTDirectEventBlock?
|
||||
@objc var onError: RCTDirectEventBlock?
|
||||
|
@@ -37,7 +37,6 @@ RCT_EXPORT_VIEW_PROPERTY(format, NSDictionary);
|
||||
RCT_EXPORT_VIEW_PROPERTY(fps, NSNumber);
|
||||
RCT_EXPORT_VIEW_PROPERTY(hdr, NSNumber); // nullable bool
|
||||
RCT_EXPORT_VIEW_PROPERTY(lowLightBoost, NSNumber); // nullable bool
|
||||
RCT_EXPORT_VIEW_PROPERTY(colorSpace, NSString);
|
||||
RCT_EXPORT_VIEW_PROPERTY(videoStabilizationMode, NSString);
|
||||
// other props
|
||||
RCT_EXPORT_VIEW_PROPERTY(torch, NSString);
|
||||
@@ -61,6 +60,5 @@ RCT_EXTERN_METHOD(focus:(nonnull NSNumber *)node point:(NSDictionary *)point res
|
||||
|
||||
// Static Methods
|
||||
RCT_EXTERN__BLOCKING_SYNCHRONOUS_METHOD(installFrameProcessorBindings);
|
||||
RCT_EXTERN_METHOD(getAvailableVideoCodecs:(nonnull NSNumber *)node fileType:(NSString *)fileType resolve:(RCTPromiseResolveBlock)resolve reject:(RCTPromiseRejectBlock)reject);
|
||||
|
||||
@end
|
||||
|
@@ -79,26 +79,6 @@ final class CameraViewManager: RCTViewManager {
|
||||
component.focus(point: CGPoint(x: x.doubleValue, y: y.doubleValue), promise: promise)
|
||||
}
|
||||
|
||||
@objc
|
||||
final func getAvailableVideoCodecs(_ node: NSNumber, fileType: String?, resolve: @escaping RCTPromiseResolveBlock, reject: @escaping RCTPromiseRejectBlock) {
|
||||
withPromise(resolve: resolve, reject: reject) {
|
||||
let component = getCameraView(withTag: node)
|
||||
guard let videoOutput = component.videoOutput else {
|
||||
throw CameraError.session(SessionError.cameraNotReady)
|
||||
}
|
||||
|
||||
var parsedFileType = AVFileType.mov
|
||||
if fileType != nil {
|
||||
guard let parsed = try? AVFileType(withString: fileType!) else {
|
||||
throw CameraError.parameter(ParameterError.invalid(unionName: "fileType", receivedValue: fileType!))
|
||||
}
|
||||
parsedFileType = parsed
|
||||
}
|
||||
|
||||
return videoOutput.availableVideoCodecTypesForAssetWriter(writingTo: parsedFileType).map(\.descriptor)
|
||||
}
|
||||
}
|
||||
|
||||
@objc
|
||||
final func getAvailableCameraDevices(_ resolve: @escaping RCTPromiseResolveBlock, reject: @escaping RCTPromiseRejectBlock) {
|
||||
withPromise(resolve: resolve, reject: reject) {
|
||||
@@ -117,11 +97,11 @@ final class CameraViewManager: RCTViewManager {
|
||||
"neutralZoom": $0.neutralZoomFactor,
|
||||
"maxZoom": $0.maxAvailableVideoZoomFactor,
|
||||
"isMultiCam": $0.isMultiCam,
|
||||
"supportsParallelVideoProcessing": true,
|
||||
"supportsDepthCapture": false, // TODO: supportsDepthCapture
|
||||
"supportsRawCapture": false, // TODO: supportsRawCapture
|
||||
"supportsLowLightBoost": $0.isLowLightBoostSupported,
|
||||
"supportsFocus": $0.isFocusPointOfInterestSupported,
|
||||
"hardwareLevel": "full",
|
||||
"formats": $0.formats.map { format -> [String: Any] in
|
||||
format.toDictionary()
|
||||
},
|
||||
|
@@ -30,11 +30,8 @@ extension AVCaptureDevice.Format {
|
||||
}
|
||||
|
||||
// compare max fps
|
||||
if let leftMaxFps = videoSupportedFrameRateRanges.max(by: { $0.maxFrameRate > $1.maxFrameRate }),
|
||||
let rightMaxFps = other.videoSupportedFrameRateRanges.max(by: { $0.maxFrameRate > $1.maxFrameRate }) {
|
||||
if leftMaxFps.maxFrameRate > rightMaxFps.maxFrameRate {
|
||||
return true
|
||||
}
|
||||
if maxFrameRate > other.maxFrameRate {
|
||||
return true
|
||||
}
|
||||
|
||||
return false
|
||||
|
@@ -50,29 +50,17 @@ extension AVCaptureDevice.Format {
|
||||
}
|
||||
}
|
||||
if let maxZoom = filter.value(forKey: "maxZoom") as? NSNumber {
|
||||
if videoMaxZoomFactor != CGFloat(maxZoom.floatValue) {
|
||||
if videoMaxZoomFactor != CGFloat(maxZoom.doubleValue) {
|
||||
return false
|
||||
}
|
||||
}
|
||||
if let colorSpaces = filter.value(forKey: "colorSpaces") as? [String] {
|
||||
let avColorSpaces = colorSpaces.map { try? AVCaptureColorSpace(string: $0) }
|
||||
let allColorSpacesIncluded = supportedColorSpaces.allSatisfy { avColorSpaces.contains($0) }
|
||||
if !allColorSpacesIncluded {
|
||||
if let minFps = filter.value(forKey: "minFps") as? NSNumber {
|
||||
if minFrameRate != Float64(minFps.doubleValue) {
|
||||
return false
|
||||
}
|
||||
}
|
||||
if let frameRateRanges = filter.value(forKey: "frameRateRanges") as? [NSDictionary] {
|
||||
let allFrameRateRangesIncluded = videoSupportedFrameRateRanges.allSatisfy { range -> Bool in
|
||||
frameRateRanges.contains { dict -> Bool in
|
||||
guard let max = dict.value(forKey: "maxFrameRate") as? NSNumber,
|
||||
let min = dict.value(forKey: "minFrameRate") as? NSNumber
|
||||
else {
|
||||
return false
|
||||
}
|
||||
return range.maxFrameRate == max.doubleValue && range.minFrameRate == min.doubleValue
|
||||
}
|
||||
}
|
||||
if !allFrameRateRangesIncluded {
|
||||
if let maxFps = filter.value(forKey: "maxFps") as? NSNumber {
|
||||
if maxFrameRate != Float64(maxFps.doubleValue) {
|
||||
return false
|
||||
}
|
||||
}
|
||||
@@ -90,14 +78,6 @@ extension AVCaptureDevice.Format {
|
||||
}
|
||||
}
|
||||
|
||||
if #available(iOS 13.0, *) {
|
||||
if let isHighestPhotoQualitySupported = filter.value(forKey: "isHighestPhotoQualitySupported") as? Bool {
|
||||
if self.isHighestPhotoQualitySupported != isHighestPhotoQualitySupported {
|
||||
return false
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return true
|
||||
}
|
||||
}
|
||||
|
@@ -21,7 +21,24 @@ extension AVCaptureDevice.Format {
|
||||
return getAllVideoStabilizationModes().filter { self.isVideoStabilizationModeSupported($0) }
|
||||
}
|
||||
|
||||
var minFrameRate: Float64 {
|
||||
let maxRange = videoSupportedFrameRateRanges.max { l, r in
|
||||
return l.maxFrameRate < r.maxFrameRate
|
||||
}
|
||||
return maxRange?.maxFrameRate ?? 0
|
||||
}
|
||||
|
||||
var maxFrameRate: Float64 {
|
||||
let maxRange = videoSupportedFrameRateRanges.max { l, r in
|
||||
return l.maxFrameRate < r.maxFrameRate
|
||||
}
|
||||
return maxRange?.maxFrameRate ?? 0
|
||||
}
|
||||
|
||||
func toDictionary() -> [String: Any] {
|
||||
let mediaSubType = CMFormatDescriptionGetMediaSubType(formatDescription)
|
||||
let pixelFormat = PixelFormat(mediaSubType: mediaSubType)
|
||||
|
||||
var dict: [String: Any] = [
|
||||
"videoStabilizationModes": videoStabilizationModes.map(\.descriptor),
|
||||
"autoFocusSystem": autoFocusSystem.descriptor,
|
||||
@@ -33,22 +50,13 @@ extension AVCaptureDevice.Format {
|
||||
"minISO": minISO,
|
||||
"fieldOfView": videoFieldOfView,
|
||||
"maxZoom": videoMaxZoomFactor,
|
||||
"colorSpaces": supportedColorSpaces.map(\.descriptor),
|
||||
"supportsVideoHDR": isVideoHDRSupported,
|
||||
"supportsPhotoHDR": false,
|
||||
"frameRateRanges": videoSupportedFrameRateRanges.map {
|
||||
[
|
||||
"minFrameRate": $0.minFrameRate,
|
||||
"maxFrameRate": $0.maxFrameRate,
|
||||
]
|
||||
},
|
||||
"pixelFormat": CMFormatDescriptionGetMediaSubType(formatDescription).toString(),
|
||||
"minFps": minFrameRate,
|
||||
"maxFps": maxFrameRate,
|
||||
"pixelFormats": [pixelFormat.unionValue],
|
||||
]
|
||||
|
||||
if #available(iOS 13.0, *) {
|
||||
dict["isHighestPhotoQualitySupported"] = self.isHighestPhotoQualitySupported
|
||||
}
|
||||
|
||||
return dict
|
||||
}
|
||||
}
|
||||
|
@@ -23,6 +23,7 @@ std::vector<jsi::PropNameID> FrameHostObject::getPropertyNames(jsi::Runtime& rt)
|
||||
result.push_back(jsi::PropNameID::forUtf8(rt, std::string("isMirrored")));
|
||||
result.push_back(jsi::PropNameID::forUtf8(rt, std::string("timestamp")));
|
||||
result.push_back(jsi::PropNameID::forUtf8(rt, std::string("isDrawable")));
|
||||
result.push_back(jsi::PropNameID::forUtf8(rt, std::string("pixelFormat")));
|
||||
// Conversion
|
||||
result.push_back(jsi::PropNameID::forUtf8(rt, std::string("toString")));
|
||||
result.push_back(jsi::PropNameID::forUtf8(rt, std::string("toArrayBuffer")));
|
||||
@@ -126,13 +127,13 @@ jsi::Value FrameHostObject::get(jsi::Runtime& runtime, const jsi::PropNameID& pr
|
||||
return jsi::String::createFromUtf8(runtime, "portrait");
|
||||
case UIImageOrientationDown:
|
||||
case UIImageOrientationDownMirrored:
|
||||
return jsi::String::createFromUtf8(runtime, "portraitUpsideDown");
|
||||
return jsi::String::createFromUtf8(runtime, "portrait-upside-down");
|
||||
case UIImageOrientationLeft:
|
||||
case UIImageOrientationLeftMirrored:
|
||||
return jsi::String::createFromUtf8(runtime, "landscapeLeft");
|
||||
return jsi::String::createFromUtf8(runtime, "landscape-left");
|
||||
case UIImageOrientationRight:
|
||||
case UIImageOrientationRightMirrored:
|
||||
return jsi::String::createFromUtf8(runtime, "landscapeRight");
|
||||
return jsi::String::createFromUtf8(runtime, "landscape-right");
|
||||
}
|
||||
}
|
||||
if (name == "isMirrored") {
|
||||
@@ -154,6 +155,19 @@ jsi::Value FrameHostObject::get(jsi::Runtime& runtime, const jsi::PropNameID& pr
|
||||
auto seconds = static_cast<double>(CMTimeGetSeconds(timestamp));
|
||||
return jsi::Value(seconds * 1000.0);
|
||||
}
|
||||
if (name == "pixelFormat") {
|
||||
auto format = CMSampleBufferGetFormatDescription(frame.buffer);
|
||||
auto mediaType = CMFormatDescriptionGetMediaSubType(format);
|
||||
switch (mediaType) {
|
||||
case kCVPixelFormatType_32BGRA:
|
||||
return jsi::String::createFromUtf8(runtime, "rgb");
|
||||
case kCVPixelFormatType_420YpCbCr8BiPlanarFullRange:
|
||||
case kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange:
|
||||
return jsi::String::createFromUtf8(runtime, "yuv");
|
||||
default:
|
||||
return jsi::String::createFromUtf8(runtime, "unknown");
|
||||
}
|
||||
}
|
||||
if (name == "bytesPerRow") {
|
||||
auto imageBuffer = CMSampleBufferGetImageBuffer(frame.buffer);
|
||||
auto bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
|
||||
|
@@ -12,7 +12,7 @@ extension AVAuthorizationStatus {
|
||||
var descriptor: String {
|
||||
switch self {
|
||||
case .authorized:
|
||||
return "authorized"
|
||||
return "granted"
|
||||
case .denied:
|
||||
return "denied"
|
||||
case .notDetermined:
|
||||
|
@@ -1,44 +0,0 @@
|
||||
//
|
||||
// AVCaptureColorSpace+descriptor.swift
|
||||
// mrousavy
|
||||
//
|
||||
// Created by Marc Rousavy on 19.12.20.
|
||||
// Copyright © 2020 mrousavy. All rights reserved.
|
||||
//
|
||||
|
||||
import AVFoundation
|
||||
|
||||
extension AVCaptureColorSpace {
|
||||
init(string: String) throws {
|
||||
switch string {
|
||||
case "hlg-bt2020":
|
||||
if #available(iOS 14.1, *) {
|
||||
self = .HLG_BT2020
|
||||
} else {
|
||||
throw EnumParserError.unsupportedOS(supportedOnOS: "14.1")
|
||||
}
|
||||
return
|
||||
case "p3-d65":
|
||||
self = .P3_D65
|
||||
return
|
||||
case "srgb":
|
||||
self = .sRGB
|
||||
return
|
||||
default:
|
||||
throw EnumParserError.invalidValue
|
||||
}
|
||||
}
|
||||
|
||||
var descriptor: String {
|
||||
switch self {
|
||||
case .HLG_BT2020:
|
||||
return "hlg-bt2020"
|
||||
case .P3_D65:
|
||||
return "p3-d65"
|
||||
case .sRGB:
|
||||
return "srgb"
|
||||
default:
|
||||
fatalError("AVCaptureDevice.Position has unknown state.")
|
||||
}
|
||||
}
|
||||
}
|
@@ -15,79 +15,11 @@ extension AVVideoCodecType {
|
||||
case "h264":
|
||||
self = .h264
|
||||
return
|
||||
case "hevc":
|
||||
case "h265":
|
||||
self = .hevc
|
||||
return
|
||||
case "hevc-alpha":
|
||||
if #available(iOS 13.0, *) {
|
||||
self = .hevcWithAlpha
|
||||
return
|
||||
} else {
|
||||
return nil
|
||||
}
|
||||
case "jpeg":
|
||||
self = .jpeg
|
||||
return
|
||||
case "pro-res-422":
|
||||
self = .proRes422
|
||||
return
|
||||
case "pro-res-422-hq":
|
||||
if #available(iOS 13.0, *) {
|
||||
self = .proRes422HQ
|
||||
return
|
||||
} else {
|
||||
return nil
|
||||
}
|
||||
case "pro-res-422-lt":
|
||||
if #available(iOS 13.0, *) {
|
||||
self = .proRes422LT
|
||||
return
|
||||
} else {
|
||||
return nil
|
||||
}
|
||||
case "pro-res-422-proxy":
|
||||
if #available(iOS 13.0, *) {
|
||||
self = .proRes422Proxy
|
||||
return
|
||||
} else {
|
||||
return nil
|
||||
}
|
||||
case "pro-res-4444":
|
||||
self = .proRes4444
|
||||
return
|
||||
default:
|
||||
return nil
|
||||
}
|
||||
}
|
||||
|
||||
var descriptor: String {
|
||||
if #available(iOS 13.0, *) {
|
||||
switch self {
|
||||
case .hevcWithAlpha:
|
||||
return "hevc-alpha"
|
||||
case .proRes422HQ:
|
||||
return "pro-res-422-hq"
|
||||
case .proRes422LT:
|
||||
return "pro-res-422-lt"
|
||||
case .proRes422Proxy:
|
||||
return "pro-res-422-proxy"
|
||||
default:
|
||||
break
|
||||
}
|
||||
}
|
||||
switch self {
|
||||
case .h264:
|
||||
return "h264"
|
||||
case .hevc:
|
||||
return "hevc"
|
||||
case .jpeg:
|
||||
return "jpeg"
|
||||
case .proRes422:
|
||||
return "pro-res-422"
|
||||
case .proRes4444:
|
||||
return "pro-res-4444"
|
||||
default:
|
||||
fatalError("AVVideoCodecType has unknown state.")
|
||||
}
|
||||
}
|
||||
}
|
||||
|
63
ios/Parsers/PixelFormat.swift
Normal file
63
ios/Parsers/PixelFormat.swift
Normal file
@@ -0,0 +1,63 @@
|
||||
//
|
||||
// PixelFormat.swift
|
||||
// VisionCamera
|
||||
//
|
||||
// Created by Marc Rousavy on 17.08.23.
|
||||
// Copyright © 2023 mrousavy. All rights reserved.
|
||||
//
|
||||
|
||||
import AVFoundation
|
||||
import Foundation
|
||||
|
||||
enum PixelFormat {
|
||||
case yuv
|
||||
case rgb
|
||||
case dng
|
||||
case native
|
||||
case unknown
|
||||
|
||||
var unionValue: String {
|
||||
switch self {
|
||||
case .yuv:
|
||||
return "yuv"
|
||||
case .rgb:
|
||||
return "rgb"
|
||||
case .dng:
|
||||
return "dng"
|
||||
case .native:
|
||||
return "native"
|
||||
case .unknown:
|
||||
return "unknown"
|
||||
}
|
||||
}
|
||||
|
||||
init(unionValue: String) throws {
|
||||
switch unionValue {
|
||||
case "yuv":
|
||||
self = .yuv
|
||||
case "rgb":
|
||||
self = .rgb
|
||||
case "dng":
|
||||
self = .dng
|
||||
case "native":
|
||||
self = .native
|
||||
case "unknown":
|
||||
self = .unknown
|
||||
default:
|
||||
throw CameraError.parameter(.invalid(unionName: "pixelFormat", receivedValue: unionValue))
|
||||
}
|
||||
}
|
||||
|
||||
init(mediaSubType: OSType) {
|
||||
switch mediaSubType {
|
||||
case kCVPixelFormatType_420YpCbCr8BiPlanarFullRange:
|
||||
self = .yuv
|
||||
case kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange:
|
||||
self = .yuv
|
||||
case kCVPixelFormatType_32BGRA:
|
||||
self = .rgb
|
||||
default:
|
||||
self = .unknown
|
||||
}
|
||||
}
|
||||
}
|
@@ -15,13 +15,13 @@ extension UIInterfaceOrientation {
|
||||
case "portrait":
|
||||
self = .portrait
|
||||
return
|
||||
case "portraitUpsideDown":
|
||||
case "portrait-upside-down":
|
||||
self = .portraitUpsideDown
|
||||
return
|
||||
case "landscapeLeft":
|
||||
case "landscape-left":
|
||||
self = .landscapeLeft
|
||||
return
|
||||
case "landscapeRight":
|
||||
case "landscape-right":
|
||||
self = .landscapeRight
|
||||
return
|
||||
default:
|
||||
|
@@ -48,11 +48,16 @@ class PhotoCaptureDelegate: NSObject, AVCapturePhotoCaptureDelegate {
|
||||
let exif = photo.metadata["{Exif}"] as? [String: Any]
|
||||
let width = exif?["PixelXDimension"]
|
||||
let height = exif?["PixelYDimension"]
|
||||
let exifOrientation = photo.metadata[kCGImagePropertyOrientation as String] as? Int ?? 0
|
||||
let orientation = getOrientation(forExifOrientation: exifOrientation)
|
||||
let isMirrored = getIsMirrored(forExifOrientation: exifOrientation)
|
||||
|
||||
promise.resolve([
|
||||
"path": tempFilePath,
|
||||
"width": width as Any,
|
||||
"height": height as Any,
|
||||
"orientation": orientation,
|
||||
"isMirrored": isMirrored,
|
||||
"isRawPhoto": photo.isRawPhoto,
|
||||
"metadata": photo.metadata,
|
||||
"thumbnail": photo.embeddedThumbnailPhotoFormat as Any,
|
||||
@@ -71,4 +76,28 @@ class PhotoCaptureDelegate: NSObject, AVCapturePhotoCaptureDelegate {
|
||||
return
|
||||
}
|
||||
}
|
||||
|
||||
private func getOrientation(forExifOrientation exifOrientation: Int) -> String {
|
||||
switch exifOrientation {
|
||||
case 1, 2:
|
||||
return "portrait"
|
||||
case 3, 4:
|
||||
return "portrait-upside-down"
|
||||
case 5, 6:
|
||||
return "landscape-left"
|
||||
case 7, 8:
|
||||
return "landscape-right"
|
||||
default:
|
||||
return "portrait"
|
||||
}
|
||||
}
|
||||
|
||||
private func getIsMirrored(forExifOrientation exifOrientation: Int) -> Bool {
|
||||
switch exifOrientation {
|
||||
case 2, 4, 5, 7:
|
||||
return true
|
||||
default:
|
||||
return false
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@@ -16,7 +16,7 @@
|
||||
#import <include/core/SkImage.h>
|
||||
#import <include/gpu/ganesh/SkImageGanesh.h>
|
||||
#import <include/gpu/mtl/GrMtlTypes.h>
|
||||
#import <include/gpu/GrRecordingContext.h>
|
||||
#import <include/gpu/GrBackendSurface.h>
|
||||
|
||||
#include <TargetConditionals.h>
|
||||
#if TARGET_RT_BIG_ENDIAN
|
||||
|
@@ -16,8 +16,12 @@
|
||||
#import <include/core/SkSurface.h>
|
||||
#import <include/core/SkCanvas.h>
|
||||
#import <include/core/SkColorSpace.h>
|
||||
#import <include/gpu/ganesh/SkImageGanesh.h>
|
||||
#import <include/gpu/GrDirectContext.h>
|
||||
|
||||
#import <include/gpu/mtl/GrMtlTypes.h>
|
||||
#import <include/gpu/GrBackendSurface.h>
|
||||
#import <include/gpu/ganesh/SkSurfaceGanesh.h>
|
||||
#import <include/gpu/ganesh/mtl/SkSurfaceMetal.h>
|
||||
|
||||
#import "SkImageHelpers.h"
|
||||
|
||||
#import <system_error>
|
||||
@@ -83,6 +87,7 @@
|
||||
height:CVPixelBufferGetHeight(pixelBuffer)];
|
||||
|
||||
// Get & Lock the writeable Texture from the Metal Drawable
|
||||
|
||||
GrMtlTextureInfo textureInfo;
|
||||
textureInfo.fTexture.retain((__bridge void*)texture);
|
||||
GrBackendRenderTarget backendRenderTarget((int)texture.width,
|
||||
@@ -93,12 +98,12 @@
|
||||
auto context = _offscreenContext->skiaContext.get();
|
||||
|
||||
// Create a Skia Surface from the writable Texture
|
||||
auto surface = SkSurface::MakeFromBackendRenderTarget(context,
|
||||
backendRenderTarget,
|
||||
kTopLeft_GrSurfaceOrigin,
|
||||
kBGRA_8888_SkColorType,
|
||||
SkColorSpace::MakeSRGB(),
|
||||
nullptr);
|
||||
auto surface = SkSurfaces::WrapBackendRenderTarget(context,
|
||||
backendRenderTarget,
|
||||
kTopLeft_GrSurfaceOrigin,
|
||||
kBGRA_8888_SkColorType,
|
||||
SkColorSpace::MakeSRGB(),
|
||||
nullptr);
|
||||
|
||||
if (surface == nullptr || surface->getCanvas() == nullptr) {
|
||||
throw std::runtime_error("Skia surface could not be created from parameters.");
|
||||
@@ -143,14 +148,14 @@
|
||||
|
||||
// Create a Skia Surface from the CAMetalLayer (use to draw to the View)
|
||||
GrMTLHandle drawableHandle;
|
||||
auto surface = SkSurface::MakeFromCAMetalLayer(context,
|
||||
(__bridge GrMTLHandle)layer,
|
||||
kTopLeft_GrSurfaceOrigin,
|
||||
1,
|
||||
kBGRA_8888_SkColorType,
|
||||
nullptr,
|
||||
nullptr,
|
||||
&drawableHandle);
|
||||
auto surface = SkSurfaces::WrapCAMetalLayer(context,
|
||||
(__bridge GrMTLHandle)layer,
|
||||
kTopLeft_GrSurfaceOrigin,
|
||||
1,
|
||||
kBGRA_8888_SkColorType,
|
||||
nullptr,
|
||||
nullptr,
|
||||
&drawableHandle);
|
||||
if (surface == nullptr || surface->getCanvas() == nullptr) {
|
||||
throw std::runtime_error("Skia surface could not be created from parameters.");
|
||||
}
|
||||
|
@@ -21,6 +21,7 @@
|
||||
B86DC971260E2D5200FB17B2 /* AVAudioSession+trySetAllowHaptics.swift in Sources */ = {isa = PBXBuildFile; fileRef = B86DC970260E2D5200FB17B2 /* AVAudioSession+trySetAllowHaptics.swift */; };
|
||||
B86DC974260E310600FB17B2 /* CameraView+AVAudioSession.swift in Sources */ = {isa = PBXBuildFile; fileRef = B86DC973260E310600FB17B2 /* CameraView+AVAudioSession.swift */; };
|
||||
B86DC977260E315100FB17B2 /* CameraView+AVCaptureSession.swift in Sources */ = {isa = PBXBuildFile; fileRef = B86DC976260E315100FB17B2 /* CameraView+AVCaptureSession.swift */; };
|
||||
B87B11BF2A8E63B700732EBF /* PixelFormat.swift in Sources */ = {isa = PBXBuildFile; fileRef = B87B11BE2A8E63B700732EBF /* PixelFormat.swift */; };
|
||||
B882721026AEB1A100B14107 /* AVCaptureConnection+setInterfaceOrientation.swift in Sources */ = {isa = PBXBuildFile; fileRef = B882720F26AEB1A100B14107 /* AVCaptureConnection+setInterfaceOrientation.swift */; };
|
||||
B887518525E0102000DB86D6 /* PhotoCaptureDelegate.swift in Sources */ = {isa = PBXBuildFile; fileRef = B887515C25E0102000DB86D6 /* PhotoCaptureDelegate.swift */; };
|
||||
B887518625E0102000DB86D6 /* CameraView+RecordVideo.swift in Sources */ = {isa = PBXBuildFile; fileRef = B887515D25E0102000DB86D6 /* CameraView+RecordVideo.swift */; };
|
||||
@@ -46,7 +47,6 @@
|
||||
B887519F25E0102000DB86D6 /* AVCaptureDevice.DeviceType+descriptor.swift in Sources */ = {isa = PBXBuildFile; fileRef = B887517A25E0102000DB86D6 /* AVCaptureDevice.DeviceType+descriptor.swift */; };
|
||||
B88751A025E0102000DB86D6 /* AVAuthorizationStatus+descriptor.swift in Sources */ = {isa = PBXBuildFile; fileRef = B887517B25E0102000DB86D6 /* AVAuthorizationStatus+descriptor.swift */; };
|
||||
B88751A125E0102000DB86D6 /* AVCaptureDevice.Position+descriptor.swift in Sources */ = {isa = PBXBuildFile; fileRef = B887517C25E0102000DB86D6 /* AVCaptureDevice.Position+descriptor.swift */; };
|
||||
B88751A225E0102000DB86D6 /* AVCaptureColorSpace+descriptor.swift in Sources */ = {isa = PBXBuildFile; fileRef = B887517D25E0102000DB86D6 /* AVCaptureColorSpace+descriptor.swift */; };
|
||||
B88751A325E0102000DB86D6 /* AVCaptureDevice.FlashMode+descriptor.swift in Sources */ = {isa = PBXBuildFile; fileRef = B887517E25E0102000DB86D6 /* AVCaptureDevice.FlashMode+descriptor.swift */; };
|
||||
B88751A425E0102000DB86D6 /* AVCaptureDevice.Format.AutoFocusSystem+descriptor.swift in Sources */ = {isa = PBXBuildFile; fileRef = B887517F25E0102000DB86D6 /* AVCaptureDevice.Format.AutoFocusSystem+descriptor.swift */; };
|
||||
B88751A525E0102000DB86D6 /* CameraView+Focus.swift in Sources */ = {isa = PBXBuildFile; fileRef = B887518025E0102000DB86D6 /* CameraView+Focus.swift */; };
|
||||
@@ -103,6 +103,7 @@
|
||||
B86DC970260E2D5200FB17B2 /* AVAudioSession+trySetAllowHaptics.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = "AVAudioSession+trySetAllowHaptics.swift"; sourceTree = "<group>"; };
|
||||
B86DC973260E310600FB17B2 /* CameraView+AVAudioSession.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = "CameraView+AVAudioSession.swift"; sourceTree = "<group>"; };
|
||||
B86DC976260E315100FB17B2 /* CameraView+AVCaptureSession.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = "CameraView+AVCaptureSession.swift"; sourceTree = "<group>"; };
|
||||
B87B11BE2A8E63B700732EBF /* PixelFormat.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = PixelFormat.swift; sourceTree = "<group>"; };
|
||||
B882720F26AEB1A100B14107 /* AVCaptureConnection+setInterfaceOrientation.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = "AVCaptureConnection+setInterfaceOrientation.swift"; sourceTree = "<group>"; };
|
||||
B887515C25E0102000DB86D6 /* PhotoCaptureDelegate.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = PhotoCaptureDelegate.swift; sourceTree = "<group>"; };
|
||||
B887515D25E0102000DB86D6 /* CameraView+RecordVideo.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = "CameraView+RecordVideo.swift"; sourceTree = "<group>"; };
|
||||
@@ -129,7 +130,6 @@
|
||||
B887517A25E0102000DB86D6 /* AVCaptureDevice.DeviceType+descriptor.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = "AVCaptureDevice.DeviceType+descriptor.swift"; sourceTree = "<group>"; };
|
||||
B887517B25E0102000DB86D6 /* AVAuthorizationStatus+descriptor.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = "AVAuthorizationStatus+descriptor.swift"; sourceTree = "<group>"; };
|
||||
B887517C25E0102000DB86D6 /* AVCaptureDevice.Position+descriptor.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = "AVCaptureDevice.Position+descriptor.swift"; sourceTree = "<group>"; };
|
||||
B887517D25E0102000DB86D6 /* AVCaptureColorSpace+descriptor.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = "AVCaptureColorSpace+descriptor.swift"; sourceTree = "<group>"; };
|
||||
B887517E25E0102000DB86D6 /* AVCaptureDevice.FlashMode+descriptor.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = "AVCaptureDevice.FlashMode+descriptor.swift"; sourceTree = "<group>"; };
|
||||
B887517F25E0102000DB86D6 /* AVCaptureDevice.Format.AutoFocusSystem+descriptor.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = "AVCaptureDevice.Format.AutoFocusSystem+descriptor.swift"; sourceTree = "<group>"; };
|
||||
B887518025E0102000DB86D6 /* CameraView+Focus.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = "CameraView+Focus.swift"; sourceTree = "<group>"; };
|
||||
@@ -255,11 +255,11 @@
|
||||
B887517A25E0102000DB86D6 /* AVCaptureDevice.DeviceType+descriptor.swift */,
|
||||
B887517B25E0102000DB86D6 /* AVAuthorizationStatus+descriptor.swift */,
|
||||
B887517C25E0102000DB86D6 /* AVCaptureDevice.Position+descriptor.swift */,
|
||||
B887517D25E0102000DB86D6 /* AVCaptureColorSpace+descriptor.swift */,
|
||||
B887517E25E0102000DB86D6 /* AVCaptureDevice.FlashMode+descriptor.swift */,
|
||||
B887517F25E0102000DB86D6 /* AVCaptureDevice.Format.AutoFocusSystem+descriptor.swift */,
|
||||
B8DB3BCB263DC97E004C18D7 /* AVFileType+descriptor.swift */,
|
||||
B864004F27849A2400E9D2CA /* UIInterfaceOrientation+descriptor.swift */,
|
||||
B87B11BE2A8E63B700732EBF /* PixelFormat.swift */,
|
||||
);
|
||||
path = Parsers;
|
||||
sourceTree = "<group>";
|
||||
@@ -407,7 +407,6 @@
|
||||
B887518625E0102000DB86D6 /* CameraView+RecordVideo.swift in Sources */,
|
||||
B81BE1BF26B936FF002696CC /* AVCaptureDevice.Format+videoDimensions.swift in Sources */,
|
||||
B8DB3BCA263DC4D8004C18D7 /* RecordingSession.swift in Sources */,
|
||||
B88751A225E0102000DB86D6 /* AVCaptureColorSpace+descriptor.swift in Sources */,
|
||||
B83D5EE729377117000AFD2F /* NativePreviewView.swift in Sources */,
|
||||
B887518925E0102000DB86D6 /* Collection+safe.swift in Sources */,
|
||||
B887519125E0102000DB86D6 /* AVCaptureDevice.Format+toDictionary.swift in Sources */,
|
||||
@@ -443,6 +442,7 @@
|
||||
B887519A25E0102000DB86D6 /* AVVideoCodecType+descriptor.swift in Sources */,
|
||||
B88751A825E0102000DB86D6 /* CameraError.swift in Sources */,
|
||||
B85F7AE92A77BB680089C539 /* FrameProcessorPlugin.m in Sources */,
|
||||
B87B11BF2A8E63B700732EBF /* PixelFormat.swift in Sources */,
|
||||
B88751A625E0102000DB86D6 /* CameraViewManager.swift in Sources */,
|
||||
B887519F25E0102000DB86D6 /* AVCaptureDevice.DeviceType+descriptor.swift in Sources */,
|
||||
B8D22CDC2642DB4D00234472 /* AVAssetWriterInputPixelBufferAdaptor+initWithVideoSettings.swift in Sources */,
|
||||
|
Reference in New Issue
Block a user