37a3548a81
* Nuke CameraX * fix: Run View Finder on UI Thread * Open Camera, set up Threads * fix init * Mirror if needed * Try PreviewView * Use max resolution * Add `hardwareLevel` property * Check if output type is supported * Replace `frameRateRanges` with `minFps` and `maxFps` * Remove `isHighestPhotoQualitySupported` * Remove `colorSpace` The native platforms will use the best / most accurate colorSpace by default anyways. * HDR * Check from format * fix * Remove `supportsParallelVideoProcessing` * Correctly return video/photo sizes on Android now. Finally * Log all Device props * Log if optimized usecase is used * Cleanup * Configure Camera Input only once * Revert "Configure Camera Input only once" This reverts commit 0fd6c03f54c7566cb5592053720c4a8743aba92e. * Extract Camera configuration * Try to reconfigure all * Hook based * Properly set up `CameraSession` * Delete unused * fix: Fix recreate when outputs change * Update NativePreviewView.kt * Use callback for closing * Catch CameraAccessException * Finally got it stable * Remove isMirrored * Implement `takePhoto()` * Add ExifInterface library * Run findViewById on UI Thread * Add Photo Output Surface to takePhoto * Fix Video Stabilization Modes * Optimize Imports * More logs * Update CameraSession.kt * Close Image * Use separate Executor in CameraQueue * Delete hooks * Use same Thread again * If opened, call error * Update CameraSession.kt * Log HW level * fix: Don't enable Stream Use Case if it's not 100% supported * Move some stuff * Cleanup PhotoOutputSynchronizer * Try just open in suspend fun * Some synchronization fixes * fix logs * Update CameraDevice+createCaptureSession.kt * Update CameraDevice+createCaptureSession.kt * fixes * fix: Use Snapshot Template for speed capture prio * Use PREVIEW template for repeating request * Use `TEMPLATE_RECORD` if video use-case is attached * Use `isRunning` flag * Recreate session everytime on active/inactive * Lazily get values in capture session * Stability * Rebuild session if outputs change * Set `didOutputsChange` back to false * Capture first in lock * Try * kinda fix it? idk * fix: Keep Outputs * Refactor into single method * Update CameraView.kt * Use Enums for type safety * Implement Orientation (I think) * Move RefCount management to Java (Frame) * Don't crash when dropping a Frame * Prefer Devices with higher max resolution * Prefer multi-cams * Use FastImage for Media Page * Return orientation in takePhoto() * Load orientation from EXIF Data * Add `isMirrored` props and documentation for PhotoFile * fix: Return `not-determined` on Android * Update CameraViewModule.kt * chore: Upgrade packages * fix: Fix Metro Config * Cleanup config * Properly mirror Images on save * Prepare MediaRecorder * Start/Stop MediaRecorder * Remove `takeSnapshot()` It no longer works on Android and never worked on iOS. Users could use useFrameProcessor to take a Snapshot * Use `MediaCodec` * Move to `VideoRecording` class * Cleanup Snapshot * Create `SkiaPreviewView` hybrid class * Create OpenGL context * Create `SkiaPreviewView` * Fix texture creation missing context * Draw red frame * Somehow get it working * Add Skia CMake setup * Start looping * Init OpenGL * Refactor into `SkiaRenderer` * Cleanup PreviewSize * Set up * Only re-render UI if there is a new Frame * Preview * Fix init * Try rendering Preview * Update SkiaPreviewView.kt * Log version * Try using Skia (fail) * Drawwwww!!!!!!!!!! 🎉 * Use Preview Size * Clear first * Refactor into SkiaRenderer * Add `previewType: "none"` on iOS * Simplify a lot * Draw Camera? For some reason? I have no idea anymore * Fix OpenGL errors * Got it kinda working again? * Actually draw Frame woah * Clean up code * Cleanup * Update on main * Synchronize render calls * holy shit * Update SkiaRenderer.cpp * Update SkiaRenderer.cpp * Refactor * Update SkiaRenderer.cpp * Check for `NO_INPUT_TEXTURE`^ * Post & Wait * Set input size * Add Video back again * Allow session without preview * Convert JPEG to byte[] * feat: Use `ImageReader` and use YUV Image Buffers in Skia Context (#1689) * Try to pass YUV Buffers as Pixmaps * Create pixmap! * Clean up * Render to preview * Only render if we have an output surface * Update SkiaRenderer.cpp * Fix Y+U+V sampling code * Cleanup * Fix Semaphore 0 * Use 4:2:0 YUV again idk * Update SkiaRenderer.h * Set minSdk to 26 * Set surface * Revert "Set minSdk to 26" This reverts commit c4085b7c16c628532e5c2d68cf7ed11c751d0b48. * Set previewType * feat: Video Recording with Camera2 (#1691) * Rename * Update CameraSession.kt * Use `SurfaceHolder` instead of `SurfaceView` for output * Update CameraOutputs.kt * Update CameraSession.kt * fix: Fix crash when Preview is null * Check if snapshot capture is supported * Update RecordingSession.kt * S * Use `MediaRecorder` * Make audio optional * Add Torch * Output duration * Update RecordingSession.kt * Start RecordingSession * logs * More log * Base for preparing pass-through Recording * Use `ImageWriter` to append Images to the Recording Surface * Stream PRIVATE GPU_SAMPLED_IMAGE Images * Add flags * Close session on stop * Allow customizing `videoCodec` and `fileType` * Enable Torch * Fix Torch Mode * Fix comparing outputs with hashCode * Update CameraSession.kt * Correctly pass along Frame Processor * fix: Use AUDIO_BIT_RATE of 16 * 44,1Khz * Use CAMCORDER instead of MIC microphone * Use 1 channel * fix: Use `Orientation` * Add `native` PixelFormat * Update iOS to latest Skia integration * feat: Add `pixelFormat` property to Camera * Catch error in configureSession * Fix JPEG format * Clean up best match finder * Update CameraDeviceDetails.kt * Clamp sizes by maximum CamcorderProfile size * Remove `getAvailableVideoCodecs` * chore: release 3.0.0-rc.5 * Use maximum video size of RECORD as default * Update CameraDeviceDetails.kt * Add a todo * Add JSON device to issue report * Prefer `full` devices and flash * Lock to 30 FPS on Samsung * Implement Zoom * Refactor * Format -> PixelFormat * fix: Feat `pixelFormat` -> `pixelFormats` * Update TROUBLESHOOTING.mdx * Format * fix: Implement `zoom` for Photo Capture * fix: Don't run if `isActive` is `false` * fix: Call `examplePlugin(frame)` * fix: Fix Flash * fix: Use `react-native-worklets-core`! * fix: Fix import
185 lines
7.8 KiB
Plaintext
185 lines
7.8 KiB
Plaintext
//
|
|
// FrameHostObject.m
|
|
// VisionCamera
|
|
//
|
|
// Created by Marc Rousavy on 22.03.21.
|
|
// Copyright © 2021 mrousavy. All rights reserved.
|
|
//
|
|
|
|
#import "FrameHostObject.h"
|
|
#import <Foundation/Foundation.h>
|
|
#import <jsi/jsi.h>
|
|
#import "WKTJsiHostObject.h"
|
|
|
|
#import "../../cpp/JSITypedArray.h"
|
|
|
|
std::vector<jsi::PropNameID> FrameHostObject::getPropertyNames(jsi::Runtime& rt) {
|
|
std::vector<jsi::PropNameID> result;
|
|
result.push_back(jsi::PropNameID::forUtf8(rt, std::string("width")));
|
|
result.push_back(jsi::PropNameID::forUtf8(rt, std::string("height")));
|
|
result.push_back(jsi::PropNameID::forUtf8(rt, std::string("bytesPerRow")));
|
|
result.push_back(jsi::PropNameID::forUtf8(rt, std::string("planesCount")));
|
|
result.push_back(jsi::PropNameID::forUtf8(rt, std::string("orientation")));
|
|
result.push_back(jsi::PropNameID::forUtf8(rt, std::string("isMirrored")));
|
|
result.push_back(jsi::PropNameID::forUtf8(rt, std::string("timestamp")));
|
|
result.push_back(jsi::PropNameID::forUtf8(rt, std::string("isDrawable")));
|
|
result.push_back(jsi::PropNameID::forUtf8(rt, std::string("pixelFormat")));
|
|
// Conversion
|
|
result.push_back(jsi::PropNameID::forUtf8(rt, std::string("toString")));
|
|
result.push_back(jsi::PropNameID::forUtf8(rt, std::string("toArrayBuffer")));
|
|
// Ref Management
|
|
result.push_back(jsi::PropNameID::forUtf8(rt, std::string("isValid")));
|
|
result.push_back(jsi::PropNameID::forUtf8(rt, std::string("incrementRefCount")));
|
|
result.push_back(jsi::PropNameID::forUtf8(rt, std::string("decrementRefCount")));
|
|
|
|
return result;
|
|
}
|
|
|
|
jsi::Value FrameHostObject::get(jsi::Runtime& runtime, const jsi::PropNameID& propName) {
|
|
auto name = propName.utf8(runtime);
|
|
|
|
if (name == "toString") {
|
|
auto toString = JSI_HOST_FUNCTION_LAMBDA {
|
|
if (this->frame == nil) {
|
|
return jsi::String::createFromUtf8(runtime, "[closed frame]");
|
|
}
|
|
auto imageBuffer = CMSampleBufferGetImageBuffer(frame.buffer);
|
|
auto width = CVPixelBufferGetWidth(imageBuffer);
|
|
auto height = CVPixelBufferGetHeight(imageBuffer);
|
|
|
|
NSMutableString* string = [NSMutableString stringWithFormat:@"%lu x %lu Frame", width, height];
|
|
return jsi::String::createFromUtf8(runtime, string.UTF8String);
|
|
};
|
|
return jsi::Function::createFromHostFunction(runtime, jsi::PropNameID::forUtf8(runtime, "toString"), 0, toString);
|
|
}
|
|
if (name == "incrementRefCount") {
|
|
auto incrementRefCount = JSI_HOST_FUNCTION_LAMBDA {
|
|
// Increment retain count by one so ARC doesn't destroy the Frame Buffer.
|
|
CFRetain(frame.buffer);
|
|
return jsi::Value::undefined();
|
|
};
|
|
return jsi::Function::createFromHostFunction(runtime,
|
|
jsi::PropNameID::forUtf8(runtime, "incrementRefCount"),
|
|
0,
|
|
incrementRefCount);
|
|
}
|
|
if (name == "decrementRefCount") {
|
|
auto decrementRefCount = JSI_HOST_FUNCTION_LAMBDA {
|
|
// Decrement retain count by one. If the retain count is zero, ARC will destroy the Frame Buffer.
|
|
CFRelease(frame.buffer);
|
|
return jsi::Value::undefined();
|
|
};
|
|
return jsi::Function::createFromHostFunction(runtime,
|
|
jsi::PropNameID::forUtf8(runtime, "decrementRefCount"),
|
|
0,
|
|
decrementRefCount);
|
|
}
|
|
if (name == "toArrayBuffer") {
|
|
auto toArrayBuffer = JSI_HOST_FUNCTION_LAMBDA {
|
|
auto pixelBuffer = CMSampleBufferGetImageBuffer(frame.buffer);
|
|
auto bytesPerRow = CVPixelBufferGetBytesPerRow(pixelBuffer);
|
|
auto height = CVPixelBufferGetHeight(pixelBuffer);
|
|
auto buffer = (uint8_t*) CVPixelBufferGetBaseAddress(pixelBuffer);
|
|
auto arraySize = bytesPerRow * height;
|
|
|
|
static constexpr auto ARRAYBUFFER_CACHE_PROP_NAME = "__frameArrayBufferCache";
|
|
if (!runtime.global().hasProperty(runtime, ARRAYBUFFER_CACHE_PROP_NAME)) {
|
|
vision::TypedArray<vision::TypedArrayKind::Uint8ClampedArray> arrayBuffer(runtime, arraySize);
|
|
runtime.global().setProperty(runtime, ARRAYBUFFER_CACHE_PROP_NAME, arrayBuffer);
|
|
}
|
|
|
|
auto arrayBufferCache = runtime.global().getPropertyAsObject(runtime, ARRAYBUFFER_CACHE_PROP_NAME);
|
|
auto arrayBuffer = vision::getTypedArray(runtime, arrayBufferCache).get<vision::TypedArrayKind::Uint8ClampedArray>(runtime);
|
|
|
|
if (arrayBuffer.size(runtime) != arraySize) {
|
|
arrayBuffer = vision::TypedArray<vision::TypedArrayKind::Uint8ClampedArray>(runtime, arraySize);
|
|
runtime.global().setProperty(runtime, ARRAYBUFFER_CACHE_PROP_NAME, arrayBuffer);
|
|
}
|
|
|
|
arrayBuffer.updateUnsafe(runtime, buffer, arraySize);
|
|
|
|
return arrayBuffer;
|
|
};
|
|
return jsi::Function::createFromHostFunction(runtime, jsi::PropNameID::forUtf8(runtime, "toArrayBuffer"), 0, toArrayBuffer);
|
|
}
|
|
|
|
if (name == "isDrawable") {
|
|
return jsi::Value(false);
|
|
}
|
|
if (name == "isValid") {
|
|
auto isValid = frame != nil && frame.buffer != nil && CFGetRetainCount(frame.buffer) > 0 && CMSampleBufferIsValid(frame.buffer);
|
|
return jsi::Value(isValid);
|
|
}
|
|
if (name == "width") {
|
|
auto imageBuffer = CMSampleBufferGetImageBuffer(frame.buffer);
|
|
auto width = CVPixelBufferGetWidth(imageBuffer);
|
|
return jsi::Value((double) width);
|
|
}
|
|
if (name == "height") {
|
|
auto imageBuffer = CMSampleBufferGetImageBuffer(frame.buffer);
|
|
auto height = CVPixelBufferGetHeight(imageBuffer);
|
|
return jsi::Value((double) height);
|
|
}
|
|
if (name == "orientation") {
|
|
switch (frame.orientation) {
|
|
case UIImageOrientationUp:
|
|
case UIImageOrientationUpMirrored:
|
|
return jsi::String::createFromUtf8(runtime, "portrait");
|
|
case UIImageOrientationDown:
|
|
case UIImageOrientationDownMirrored:
|
|
return jsi::String::createFromUtf8(runtime, "portrait-upside-down");
|
|
case UIImageOrientationLeft:
|
|
case UIImageOrientationLeftMirrored:
|
|
return jsi::String::createFromUtf8(runtime, "landscape-left");
|
|
case UIImageOrientationRight:
|
|
case UIImageOrientationRightMirrored:
|
|
return jsi::String::createFromUtf8(runtime, "landscape-right");
|
|
}
|
|
}
|
|
if (name == "isMirrored") {
|
|
switch (frame.orientation) {
|
|
case UIImageOrientationUp:
|
|
case UIImageOrientationDown:
|
|
case UIImageOrientationLeft:
|
|
case UIImageOrientationRight:
|
|
return jsi::Value(false);
|
|
case UIImageOrientationDownMirrored:
|
|
case UIImageOrientationUpMirrored:
|
|
case UIImageOrientationLeftMirrored:
|
|
case UIImageOrientationRightMirrored:
|
|
return jsi::Value(true);
|
|
}
|
|
}
|
|
if (name == "timestamp") {
|
|
auto timestamp = CMSampleBufferGetPresentationTimeStamp(frame.buffer);
|
|
auto seconds = static_cast<double>(CMTimeGetSeconds(timestamp));
|
|
return jsi::Value(seconds * 1000.0);
|
|
}
|
|
if (name == "pixelFormat") {
|
|
auto format = CMSampleBufferGetFormatDescription(frame.buffer);
|
|
auto mediaType = CMFormatDescriptionGetMediaSubType(format);
|
|
switch (mediaType) {
|
|
case kCVPixelFormatType_32BGRA:
|
|
return jsi::String::createFromUtf8(runtime, "rgb");
|
|
case kCVPixelFormatType_420YpCbCr8BiPlanarFullRange:
|
|
case kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange:
|
|
return jsi::String::createFromUtf8(runtime, "yuv");
|
|
default:
|
|
return jsi::String::createFromUtf8(runtime, "unknown");
|
|
}
|
|
}
|
|
if (name == "bytesPerRow") {
|
|
auto imageBuffer = CMSampleBufferGetImageBuffer(frame.buffer);
|
|
auto bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
|
|
return jsi::Value((double) bytesPerRow);
|
|
}
|
|
if (name == "planesCount") {
|
|
auto imageBuffer = CMSampleBufferGetImageBuffer(frame.buffer);
|
|
auto planesCount = CVPixelBufferGetPlaneCount(imageBuffer);
|
|
return jsi::Value((double) planesCount);
|
|
}
|
|
|
|
// fallback to base implementation
|
|
return HostObject::get(runtime, propName);
|
|
}
|