feat: Code Scanner API (#1912)

* feat: CodeScanner JS API

* feat: iOS

* Use guard

* Format

* feat: Android base

* fix: Attach Surfaces

* Use isBusy var

* fix: Use separate Queue

* feat: Finish iOS types

* feat: Implement all other code types on Android

* fix: Call JS event

* fix: Pass codetypes on Android

* fix: iOS use Preview coordinate system

* docs: Add comments

* chore: Format code

* Update CameraView+AVCaptureSession.swift

* docs: Add Code Scanner docs

* docs: Update

* feat: Use lazily downloaded model on Android

* Revert changes in CameraPage

* Format

* fix: Fix empty QR codes

* Update README.md
This commit is contained in:
Marc Rousavy 2023-10-04 12:53:52 +02:00 committed by GitHub
parent 2c08e5ae78
commit 6640b72a00
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
36 changed files with 763 additions and 29 deletions

View File

@ -17,6 +17,7 @@
VisionCamera is a powerful, high-performance Camera library for React Native. It features: VisionCamera is a powerful, high-performance Camera library for React Native. It features:
* 📸 Photo and Video capture * 📸 Photo and Video capture
* 👁️ QR/Barcode scanner
* 📱 Customizable devices and multi-cameras ("fish-eye" zoom) * 📱 Customizable devices and multi-cameras ("fish-eye" zoom)
* 🎞️ Customizable resolutions and aspect-ratios (4k/8k images) * 🎞️ Customizable resolutions and aspect-ratios (4k/8k images)
* ⏱️ Customizable FPS (30..240 FPS) * ⏱️ Customizable FPS (30..240 FPS)

View File

@ -0,0 +1,104 @@
---
id: code-scanning
title: QR/Barcode Scanning
sidebar_label: QR/Barcode Scanning
---
import Tabs from '@theme/Tabs'
import TabItem from '@theme/TabItem'
import useBaseUrl from '@docusaurus/useBaseUrl'
<div class="image-container">
<svg xmlns="http://www.w3.org/2000/svg" width="283" height="535">
<image href={useBaseUrl("img/demo.gif")} x="18" y="33" width="247" height="469" />
<image href={useBaseUrl("img/frame.png")} width="283" height="535" />
</svg>
</div>
## What is a Code Scanner?
A Code Scanner is a separate Camera output (just like photo or video) that can detect a variety of machine-readable codes, such as:
- **QR**: Square QR codes
- **Aztec**: Square Aztec codes
- **Data Matrix**: Square Data Matrix codes
- **Barcode (EAN)**: EAN-13 or EAN-8 Barcodes
- **Barcode (Code)**: Code-128, Code-39 or Code-93 Barcodes
- **Barcode (other)**: Codabar, ITF-14, UPC-E or PDF-417 Barcodes
## Setup
On iOS, the Code Scanner uses the platform-native APIs and can be used out of the box.
On Android, the [MLKit Vision Barcode Scanning](https://developers.google.com/ml-kit/vision/barcode-scanning) API will be used, and the model (2.2MB) needs to be downloaded first. To download the model when the user installs your app, add this to your `AndroidManifest.xml` file:
```xml
<application ...>
...
<meta-data
android:name="com.google.mlkit.vision.DEPENDENCIES"
android:value="barcode">
</application>
```
## Usage
To use a codescanner, simply create a [`CodeScanner`](/docs/api/interfaces/CodeScanner) and pass it to the `<Camera>`:
<Tabs
groupId="component-style"
defaultValue="hooks"
values={[
{label: 'Hooks API', value: 'hooks'},
{label: 'Imperative API', value: 'imperative'}
]}>
<TabItem value="hooks">
```tsx
const codeScanner = useCodeScanner({
codeTypes: ['qr', 'ean-13'],
onCodeScanned: (codes) => {
console.log(`Scanned ${codes.length} codes!`)
}
})
return <Camera {...props} codeScanner={codeScanner} />
```
The result of this will automatically be memoized.
</TabItem>
<TabItem value="imperative">
```ts
const codeScanner: CodeScanner = {
codeTypes: ['qr', 'ean-13'],
onCodeScanned: (codes) => {
console.log(`Scanned ${codes.length} codes!`)
}
}
```
Make sure to memoize the result of this, as every change in this will trigger a Camera session re-build.
```tsx
render() {
return <Camera {...props} codeScanner={this.codeScanner} />
}
```
</TabItem>
</Tabs>
## Separate Output
Since the Code Scanner is a separate camera output (just like photo or video), it cannot be attached simultaneously with photo and video enabled.
You need to disable either `photo`, `video`, or the `codeScanner`.
## Code result
The Code Scanner will call your `onCodeScanned` callback with all detected codes ([`Code`](/docs/api/interfaces/Code)), including their decoded string value, and their coordinates on the screen relative to the Preview.
<br />
#### 🚀 Next section: [Camera Lifecycle](lifecycle)

View File

@ -134,17 +134,17 @@ const onDraw = useDrawCallback((canvas) => {
And you can also call back to the React-JS thread by using `createRunInJsFn(...)`: And you can also call back to the React-JS thread by using `createRunInJsFn(...)`:
```tsx ```tsx
const onQRCodeDetected = Worklets.createRunInJsFn((qrCode: string) => { const onFaceDetected = Worklets.createRunInJsFn((face: Face) => {
navigation.push("ProductPage", { productId: qrCode }) navigation.push("FiltersPage", { face: face })
}) })
const frameProcessor = useFrameProcessor((frame) => { const frameProcessor = useFrameProcessor((frame) => {
'worklet' 'worklet'
const qrCodes = scanQRCodes(frame) const faces = scanFaces(frame)
if (qrCodes.length > 0) { if (faces.length > 0) {
onQRCodeDetected(qrCodes[0]) onFaceDetected(faces[0])
} }
}, [onQRCodeDetected]) }, [onFaceDetected])
``` ```
## Threading ## Threading

View File

@ -7,6 +7,7 @@ module.exports = {
'guides/formats', 'guides/formats',
'guides/taking-photos', 'guides/taking-photos',
'guides/recording-videos', 'guides/recording-videos',
'guides/code-scanning',
{ {
type: 'category', type: 'category',
label: 'Realtime Frame Processing', label: 'Realtime Frame Processing',

View File

@ -142,8 +142,9 @@ android {
dependencies { dependencies {
//noinspection GradleDynamicVersion //noinspection GradleDynamicVersion
implementation 'com.facebook.react:react-android:+' implementation "com.facebook.react:react-android:+"
implementation "org.jetbrains.kotlinx:kotlinx-coroutines-android:1.5.2" implementation "org.jetbrains.kotlinx:kotlinx-coroutines-android:1.5.2"
implementation "com.google.android.gms:play-services-mlkit-barcode-scanning:18.3.0"
if (hasWorklets) { if (hasWorklets) {
// Frame Processor integration (optional) // Frame Processor integration (optional)

View File

@ -79,6 +79,13 @@ class RecordingInProgressError :
"There is already an active video recording in progress! Did you call startRecording() twice?" "There is already an active video recording in progress! Did you call startRecording() twice?"
) )
class CodeTypeNotSupportedError(codeType: String) :
CameraError(
"code-scanner",
"code-type-not-supported",
"The codeType \"$codeType\" is not supported by the Code Scanner!"
)
class ViewNotFoundError(viewId: Int) : class ViewNotFoundError(viewId: Int) :
CameraError("system", "view-not-found", "The given view (ID $viewId) was not found in the view manager.") CameraError("system", "view-not-found", "The given view (ID $viewId) was not found in the view manager.")
class FrameProcessorsUnavailableError(reason: String) : class FrameProcessorsUnavailableError(reason: String) :

View File

@ -11,6 +11,7 @@ class CameraQueues {
companion object { companion object {
val cameraQueue = CameraQueue("mrousavy/VisionCamera.main") val cameraQueue = CameraQueue("mrousavy/VisionCamera.main")
val videoQueue = CameraQueue("mrousavy/VisionCamera.video") val videoQueue = CameraQueue("mrousavy/VisionCamera.video")
val codeScannerQueue = CameraQueue("mrousavy/VisionCamera.codeScanner")
} }
class CameraQueue(name: String) { class CameraQueue(name: String) {

View File

@ -5,6 +5,8 @@ import com.facebook.react.bridge.Arguments
import com.facebook.react.bridge.ReactContext import com.facebook.react.bridge.ReactContext
import com.facebook.react.bridge.WritableMap import com.facebook.react.bridge.WritableMap
import com.facebook.react.uimanager.events.RCTEventEmitter import com.facebook.react.uimanager.events.RCTEventEmitter
import com.google.mlkit.vision.barcode.common.Barcode
import com.mrousavy.camera.parsers.CodeType
fun CameraView.invokeOnInitialized() { fun CameraView.invokeOnInitialized() {
Log.i(CameraView.TAG, "invokeOnInitialized()") Log.i(CameraView.TAG, "invokeOnInitialized()")
@ -37,6 +39,31 @@ fun CameraView.invokeOnViewReady() {
reactContext.getJSModule(RCTEventEmitter::class.java).receiveEvent(id, "cameraViewReady", event) reactContext.getJSModule(RCTEventEmitter::class.java).receiveEvent(id, "cameraViewReady", event)
} }
fun CameraView.invokeOnCodeScanned(barcodes: List<Barcode>) {
val codes = Arguments.createArray()
barcodes.forEach { barcode ->
val code = Arguments.createMap()
val type = CodeType.fromBarcodeType(barcode.format)
code.putString("type", type.unionValue)
code.putString("value", barcode.rawValue)
barcode.boundingBox?.let { rect ->
val frame = Arguments.createMap()
frame.putInt("x", rect.left)
frame.putInt("y", rect.top)
frame.putInt("width", rect.right - rect.left)
frame.putInt("height", rect.bottom - rect.top)
code.putMap("frame", frame)
}
codes.pushMap(code)
}
val event = Arguments.createMap()
event.putArray("codes", codes)
val reactContext = context as ReactContext
reactContext.getJSModule(RCTEventEmitter::class.java).receiveEvent(id, "cameraCodeScanned", event)
}
private fun errorToMap(error: Throwable): WritableMap { private fun errorToMap(error: Throwable): WritableMap {
val map = Arguments.createMap() val map = Arguments.createMap()
map.putString("message", error.message) map.putString("message", error.message)

View File

@ -22,6 +22,7 @@ import com.mrousavy.camera.extensions.getPreviewTargetSize
import com.mrousavy.camera.extensions.installHierarchyFitter import com.mrousavy.camera.extensions.installHierarchyFitter
import com.mrousavy.camera.extensions.smaller import com.mrousavy.camera.extensions.smaller
import com.mrousavy.camera.frameprocessor.FrameProcessor import com.mrousavy.camera.frameprocessor.FrameProcessor
import com.mrousavy.camera.parsers.CodeScanner
import com.mrousavy.camera.parsers.Orientation import com.mrousavy.camera.parsers.Orientation
import com.mrousavy.camera.parsers.PixelFormat import com.mrousavy.camera.parsers.PixelFormat
import com.mrousavy.camera.parsers.ResizeMode import com.mrousavy.camera.parsers.ResizeMode
@ -47,7 +48,7 @@ class CameraView(context: Context) : FrameLayout(context) {
private val propsThatRequirePreviewReconfiguration = arrayListOf("cameraId", "format", "resizeMode") private val propsThatRequirePreviewReconfiguration = arrayListOf("cameraId", "format", "resizeMode")
private val propsThatRequireSessionReconfiguration = private val propsThatRequireSessionReconfiguration =
arrayListOf("cameraId", "format", "photo", "video", "enableFrameProcessor", "pixelFormat") arrayListOf("cameraId", "format", "photo", "video", "enableFrameProcessor", "codeScannerOptions", "pixelFormat")
private val propsThatRequireFormatReconfiguration = arrayListOf("fps", "hdr", "videoStabilizationMode", "lowLightBoost") private val propsThatRequireFormatReconfiguration = arrayListOf("fps", "hdr", "videoStabilizationMode", "lowLightBoost")
} }
@ -80,6 +81,9 @@ class CameraView(context: Context) : FrameLayout(context) {
var orientation: Orientation? = null var orientation: Orientation? = null
var enableZoomGesture: Boolean = false var enableZoomGesture: Boolean = false
// code scanner
var codeScannerOptions: CodeScanner? = null
// private properties // private properties
private var isMounted = false private var isMounted = false
internal val cameraManager = context.getSystemService(Context.CAMERA_SERVICE) as CameraManager internal val cameraManager = context.getSystemService(Context.CAMERA_SERVICE) as CameraManager
@ -202,6 +206,7 @@ class CameraView(context: Context) : FrameLayout(context) {
val targetPhotoSize = if (format != null) Size(format.getInt("photoWidth"), format.getInt("photoHeight")) else null val targetPhotoSize = if (format != null) Size(format.getInt("photoWidth"), format.getInt("photoHeight")) else null
// TODO: Allow previewSurface to be null/none // TODO: Allow previewSurface to be null/none
val previewSurface = previewSurface ?: return val previewSurface = previewSurface ?: return
val codeScannerOptions = codeScannerOptions
val previewOutput = CameraOutputs.PreviewOutput(previewSurface, previewView?.targetSize) val previewOutput = CameraOutputs.PreviewOutput(previewSurface, previewView?.targetSize)
val photoOutput = if (photo == true) { val photoOutput = if (photo == true) {
@ -214,8 +219,17 @@ class CameraView(context: Context) : FrameLayout(context) {
} else { } else {
null null
} }
val codeScanner = if (codeScannerOptions != null) {
CameraOutputs.CodeScannerOutput(
codeScannerOptions,
{ codes -> invokeOnCodeScanned(codes) },
{ error -> invokeOnError(error) }
)
} else {
null
}
cameraSession.configureSession(cameraId, previewOutput, photoOutput, videoOutput) cameraSession.configureSession(cameraId, previewOutput, photoOutput, videoOutput, codeScanner)
} catch (e: Throwable) { } catch (e: Throwable) {
Log.e(TAG, "Failed to configure session: ${e.message}", e) Log.e(TAG, "Failed to configure session: ${e.message}", e)
invokeOnError(e) invokeOnError(e)

View File

@ -5,6 +5,7 @@ import com.facebook.react.common.MapBuilder
import com.facebook.react.uimanager.ThemedReactContext import com.facebook.react.uimanager.ThemedReactContext
import com.facebook.react.uimanager.ViewGroupManager import com.facebook.react.uimanager.ViewGroupManager
import com.facebook.react.uimanager.annotations.ReactProp import com.facebook.react.uimanager.annotations.ReactProp
import com.mrousavy.camera.parsers.CodeScanner
import com.mrousavy.camera.parsers.Orientation import com.mrousavy.camera.parsers.Orientation
import com.mrousavy.camera.parsers.PixelFormat import com.mrousavy.camera.parsers.PixelFormat
import com.mrousavy.camera.parsers.ResizeMode import com.mrousavy.camera.parsers.ResizeMode
@ -27,6 +28,7 @@ class CameraViewManager : ViewGroupManager<CameraView>() {
.put("cameraViewReady", MapBuilder.of("registrationName", "onViewReady")) .put("cameraViewReady", MapBuilder.of("registrationName", "onViewReady"))
.put("cameraInitialized", MapBuilder.of("registrationName", "onInitialized")) .put("cameraInitialized", MapBuilder.of("registrationName", "onInitialized"))
.put("cameraError", MapBuilder.of("registrationName", "onError")) .put("cameraError", MapBuilder.of("registrationName", "onError"))
.put("cameraCodeScanned", MapBuilder.of("registrationName", "onCodeScanned"))
.build() .build()
override fun getName(): String = TAG override fun getName(): String = TAG
@ -200,6 +202,15 @@ class CameraViewManager : ViewGroupManager<CameraView>() {
view.orientation = newMode view.orientation = newMode
} }
@ReactProp(name = "codeScannerOptions")
fun setCodeScanner(view: CameraView, codeScannerOptions: ReadableMap) {
val newCodeScannerOptions = CodeScanner(codeScannerOptions)
if (view.codeScannerOptions != newCodeScannerOptions) {
addChangedPropToTransaction(view, "codeScannerOptions")
}
view.codeScannerOptions = newCodeScannerOptions
}
companion object { companion object {
const val TAG = "CameraView" const val TAG = "CameraView"

View File

@ -139,7 +139,8 @@ class CameraSession(
cameraId: String, cameraId: String,
preview: CameraOutputs.PreviewOutput? = null, preview: CameraOutputs.PreviewOutput? = null,
photo: CameraOutputs.PhotoOutput? = null, photo: CameraOutputs.PhotoOutput? = null,
video: CameraOutputs.VideoOutput? = null video: CameraOutputs.VideoOutput? = null,
codeScanner: CameraOutputs.CodeScannerOutput? = null
) { ) {
Log.i(TAG, "Configuring Session for Camera $cameraId...") Log.i(TAG, "Configuring Session for Camera $cameraId...")
val outputs = CameraOutputs( val outputs = CameraOutputs(
@ -148,6 +149,7 @@ class CameraSession(
preview, preview,
photo, photo,
video, video,
codeScanner,
hdr == true, hdr == true,
this this
) )
@ -190,6 +192,7 @@ class CameraSession(
currentOutputs.preview, currentOutputs.preview,
currentOutputs.photo, currentOutputs.photo,
currentOutputs.video, currentOutputs.video,
currentOutputs.codeScanner,
hdr, hdr,
this this
) )
@ -534,11 +537,15 @@ class CameraSession(
val template = if (outputs.videoOutput != null) CameraDevice.TEMPLATE_RECORD else CameraDevice.TEMPLATE_PREVIEW val template = if (outputs.videoOutput != null) CameraDevice.TEMPLATE_RECORD else CameraDevice.TEMPLATE_PREVIEW
val captureRequest = camera.createCaptureRequest(template) val captureRequest = camera.createCaptureRequest(template)
outputs.previewOutput?.let { output -> outputs.previewOutput?.let { output ->
Log.i(TAG, "Adding output surface ${output.outputType}..") Log.i(TAG, "Adding preview output surface ${output.outputType}..")
captureRequest.addTarget(output.surface) captureRequest.addTarget(output.surface)
} }
outputs.videoOutput?.let { output -> outputs.videoOutput?.let { output ->
Log.i(TAG, "Adding output surface ${output.outputType}..") Log.i(TAG, "Adding video output surface ${output.outputType}..")
captureRequest.addTarget(output.surface)
}
outputs.codeScannerOutput?.let { output ->
Log.i(TAG, "Adding code scanner output surface ${output.outputType}")
captureRequest.addTarget(output.surface) captureRequest.addTarget(output.surface)
} }

View File

@ -0,0 +1,19 @@
package com.mrousavy.camera.core.outputs
import android.media.ImageReader
import android.util.Log
import com.google.mlkit.vision.barcode.BarcodeScanner
import java.io.Closeable
class BarcodeScannerOutput(private val imageReader: ImageReader, private val barcodeScanner: BarcodeScanner) :
ImageReaderOutput(imageReader, OutputType.VIDEO),
Closeable {
override fun close() {
Log.i(TAG, "Closing BarcodeScanner..")
barcodeScanner.close()
super.close()
}
override fun toString(): String =
"$outputType (${imageReader.width} x ${imageReader.height} ${barcodeScanner.detectorType} BarcodeScanner)"
}

View File

@ -8,6 +8,10 @@ import android.media.ImageReader
import android.util.Log import android.util.Log
import android.util.Size import android.util.Size
import android.view.Surface import android.view.Surface
import com.google.mlkit.vision.barcode.BarcodeScannerOptions
import com.google.mlkit.vision.barcode.BarcodeScanning
import com.google.mlkit.vision.barcode.common.Barcode
import com.google.mlkit.vision.common.InputImage
import com.mrousavy.camera.CameraQueues import com.mrousavy.camera.CameraQueues
import com.mrousavy.camera.core.VideoPipeline import com.mrousavy.camera.core.VideoPipeline
import com.mrousavy.camera.extensions.bigger import com.mrousavy.camera.extensions.bigger
@ -16,6 +20,8 @@ import com.mrousavy.camera.extensions.getPhotoSizes
import com.mrousavy.camera.extensions.getPreviewTargetSize import com.mrousavy.camera.extensions.getPreviewTargetSize
import com.mrousavy.camera.extensions.getVideoSizes import com.mrousavy.camera.extensions.getVideoSizes
import com.mrousavy.camera.extensions.smaller import com.mrousavy.camera.extensions.smaller
import com.mrousavy.camera.parsers.CodeScanner
import com.mrousavy.camera.parsers.Orientation
import com.mrousavy.camera.parsers.PixelFormat import com.mrousavy.camera.parsers.PixelFormat
import java.io.Closeable import java.io.Closeable
@ -25,6 +31,7 @@ class CameraOutputs(
val preview: PreviewOutput? = null, val preview: PreviewOutput? = null,
val photo: PhotoOutput? = null, val photo: PhotoOutput? = null,
val video: VideoOutput? = null, val video: VideoOutput? = null,
val codeScanner: CodeScannerOutput? = null,
val enableHdr: Boolean? = false, val enableHdr: Boolean? = false,
val callback: Callback val callback: Callback
) : Closeable { ) : Closeable {
@ -41,6 +48,11 @@ class CameraOutputs(
val enableFrameProcessor: Boolean? = false, val enableFrameProcessor: Boolean? = false,
val format: PixelFormat = PixelFormat.NATIVE val format: PixelFormat = PixelFormat.NATIVE
) )
data class CodeScannerOutput(
val codeScanner: CodeScanner,
val onCodeScanned: (codes: List<Barcode>) -> Unit,
val onError: (error: Throwable) -> Unit
)
interface Callback { interface Callback {
fun onPhotoCaptured(image: Image) fun onPhotoCaptured(image: Image)
@ -52,6 +64,8 @@ class CameraOutputs(
private set private set
var videoOutput: VideoPipelineOutput? = null var videoOutput: VideoPipelineOutput? = null
private set private set
var codeScannerOutput: BarcodeScannerOutput? = null
private set
val size: Int val size: Int
get() { get() {
@ -59,6 +73,7 @@ class CameraOutputs(
if (previewOutput != null) size++ if (previewOutput != null) size++
if (photoOutput != null) size++ if (photoOutput != null) size++
if (videoOutput != null) size++ if (videoOutput != null) size++
if (codeScannerOutput != null) size++
return size return size
} }
@ -72,6 +87,7 @@ class CameraOutputs(
this.video?.enableRecording == other.video?.enableRecording && this.video?.enableRecording == other.video?.enableRecording &&
this.video?.targetSize == other.video?.targetSize && this.video?.targetSize == other.video?.targetSize &&
this.video?.format == other.video?.format && this.video?.format == other.video?.format &&
this.codeScanner?.codeScanner == other.codeScanner?.codeScanner &&
this.enableHdr == other.enableHdr this.enableHdr == other.enableHdr
} }
@ -80,12 +96,14 @@ class CameraOutputs(
result += (preview?.hashCode() ?: 0) result += (preview?.hashCode() ?: 0)
result += (photo?.hashCode() ?: 0) result += (photo?.hashCode() ?: 0)
result += (video?.hashCode() ?: 0) result += (video?.hashCode() ?: 0)
result += (codeScanner?.hashCode() ?: 0)
return result return result
} }
override fun close() { override fun close() {
photoOutput?.close() photoOutput?.close()
videoOutput?.close() videoOutput?.close()
codeScannerOutput?.close()
} }
override fun toString(): String { override fun toString(): String {
@ -93,6 +111,7 @@ class CameraOutputs(
previewOutput?.let { strings.add(it.toString()) } previewOutput?.let { strings.add(it.toString()) }
photoOutput?.let { strings.add(it.toString()) } photoOutput?.let { strings.add(it.toString()) }
videoOutput?.let { strings.add(it.toString()) } videoOutput?.let { strings.add(it.toString()) }
codeScannerOutput?.let { strings.add(it.toString()) }
return strings.joinToString(", ", "[", "]") return strings.joinToString(", ", "[", "]")
} }
@ -144,6 +163,47 @@ class CameraOutputs(
videoOutput = VideoPipelineOutput(videoPipeline, SurfaceOutput.OutputType.VIDEO) videoOutput = VideoPipelineOutput(videoPipeline, SurfaceOutput.OutputType.VIDEO)
} }
// Code Scanner
if (codeScanner != null) {
val format = ImageFormat.YUV_420_888
val targetSize = Size(1280, 720)
val size = characteristics.getVideoSizes(cameraId, format).closestToOrMax(targetSize)
val types = codeScanner.codeScanner.codeTypes.map { it.toBarcodeType() }
val barcodeScannerOptions = BarcodeScannerOptions.Builder()
.setBarcodeFormats(types[0], *types.toIntArray())
.setExecutor(CameraQueues.codeScannerQueue.executor)
.build()
val scanner = BarcodeScanning.getClient(barcodeScannerOptions)
var isBusy = false
val imageReader = ImageReader.newInstance(size.width, size.height, format, 1)
imageReader.setOnImageAvailableListener({ reader ->
if (isBusy) return@setOnImageAvailableListener
val image = reader.acquireNextImage() ?: return@setOnImageAvailableListener
isBusy = true
// TODO: Get correct orientation
val inputImage = InputImage.fromMediaImage(image, Orientation.PORTRAIT.toDegrees())
scanner.process(inputImage)
.addOnSuccessListener { barcodes ->
image.close()
isBusy = false
if (barcodes.isNotEmpty()) {
codeScanner.onCodeScanned(barcodes)
}
}
.addOnFailureListener { error ->
image.close()
isBusy = false
codeScanner.onError(error)
}
}, CameraQueues.videoQueue.handler)
Log.i(TAG, "Adding ${size.width}x${size.height} code scanner output. (Code Types: $types)")
codeScannerOutput = BarcodeScannerOutput(imageReader, scanner)
}
Log.i(TAG, "Prepared $size Outputs for Camera $cameraId!") Log.i(TAG, "Prepared $size Outputs for Camera $cameraId!")
} }
} }

View File

@ -5,7 +5,7 @@ import android.util.Log
import android.util.Size import android.util.Size
import java.io.Closeable import java.io.Closeable
class ImageReaderOutput(private val imageReader: ImageReader, outputType: OutputType, dynamicRangeProfile: Long? = null) : open class ImageReaderOutput(private val imageReader: ImageReader, outputType: OutputType, dynamicRangeProfile: Long? = null) :
SurfaceOutput( SurfaceOutput(
imageReader.surface, imageReader.surface,
Size(imageReader.width, imageReader.height), Size(imageReader.width, imageReader.height),
@ -16,6 +16,7 @@ class ImageReaderOutput(private val imageReader: ImageReader, outputType: Output
override fun close() { override fun close() {
Log.i(TAG, "Closing ${imageReader.width}x${imageReader.height} $outputType ImageReader..") Log.i(TAG, "Closing ${imageReader.width}x${imageReader.height} $outputType ImageReader..")
imageReader.close() imageReader.close()
super.close()
} }
override fun toString(): String = "$outputType (${imageReader.width} x ${imageReader.height} in format #${imageReader.imageFormat})" override fun toString(): String = "$outputType (${imageReader.width} x ${imageReader.height} in format #${imageReader.imageFormat})"

View File

@ -16,6 +16,7 @@ class VideoPipelineOutput(val videoPipeline: VideoPipeline, outputType: OutputTy
override fun close() { override fun close() {
Log.i(TAG, "Closing ${videoPipeline.width}x${videoPipeline.height} Video Pipeline..") Log.i(TAG, "Closing ${videoPipeline.width}x${videoPipeline.height} Video Pipeline..")
videoPipeline.close() videoPipeline.close()
super.close()
} }
override fun toString(): String = "$outputType (${videoPipeline.width} x ${videoPipeline.height} in format #${videoPipeline.format})" override fun toString(): String = "$outputType (${videoPipeline.width} x ${videoPipeline.height} in format #${videoPipeline.format})"

View File

@ -62,6 +62,9 @@ suspend fun CameraDevice.createCaptureSession(
outputs.videoOutput?.let { output -> outputs.videoOutput?.let { output ->
outputConfigurations.add(output.toOutputConfiguration(characteristics)) outputConfigurations.add(output.toOutputConfiguration(characteristics))
} }
outputs.codeScannerOutput?.let { output ->
outputConfigurations.add(output.toOutputConfiguration(characteristics))
}
if (outputs.enableHdr == true && Build.VERSION.SDK_INT >= Build.VERSION_CODES.TIRAMISU) { if (outputs.enableHdr == true && Build.VERSION.SDK_INT >= Build.VERSION_CODES.TIRAMISU) {
val supportedProfiles = characteristics.get(CameraCharacteristics.REQUEST_AVAILABLE_DYNAMIC_RANGE_PROFILES) val supportedProfiles = characteristics.get(CameraCharacteristics.REQUEST_AVAILABLE_DYNAMIC_RANGE_PROFILES)
val hdrProfile = supportedProfiles?.bestProfile ?: supportedProfiles?.supportedProfiles?.firstOrNull() val hdrProfile = supportedProfiles?.bestProfile ?: supportedProfiles?.supportedProfiles?.firstOrNull()

View File

@ -0,0 +1,22 @@
package com.mrousavy.camera.parsers
import com.facebook.react.bridge.ReadableMap
import com.mrousavy.camera.InvalidTypeScriptUnionError
class CodeScanner(map: ReadableMap) {
val codeTypes: List<CodeType>
init {
val codeTypes = map.getArray("codeTypes")?.toArrayList() ?: throw InvalidTypeScriptUnionError("codeScanner", map.toString())
this.codeTypes = codeTypes.map {
return@map CodeType.fromUnionValue(it as String)
}
}
override fun equals(other: Any?): Boolean {
if (other !is CodeScanner) return false
return codeTypes.size == other.codeTypes.size && codeTypes.containsAll(other.codeTypes)
}
override fun hashCode(): Int = codeTypes.hashCode()
}

View File

@ -0,0 +1,74 @@
package com.mrousavy.camera.parsers
import com.google.mlkit.vision.barcode.common.Barcode
import com.mrousavy.camera.CodeTypeNotSupportedError
import com.mrousavy.camera.InvalidTypeScriptUnionError
enum class CodeType(override val unionValue: String) : JSUnionValue {
CODE_128("code-128"),
CODE_39("code-39"),
CODE_93("code-93"),
CODABAR("codabar"),
EAN_13("ean-13"),
EAN_8("ean-8"),
ITF("itf"),
UPC_E("upc-e"),
QR("qr"),
PDF_417("pdf-417"),
AZTEC("aztec"),
DATA_MATRIX("data-matrix"),
UNKNOWN("unknown");
fun toBarcodeType(): Int =
when (this) {
CODE_128 -> Barcode.FORMAT_CODE_128
CODE_39 -> Barcode.FORMAT_CODE_39
CODE_93 -> Barcode.FORMAT_CODE_93
CODABAR -> Barcode.FORMAT_CODABAR
EAN_13 -> Barcode.FORMAT_EAN_13
EAN_8 -> Barcode.FORMAT_EAN_8
ITF -> Barcode.FORMAT_ITF
UPC_E -> Barcode.FORMAT_UPC_E
QR -> Barcode.FORMAT_QR_CODE
PDF_417 -> Barcode.FORMAT_PDF417
AZTEC -> Barcode.FORMAT_AZTEC
DATA_MATRIX -> Barcode.FORMAT_DATA_MATRIX
UNKNOWN -> throw CodeTypeNotSupportedError(this.unionValue)
}
companion object : JSUnionValue.Companion<CodeType> {
fun fromBarcodeType(barcodeType: Int): CodeType =
when (barcodeType) {
Barcode.FORMAT_CODE_128 -> CODE_128
Barcode.FORMAT_CODE_39 -> CODE_39
Barcode.FORMAT_CODE_93 -> CODE_93
Barcode.FORMAT_CODABAR -> CODABAR
Barcode.FORMAT_EAN_13 -> EAN_13
Barcode.FORMAT_EAN_8 -> EAN_8
Barcode.FORMAT_ITF -> ITF
Barcode.FORMAT_UPC_E -> UPC_E
Barcode.FORMAT_QR_CODE -> QR
Barcode.FORMAT_PDF417 -> PDF_417
Barcode.FORMAT_AZTEC -> AZTEC
Barcode.FORMAT_DATA_MATRIX -> DATA_MATRIX
else -> UNKNOWN
}
override fun fromUnionValue(unionValue: String?): CodeType =
when (unionValue) {
"code-128" -> CODE_128
"code-39" -> CODE_39
"code-93" -> CODE_93
"codabar" -> CODABAR
"ean-13" -> EAN_13
"ean-8" -> EAN_8
"itf" -> ITF
"upc-e" -> UPC_E
"qr" -> QR
"pdf-417" -> PDF_417
"aztec" -> AZTEC
"data-matrix" -> DATA_MATRIX
else -> throw InvalidTypeScriptUnionError("codeType", unionValue ?: "(null)")
}
}
}

View File

@ -9,25 +9,23 @@ enum class PixelFormat(override val unionValue: String) : JSUnionValue {
NATIVE("native"), NATIVE("native"),
UNKNOWN("unknown"); UNKNOWN("unknown");
fun toImageFormat(): Int { fun toImageFormat(): Int =
return when (this) { when (this) {
YUV -> ImageFormat.YUV_420_888 YUV -> ImageFormat.YUV_420_888
NATIVE -> ImageFormat.PRIVATE NATIVE -> ImageFormat.PRIVATE
else -> throw PixelFormatNotSupportedError(this.unionValue) else -> throw PixelFormatNotSupportedError(this.unionValue)
} }
}
companion object : JSUnionValue.Companion<PixelFormat> { companion object : JSUnionValue.Companion<PixelFormat> {
fun fromImageFormat(imageFormat: Int): PixelFormat { fun fromImageFormat(imageFormat: Int): PixelFormat =
return when (imageFormat) { when (imageFormat) {
ImageFormat.YUV_420_888 -> YUV ImageFormat.YUV_420_888 -> YUV
ImageFormat.PRIVATE -> NATIVE ImageFormat.PRIVATE -> NATIVE
else -> UNKNOWN else -> UNKNOWN
} }
}
override fun fromUnionValue(unionValue: String?): PixelFormat? { override fun fromUnionValue(unionValue: String?): PixelFormat? =
return when (unionValue) { when (unionValue) {
"yuv" -> YUV "yuv" -> YUV
"rgb" -> RGB "rgb" -> RGB
"native" -> NATIVE "native" -> NATIVE
@ -36,4 +34,3 @@ enum class PixelFormat(override val unionValue: String) : JSUnionValue {
} }
} }
} }
}

View File

@ -229,6 +229,31 @@ enum CaptureError {
} }
} }
// MARK: - CodeScannerError
enum CodeScannerError {
case notCompatibleWithOutputs
case codeTypeNotSupported(codeType: String)
var code: String {
switch self {
case .notCompatibleWithOutputs:
return "not-compatible-with-outputs"
case .codeTypeNotSupported:
return "code-type-not-supported"
}
}
var message: String {
switch self {
case .notCompatibleWithOutputs:
return "The Code Scanner is not supported in combination with the current outputs! Either disable video or photo outputs."
case let .codeTypeNotSupported(codeType: codeType):
return "The codeType \"\(codeType)\" is not supported by the Code Scanner!"
}
}
}
// MARK: - CameraError // MARK: - CameraError
enum CameraError: Error { enum CameraError: Error {
@ -238,6 +263,7 @@ enum CameraError: Error {
case format(_ id: FormatError) case format(_ id: FormatError)
case session(_ id: SessionError) case session(_ id: SessionError)
case capture(_ id: CaptureError) case capture(_ id: CaptureError)
case codeScanner(_ id: CodeScannerError)
case unknown(message: String? = nil) case unknown(message: String? = nil)
var code: String { var code: String {
@ -254,6 +280,8 @@ enum CameraError: Error {
return "session/\(id.code)" return "session/\(id.code)"
case let .capture(id: id): case let .capture(id: id):
return "capture/\(id.code)" return "capture/\(id.code)"
case let .codeScanner(id: id):
return "code-scanner/\(id.code)"
case .unknown: case .unknown:
return "unknown/unknown" return "unknown/unknown"
} }
@ -273,6 +301,8 @@ enum CameraError: Error {
return id.message return id.message
case let .capture(id: id): case let .capture(id: id):
return id.message return id.message
case let .codeScanner(id: id):
return id.message
case let .unknown(message: message): case let .unknown(message: message):
return message ?? "An unexpected error occured." return message ?? "An unexpected error occured."
} }

View File

@ -24,6 +24,13 @@ public class CameraQueues: NSObject {
autoreleaseFrequency: .inherit, autoreleaseFrequency: .inherit,
target: nil) target: nil)
/// The serial execution queue for output processing of QR/barcodes.
@objc public static let codeScannerQueue = DispatchQueue(label: "mrousavy/VisionCamera.codeScanner",
qos: .userInteractive,
attributes: [],
autoreleaseFrequency: .inherit,
target: nil)
/// The serial execution queue for output processing of audio buffers. /// The serial execution queue for output processing of audio buffers.
@objc public static let audioQueue = DispatchQueue(label: "mrousavy/VisionCamera.audio", @objc public static let audioQueue = DispatchQueue(label: "mrousavy/VisionCamera.audio",
qos: .userInteractive, qos: .userInteractive,

View File

@ -124,6 +124,34 @@ extension CameraView {
captureSession.addOutput(videoOutput!) captureSession.addOutput(videoOutput!)
} }
// Code Scanner
if let codeScannerOptions = codeScannerOptions {
guard let codeScanner = try? CodeScanner(fromJsValue: codeScannerOptions) else {
invokeOnError(.parameter(.invalid(unionName: "codeScanner", receivedValue: codeScannerOptions.description)))
return
}
let metadataOutput = AVCaptureMetadataOutput()
guard captureSession.canAddOutput(metadataOutput) else {
invokeOnError(.codeScanner(.notCompatibleWithOutputs))
return
}
captureSession.addOutput(metadataOutput)
for codeType in codeScanner.codeTypes {
// swiftlint:disable:next for_where
if !metadataOutput.availableMetadataObjectTypes.contains(codeType) {
invokeOnError(.codeScanner(.codeTypeNotSupported(codeType: codeType.descriptor)))
return
}
}
metadataOutput.setMetadataObjectsDelegate(self, queue: CameraQueues.codeScannerQueue)
metadataOutput.metadataObjectTypes = codeScanner.codeTypes
if let rectOfInterest = codeScanner.regionOfInterest {
metadataOutput.rectOfInterest = rectOfInterest
}
}
if outputOrientation != .portrait { if outputOrientation != .portrait {
updateOrientation() updateOrientation()
} }

View File

@ -0,0 +1,45 @@
//
// CameraView+CodeScanner.swift
// VisionCamera
//
// Created by Marc Rousavy on 03.10.23.
// Copyright © 2023 mrousavy. All rights reserved.
//
import AVFoundation
import Foundation
extension CameraView: AVCaptureMetadataOutputObjectsDelegate {
public func metadataOutput(_: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from _: AVCaptureConnection) {
guard let onCodeScanned = onCodeScanned else {
return
}
guard !metadataObjects.isEmpty else {
return
}
// Map codes to JS values
let codes = metadataObjects.map { object in
var value: String?
if let code = object as? AVMetadataMachineReadableCodeObject {
value = code.stringValue
}
let frame = previewView.layerRectConverted(fromMetadataOutputRect: object.bounds)
return [
"type": object.type.descriptor,
"value": value as Any,
"frame": [
"x": frame.origin.x,
"y": frame.origin.y,
"width": frame.size.width,
"height": frame.size.height,
],
]
}
// Call JS event
onCodeScanned([
"codes": codes,
])
}
}

View File

@ -27,7 +27,8 @@ private let propsThatRequireReconfiguration = ["cameraId",
"video", "video",
"enableFrameProcessor", "enableFrameProcessor",
"hdr", "hdr",
"pixelFormat"] "pixelFormat",
"codeScannerOptions"]
private let propsThatRequireDeviceReconfiguration = ["fps", private let propsThatRequireDeviceReconfiguration = ["fps",
"lowLightBoost"] "lowLightBoost"]
@ -46,6 +47,7 @@ public final class CameraView: UIView {
@objc var video: NSNumber? // nullable bool @objc var video: NSNumber? // nullable bool
@objc var audio: NSNumber? // nullable bool @objc var audio: NSNumber? // nullable bool
@objc var enableFrameProcessor = false @objc var enableFrameProcessor = false
@objc var codeScannerOptions: NSDictionary?
@objc var pixelFormat: NSString? @objc var pixelFormat: NSString?
// props that require format reconfiguring // props that require format reconfiguring
@objc var format: NSDictionary? @objc var format: NSDictionary?
@ -69,6 +71,7 @@ public final class CameraView: UIView {
@objc var onInitialized: RCTDirectEventBlock? @objc var onInitialized: RCTDirectEventBlock?
@objc var onError: RCTDirectEventBlock? @objc var onError: RCTDirectEventBlock?
@objc var onViewReady: RCTDirectEventBlock? @objc var onViewReady: RCTDirectEventBlock?
@objc var onCodeScanned: RCTDirectEventBlock?
// zoom // zoom
@objc var enableZoomGesture = false { @objc var enableZoomGesture = false {
didSet { didSet {

View File

@ -51,6 +51,9 @@ RCT_EXPORT_VIEW_PROPERTY(resizeMode, NSString);
RCT_EXPORT_VIEW_PROPERTY(onError, RCTDirectEventBlock); RCT_EXPORT_VIEW_PROPERTY(onError, RCTDirectEventBlock);
RCT_EXPORT_VIEW_PROPERTY(onInitialized, RCTDirectEventBlock); RCT_EXPORT_VIEW_PROPERTY(onInitialized, RCTDirectEventBlock);
RCT_EXPORT_VIEW_PROPERTY(onViewReady, RCTDirectEventBlock); RCT_EXPORT_VIEW_PROPERTY(onViewReady, RCTDirectEventBlock);
// Code Scanner
RCT_EXPORT_VIEW_PROPERTY(codeScannerOptions, NSDictionary);
RCT_EXPORT_VIEW_PROPERTY(onCodeScanned, RCTDirectEventBlock);
// Camera View Functions // Camera View Functions
RCT_EXTERN_METHOD(startRecording RCT_EXTERN_METHOD(startRecording

View File

@ -0,0 +1,94 @@
//
// AVMetadataObject.ObjectType+descriptor.swift
// VisionCamera
//
// Created by Marc Rousavy on 03.10.23.
// Copyright © 2023 mrousavy. All rights reserved.
//
import AVFoundation
import Foundation
extension AVMetadataObject.ObjectType {
init(withString string: String) throws {
switch string {
case "code-128":
self = .code128
return
case "code-39":
self = .code39
return
case "code-93":
self = .code93
return
case "codabar":
if #available(iOS 15.4, *) {
self = .codabar
} else {
throw CameraError.codeScanner(.codeTypeNotSupported(codeType: string))
}
return
case "ean-13":
self = .ean13
return
case "ean-8":
self = .ean8
return
case "itf":
self = .itf14
return
case "upc-e":
self = .upce
return
case "qr":
self = .qr
return
case "pdf-417":
self = .pdf417
return
case "aztec":
self = .aztec
return
case "data-matrix":
self = .dataMatrix
return
default:
throw EnumParserError.invalidValue
}
}
var descriptor: String {
if #available(iOS 15.4, *) {
if self == .codabar {
return "codabar"
}
}
switch self {
case .code128:
return "code-128"
case .code39:
return "code-39"
case .code93:
return "code-93"
case .ean13:
return "ean-13"
case .ean8:
return "ean-8"
case .itf14:
return "itf"
case .upce:
return "upce"
case .qr:
return "qr"
case .pdf417:
return "pdf-417"
case .aztec:
return "aztec"
case .dataMatrix:
return "data-matrix"
default:
return "unknown"
}
}
}

View File

@ -38,6 +38,10 @@ class PreviewView: UIView {
return AVCaptureVideoPreviewLayer.self return AVCaptureVideoPreviewLayer.self
} }
func layerRectConverted(fromMetadataOutputRect rect: CGRect) -> CGRect {
return videoPreviewLayer.layerRectConverted(fromMetadataOutputRect: rect)
}
init(frame: CGRect, session: AVCaptureSession) { init(frame: CGRect, session: AVCaptureSession) {
super.init(frame: frame) super.init(frame: frame)
videoPreviewLayer.session = session videoPreviewLayer.session = session

View File

@ -0,0 +1,45 @@
//
// CodeScanner.swift
// VisionCamera
//
// Created by Marc Rousavy on 03.10.23.
// Copyright © 2023 mrousavy. All rights reserved.
//
import AVFoundation
import Foundation
class CodeScanner {
let codeTypes: [AVMetadataObject.ObjectType]
let interval: Int
let regionOfInterest: CGRect?
init(fromJsValue dictionary: NSDictionary) throws {
if let codeTypes = dictionary["codeTypes"] as? [String] {
self.codeTypes = try codeTypes.map { value in
return try AVMetadataObject.ObjectType(withString: value)
}
} else {
throw CameraError.parameter(.invalidCombination(provided: "codeScanner", missing: "codeTypes"))
}
if let interval = dictionary["interval"] as? Double {
self.interval = Int(interval)
} else {
interval = 300
}
if let regionOfInterest = dictionary["regionOfInterest"] as? NSDictionary {
guard let x = regionOfInterest["x"] as? Double,
let y = regionOfInterest["y"] as? Double,
let width = regionOfInterest["width"] as? Double,
let height = regionOfInterest["height"] as? Double else {
throw CameraError.parameter(.invalid(unionName: "regionOfInterest", receivedValue: regionOfInterest.description))
}
self.regionOfInterest = CGRect(x: x, y: y, width: width, height: height)
} else {
regionOfInterest = nil
}
}
}

View File

@ -65,6 +65,9 @@
B8DB3BCA263DC4D8004C18D7 /* RecordingSession.swift in Sources */ = {isa = PBXBuildFile; fileRef = B8DB3BC9263DC4D8004C18D7 /* RecordingSession.swift */; }; B8DB3BCA263DC4D8004C18D7 /* RecordingSession.swift in Sources */ = {isa = PBXBuildFile; fileRef = B8DB3BC9263DC4D8004C18D7 /* RecordingSession.swift */; };
B8DB3BCC263DC97E004C18D7 /* AVFileType+descriptor.swift in Sources */ = {isa = PBXBuildFile; fileRef = B8DB3BCB263DC97E004C18D7 /* AVFileType+descriptor.swift */; }; B8DB3BCC263DC97E004C18D7 /* AVFileType+descriptor.swift in Sources */ = {isa = PBXBuildFile; fileRef = B8DB3BCB263DC97E004C18D7 /* AVFileType+descriptor.swift */; };
B8E957D02A693AD2008F5480 /* CameraView+Torch.swift in Sources */ = {isa = PBXBuildFile; fileRef = B8E957CF2A693AD2008F5480 /* CameraView+Torch.swift */; }; B8E957D02A693AD2008F5480 /* CameraView+Torch.swift in Sources */ = {isa = PBXBuildFile; fileRef = B8E957CF2A693AD2008F5480 /* CameraView+Torch.swift */; };
B8FF60AC2ACC93EF009D612F /* CameraView+CodeScanner.swift in Sources */ = {isa = PBXBuildFile; fileRef = B8FF60AB2ACC93EF009D612F /* CameraView+CodeScanner.swift */; };
B8FF60AE2ACC9731009D612F /* CodeScanner.swift in Sources */ = {isa = PBXBuildFile; fileRef = B8FF60AD2ACC9731009D612F /* CodeScanner.swift */; };
B8FF60B12ACC981B009D612F /* AVMetadataObject.ObjectType+descriptor.swift in Sources */ = {isa = PBXBuildFile; fileRef = B8FF60B02ACC981B009D612F /* AVMetadataObject.ObjectType+descriptor.swift */; };
/* End PBXBuildFile section */ /* End PBXBuildFile section */
/* Begin PBXCopyFilesBuildPhase section */ /* Begin PBXCopyFilesBuildPhase section */
@ -152,6 +155,9 @@
B8F0825E2A6046FC00C17EB6 /* FrameProcessor.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = FrameProcessor.h; sourceTree = "<group>"; }; B8F0825E2A6046FC00C17EB6 /* FrameProcessor.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = FrameProcessor.h; sourceTree = "<group>"; };
B8F0825F2A60491900C17EB6 /* FrameProcessor.mm */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.cpp.objcpp; path = FrameProcessor.mm; sourceTree = "<group>"; }; B8F0825F2A60491900C17EB6 /* FrameProcessor.mm */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.cpp.objcpp; path = FrameProcessor.mm; sourceTree = "<group>"; };
B8F7DDD1266F715D00120533 /* Frame.m */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.objc; path = Frame.m; sourceTree = "<group>"; }; B8F7DDD1266F715D00120533 /* Frame.m */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.objc; path = Frame.m; sourceTree = "<group>"; };
B8FF60AB2ACC93EF009D612F /* CameraView+CodeScanner.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = "CameraView+CodeScanner.swift"; sourceTree = "<group>"; };
B8FF60AD2ACC9731009D612F /* CodeScanner.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = CodeScanner.swift; sourceTree = "<group>"; };
B8FF60B02ACC981B009D612F /* AVMetadataObject.ObjectType+descriptor.swift */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.swift; path = "AVMetadataObject.ObjectType+descriptor.swift"; sourceTree = "<group>"; };
/* End PBXFileReference section */ /* End PBXFileReference section */
/* Begin PBXFrameworksBuildPhase section */ /* Begin PBXFrameworksBuildPhase section */
@ -188,6 +194,7 @@
B887518025E0102000DB86D6 /* CameraView+Focus.swift */, B887518025E0102000DB86D6 /* CameraView+Focus.swift */,
B887515D25E0102000DB86D6 /* CameraView+RecordVideo.swift */, B887515D25E0102000DB86D6 /* CameraView+RecordVideo.swift */,
B887517125E0102000DB86D6 /* CameraView+TakePhoto.swift */, B887517125E0102000DB86D6 /* CameraView+TakePhoto.swift */,
B8FF60AB2ACC93EF009D612F /* CameraView+CodeScanner.swift */,
B887518225E0102000DB86D6 /* CameraView+Zoom.swift */, B887518225E0102000DB86D6 /* CameraView+Zoom.swift */,
B86400512784A23400E9D2CA /* CameraView+Orientation.swift */, B86400512784A23400E9D2CA /* CameraView+Orientation.swift */,
B887515F25E0102000DB86D6 /* CameraViewManager.m */, B887515F25E0102000DB86D6 /* CameraViewManager.m */,
@ -208,6 +215,7 @@
isa = PBXGroup; isa = PBXGroup;
children = ( children = (
B80175EB2ABDEBD000E7DE90 /* ResizeMode.swift */, B80175EB2ABDEBD000E7DE90 /* ResizeMode.swift */,
B8FF60AD2ACC9731009D612F /* CodeScanner.swift */,
); );
path = Types; path = Types;
sourceTree = "<group>"; sourceTree = "<group>";
@ -264,6 +272,7 @@
B8DB3BCB263DC97E004C18D7 /* AVFileType+descriptor.swift */, B8DB3BCB263DC97E004C18D7 /* AVFileType+descriptor.swift */,
B864004F27849A2400E9D2CA /* UIInterfaceOrientation+descriptor.swift */, B864004F27849A2400E9D2CA /* UIInterfaceOrientation+descriptor.swift */,
B87B11BE2A8E63B700732EBF /* PixelFormat.swift */, B87B11BE2A8E63B700732EBF /* PixelFormat.swift */,
B8FF60B02ACC981B009D612F /* AVMetadataObject.ObjectType+descriptor.swift */,
); );
path = Parsers; path = Parsers;
sourceTree = "<group>"; sourceTree = "<group>";
@ -413,6 +422,7 @@
B887518525E0102000DB86D6 /* PhotoCaptureDelegate.swift in Sources */, B887518525E0102000DB86D6 /* PhotoCaptureDelegate.swift in Sources */,
B887518B25E0102000DB86D6 /* AVCaptureDevice.Format+isBetterThan.swift in Sources */, B887518B25E0102000DB86D6 /* AVCaptureDevice.Format+isBetterThan.swift in Sources */,
B8BD3BA2266E22D2006C80A2 /* Callback.swift in Sources */, B8BD3BA2266E22D2006C80A2 /* Callback.swift in Sources */,
B8FF60B12ACC981B009D612F /* AVMetadataObject.ObjectType+descriptor.swift in Sources */,
B84760A62608EE7C004C3180 /* FrameHostObject.mm in Sources */, B84760A62608EE7C004C3180 /* FrameHostObject.mm in Sources */,
B864005027849A2400E9D2CA /* UIInterfaceOrientation+descriptor.swift in Sources */, B864005027849A2400E9D2CA /* UIInterfaceOrientation+descriptor.swift in Sources */,
B887518E25E0102000DB86D6 /* AVFrameRateRange+includes.swift in Sources */, B887518E25E0102000DB86D6 /* AVFrameRateRange+includes.swift in Sources */,
@ -429,10 +439,12 @@
B881D35E2ABC775E009B21C8 /* AVCaptureDevice+toDictionary.swift in Sources */, B881D35E2ABC775E009B21C8 /* AVCaptureDevice+toDictionary.swift in Sources */,
B87B11BF2A8E63B700732EBF /* PixelFormat.swift in Sources */, B87B11BF2A8E63B700732EBF /* PixelFormat.swift in Sources */,
B88751A625E0102000DB86D6 /* CameraViewManager.swift in Sources */, B88751A625E0102000DB86D6 /* CameraViewManager.swift in Sources */,
B8FF60AC2ACC93EF009D612F /* CameraView+CodeScanner.swift in Sources */,
B80175EC2ABDEBD000E7DE90 /* ResizeMode.swift in Sources */, B80175EC2ABDEBD000E7DE90 /* ResizeMode.swift in Sources */,
B887519F25E0102000DB86D6 /* AVCaptureDevice.DeviceType+physicalDeviceDescriptor.swift in Sources */, B887519F25E0102000DB86D6 /* AVCaptureDevice.DeviceType+physicalDeviceDescriptor.swift in Sources */,
B8D22CDC2642DB4D00234472 /* AVAssetWriterInputPixelBufferAdaptor+initWithVideoSettings.swift in Sources */, B8D22CDC2642DB4D00234472 /* AVAssetWriterInputPixelBufferAdaptor+initWithVideoSettings.swift in Sources */,
B84760DF2608F57D004C3180 /* CameraQueues.swift in Sources */, B84760DF2608F57D004C3180 /* CameraQueues.swift in Sources */,
B8FF60AE2ACC9731009D612F /* CodeScanner.swift in Sources */,
B8446E502ABA14C900E56077 /* CameraDevicesManager.m in Sources */, B8446E502ABA14C900E56077 /* CameraDevicesManager.m in Sources */,
B887519025E0102000DB86D6 /* AVCaptureDevice.Format+matchesFilter.swift in Sources */, B887519025E0102000DB86D6 /* AVCaptureDevice.Format+matchesFilter.swift in Sources */,
B887518F25E0102000DB86D6 /* AVCapturePhotoOutput+mirror.swift in Sources */, B887518F25E0102000DB86D6 /* AVCapturePhotoOutput+mirror.swift in Sources */,

View File

@ -11,21 +11,27 @@ import type { RecordVideoOptions, VideoFile } from './VideoFile'
import { VisionCameraProxy } from './FrameProcessorPlugins' import { VisionCameraProxy } from './FrameProcessorPlugins'
import { CameraDevices } from './CameraDevices' import { CameraDevices } from './CameraDevices'
import type { EmitterSubscription } from 'react-native' import type { EmitterSubscription } from 'react-native'
import { Code, CodeScanner } from './CodeScanner'
//#region Types //#region Types
export type CameraPermissionStatus = 'granted' | 'not-determined' | 'denied' | 'restricted' export type CameraPermissionStatus = 'granted' | 'not-determined' | 'denied' | 'restricted'
export type CameraPermissionRequestResult = 'granted' | 'denied' export type CameraPermissionRequestResult = 'granted' | 'denied'
interface OnCodeScannedEvent {
codes: Code[]
}
interface OnErrorEvent { interface OnErrorEvent {
code: string code: string
message: string message: string
cause?: ErrorWithCause cause?: ErrorWithCause
} }
type NativeCameraViewProps = Omit<CameraProps, 'device' | 'onInitialized' | 'onError' | 'frameProcessor'> & { type NativeCameraViewProps = Omit<CameraProps, 'device' | 'onInitialized' | 'onError' | 'frameProcessor' | 'codeScanner'> & {
cameraId: string cameraId: string
enableFrameProcessor: boolean enableFrameProcessor: boolean
codeScannerOptions?: Omit<CodeScanner, 'onCodeScanned'>
onInitialized?: (event: NativeSyntheticEvent<void>) => void onInitialized?: (event: NativeSyntheticEvent<void>) => void
onError?: (event: NativeSyntheticEvent<OnErrorEvent>) => void onError?: (event: NativeSyntheticEvent<OnErrorEvent>) => void
onCodeScanned?: (event: NativeSyntheticEvent<OnCodeScannedEvent>) => void
onViewReady: () => void onViewReady: () => void
} }
type RefType = React.Component<NativeCameraViewProps> & Readonly<NativeMethods> type RefType = React.Component<NativeCameraViewProps> & Readonly<NativeMethods>
@ -76,6 +82,7 @@ export class Camera extends React.PureComponent<CameraProps> {
this.onViewReady = this.onViewReady.bind(this) this.onViewReady = this.onViewReady.bind(this)
this.onInitialized = this.onInitialized.bind(this) this.onInitialized = this.onInitialized.bind(this)
this.onError = this.onError.bind(this) this.onError = this.onError.bind(this)
this.onCodeScanned = this.onCodeScanned.bind(this)
this.ref = React.createRef<RefType>() this.ref = React.createRef<RefType>()
this.lastFrameProcessor = undefined this.lastFrameProcessor = undefined
} }
@ -387,6 +394,13 @@ export class Camera extends React.PureComponent<CameraProps> {
} }
//#endregion //#endregion
private onCodeScanned(event: NativeSyntheticEvent<OnCodeScannedEvent>): void {
const codeScanner = this.props.codeScanner
if (codeScanner == null) return
codeScanner.onCodeScanned(event.nativeEvent.codes)
}
//#region Lifecycle //#region Lifecycle
private setFrameProcessor(frameProcessor: FrameProcessor): void { private setFrameProcessor(frameProcessor: FrameProcessor): void {
VisionCameraProxy.setFrameProcessor(this.handle, frameProcessor) VisionCameraProxy.setFrameProcessor(this.handle, frameProcessor)
@ -422,7 +436,7 @@ export class Camera extends React.PureComponent<CameraProps> {
/** @internal */ /** @internal */
public render(): React.ReactNode { public render(): React.ReactNode {
// We remove the big `device` object from the props because we only need to pass `cameraId` to native. // We remove the big `device` object from the props because we only need to pass `cameraId` to native.
const { device, frameProcessor, ...props } = this.props const { device, frameProcessor, codeScanner, ...props } = this.props
// eslint-disable-next-line @typescript-eslint/no-unnecessary-condition // eslint-disable-next-line @typescript-eslint/no-unnecessary-condition
if (device == null) { if (device == null) {
@ -440,7 +454,9 @@ export class Camera extends React.PureComponent<CameraProps> {
ref={this.ref} ref={this.ref}
onViewReady={this.onViewReady} onViewReady={this.onViewReady}
onInitialized={this.onInitialized} onInitialized={this.onInitialized}
onCodeScanned={this.onCodeScanned}
onError={this.onError} onError={this.onError}
codeScannerOptions={codeScanner}
enableFrameProcessor={frameProcessor != null} enableFrameProcessor={frameProcessor != null}
enableBufferCompression={props.enableBufferCompression ?? shouldEnableBufferCompression} enableBufferCompression={props.enableBufferCompression ?? shouldEnableBufferCompression}
/> />

View File

@ -24,6 +24,10 @@ export type SessionError =
| 'session/camera-has-been-disconnected' | 'session/camera-has-been-disconnected'
| 'session/audio-in-use-by-other-app' | 'session/audio-in-use-by-other-app'
| 'session/audio-session-failed-to-activate' | 'session/audio-session-failed-to-activate'
export type CodeScannerError =
| 'code-scanner/not-compatible-with-outputs'
| 'code-scanner/code-type-not-supported'
| 'code-scanner/cannot-load-model'
export type CaptureError = export type CaptureError =
| 'capture/recording-in-progress' | 'capture/recording-in-progress'
| 'capture/no-recording-in-progress' | 'capture/no-recording-in-progress'

View File

@ -1,6 +1,7 @@
import type { ViewProps } from 'react-native' import type { ViewProps } from 'react-native'
import type { CameraDevice, CameraDeviceFormat, VideoStabilizationMode } from './CameraDevice' import type { CameraDevice, CameraDeviceFormat, VideoStabilizationMode } from './CameraDevice'
import type { CameraRuntimeError } from './CameraError' import type { CameraRuntimeError } from './CameraError'
import { CodeScanner } from './CodeScanner'
import type { Frame } from './Frame' import type { Frame } from './Frame'
import type { Orientation } from './Orientation' import type { Orientation } from './Orientation'
@ -223,13 +224,17 @@ export interface CameraProps extends ViewProps {
* ```tsx * ```tsx
* const frameProcessor = useFrameProcessor((frame) => { * const frameProcessor = useFrameProcessor((frame) => {
* 'worklet' * 'worklet'
* const qrCodes = scanQRCodes(frame) * const faces = scanFaces(frame)
* console.log(`Detected QR Codes: ${qrCodes}`) * console.log(`Faces: ${faces}`)
* }, []) * }, [])
* *
* return <Camera {...cameraProps} frameProcessor={frameProcessor} /> * return <Camera {...cameraProps} frameProcessor={frameProcessor} />
* ``` * ```
*/ */
frameProcessor?: FrameProcessor frameProcessor?: FrameProcessor
/**
* TODO: Desc
*/
codeScanner?: CodeScanner
//#endregion //#endregion
} }

View File

@ -0,0 +1,62 @@
/**
* The type of the code to scan.
*/
export type CodeType =
| 'code-128'
| 'code-39'
| 'code-93'
| 'codabar'
| 'ean-13'
| 'ean-8'
| 'itf'
| 'upc-e'
| 'qr'
| 'pdf-417'
| 'aztec'
| 'data-matrix'
/**
* A scanned code.
*/
export interface Code {
/**
* The type of the code that was scanned.
*/
type: CodeType | 'unknown'
/**
* The string value, or null if it cannot be decoded.
*/
value?: string
/**
* The location of the code relative to the Camera Preview (in dp).
*/
frame?: {
x: number
y: number
width: number
height: number
}
}
/**
* A scanner for detecting codes in a Camera Stream.
*/
export interface CodeScanner {
/**
* The types of codes to configure the code scanner for.
*/
codeTypes: CodeType[]
/**
* A callback to call whenever the scanned codes change.
*/
onCodeScanned: (codes: Code[]) => void
/**
* Crops the scanner's view area to the specific region of interest.
*/
regionOfInterest?: {
x: number
y: number
width: number
height: number
}
}

View File

@ -0,0 +1,23 @@
import { useCallback, useMemo, useRef } from 'react'
import { Code, CodeScanner } from '../CodeScanner'
export function useCodeScanner(codeScanner: CodeScanner): CodeScanner {
const { onCodeScanned, ...codeScannerOptions } = codeScanner
// Memoize the function once and use a ref on any identity changes
const ref = useRef(onCodeScanned)
ref.current = onCodeScanned
const callback = useCallback((codes: Code[]) => {
ref.current(codes)
}, [])
// CodeScanner needs to be memoized so it doesn't trigger a Camera Session re-build
return useMemo(
() => ({
...codeScannerOptions,
onCodeScanned: callback,
}),
// eslint-disable-next-line react-hooks/exhaustive-deps
[JSON.stringify(codeScannerOptions), callback],
)
}

View File

@ -42,8 +42,8 @@ export function createFrameProcessor(frameProcessor: FrameProcessor['frameProces
* ```ts * ```ts
* const frameProcessor = useFrameProcessor((frame) => { * const frameProcessor = useFrameProcessor((frame) => {
* 'worklet' * 'worklet'
* const qrCodes = scanQRCodes(frame) * const faces = scanFaces(frame)
* console.log(`QR Codes: ${qrCodes}`) * console.log(`Faces: ${faces}`)
* }, []) * }, [])
* ``` * ```
*/ */

View File

@ -9,6 +9,7 @@ export * from './PhotoFile'
export * from './PixelFormat' export * from './PixelFormat'
export * from './Point' export * from './Point'
export * from './VideoFile' export * from './VideoFile'
export * from './CodeScanner'
export * from './devices/getCameraFormat' export * from './devices/getCameraFormat'
export * from './devices/getCameraDevice' export * from './devices/getCameraDevice'
@ -19,3 +20,4 @@ export * from './hooks/useCameraDevices'
export * from './hooks/useCameraFormat' export * from './hooks/useCameraFormat'
export * from './hooks/useCameraPermission' export * from './hooks/useCameraPermission'
export * from './hooks/useFrameProcessor' export * from './hooks/useFrameProcessor'
export * from './hooks/useCodeScanner'