restructure

This commit is contained in:
Marc Rousavy
2021-02-19 21:28:41 +01:00
parent 3927ca1b1e
commit 7bd14ff308
4 changed files with 0 additions and 25253 deletions

11
docs/TODO.md Normal file
View File

@@ -0,0 +1,11 @@
# TODO
This is an internal TODO list which I am using to keep track of some of the features that are still missing.
* [ ] focus(x, y)
* [ ] Mirror images from selfie cameras (iOS Done, Android WIP)
* [ ] Allow camera switching (front <-> back) while recording and stich videos together
* [ ] Make `startRecording()` async. Due to NativeModules limitations, we can only have either one callback or one promise in a native function. For `startRecording()` we need both, since you probably also want to catch any errors that occured during a `startRecording()` call (or wait until the recording has actually started, since this can also take some time)
* [ ] Return a `jsi::Value` reference for images (`UIImage`/`Bitmap`) on `takePhoto()` and `takeSnapshot()`. This way, we skip the entire file writing and reading, making image capture _a lot_ faster.
* [ ] Implement frame processors. The idea here is that the user passes a small JS function (reanimated worklet) to the `Camera::frameProcessor` prop which will then get called on every frame the camera previews. (I'd say we cap it to 30 times per second, even if the camera fps is higher) This can then be used to scan QR codes, detect faces, detect depth, render something ontop of the camera such as color filters, QR code boundaries or even dog filters, possibly even use AR - all from a single, small, and highly flexible JS function!
* [ ] Create a custom MPEG4 encoder to allow for more customizability in `recordVideo()` (`bitRate`, `priority`, `minQuantizationParameter`, `allowFrameReordering`, `expectedFrameRate`, `realTime`, `minimizeMemoryUsage`)

23
docs/TROUBLESHOOTING.md Normal file
View File

@@ -0,0 +1,23 @@
# Troubleshooting
Before opening an issue, make sure you try the following:
## iOS
1. Try cleaning and rebuilding **everything**:
```sh
rm -rf package-lock.json && rm -rf yarn.lock && rm -rf node_modules && rm -rf ios/Podfile.lock && rm -rf ios/Pods
npm i # or "yarn"
cd ios && pod repo update && pod update && pod install
```
2. Check your minimum iOS version. react-native-vision-camera requires a minimum iOS version of **11.0**.
3. Check your Swift version. react-native-vision-camera requires a minimum Swift version of **5.2**.
4. Make sure you have created a Swift bridging header in your project.
1. Open your project with Xcode (`Example.xcworkspace`)
2. In the menu-bar, press **File** > **New** > **File** (⌘N)
3. Use whatever name you prefer, e.g. `File.swift`, and press **Create**
4. Press **Create Bridging Header** when promted.
## Android
1. Since the Android implementation uses the not-yet fully stable **CameraX** API, make sure you've browsed the [CameraX issue tracker](https://issuetracker.google.com/issues?q=componentid:618491%20status:open) to find out if your issue is a limitation by the **CameraX** library even I cannot get around.