* feat: Route images through `ImageWriter` into OpenGL pipeline
* fix: Use RGB format
* fix: Every device supports YUV, RGB and NATIVE
* Update VideoPipeline.kt
* log format
* Plug ImageReader between OpenGL pipeline
* Call Frame Processor
* Format
* Remove logs
* feat: Use `HardwareBuffer` for `toArrayBuffer()`
* Format
* feat: Route images through `ImageWriter` into OpenGL pipeline
* fix: Use RGB format
* fix: Every device supports YUV, RGB and NATIVE
* Update VideoPipeline.kt
* log format
* Plug ImageReader between OpenGL pipeline
* Call Frame Processor
* Format
* Remove logs
1. Reverts 4e96eb77e0 (PR #1789) to bring the C++ OpenGL GPU Pipeline back.
2. Fixes the "initHybrid JNI not found" error by loading the native JNI/C++ library in `VideoPipeline.kt`.
This PR has two downsides:
1. `pixelFormat="yuv"` does not work on Android. OpenGL only works in RGB
2. OpenGL rendering is fast, but it has an overhead. I think for Camera -> Video Recording we shouldn't be using an entire OpenGL rendering pipeline.
The original plan was to use something similar to how it works on iOS by just passing GPU buffers around, but the android.media APIs just aren't as advanced yet. `ImageReader`/`ImageWriter` is way too buggy and doesn't really work with `MediaRecorder`/`MediaCodec`.
This sucks, I hope in the future we can use something like `AHardwareBuffer`s.