-
Hello. Could you please explain how to draw an image using a YUV buffer on a Texture? I'm receiving camera images in YUV buffer from an external module. I have implemented it as below, but only a black screen is displayed. `filamentTexture = Texture.Builder() val sampler = TextureSampler( ... val bitmap = getRGBBitmap(yuvBuffer, context) val byteBuffer: ByteBuffer = ByteBuffer.allocate(bitmap.byteCount) filamentTexture!!.setImage( |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
First of All, the Android's way to handle the camera images is through the you already knows the I will refer this project, and Steam.java and Texture.java API of the filament. Or least use the |
Beta Was this translation helpful? Give feedback.
-
While searching for a solution, I found that I could solve it by uploading y, u, and v buffers to three textures separately, without using SurfaceTexture, and then converting them to RGB in the fragment shader. `private fun setTexture() {
... `private fun setBufferOnTexture(buffers: List, width: Int, height: Int) {
... ` fragment {
}` |
Beta Was this translation helpful? Give feedback.
First of All, the Android's way to handle the camera images is through the
Surface
orSurfaceTexture
, so I think you need to refactor the external module to provide the Camera's output asSurface
orSurfaceTexture
instead of raw image buffer, otherwise there are no high speed method to process these image.you already knows the
Stream
API in filament, so I guess you already knows thesample-hello-camera
sample project.filament/android/samples/sample-hello-camera/src/main/java/com/google/android/filament/hellocam/CameraHelper.kt
Lines 45 to 49 in 2763931