Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Directly using the pointer of source texture #435

Closed
grayhong opened this issue Jan 26, 2022 · 4 comments
Closed

Directly using the pointer of source texture #435

grayhong opened this issue Jan 26, 2022 · 4 comments
Labels
platform:android Issue that occurs when the target platform is Android sect:plugin Issue about plugin's source code type:question Ask a question

Comments

@grayhong
Copy link

Description

Hi Homuler! Thank you so much for the amazing work!

I'm using your UnityPlugin in my Android project and want to improve the performance to support low-end devices.

While I was profiling my Unity project with Android target, I found that the function ReadFromImageSource takes quite long (~7ms). (FYI. I am using HolisticTracking model)

Since I am using AR Camera as well, I had to use Texture as a source of TextureFrame.

I found that in a TextureFrame, it copies the source texture to the TextureFrame's texture and then hand over the pointer to the gpu through GlTextureBuffer

public void ReadTextureFromOnCPU(Texture src)
{
var textureBuffer = LoadToTextureBuffer(src);
SetPixels32(textureBuffer.GetPixels32());
}

private Texture2D LoadToTextureBuffer(Texture texture)
{
var textureFormat = GetTextureFormat(texture);
if (_textureBuffer == null || _textureBuffer.format != textureFormat)
{
_textureBuffer = new Texture2D(width, height, textureFormat, false);
}
var tmpRenderTexture = new RenderTexture(texture.width, texture.height, 32);
var currentRenderTexture = RenderTexture.active;
RenderTexture.active = tmpRenderTexture;
Graphics.Blit(texture, tmpRenderTexture);
var rect = new UnityEngine.Rect(0, 0, Mathf.Min(tmpRenderTexture.width, _textureBuffer.width), Mathf.Min(tmpRenderTexture.height, _textureBuffer.height));
_textureBuffer.ReadPixels(rect, 0, 0);
_textureBuffer.Apply();
RenderTexture.active = currentRenderTexture;
tmpRenderTexture.Release();
return _textureBuffer;
}

However, since I already had the Texture, I thought copying the texture again is unnecessary.
So, I tried to use texture.GetNativeTexturePtr() directly as a target argument of GlTextureBuffer, but it didn't work.

So my question is, is it possible to directly use the Texture's pointer as an input of GlTextureBuffer without copying to the TextureFrame's texture?

I suspect that there should be a reason you did GetPixel32 and SetPixel32 , could you explain it?

Thank you so much for your time!

@homuler
Copy link
Owner

homuler commented Jan 27, 2022

is it possible to directly use the Texture's pointer as an input of GlTextureBuffer without copying to the TextureFrame's texture?

It's a bit difficult to answer while I don't know what does not work how.
As you've probably noticed, when the graphics API is OpenGL ES, Texture's pointer is already passed to GlTextureBuffer directly (TextureFrame#GetTextureName returns it).

if (configType == ConfigType.OpenGLES)
{
var gpuBuffer = textureFrame.BuildGpuBuffer(GpuManager.GlCalculatorHelper.GetGlContext());
return calculatorGraph.AddPacketToInputStream(streamName, new GpuBufferPacket(gpuBuffer, currentTimestamp));
}

var glTextureBuffer = new GlTextureBuffer(GetTextureName(), width, height, gpuBufferformat, OnReleaseTextureFrame, glContext);

I think what you really want to know is why Texture is copied on CPU (in the sample app), and if it can be avoided in your case. And maybe it's related to the below question.

I suspect that there should be a reason you did GetPixel32 and SetPixel32 , could you explain it?

Why Texture is Copied?

It's because the sample app does not manage the source texture (e.g. OpenGL texture that WebCamTexture uses internally).
As a result, when MediaPipe try to read the (OpenGL) texture, which we pass by its name, the corresponding (GPU) memory may already be freed.
So if you can control when the (OpenGL) texture is freed, you don't need to copy the source Texture.

Why Texture is Copied on CPU?

Actually, it's not necessary, and you can copy it on GPU as long as the graphics API is OpenGL ES, but in that case, the input (image) and the output can be out of sync.
See #302 (comment) for more details (the 1st comment is correct).

You can test it by calling ReadTextureFromOnGPU in place of ReadTextureFromOnCPU (though it's not perfectly implemented and tested well). Please note that it only works on Android.

/// <summary>
/// Copy texture data from <paramref name="src" />.
/// If <paramref name="src" /> format is different from <see cref="format" />, it converts the format.
/// </summary>
/// <remarks>
/// After calling it, pixel data won't be read from CPU safely.
/// </remarks>
public bool ReadTextureFromOnGPU(Texture src)

protected static void ReadFromImageSource(ImageSource imageSource, TextureFrame textureFrame)
{
var sourceTexture = imageSource.GetCurrentTexture();
// For some reason, when the image is coiped on GPU, latency tends to be high.
// So even when OpenGL ES is available, use CPU to copy images.

@homuler homuler added platform:android Issue that occurs when the target platform is Android sect:plugin Issue about plugin's source code type:support Support issue stat:awaiting response Awaiting response from author labels Jan 27, 2022
@grayhong
Copy link
Author

Thank you so much for the quick reply! I really appreciate it.

So if you can control when the (OpenGL) texture is freed, you don't need to copy the source Texture.

I guess this would be very hard,, Now I understand why you copied a texture.

I tried to use CopyTexture, but I've got an error that the input source (AR Camera)'s texture and the TextureFrame's texture have different input dimensions.
Graphics.CopyTexture called with mismatching texture sizes (src 1440x1080x1 dst 1080x1440x1)
To solve this, I could do Graphics.CopyTexture after calling a Graphics.Blit(). I have no idea why Graphics.Blit() solves the dimension mismatching issue, but it worked.

However, Blit() seems unnecessary to me, and as you expected, the Mediapipe output was not in a sync with the camera input.

Anyways, thank you so much for the advice and I will let you know if I find an efficient way to bypass the SetPixel() call!

@homuler
Copy link
Owner

homuler commented Jan 31, 2022

Graphics.CopyTexture called with mismatching texture sizes (src 1440x1080x1 dst 1080x1440x1)

Aren't textureWidth and textureHeight reversed?

_textureFramePool.ResizeTexture(imageSource.textureWidth, imageSource.textureHeight, TextureFormat.RGBA32);

@grayhong
Copy link
Author

grayhong commented Feb 7, 2022

Yes! But I think it is issue in the AR camera (the camera source that I'm currently using). It's width and height are flipped by some reason. Thank you for checking!

@homuler homuler added type:question Ask a question and removed type:support Support issue labels Feb 21, 2022
@homuler homuler closed this as completed Feb 21, 2022
@homuler homuler removed the stat:awaiting response Awaiting response from author label Oct 16, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
platform:android Issue that occurs when the target platform is Android sect:plugin Issue about plugin's source code type:question Ask a question
Projects
None yet
Development

No branches or pull requests

2 participants