Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Docker for building project #34

Closed
Voveka98 opened this issue Jun 19, 2024 · 6 comments
Closed

Docker for building project #34

Voveka98 opened this issue Jun 19, 2024 · 6 comments

Comments

@Voveka98
Copy link

Hi! Thanks for your work.
I have some troubles with installation inside docker nvidia/cuda:12.1.0-devel-ubuntu22.04
I run docker with next command:

docker run --gpus all -it  -v /Unique3D/:/workspace/ --net=host --shm-size 5g  --name unique3d  nvidia/cuda:12.1.0-devel-ubuntu22.04

Then i install all requirements as in README.md and when i try to create gradio demo i get next error:

[F glutil.cpp:338] eglInitialize() failed
Aborted (core dumped)

So i'd like to know maybe you have solution for this problem or familiar with it?
Thanks in advance!

@Lektro9
Copy link

Lektro9 commented Jun 19, 2024

I had the same issue yesterday. Googling it suggested OpenGL is not supported by the Nvidia docker images? (NVIDIA/nvidia-docker#328).

What worked for me was replacing dr.RasterizeGLContext with dr.RasterizeCudaContext in the codebase. But make sure to remove output_db=False as the first argument.

I am not smart enough to know if the output quality suffers from this change but it works for me and I get excellent models (compared to other 3d mesh generators)

@Voveka98
Copy link
Author

@Lektro9 Thanks for advice, saw this solution but couldn't do it because of not removing output_db=False
Will go try this!

@jtydhr88
Copy link
Contributor

I committed a dockerfile https://github.com/AiuniAI/Unique3D/tree/main/docker

@Voveka98
Copy link
Author

@jtydhr88 Thanks a lot! I will try this
FYI: I faced a problem with gpu inference in docker from issue starting message:

2024-06-24 10:58:35.423958997 [E:onnxruntime:Default, provider_bridge_ort.cc:1730 TryGetProviderInfo_TensorRT] /onnxruntime_src/onnxruntime/core/session/provider_bridge_ort.cc:1426 onnxruntime::Provider& onnxruntime::ProviderLibrary::Get() [ONNXRuntimeError] : 1 : FAIL : Failed to load library libonnxruntime_providers_tensorrt.so with error: libnvinfer.so.10: cannot open shared object file: No such file or directory

but it was due to incorrect TensorRT installation. I fixed it with apt-get install tensorrt and now works nice!

@jtydhr88
Copy link
Contributor

Yeah , I do the same thing in my Dockerfile

@jtydhr88
Copy link
Contributor

The issue I think can be closed

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants