Problems Running WGPU with Vulkan on Discreet GPU #2907
Replies: 1 comment
-
Ok, I have found a temporary workaround for now. I was unable to get WGPU working properly with PRIME, so I have opted for now to disable that from my system by uninstalling the package I found the workaround by attempting to follow the wiki's guide on how to Use Nvidia graphics only. I currently have no
Paradoxically, this does not seem to have disabled my intel GPU, since running
returns
Of course, I can now no longer choose the Nvidia GPU with Be that as it may, the rust snippet from before:
now returns the following output:
Perhaps as expected, the following call
now crashes with the error:
However, explicitly choosing the Nvidia card with:
works as expected and produces a cube with the Mandelbrot set. I find the failure to select the Intel card a bit perplexing, as I can still run
and
Both of these still produce the same output from before. I'm not sure whether this is an issue with my configuration, or a bug in WGPU. I have verified that with my configuration described here the Bevy examples are also working. I am therefore opting for now to stick to my current configuration and forgo my use of That said, I would be very happy if anyone with any experience with Thanks |
Beta Was this translation helpful? Give feedback.
-
Hi,
I'm having trouble running WGPU with Vulkan on my discreet nvidia graphics card. As best I can tell, it doesn't seem to be a driver issue.
I am running an ArchLinux intallation, and have both an integrated Intel graphics card and a discreet Nvidia graphics card, as evidenced by running the following on my system:
Which produces:
I use PRIME GPU offloading on my system. It is verified to work by the output of the following two commands:
Which prints:
And:
Which prints:
Vulkan is installed and seems to be working. The command:
Produces a spinning cube and prints:
While
Produces a much faster spinning cube with the output:
The above would lead me to believe that all necessary dependencies are in order. I am able to run the examples on the integrated GPU. For instances, running
Produces the expected cube with fractal pattern and prints out
I am able to explicitly choose this GPU as expected by running:
However, if I attempt to run it on the discreet GPU with:
I obtain the following error message:
I've attempted to enumerate the graphics cards that WGPU finds, though I'm not very proficient with this library so I'm not fully certain this snippet is correct, with the following:
This prints out the following:
I would expect this to list out also my discreet GPU, and also have versions for the Vulkan backend, though these are absent here.
I found this issue by trying to run the examples from the Bevy game engine, which uses WGPU for its rendering. These failed due to Bevy's inability to find an appropriate GPU, and debugging that lead to tracing the issue upstream to WGPU. I am, however, certain that this is not an issue my with GPUs not being supported, as I have previously successfully run the same Bevy examples on the same machine on an Ubuntu partition.
At this point I don't know how to further debug this issue, since prior issues on WGPU or Bevy don't seem to solve my problem, and I haven't spotted anything of use in the Arch wiki. Any help would be greatly appreciated.
Beta Was this translation helpful? Give feedback.
All reactions