-
Notifications
You must be signed in to change notification settings - Fork 898
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LiF MMO: rendering issues #591
Comments
Greetings good sir. I recommend updating your Nvidia drivers to the latest version. The recommended version is currently 396.54. Refer to https://github.com/doitsujin/dxvk/wiki/Driver-support |
Thanks. I'll try that driver version when it is available in my distro. It isn't yet. When running the game on windows, I noticed it looked like the dxvk screenshot initially, until the game fully loaded. For some reason, in wine it never appears to load in the high quality resources. I'm not sure if it is failing to load textures or shaders or what. Is there a way to determine this? Here is pic from windows of how it should look. |
It is very unlikely that a shader would cause issues like that. For some reason the game is not streaming in higher resolution textures. |
Maybe the game's log contains anything useful, would you mind uploading that? The issue is baked into the apitrace you provided. Can you make another one but this time on Windows with normal Direct3D? |
@K0bin Nice thinking. The game log under wine has errors about out of texture memory that are not present in the windows log.
And the the texture memory budget is 0 under wine while it has reasonable numbers under windows, e.g.
Any thoughts why the budget might be 0 under wine? Game logs
New API tracesIn windows, it takes a couple mins before the high quality resources are switched in, so I redid the wine apitrace waiting idly for what should have be enough time for the texture error to occur. The win api trace waits in game long enough for the high quality resources to switch in before exiting the game.
Also, just curious.. Can you briefly explain what tool(s) you use to interpret the apitrace files? What sort of things do you look for in them to help identify issues? Perhaps, using these tools myself will help me identifying issues on my own. Thanks |
The logs were indeed useful because it seems like the game can't get the amount of VRAM your graphics card has. Apitrace allows you to look into all D3D calls the game makes and replay them without owning the game. I think I'm gonna look into the calls with which the game reads the amount of VRAM, look for differences between the DXVK one and the working D3D11 one and hope something sticks out. |
Thanks looking into that. At risk of being off-topic, is https://apitrace.github.io/ the tool you're using or are there others? I tried it briefly, and it seemed to only provide api calls for the first part of the recorded session. It didn't have api calls past the loading screen (though it did replay the full recorded session). Maybe the files I uploaded were simply too large. |
I assume LIF MMO is 64 bit, if so, please test: |
That fixed it.
Seems like it also causes the process to become CPU bound. Which API was it that was reporting invalid VRAM? |
This one: This isn't a proper fix, I just hardcoded all the values to what native D3D returned for your card to see if that works. The game apparently uses another api to get VRAM and tries to match that using something that D3D11 returns. (maybe the device name)
I don't really know why that happens, I didn't really change anything performance. It's streaming in textures now which is obviously work but probably shouldn't be that bad. What's causing the cpu load, the game process or Wine server? Can you test if this one also does the trick: |
dxgi_nv_adapter_name.zip didn't fix the issue (high quality textures not streaming in, 0 texture memory budget reported in logs), and it also caused the process to become CPU bound like the first zip. So worst of both worlds. I've been using dxvk version 0.70. Are your changes based top of a different version?
It is a thread in the game process, cm_client.exe. Here is some
|
Try this one please: I've tried this build in Witcher 3 and it runs exactly the same as v0.70. It's based of the latest master but that shouldn't impact performance either. Please also retest v0.70 to make sure that it's really running faster because there shouldn't be any reason for that. |
dxgi_nv_adapter_name_luid.zip behaves similarly to dxgi_nv_adapter_name.zip. The difference between your libs and the 0.70 release under the releases tab on github is less severe than I originally posted, but it does appear measurable and consistent. I tried more carefully to reduce the number of variables in testing the two sets of dlls this time, e.g. made sure rendering the same scenery, waited a few mins until things stabilized, re-ran each test 4 times, switched back and forth between the two sets of libs for each test. dxgi_nv_adapter_name_luid.zip
0.70 from github releases tab
I tested dxvk-0.65-state-cache-x64-simplified.zip briefly again, and it was practically the same as the dxgi_nv_adapter_name_luid.zip results above. Maybe the difference is from compiler flags/optimizations present in the official release but not in yours? |
I'm using MSVC while doitsujins builds are done with MinGW. I've never seen such a big performance gap and if anything builds done on Windows have been slightly faster for some reason. Let's concentrate on fixing the streaming issue first: |
The issue remains with dxgi_nv_adapter_subsys.zip. 0 texture memory budget reported in logs. |
Could the game be using nvapi, if so, try setting these env variables: DXVK_CUSTOM_VENDOR_ID=1002 DXVK_CUSTOM_DEVICE_ID=67DF Also, disable both nvapi and nvapi64 from winecfg. |
I disabled nvapi64 (there was no nvapi, perhaps because it is a 64-bit prefix?), and did
and used the following config
It didn't appear to have any effect. |
So if the first dlls I provided worked and the ones after didn't then the game is probably using |
Yes. The first dlls ( I see the comment that contained the link to the working dlls I downloaded, |
I just corrected the file name that I messed up. |
@K0bin by this, do you mean that you suspect the game is using these values to lookup how much memory to reserve for textures based on pci ids, rather than a value like |
It seems like that's the case as the values DXVK reports are certainly not 0. I've also read somewhere that their engine is based on
|
I tried it. It didn't work. Sorry, if that wasn't clear in #591 (comment) |
Searching around, I've been reading that this game is using a heavily modified Torque3D game engine and Torque3D used to query the video device memory using the Win32_videoController class through WMI. I'm not sure if this engine is still using that. |
@ZeroFault, I think @K0bin said he only stubbed the results from IDXGIAdapter::GetDesc in order to fix the issue. So it would only be a matter of figuring out which of those stubbed values are specifically required by the game (then properly settings those values, instead of hard-coding them). |
That's exactly what I was trying to do with all these builds. I've found a forum post that said they deviated from Torque 3D in 2011. The earliest commit in the Torque3D GitHub repo is from 2012. They do the following:
If they were still using that code, it would probably work. I'll just assume that the DXGI case was implemented after they forked. So it might be possible that DXVK has a different ordering for adapters than Wine but that doesn't explain why Wine doesn't work except if it was using an inconsistent ordering between WineD3D and WMI. That also wouldn't explain why hardcoding everything fixed it. I'm running out of ideas so let's try something a little bit weird: I'm gonna just hardcode the amount of memory. It's slightly less than DXVK reports and shouldn't work but, lets just see what happens: |
@K0bin dxgi_memory_hardcode.zip works.
|
I'm confused.
The engine divides that by 1048576 (Byte => MB), which gets us 8077 and 8192. Why does it decided that the latter one only gets 0 MB for textures? |
@K0bin Can you throw the code for |
That's everything I changed in dxgi_memory_hardcode compared to master. Master uses the value that Vulkan gives us. I got these numbers from your D3D11 apitrace. The newer open source version of Torque only uses |
And just for reference, from the DXVK apitrace, I see.
Hopefully, other people familiar graphics APIs will have some ideas. The only thing I can think of is the engine falls back to 0 if it sees an unexpected value that doesn't match what it has hardcoded for the given pci id, though why it would do that, I don't know. |
This makes no sense to me so let's just see if it works if I subtract 8 bytes from the VRAM for no reason at all. |
That works. Please send a PR that arbitrarily subtracts 8 bytes from the VRAM. :)
|
fwiw, Doesn't really explain why the game engine reserves 0 bytes for texture memory though. Perhaps the game engine uses another api to get the total memory, detects the dedicated memory is equal to the total memory, and falls back to 0 because they shouldn't be equal? |
No, it's not. DXVK just reports the value it gets from Vulkan, and this game is particularly stupid in that it seems to cut off the upper 32 bits from the reported value. I'll add a config option for this, and limit this game to 4095 MB by default if that works. |
Odd. It would be high 31 bits, since it doesn't overflow/fail at 4GB. It works up to but not including 8GB. @doitsujin , Maybe should limit it to min(actual_value, 0x1ffffffff)? Otherwise, 4GB might be an issue if you don't in fact have at least 4GB? Plus that wouldn't limit normal users who have a modern gpu that has 8GB+ who aren't going to mess with config files. |
@doitsujin Implemented the option to limit memory, please verify that this works: |
Three issues with the patch:
Checking the dlls here in a minute.. Update: yes. Those dlls worked. |
If they even have the same filename then I'd assume that they have the same problem
4095 is the highest possible value for a 32bit unsigned integer. The 1070 has 8096 MB of VRAM, if you subtract 1 from that you get 8095 which when converted to a 32bit uint overflows to 4095. |
I see what you mean. Thanks |
It does work? |
Yes (sorry, I updated the last comment with the results) |
Hello ! I am running LiF:MMO from steam using the experimental proton. Please find attached the logs file. |
This isn't a DXVK log so it isn't useful to us. |
If in fact it is the same issue, you should be able to fix it by the same workaround i.e. setting the same config overrides to a value below 4GB that doesn't overflow 32 bits. See 73cbf5b
Lines 43 to 50 in 2133049
However, from a brief search online, I don't think your radeon hd 5600 video card has 4GB+ memory, which was part of this original issue. Wikipedia lists that card as 1 or 2 GB.
|
The HD 5600 also doesn't support Vulkan 🐸 |
Oh, thanks. |
Hello. What is the finish fix to the problem? |
And I don´t speak english and the google translate doesn´t translate very good. So please write a simple awser ease to understand for me |
There are rendering issues in LiF MMO as seen below. The dxvk HUD shows up as expected. I'm not sure what to make of it. It seems like the shaders for clothing/skin/vegetation aren't applying correctly?
I tried all DXVK_CONFIG_FILE settings, but none helped.
Software information
System information
Apitrace file(s)
Log files
wined3d
If it helps identifying whether the issue is in dxvk, the game renders similarly with wined3d (d3d11.dll, d3d10.dll, dxgi.dll, etc set to builtin).
With wined3d, there are numerous errors related to blitting that are not present in dxvk, e.g.
The text was updated successfully, but these errors were encountered: