-
-
Notifications
You must be signed in to change notification settings - Fork 177
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
10-bit color support in opengl client #1309
Comments
With r15015, running
So we can detect support for 30-bit color and 10-bit per channel. We could probably handle R210 the same way (as "GL_UNSIGNED_INT_2_10_10_10") but since I don't have hardware to test.. this is not supported. @afarr: FYI, we can handle high color depth displays (only tested on Linux). |
2017-02-09 08:30:19: antoine uploaded file
|
See new wiki page: ImageDepth |
2017-06-20 21:00:35: maxmylyn commented
|
You may already have all you need:
More nvidia info here: 10-bit per color support on NVIDIA Geforce GPUs Actually verifying that you are rendering at 10-bit per colour is a bit harder:
Edit: AMD’s 10-bit Video Output Technology seems to indicate that 10-bit color requires a "firepro" workstation card |
Updates and fixes:
The test application is ready in #1553, but it's not really easy to use because it requires opengl... virtualgl can't handle the "r210" pixel format, and the software gl renderer doesn't support it either. |
r16303: the "pixel-depth" option can now be used to force the opengl client to use deep color (use any value higher than 30) - even if the display doesn't claim to render deep color.
Note the "r210" rgb format. Same result if the client is running on a 30-bit display with
Remaining issues:
|
Updates:
With these changes, it is now much easier to:
For macos, see also #1443 |
opengl applications running through virtualgl currently require this patch: #1577#comment:2 |
NVIDIA @ SIGGRAPH 2019: NV to Enable 30-bit OpenGL Support on GeForce/Titan Cards: At long last, NVIDIA is dropping the requirement to use a Quadro card to get 30-bit (10bpc) color support on OpenGL applications; the company will finally be extending that feature to GeForce and Titan cards as well. |
Split from #909.
The best explanation of the changes required can be found in https://www.nvidia.com/docs/IO/40049/TB-04701-001_v02_new.pdf
see 30-Bit Visual on Linux.
We'll need to tell the server we want 10-bit colour, maybe advertise a new YUV or RGB upload mode.
The text was updated successfully, but these errors were encountered: