Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

10-bit color support in opengl client #1309

Closed
totaam opened this issue Sep 16, 2016 · 12 comments
Closed

10-bit color support in opengl client #1309

totaam opened this issue Sep 16, 2016 · 12 comments

Comments

@totaam
Copy link
Collaborator

totaam commented Sep 16, 2016

Split from #909.

The best explanation of the changes required can be found in https://www.nvidia.com/docs/IO/40049/TB-04701-001_v02_new.pdf
see 30-Bit Visual on Linux.

We'll need to tell the server we want 10-bit colour, maybe advertise a new YUV or RGB upload mode.

@totaam
Copy link
Collaborator Author

totaam commented Feb 8, 2017

With r15015, running xpra/client/gl/gl_check.py against a 30-bit display I get [/attachment/ticket/1309/gl_check.txt], which shows:

* blue-size                       : 10
* red-size                        : 10
* green-size                      : 10
* depth                           : 30

So we can detect support for 30-bit color and 10-bit per channel.
And r15018 handles 30-bit modes with native 30-bit upload: "r210" == "GL_UNSIGNED_INT_2_10_10_10_REV".
r15019 fixes swapped colour red and blue (oops), r15026 allows us to prefer high bit depth "r210" plain rgb encoding if the client is using 10-bit depth rendering. (jpeg and video encodings will still be used for lossy packets).
r15027 shows the bit depth on session info (normal bit depth is 24):
[[Image(session-info-bit-depth.png)]]

We could probably handle R210 the same way (as "GL_UNSIGNED_INT_2_10_10_10") but since I don't have hardware to test.. this is not supported.

@afarr: FYI, we can handle high color depth displays (only tested on Linux).

@totaam
Copy link
Collaborator Author

totaam commented Feb 9, 2017

2017-02-09 08:30:19: antoine uploaded file session-info-bit-depth.png (50.0 KiB)

shows the bit depth on session info
session-info-bit-depth.png

@totaam
Copy link
Collaborator Author

totaam commented Feb 16, 2017

PS: r15094 fixes opengl rendering which broke because our hacked pygtkglext library is missing the "get_depth" method, OSX clients will not support high bit depths until this is fixed: #1443

@totaam
Copy link
Collaborator Author

totaam commented Feb 20, 2017

See new wiki page: ImageDepth

@totaam
Copy link
Collaborator Author

totaam commented Jun 20, 2017

2017-06-20 21:00:35: maxmylyn commented


Realistically, we won't be able to test this until we get proper hardware for it. And even then, I have no idea what said proper hardware will be.

@antoine - some input as to what we should be testing with would be nice, but I wouldn't hold my breath on us actually getting said equipment if it involves asking for new hardware.

@totaam
Copy link
Collaborator Author

totaam commented Jun 20, 2017

@antoine - some input as to what we should be testing with would be nice, but I wouldn't hold my breath on us actually getting said equipment if it involves asking for new hardware.

You may already have all you need:

  • a good monitor, most 4k monitors can do 10-bit colour nowadays
  • a recent enough graphics card
  • probably best to use a displayport cable, not HDMI
  • support also varies from OS to OS.. (more limited on MS Windows AFAICT)

More nvidia info here: 10-bit per color support on NVIDIA Geforce GPUs

Actually verifying that you are rendering at 10-bit per colour is a bit harder:

  • see comment:1 to ensure 10-bit rendering is available
  • enable 30-bit as per ImageDepth
  • force RGB only encoding: --encodings=rgb and verify that paint packets come through as "r210" rgb pixel format
  • use an application that renders in 30-bit colour and verify that we forward the colours accurately (we should probably write a test opengl app to make it easier: high bit depth test application #1553)

Edit: AMD’s 10-bit Video Output Technology seems to indicate that 10-bit color requires a "firepro" workstation card

@totaam
Copy link
Collaborator Author

totaam commented Jul 12, 2017

Updates and fixes:

  • r16297 fixes mixed mode ("r210" vs "BGRA" uploads must use different upload datatypes), and red and blue colors were swapped when using r210 to RGB downsampling (ie: with all non-plain-rgb picture codecs)
  • r16298: make it possible to force the high bit depth code path using the XPRA_FORCE_HIGH_BIT_DEPTH=1 env var

The test application is ready in #1553, but it's not really easy to use because it requires opengl... virtualgl can't handle the "r210" pixel format, and the software gl renderer doesn't support it either.
So in order to test, I had to run the xpra server against my main desktop with the nvidia driver configured at 10 bpc.
Then connect a client... and the only client I had available for testing was a windows 7 system, and ms windows doesn't do 10 bpc with the consumer cards, so I had to swap cards. Then the monitor it was connected to didn't handle 10 bpc, so I had to swap that. Then the cables were too short. Then I had to make fixes (see this ticket and many other fixes yesterday - bugs you only hit with --use-display for example...)
TLDR: hard to test!

@totaam
Copy link
Collaborator Author

totaam commented Jul 12, 2017

r16303: the "pixel-depth" option can now be used to force the opengl client to use deep color (use any value higher than 30) - even if the display doesn't claim to render deep color.
ie: running the server with --pixel-depth=30 -d compress, and a linux opengl client with --pixel-depth=30 --opengl=yes, I see:

compress:   0.1ms for  499x316  pixels at    0,0    for wid=1     using  rgb24 with ratio   1.6% \
    (  615KB to     9KB), sequence     5, client_options={'lz4': 1, 'rgb_format': '[r210](../commit/8875e19d8d67a1e2eeb8f189c9b1d526bd1d0e44)'}

Note the "r210" rgb format. Same result if the client is running on a 30-bit display with --pixel-depth=0 (the default)
Whereas if the client runs on a 24-bit display, or if we force disable deep color with --pixel-depth=24 then we see:

compress:   1.4ms for  499x316  pixels at    0,0    for wid=1     using  rgb24 with ratio   1.3% \
    (  615KB to     7KB), sequence     3, client_options={'lz4': 1, 'rgb_format': 'RGB'}

Remaining issues:

@totaam
Copy link
Collaborator Author

totaam commented Jul 13, 2017

Updates:

  • r16307 adds some of the test code to the xpra source under [/browser/xpra/trunk/src/xpra/client/gtk_base/example xpra/client/gtk_base/example] and makes it python3 compatible - those examples have a helper shortcut on macos, and EXE binaries on win32
  • r16326 (+r16327 fixup) adds 16-bit opengl rendering support - very useful for forcing a low bit depth and seeing more color banding
  • r16328 (+r16329 fixup) better transparency support for 16-bit mode
  • r16331 improves the color gradient app (see high bit depth test application #1553#comment:4)

With these changes, it is now much easier to:

For macos, see also #1443

@totaam
Copy link
Collaborator Author

totaam commented Jul 20, 2017

Tested on win32 (no luck) and Linux (OK) as part of #1553, for macos testing: #1443. Closing.

@totaam totaam closed this as completed Jul 20, 2017
@totaam
Copy link
Collaborator Author

totaam commented Jul 28, 2017

opengl applications running through virtualgl currently require this patch: #1577#comment:2

@totaam
Copy link
Collaborator Author

totaam commented Jul 29, 2019

NVIDIA @ SIGGRAPH 2019: NV to Enable 30-bit OpenGL Support on GeForce/Titan Cards: At long last, NVIDIA is dropping the requirement to use a Quadro card to get 30-bit (10bpc) color support on OpenGL applications; the company will finally be extending that feature to GeForce and Titan cards as well.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant