Yesterday I got my new Dell U3417W monitor replacing the U2413. With the old monitor, 10-bit-Color was no problem, the option was available and selected by default in the Radeon settings (Radeon Vega Frontier Edition w/ Gaming Drivers 18.2.2).
Now with the new monitor, just 8 and 6 Bit options are available. The newest drivers for the Radeon Vega card and the monitor are installed. What am I doing wrong?! Do I need to switch some option in the Dell OSD for 10 Bit to become available? The Radeon card cannot be the problem, since it works flawlessly with the old U2413 monitor.
Thanks a lot for any suggestions & ideas,
The AMD driver with the 2012 U2413 was only driving the 10 bit (8 bit + AFRC) to 1920x1200. So not a good comparison to the U3417W 10 bit (8 bit + AFRC) 3440x1440. The video card driver has to offer full support of 10 bit at that high resolution. You need to post the issue on the AMD Forums. Ask them which AMD driver offers 10 bit support at that high resolution. By the way, we have never tested this AMD hybrid (not a gaming card, not a developer card) video card. Read these reviews, here and here.
thanks a lot for your feedback and the information about the missing 10-bit perhaps being dependent from resolution.
I will try the AMD forums then
Thanks & greetings from Germany,
Since I wasn't able to solve the problem with the U3417W (no 10-bit even at lower resolutions and no help at the AMD Forums), I tried the other monitor option, which is the UP2516D, which arrived today.
Unplugged the U3417W, plugged the UP2516D in - and there it is again - 10-bit color .
Since the U3417W doesn't even report as 10-bit-display in it's EDID-information (same problem with user turantelle's U3415W monitor here in the forums) and the UP2516D does, I think that is and was the problem.