Unsolved

1 Message

1146

September 30th, 2020 13:00

UP3216Q, Radeon RX 580, not showing 10-bit color space

Hello,

I have a Dell UP3216Q monitor, a 2017 AMD Radeon RX 580 video card, and I'm rocking the latest Windows 10 updates. I have installed all the codecs recommended by Microsoft:

HEVC
AV1
VP9

My AMD video card has the latest drivers. When I look at the settings I am told that I am using 10-bit and that I am connected to the UP3216Q with a DisplayPort cable.

The advanced Display settings state 10 bit. However, I can not find a way to allow my UP3216Q to show that it can display HDR. I cannot findAdvanced display settings.jpgRX580 screen grab.jpgWindows display settings.jpg a setting to allow this.

 

I would appreciate any help or advice.

3 Apprentice

 • 

739 Posts

October 8th, 2020 04:00

Because your display is not HDR and at the same time does not accept HDR signal.
There are so called "HDR" that are actually no HDR but accept HDR signal and translate it to panel SDR capabilities, "fake HDR".

If you want to see HDR movie content on that display UNDER ITS LIMITATIONS in bightness & contrast use madVR and a compatible video player. Anyway you can see P3 colors on some TV brand demos available on the internet for download.
Then make a LUT3D from HDR content to your display profile. Best way to do it is DisplayCAL software.

Note: monitor OSD should be the one used to make such ICC profile. If using factory caibration then you should not have bought a widegamut monitor in 1st place... but you can switch to "Standard" OSD preset and use Dell ICC prodile instaled with monitor driver as "colorspace destination" when making the LUT3D.

Top