Start a Conversation

This post is more than 5 years old

Solved!

Go to Solution

435164

January 22nd, 2015 00:00

P2715Q, support 1.07 billion colors?

Hi guys.

I own this P2715Q just several days. At beginning I was so happy with the "1.07 billion colors" as its tech specs description when I just got it. But now, I really doubt this description after my investigation.

Here is the thing. I find on my PC under AMD catalyst control center - My Digital Flat Panel - Properties(Digital Flat Panel) - Color Depth only shows me two options "6 bpc" and "8 bpc" after I properly connect this monitor to R9 290 through miniDP cable. Since tech spec mentions P2715Q support 1.07 billion colors which is 10 bits color depth (I know this P2715Q uses a 8Bit + A-FRC to get a 10 bits color depth.). Thus, "10 bpc" should be expected under the AMD CCC color depth setting.

Furthermore, I pull out the EDID information through a AMD EDID UTILITY. Base on EDID 1.4 version data form, the data on location "14h" is the video input definition. If the monitor really supports 10 bits color depth, the data under "14h" should be "1 0 1 1" for 7th, 6th, 5th and 4th bit. However, my monitor EDID data under location "14h" is "A5" which is "1 0 1 0 0 1 0 1". It shows "1 0 1 0" for 7th, 6th, 5th and 4th bit. That means this monitor only support 8 bits per primary color based on interpretation of "VESA ENHANCED EXTENDED DISPLAY IDENTIFICATION DATA STANDARD".

So I want to know whether this is a single case or it happens on all the p2715q monitors. Can any P2715Q user look at your AMD CCC setting if you are using AMD graphic card or just pull out the EDID information?

 

1 Message

June 1st, 2015 06:00

I've got the same doubt. Then I googled here.

After reading almost all the posts and trying for N hours...I found that it does.

My P2715Q's EDID's 14h is the same as yours, which is A5.

The solution turns out to be quite easy: use some file to override the monitor's EDID.

My card is a Quadro K620.

First I tried using an inf. I failed. Maybe I did not do it right.

Then I reached the working solution...

1. Use

NVIDIA Control Panel/Workstation/View system topology/System/Quadro XXX/DELL P2715Q/EDID source/Export

to pull out the EDID,

2. Change the a5 to b5, then use

NVIDIA Control Panel/.../Quadro XXX/DisplayPort/EDID/Load

to load the modified EDID. May need to restart the OS. Then it is done.

The file yumichan mentioned, 10 bit test ramp.zip, can be used to test.

Many thanks to beamformer, DELL-Chris M, and yumichan, thank you for your information.

2 Posts

October 30th, 2015 11:00

I came here through google to search for hacking EDID to enable it since I own a w1070 projector which supports 10-bits but I have an NVidia card, which disables 10-bit in the driver (though the hardware can certainly do it) in order to sell their higher end cards.

1) One can most definitely use a 10-bit color desktop setting, with a TV or monitor that supports 10bpc (30-bit total), since Windows XP, and get the benefits of reduced banding.

AMD have OFFICIAL support for 10-bit color in their drivers on ALL consumer level graphics cards, as long as the EDID supports it you're good to go, and you use either DP, VGA, DVI, or HDMI Deep Color (30 bit).

I've spoken to AMD engineers about it, and it's definitely possible for games to send 10 bits per channel over HDMI to most TVs, which have supported 30/36/48 bit Deep Color since HDMI 1.3. And VGA, being analog, only depends on your app / windows desktop setting being set to 10-bit, plus the RAMDAC, and the SNR of your cable. Otherwise in theory an analog connector could support an arbitrary bit depth, up to the limits of the SNR of course. But short vga cables can handle more than 16bpc.

2) Many 2015 TVs have 10-bit support, either with native panel bit depth or through 8-bit + FRC dithering, and it's certainly a good idea to pick an AMD card to exploit that since banding is a big issue, and native 10-bit video content is coming in the form of UHD Blurays.

10-bit color is supported natively in Windows since Windows XP and in Windows 7/10, it's really easy to enable it on AMD cards and worth doing, even if you have to override your EDID. And it's easy to test that it's working, by showing a 16-bit grayscale gradient png using first 8bpc then using 10bpc. If you don't see banding being reduced by a factor of 4x, then 10bpc isn't actually working. But yes, it works. I trust AMD engineers a lot more.

My Benq w1070 projector supports 10-bit / 1.07 billion colors through dithering and as soon as UHD Blurays come out I intend on taking advantage of the reduction in banding + the support for DCI P3 color space (through the use of a color conversion filter). So for 600 bucks and an AMD card, I can get a lot of video quality "bang for the buck" by enabling 10bpc.

Community Manager

 • 

54.2K Posts

January 22nd, 2015 08:00

Email sent to the team.

Community Manager

 • 

54.2K Posts

January 25th, 2015 11:00

Confirmed it is 8 bit plus AFRC (Advanced Frame Rate Control) dithering to attain 1.07B colors. But we do not know why the AMD shows only 8 bit. Under investigation.

1 Message

January 25th, 2015 16:00

I have the same problem, AMD CCC only shows 8 bit color support for the P2715Q and I'm using the latest "omega" driver using a Radeon R9 280X.  Thanks for looking into this beamformer!

11 Posts

January 25th, 2015 18:00

AMD driver reads EDID information stored in EEPROM on the monitor logical PCB and determines which color depth video format the graphic card will send. However, the EDID says this monitor only supports 8 bits color depth. The AMD driver will definitely just sends 8 bit color depth video format and shows it only support 8 bits color depth. My suspicion is that Dell engineers forgot to correct the EDID data when they modified it from some template.

1 Rookie

 • 

719 Posts

January 26th, 2015 10:00

You won't get 10bit support on a monitor unless you use ALL of these at the same time and computer:

-10bit input (not panel, just input) monitor

-windows

-AMD Firepro or nvidia Quadro

-some app using OpenGL 10bit output, like Photoshop.

Since you have an R9, you won't get 10bit, you can't: nor Dell, nor Eizo, nor NEC. But since you own an AMD GPU you can calibrate your monitor in GPU without banding. Gammer nvidias or intel integrated graphics cannot do that, you have "some" advantage.

1 Rookie

 • 

719 Posts

January 26th, 2015 11:00

That would be HDMI deep color fearute. Anyway, no ACTUAL 10bit output unless all four requirements are met. Your system does not meet ONE, so no ACTUAL 10bit support (aka 30bit pipeline).


Not an issue related to dell monitor, just inapropiate hardware in your system for that feature. No new Firepro or Quadro, no 10bit. It's a simple and final stattement.

1 Rookie

 • 

719 Posts

January 26th, 2015 11:00

For example:

http://www.amd.com/en-us/products/graphics/workstation/firepro-3d

search for 30bit pipeline and educate yourself about this matter before answer.

11 Posts

January 26th, 2015 11:00

The Samsung s27d850t advertises its provides 1.07 B color depth. And my AMD CCC shows it supports 10 bits color depth. dell p2715t advertises 1.07 B color depth but CCC shows maximum color depth is 8 bits. I cannot comment on your answer unless you post some official references from AMD, Nvidia or Adobes.

11 Posts

January 26th, 2015 11:00

I also own a Samsung s27d850t. My R9 does actually shows this monitor support 10 bits color.

11 Posts

January 26th, 2015 11:00

I do not care 30 bits pipeline or pro graphic card. Just tell my why R9 driver shows s27d850t supports 10 bits color depth but p2715q only support maximum 8 bits. And why does s27d850t has "C15" on location "14h" but p2715q has "A15" on the same location of each EDID information?

11 Posts

January 26th, 2015 11:00

I never expect I can see 1.07 billion colors. I even cannot find too much difference between 6 bits color and 8 bits color. But I pay the money for this monitor. I must make sure all the actual parameters are exactly same as its advertising.

1 Rookie

 • 

719 Posts

January 26th, 2015 11:00

But you do not have, nor will have 10 bit support in that Samsung monitor with your current GPU video card. The same applies to your Dell monitor. There is no 30 bit pipeline without an AMD FirePro video card or Nvidia Quadro video card. You do not have an issue with your Dell monitor.

11 Posts

January 26th, 2015 11:00

I never connect my monitor throught HDMI. I only use displayport 1.2 connection.

No Events found!

Top