Start a Conversation

Unsolved

This post is more than 5 years old

A

85046

April 3rd, 2014 15:00

U2413, Verify calibration in DCSS 1.5.3?

I was having big trouble trying to get a consistent calibration and now I would like to confirm that the calibration is good.

My kit:
2 x U2413 Dec 2013 in DisplayPort 1.2 Chain
i1 DisplayPro
Dell Ultrasharp Calibration Solution DCCS 1.5.3
dispcalGUI 1.5.7.7 with Argyll_V1.6.3
AMD A10-7850 APU graphics - Catalyst Driver Brighter Whites OFF, Dynamic Range Full(0-255), Enable dynamic contrast OFF
CAL1: sRGB
CAL2: Adobe RGB










My Calibration method for CAL1 sRGB:
Warm up the display for 30 minutes, tilt back a bit and turn off bright lights.
  DCCS Advanced mode > Profile > RB Primaries Custom xy... > Set to Wikipedia values
  White Point CIE Illuminant D65 > Luminance 120 > Gamma Tone Response Curve sRGB
  Chromatic Adaptation Bradford - ICC Profile Version 2 - Type Table
  Large Patch Set > Start Measurement
  Check the brightness and contrast values when it is finished and save these values in the profile filename. U2413 L CAL1 sRGB B23 C49 120.icm







My Calibration method for CAL2 aRGB:
  Same as sRGB but choose Adobe RGB in the primaries drop down





Doubts:
I have doubts as the Whitepoint and Luminance in DCCS in the Achieved window is always offset by up to 100K and 8 cd/m2. I tried using a notebook with intel graphics with the U2413 for the calibration, but this does not improve the results screen

Does anybody achieve the target here - I get 114K above target?


Investigation:
I suspected that my graphics card settings on the computer where causing the poor calibration results in the screenshot above. So I did a "Measurement Report" with 235 Tiles in dispcalGUI against the icc profile produced with DCSS.

The result showed that the icc profile from DCSS is nearly a 100% match to the measured values
< >



I took the icc profile, swapped the DisplayPort cables and measured using my laptop. < > . Comparing the two show they are virtually identical, so I can safely say my graphics driver settings on the Computer and Laptop are fine and are not causing any problems for calibration.

These measurements do not say anything about the calibration and how it conforms to sRGB or aRGB. It just says how well the calibration matches the DCCS profiles. So I tried to compare the calibration with the windows standard sRGB icc profile:sRGB Color Space Profile.icm.
 The results are quite disapointing << Measurement Report >>

I also tried to compare the factory calibrated sRGB mode to a standard sRGB icc profile and the results where downright terrible. < > The Deltas are much higher then the paper calibration report supplied with the screen.

Maybe I am doing it wrong and comparing apples to oranges in dispcalGUI.

On a final note, what kind of screen uniformity in DCCS 5 x 5 is normal?. Both my screens are -100K on the left and +100K on the right. This is quite noticeable where the screens meet in the middle of my desk as this is a 200K difference.











I hope somebody read this far(maybe yumichan) and can give me some tips or share calibration experiences or dispcalGUI measurement reports.

14 Posts

April 3rd, 2014 16:00

What external reference are you using to verify the calibration?

I.e. You need something that measures the output from your screen when you have it 'set' certain ways -- and eyeballing isn't sufficient for "verifying calibration".

There may be others, but the device is a Spyder3.  I think spyder4's are the current model....

Here's an amazon link with multiple options... figure out which suits your need... looks like the spyder4-pro is top choice right now, (don't know how often ratings change...)...

BTW sRGB is a pretty poor calibration standard.  It's like a "lowest common denominator".  Aim for something that has as good or better than the Adobe or NTSC (n.american broadcast tv) standards if you need good color reproduction.  FWIW -- NTSC and Adobe are about equivalent in rigour, -- and both specify wider color range than you can get with sRGB.

April 3rd, 2014 22:00

Thanks dude, but you missed the bit where I said Dell Ultrasharp Calibration Solution in the title and I1DisplayPro in my list of kit. No eyeballing there.

The I1DisplayPro allows proper measurement of the blue in the RG LED Phosphor backlit U2413 screen and also it the only colorimiter which allows you to save the calibration as a 3D LUT inside the display when used with Dell Ultrasharp Calibration Solution. Spyder is a poor choice as it cannot do the latter. The Spyder 3 cannot properly measure the blue primary of my display's backlight type.

The crux of my problem is: How to verify my Dell UltraSharp Calibration Solution calibration with dispcalGUI and am I doing it right as the screenshot with the CIE trace on it is never on target.

BTW: sRGB is CAL1 and aRGB is CAL2 - also mentioned in my post. As the screen can only do 99% aRGB, I am calibrating the sRGB first then looking for a near perfect verfication. Only after this will I play with aRGB.

3 Apprentice

 • 

725 Posts

April 4th, 2014 01:00

Hello, while in color managed apps the way you verify how good the calibration is, is to set the following:
http://en.community.dell.com/resized-image.ashx/__size/550x0/__key/communityserver-discussions-components-files/3529/08-_2D00_-dispcalGUI.jpg

In simulation profile set the standard profile of the content you'll be displaying, for example an sRGB image on Photoshop.
DispcalGUI will try to verify if a color managed app via RGB manipulation will render properly an image tagged with that simulation profile.

That answers how good is my profile for Photoshop. For desktop or non color managed apps the way to verify calibration is close to what you did, but not exactly (I think, you should ask Florian -DispcalGUI developer- for further details I do not know). Do the same I wrote above but check "use simulation profile as target", but always use the DCCS or DispcalGUI  "actual measurement" profile.


@Astara, ALL Christems is right, ther are ONLY 3 sub $2000 devices that can measure proerly a LED monitor (WLED or GBLED): i1DisplayPro, ColorMunkiDisplay and Spyder4, the last one not very accurate compared to the first ones. The Xrite ones have the same accuracy but Xrite wants people to buy the expensive one, so the munki was made slower and is blocked on certain software (BenQ DCCS clone for example)


@ALL Christems, if you are going to do a sRGB or Adobe RGB use the presets, do not choose xy coord by yourself. Try this and see how it performs.
By the way, as it has benn posted on several threads here we have some problesm with DCCS and Dell or Xrite do not care...
If results are as bad as your reports say maybe it is caused by how DCCS sets luminance. As said o prevous threads, first try ti aproximate White point (and contrast) , then measure native behavior, then compute LUT3D, write it and after that it tries to fix luminance... that's wrong. It should be done while messing around with white point in first step.
Maybe it's the casue so, try this:
-open DispaclGUI calibraton popup window, teh one with the 3 RGB bars for balance
-with OSD menu (not DDM) try to lower/raise contrast or brightnes by a few steps -5/+5 and see how it performs.

If it is the cause, please post it so we (more people than me) can argue DCCS faults and may be we get and update. Our monitors (U2413 U2713H) are "entry level", but 3000€ UP3214Q suffer form the same faulty software. Remember that we get DCCS 1.5 (i1Profiler and not the faulty Color munki software of DCCS 1.0) because this high end 4k monitors appeared in market. IMHO that the way we can get and update.


If none of these solutions work, please do a 3xcurves+ matrix 150 patches (or a big XYZLUT+matrix calibration & profile, try the simper one first) calibration in DispaclGUI/ArgyllCMS for Custom , Adobe and RGB OSD presets.
First do the Custom OSD mode for native gamut and whatever gamma you want. Then without messing with brghtness or contrast do the other two. For AdobeRGB or sRGB you will lost a few RGB levels for WP correction but I think you'll keep >91% values, the more of then lost in sRGB blue WP correction.
Not an optimal solution but a solution of sorts.

A final note: do not trust very dark measuremets in calibration report. Even the report says that. The not very acurate measurements will come from <1% luminance patches. To translate this to RGB values remember the gamma formula g=log(output)/log(input) with out and in going from 0 to 1 in real number values. Remember that in 2D plots RGB values are scaled in 0-100 range, not in 0-255. It's easy.
Maybe the dark red 4dE measured patches are not so trustable (I've not done calculations for your case)

Hope this solves your problem and that Dell gives us a patched working solution alt last.

April 4th, 2014 11:00

Hi Yumichan,

thanks for your suggestions, I will do some more testing and reading and get in touch with Florian.

DELL-Chris actually passed on a bug report of mine to the developers

http://en.community.dell.com/support-forums/peripherals/f/3529/p/19566605/20586175.aspx#20586175


and this bug will be fixed in a few weeks.

It might be an idea to collaborate and make a high quality bug report post with lots of eye-candy for the developers.  We would need to detail a procedure for users here to follow and confirm that the software is buggy. It might also be an idea to message everyone on the forum with  Wide Gamut LED UltraSharps and encourage them to participate in the Bug report - the follow the procedure and confirm the bug and ask for a fix. Eventually DELL will have to fix it.

14 Posts

April 4th, 2014 23:00

I'm sorry to hear that a Spyder 4 (the current generation) can't measure your abnormal blue LED's.  That is a problem with some LED's in that they display in a non-linear or narrower spectrum than you'd get with other lighting solutions.  I' surprised to hear that Dell used such LED's in a monitor. 


I use the Spyder 3 to measure a more conventional Dell screen as well as an LED-backlit Samsung 55" flat-panel.  It is capable of measuring LED-backlit displays -- though I take your word for it that the display you are using has special requirements.  I wouldn't think that was a good thing. 

When the 3008 was new it outdid both the aRGB as well asl the NTSC.  Now, 3 years later, it's still @ at a bit more along  purples, and a bit less in the yellow greens, but since our eyes are tuned more for yellow-green anyway, and purples are usually dimmer to our eyes, that's probably not a bad thing.

I must say that I've been impressed with the Dell 3008, though I hear the near ones are better.  I'm sure the LED backlit displays will be exceeding aRGB very soon.. maybe the next generation. 

It's so funny now I *used* to think my Viewsonic was so good -- back before I could do color measurement.  Besides barely passing for sRGB, it's been fading non-linearly probably since I had it (it's still in use on a 2ndary computer -- good enough for Youtube videos.. ;-) 

I'm really hoping a better than 8-bit solution comes out that is affordable for computer monitors.  I can often tell the difference between 8 and 10bit color in video, and often have banding problems in still art on the flatpanel...  Oh well.  I'll have to check out Dell's Calibration solution...sounds like it might have benefits over the Spyder4.  Don't know.  Still, having a monitor that can only have it's color corrected on the vendors color-calibration solution or on an over $3000 solution sounds like a bit sketchy, but I'm generally distrustful...oh well. 

Does my avatar really look like a dude?  ;^)

3 Apprentice

 • 

725 Posts

April 5th, 2014 04:00

You are wrong, it's not a matter of "our abnormal" led lihting... its a deficiency of your Spyder4. This device, altough is spectral corrections ready, have a verly low resolution ad MONOCROME ones, plus does not have any spectral correction for GB-LED type (U2413, NEC SPcetraview PA242W, Eizo ColorEdge GC276.. an so on... all new high end wide gamut monitors).

An example of how bad they are for an WLED with Spyder4Elite (monochome=almost useless for non white measurements):

This is a Xrite one (the file with WLED corrections contaisn 2 more types of spectra, this one is an "Eizo EV2736W"-like WKED spectrum, similar to Dell U2713HM):

Even you "hack" a Spyder4 to work with high resolution 3-trimulus corrections (ArgyllCMS) this device is not very acurate in dark patches.
It's a defficiency of Spyder devices. To buy an Spyder/datacolor device NOW (2012-2014) is not a sensible choice.

The old colorimeters like iDisplay2 or Spyder3 never were ready to measure properly any LED backlight because they do now know how to deal with spectral corrections:


-They do not know of to deal with de diferences between its own spectral sensibilities an actual CIEXYZ standard ones.

-They do not know how "to guess" the spectral power they have missed because they do not use this kind of spectral correcions we see in the 2D plots above.

This is the reason devices like MunkiDisplay,i1DisplayPro or Spyder4 were made.

LED backlight displays exceeds aRGB NOW, all GBLED have an extended red giving you 86% eciRGBv2  (a very strong saturated red in native gamut), like in old Widegamut CCFL backlight of U2410, spectraview PA241W and so on...


Your banding problem seems to be caused by:
-wrong calibration or profiling software

-wrong measurement device for that backlight

-you are using a non pro nvidia or intel iGPU (their LUTs 1D have a very low resolution)... try AMD or a nvidia Quadro

-your display's native gamma is too far away of your calibration target gamma

Try to remove al least two of these causes and there should be no banding even calibrating in GPU.
I calibrate in GPU "custom OSD mode" in my U2413 with an "normal" AMD (7000 series) and I get no banding at all in greys (ArgyllCMS or i1Profiler 1.5.0)


14 Posts

April 6th, 2014 21:00

I know very little about the difference between high end calibration tools.  I do know the Spyder3 does a good job with some back lit LED display like the samsung one Imentioned.  I don't have a spyder 4 so I can't say, so please don't refer to the spyder 4 as "my spyder 4"...

AFAIK, the spyder3 was 3-5 color sensors and the S4 was 7 color sensors -- that's the extent of my knowledge there.  I do know a bit more about LED's having the same problems as many flourescent bulbs in that they don't emit black-body curves for the colors they emit -- that means color calibration units designed to measure light from sunlight or full specturm dlighting won't do as well with non-black-body lighting sources. 

Banding problem isn't TAHT much of a problem but it isn't in the software -- it's a delta when I'm working closer to monochrome and you vary, say, blue by 1 point  *(@255 each) -- you can see the line.  If yuou do a gradient from 1-10 stretching across a 2560 display, you will see divisions if you don't turn on dithering..That

As for your comment it is a wrong It's a standard backlit.

I have tried nvQuadro -- was disappointed.  I ggenerally go with GeForce now -- current is a 590 -- getting a bit old, but my current dell workstation (a T7500) needs an extra power supply to run it due to a wiiring flaw, so a more powerful graphics card is not that likely soon.

Display's native gamma is between 2 & 2.2 -- I usually go for a standard 2.2 gamma for the PC as I want the larger range -- but that is why I said I wanted more 'bits'...  Human eye needs about 14-15 bits to be accomodated.  with about a 25,000X sensitivity, that throws it above 13 and below 14 (though I doubt my eyes are that good now).

you can't prevent banding in 8 bits.   IT's only 256 levels of each

color  - you can't cover a 25,000x range in 256 levels/color (8bit color).

Realistically 12 bits would be pretty good for me, but I knowpeople

who see differences down in lowbits far more than I do, they might

need more.

with 8-bits, banding is fairly common when working with gradients.

3 Apprentice

 • 

725 Posts

April 7th, 2014 01:00

That's why spectral corrections are used. Spyder3 cannot measure backlight it was not measured for in a generic way. It could be tunned with a high res spectrophotometer for a specific display via CMX in ArgyllCMS but not "out of the box".
This does not happen to the 3 new models (MunkiDisplay, i1DisplayPro and Spyder4). The maths are easy for the xrite ones, take native measurement, find RGB gain of the display to the reference spectrum, and then add or substract the weighted spectrum (sensibility diference between standard and actual ones multiplied for computed gain). Spyder4 Datacolor software does not have this info (RGB info of the spectrum in its software spectral corrections... the image I linked above, only monochrome) so it ony can compute limited aproximation.
And of course it measures wrong black levels, about 30-50% more than xrite counterparts (bad contrast measuremets, bad gamma finetune in low end of RGB)

Not a sensible choice right now by any means to buy a brand new Spyder no matter what version is.


Talking about banding I was speaking of dithered gradients in monitor's colorspace.

With this nv 590 you will be getting banding even in display colorspace. It's because if you calibrate in GPU nvidia gamer cards they lack of high bithdepth 1D LUTs so if gray ramp needs to be neutralided to a desired whitepoint it is done in 1/256 steps. Quadro or AMD with to this in 12/16bit and then temporary dithered to whatever GPU output allows (8bit in gamer AMD). That means that this temporal dithering gives the eye (and devices) "more bitdepth" visually. ANd of course this could be objetively measured:
Run a "Uncalibrated display report" in DispcalGUI and you will see that you have only 8bit undithered while with a modern AMD (6000 series or newer, perhaps older series have this feature too) it will give you a 9-10-12 bitdepth dithered over a 8 bit output.

So your banding problems seem to be caused by your device and the NV 590. The greater the gamut of your display is the more noticeable it will be.

April 10th, 2014 05:00

Hi Yumichan,


I have not tried your suggestions yet, but easter is on the way and I will have more time to play and post results.


I have been playing with xml files in DCCS, trying to make my own Display QA Reference Patch Set (mxf) and also Calibration Patch Sets (dptf). It is going quite well. The dtfp have primary and grey ramps and other targets. The ramps I assume are mandatory as the others are marked as optional. Also the ramps are performed multiple times during a calibration. It seems possibe to insert some ramps for the secondaries - the xml is easily changed to do this.

I hacked together an mxf for QA - saved page data on QA of patch set from an Image, then hacked this into 462 patches using the data contained in the large profiling patch set. I still want to compare apples to apples so I also made this patch set into a patch set for use in dispcalGUI. DCCS accepts my patch set and does QA with it. Unfortunately when it reaches the QA report screen, instead of showing deltas and LAB measured expected values, it shows "No Lab data". That is as far as I am at the moment with this.


I am getting some really poor Deltas ~6 on the cyan(second last) patch in the X-Rite ColorChecker Classic QA Chart

Same as here - expensive 4K display with Delta 6 on the cyan:

http://www.tftcentral.co.uk/images/dell_up3214q/calibration_software_11.jpg

I am playing with the X-Rite i1Profiler software which came with the i1DisplayPro and comparing it with DCSS using QA and X-Rite ColorChecker Classic QA Chart. It seems to have good deltas on V4 default profiles and terrible deltas on V2 matrix.


Too much data and too many variables - I am getting a headache.

I think the key to getting Dell to fix the poor calibration is screenshots and the save QA report of DCCS and i1Profiler calibrations.  More to follow

3 Apprentice

 • 

725 Posts

April 11th, 2014 01:00

AFAIK there is no need to "hack" profiling patches, i1Profiler (DCCS 1.5 or newer) makes a very good CLUT profiles for Bradford or CIECAT02 chroatic adaptations.

But the info you report about calibrations patches seems quite interesting, I'll go with its original patches unmodified and add more W,R,G,B patches to native ramp measurement, specifically in the lower half.


What CIEXYZ or CIEL*a*b* coordinates has these cyan patch (sorry I don't use Xrite validation) ?
Keep in mind that your display is 99% AdobeRGB in navite gamut, perhaps between 95-98% after LUT3D calibration. A "TRUE" AdobeRGB cyan, I mean (0,255,255) RGB value in a 8bit/channel AdobeRGB tagged image,  will be out of gamut in our displays (or U2410, or new Spectraviews, or older ones)
This may be the cause of your cyan deviations.

July 17th, 2014 04:00

Hi yummichan, thanks for your reply.

I have 2 week holiday coming up so I can finally answer your questions and also have time to hack some Calibration Patch Sets (dptf) to see If the really get DCCS to use large ramps and also secondary ramps for a better calibration.

No Events found!

Top