I have a question about the XPS 9550's ability to display HDR content.
My XPS 9550 has an Nvidia GTX 960m graphics card. I understand from Nvidia's website that "All NVIDIA GPUs from the 900 and 1000 series support HDR display output," which seems to include my card. However, I also understand that mobile versions of cards often differ from desktop versions. I haven't been able to get a response from Nvidia support confirming whether the GTX 960m does or does not support HDR content, so I thought I'd come here! Do you know the answer?
I'm a mediamaker, and have a camera capable of shooting HDR content (specifically HLG/10-bit). I'd like to start experimenting with editing and delivering HDR, but I'd like to know if my GPU can support it before I sink the money into an HDR-compliant monitor! I know that the HDR market is still developing, but I'm sure there are folks out there who know the answer.
Thanks in advance for your help!
Solved! Go to Solution.
No HDR support. I’m not aware of any laptops on the market that support HDR right now except the latest Precision 7000 Series models. There are 2 limitations involved here. First, the GeForce GPU in the XPS 15, like almost every other laptop system that has a discrete GPU, is a render-only device, which means that instead of directly controlling the display outputs, it passes rendered frames to the Intel GPU, which then sends them to the display — and no Intel GPUs today support HDR, which they would have to if they control the outputs. Additionally, HDR requires either HDMI 2.0 or DisplayPort 1.4 to transmit, and that system only has HDMI 1.4 and DisplayPort 1.2. And the built-in panel definitely doesn’t support HDR.
The reason the Precision 7000 Series models are special is because they have a DisplayPort 1.4 output AND a BIOS option called “Graphics special mode”, which allows the discrete GPU to take direct control of the outputs. This is possible because those systems use a more complex and expensive motherboard consisting of DisplayPort multiplexers, and this also allows them to use other technologies that the Intel GPU doesn’t support, such as 5K displays, VR, G-Sync, etc.
Wow, thanks for such an informative answer! I knew about the output issues, but was hoping I may be able to skirt around them with the USB3c port; I didn't realize the GPU worked that way and was a limiting factor, too. Bummer!
Thanks again for such a thorough answer!
You’re welcome! Unfortunately the USB-C port only supports DisplayPort 1.2, and even when used in Thunderbolt mode you only get dual DisplayPort 1.2. Supporting a higher standard will probably require an upgrade to Thunderbolt itself since 2x DP 1.2 links already consume 32 Gbps of bandwidth out of TB3’s 40 Gbps max, and that’s before considering any PCIe traffic. 2x DP 1.3+ would require about 65 Gbps.
Also, it turns out that Intel GPUs support HDR from 7th generation Core onward, although 7th gen requires that the OEM implemented a special controller on the motherboard. However, you still need the correct outputs, and even after that, it turns out that running HDR in Windows is still kind of dicey since of course practically everything still uses SDR, which means unless you’re running your HDR application full screen, it can look weird. Here’s Intel’s white paper: www.intel.com/.../graphics-drivers.html
HDR displays are still in their infancy because the panel manufacturers are having trouble making them, which delayed the ASUS and Acer gaming displays originally planned for this month. I know BenQ has a 32” HDR display aimed specifically at photographers, and Dell has the UP2718Q, but Dell’s original display that claimed HDR (the S2418HN) didn’t actually support the HDR10 standard, nor did it have enough brightness to do it properly. They said it supported the “Dell HDR (non-)standard”, so definitely avoid that one. But the upside really is that by the time you get an HDR-capable laptop, there will be more (and better) HDR displays available.