Start a Conversation

Solved!

Go to Solution

1 Rookie

 • 

9 Posts

5947

July 18th, 2019 20:00

HDMI 2.0b

Hi, Can anyone review this chat and tell me if any Nvidia GPU. laptop or desktop, actually has 2.0b in real life. If not are they firmware or driver updateable? I would like to have Dolby vision, HDR10, HDR10+ and HLG. All of these formats. I think HLG content isn't available yet. If I have to I'd settle for 2.0a. Really want HDR10+. An Nvidia chat told me all desktop GPU's are 2.0b, as listed on their specs website. Laptops are subject to manufacturers HDMI version specs so they could be 2.0, 2.0a, 2.0b, or 2.1. Even more confusing 2.1 specs can be thrown into 2.0b outputs. I am looking at the new G3 laptops, new XPS, and new Inspiron desktops. Thank you for any help.

Chat is edited.

Arun_Kumar30
Welcome to Dell US Chat! My name is Arun and I will be your Dell.com Sales Chat Expert. I can be reached at arun_kumar30@dell.com or 1 (800) 456-3355 Ext: 4161517.
‎4‎:‎08‎:‎50‎ ‎PM

John
Hi, I've been looking at mostly gaming laptops and today desktops. How would I guarantee an HDMI 2.0b port for Dolby vision, HDR10, HLG. Hopefully even HDR10+?
‎4‎:‎12‎:‎20‎ ‎PM

John
I've only seen the AMD 570 and 580 series have that option listed under the desktop section.
‎4‎:‎14‎:‎07‎ ‎PM

John
Sadly, there's an Alienware desktop with a 570x, I believe that doesn't list an HDMI port!
‎4‎:‎14‎:‎55‎ ‎PM

Arun_Kumar30
It has an HDMI port on the back of the Video card
‎4‎:‎17‎:‎45‎ ‎PM

Arun_Kumar30
HDMI 2.0B supports dolby vision, HDR10, HLG and HDR10+
‎4‎:‎17‎:‎50‎ ‎PM

John
The Alienware with the 570x has HDMI 2.0b?
‎4‎:‎19‎:‎20‎ ‎PM

Arun_Kumar30
Yes, it has HDMI2.0B
‎4‎:‎21‎:‎24‎ ‎PM

John
Do all the graphics cards, on dell's website, from the gtx 1050 up have 2.0b. All nvidia cards, as well, including laptops?
‎4‎:‎22‎:‎43‎ ‎PM

John
Or 1050ti, that would probably be the lowest card I would go with in a laptop. I would prefer a laptop.
‎4‎:‎22‎:‎50‎ ‎PM

Arun_Kumar30
These are the graphic cards with HDMI 2.0b
AMD Radeon™ RX 570 4GB
AMD Radeon RX 580 8GB GDDR5
‎4‎:‎24‎:‎26‎ ‎PM

John
The 570x has 8GB ram, I believe. You said that also has 2.0b.
‎4‎:‎25‎:‎34‎ ‎PM

John
Wow, that's interesting an NVIDIA agent and their website lists every desktop model has 2.0b. Laptops are up to the specific manufacturer what version is on the GPU.
‎4‎:‎28‎:‎25‎ ‎PM

Arun_Kumar30
Here's the list for you
‎4‎:‎28‎:‎30‎ ‎PM

Arun_Kumar30
NVIDIA® GeForce® GT 1030 2GB GDDR5: HDMI 2.0, Single Link DVI-D
AMD Radeon RX 560 2GB GDDR5: DisplayPort, HDMI, Dual Link DVI-D,
NVIDIA® GeForce® GTX 1050Ti 4GB GDDR5: DisplayPort 1.3 (1.4 Ready), HDMI 2.0, Dual Link DVI-D
AMD Radeon™ RX 570 4GB GDDR5: 3x DisplayPort 1.2, (DP1.4 HDR Ready), HDMI 2.0b
NVIDIA® GeForce® GTX 1060 6GB GDDR5: 3x DisplayPort 1.3 (1.4 Ready), HDMI 2.0, Dual Link DVI-D
AMD Radeon RX 580 8GB GDDR5: 3x DisplayPort 1 .2 (DP1.4 HDR Ready), HDMI 2.0b
NVIDIA® GeForce® GTX 1070 8GB GDDR5: 3x DisplayPort 1.3 (1.4 Ready), HDMI 2.0, Dual Link DVI-D
NVIDIA® GeForce® GTX 1080 8GB GDDR5X: 3x DisplayPort 1.3 (1.4 Ready), HDMI 2.0, Dual Link DVI-D
NVIDIA® GeForce® GTX 1660Ti: 3x Display Port 1.4, 1x HDMI 2.0, 1x USB-C
NVIDIA® GeForce® RTX 2060: 3x Display Port 1.4, 1x HDMI 2.0, 1x USB-C
NVIDIA® GeForce® RTX 2070: 1x Display Port 1.4, 1x HDMI 2.0, 1x DVI-D
NVIDIA® GeForce® RTX 2080: 1x Display Port 1.4, 1x HDMI 2.0, 1x DVI-D
‎4‎:‎31‎:‎08‎ ‎PM

John
Is it possible to get 2.0b capabilities thru DP 1.3 or 1.4 using an adapter cable. Or Thunder bolt type C? I already bought a Vizio M65-G1 TV.
‎4‎:‎34‎:‎47‎ ‎PM

Arun_Kumar30
I'm sorry John, it is not possible to get the capabilities thru DP 1.3 or 1.4 or any other port
‎4‎:‎36‎:‎09‎ ‎PM

John
One last point to make. Even the RTX line only has 2.0? That seems impossible. Are any of these cards firmware or driver updateable to 2.0b?
‎4‎:‎37‎:‎15‎ ‎PM

Arun_Kumar30
I'm sorry John, I would request you to check with Nvidia
‎4‎:‎37‎:‎15‎ ‎PM

9 Legend

 • 

14K Posts

July 23rd, 2019 22:00

@Fox Mulder1  I've never tried a DisplayPort 1.4 to HDMI 2.0 adapter/cable setup.  DP to HDMI cables rely on the source DisplayPort output supporting "Dual Mode DisplayPort", aka DP++, which allows a DisplayPort output to fall back to native HDMI signaling.  But I don't know if the DP++ spec includes allowing a DP 1.4 port to run HDMI 2.0b signaling as opposed to some older HDMI standard.  I've just never tried it.

I'm not sure who makes Dell's GPU boards, but I would expect there might be more than one vendor.  I recently bought an Alienware Aurora R8 for a friend that came with a GTX 2070 and it had the same set of ports as found on the reference boards, which if memory serves was 3 DisplayPort outputs, 1 HDMI output, and 1 USB-C VirtualLink output.  It seems that the vast majority of GPUs available on the market just stick to reference design output ports these days.

HDR by itself wouldn't necessarily qualify as graphics-intensive because HDR is just a wider color space and brightness range.  And even if you're thinking in terms of playing back 4K HDR video that might be encoded in an advanced codec like H.265 (which is what 4K Blu-ray discs use), Intel CPU/GPU silicon has had hardware acceleration for that since I believe Intel Core 7th Gen, so even that wouldn't be particularly intensive.  In any case, the vast majority of laptops do not have a way to disable the Intel GPU because as I said above, the Intel GPU is physically wired to the display output, so if you were to disable it, you'd lose your display signal, just as if you connected a display to a desktop GPU and then disabled that GPU.  The reason I say "vast majority of laptops" is because the more recent Precision 7000 Series models have a more advanced motherboard design that allows them to offer a BIOS option where the user can choose whether the Intel GPU or discrete GPU controls the built-in display and the built-in display outputs.  That's achieved by using DisplayPort multiplexers on the motherboard.  Basically, the built-in display and all of the outputs are wired to these multiplexers, and then the multiplexers are wired back to both GPUs, and then the BIOS option determines which GPU path is active.  But those are the only systems I know of that have that capability.

As for why everything isn't HDMI 2.0b in 2019, I guess you'd have to ask Intel that question.  In fairness though, there isn't a ton of demand to play HDR content from PCs anyway, for a variety of reasons.  One is that even getting it to display properly is still a huge chore even for tech-savvy people.  Enabling HDR in Windows causes regular SDR content to no longer look quite right, some applications don't respond properly to that Windows setting (I think HDR video in Chrome only works if you keep the Windows setting OFF or something), and almost no PC displays on the market can even deliver an HDR experience anywhere close to what TVs can manage, and the few displays that can cost way more than most people will spend on a display.  Of course you can hook a PC up to a TV, but you can also hook a lot of other HDR-capable devices to a TV that won't be nearly as much of a hassle to set up.  I suppose there's a chicken and egg problem here, but that's where we are at the moment.

9 Legend

 • 

14K Posts

July 19th, 2019 10:00

In most laptops that include discrete GPUs, most or all display outputs are still wired to the Intel GPU, and the discrete GPU when needed acts as a render-only device that passes completed video frames to the Intel GPU for output to displays.  The main reason for this is battery life so that the discrete GPU can be completely powered off.  If an output was directly wired to the discrete GPU, then it would need to stay active whenever a display was connected to that output, even if nothing graphics-intensive is going on.  The downside to that setup is that there are certain technologies that the Intel GPU doesn't support passing through and/or that require the discrete GPU to have direct control of the display outputs, like VR, G-Sync, stereoscopic 3D, and others.  But the issue more relevant to your specific question here is that at the moment, Intel GPUs do not natively support HDMI 2.0.  For most systems that have HDMI 2.0 outputs, the typical way that's implemented is by using an "LSPCON" chip that takes a DisplayPort 1.2 output from the Intel GPU and converts it to HDMI 2.0.  The problem is that DisplayPort 1.2 doesn't natively support HDR formats, so converting to HDMI 2.0 allows you to use things like 4K 60 Hz rather than just 4K 30 Hz like older HDMI versions, but it can't magically add HDR support from a source signal that didn't have it.  The DisplayPort spec didn't officially gain support for HDR formats until 1.4, but current Intel GPUs don't support anything newer than 1.2.  That will be changing with Intel's "Ice Lake" CPUs that will come with a new Gen 11 GPU that will support DisplayPort 1.4, but those aren't slated to arrive until the end of the year, and Ice Lake CPUs are the ones designed for lower power applications in smaller systems like the XPS 13.  It's not clear when the new Gen 11 GPU will find its way into the higher-end CPUs used in more gaming-oriented systems.  (Note: You might have noticed that some laptops have built-in displays that support HDR10 and Dolby Vision.  That's achieved because built-in displays use eDP, and Intel GPUs have supported eDP 1.4 for a while now.)

So you should absolutely NOT assume that an HDMI 2.0 port will support HDR formats unless you can confirm that the HDMI output is directly wired to the discrete GPU.  And there are some systems that have their HDMI ports wired to the discrete GPU even if all other outputs are wired to the Intel GPU, particularly some Inspiron Gaming (now G Series) systems and some Alienware systems.  That was done primarily so that those laptops can support VR, since that can't be passed through an Intel GPU.  Ironically, newer VR headsets are switching to DisplayPort or USB-C rather than HDMI, and since those systems usually have THOSE outputs still wired to the Intel GPU, they won't be able to use those headsets.

1 Rookie

 • 

9 Posts

July 23rd, 2019 21:00

jphughan, thanks for the reply. I see the 1660ti has 3 DP 1.4 ports. Can I use a DP to HDMI cable to achieve HDR glory? Would a desktop version of any Dell Nvidia card actually have 2.0b. Nvidia's website says even the 1050 has it. Do we know who makes Dell's cards? Is it Gigabyte? I've already bought the Vizio M65-G1 with Dolby vision, HDR, and HLG. Don't know about HDR10+. Probably never will get that. Why in the , in 2019, would anyone not make everything 2.0b? The fact that I now know the stupid Intel GPU runs the show unless something intense happens on screen. I'm really . So HDR doesn't qualify as intense for the dedicated GPU to run the show? There is no setting in Nvidia's control panel or a LAPTOP bios to shut down the Intel GPU all the time?

No Events found!

Top