I have a new Inspiron 7567 gaming laptop; it has two graphics cards, the integrated Intel HD 630 and the 'semi-discrete' Nvidia Geforce GTX 1050 ti.
Yesterday, I tried plugging it into my HDTV using an HDMI to HDMI cable. I have done this many times with my previous laptop, a Dell Inspiron n7110, and have had 0 problems. With the 7567, however, the TV just flickers for a moment and says "No signal". The computer reacts in a few different ways when the TV is plugged in:
In the Windows 10 "Display Settings", options appear for multiple displays such as "Extend Display", "Duplicate Display", etc., and with Windows + P I can choose these, however, only one screen is shown (which changes from 1 to 1/2 when HDMI is plugged in) and when I click on "Detect" it says that no other display is detected. Additionally, an extra Monitor (Generic PNP Monitor) appears in Device Manager when the display is plugged in.
In the Intel Graphics Settings control panel, there is no change when the HDMI display is plugged in. There are no multiple-display options, and "Detect" does not detect any other displays.
Finally, in the Nvidia control panel, when there is no external display plugged in, the only section in the left-hand tree menu is "3D Settings". One of the items under it is "Set PhysX Configuration", which shows "Laptop Display" connected to "Intel HD Graphics 630" with a line, inside of a bubble connected to "Geforce GTX 1050 Ti". However, when the TV is plugged in via HDMI, "Laptop Display" no longer appears, only "Samsung", which is connected to "Geforce GTX 1050 Ti", and "Intel HD Graphics 630" has also disappeared. Additionally, more sections are now available in the left-hand tree menu, allowing me to change resolution, for instance. However it displays only one device, "Samsung", and does not detect additional devices.
In short: the Intel video card does not detect the HDMI-connected device, the Windows settings react to the new device but claim that it is not detected, and the Nvidia video card detects the laptop integrated display when no external display is plugged in, but then ONLY detects the external display when one is present (never the two devices at once).
I have updated drivers, disabled/enabled the devices, checked BIOS, and tweaked settings, and nothing has allowed me to display anything on the external display.
Please help! Thank you!
What specific HDTV model? What HDMI version does this HDTV use? The six year old N7110 used the old HDMI 1.4 version to connect to the HDTV. The brand new 7567 uses HDMI 2.0.
Hi Chris, thanks for the reply.
The TV is a Samsung pn50a450, which, without having looked it up, is old enough that it probably does not support HDMI 2.0. Is this likely the culprit here? I was under the impression that HDMI versions were fully backwards compatible (that newer versions will just revert to the data transfer/resolution of older versions), and I would have thought that there would at least be some indicator of a version incompatibility.
In any case, I talked to Dell support about the problem, giving essentially the same information as here, and was told that it is likely a hardware problem, and to send the computer in for servicing. As of yesterday it's on a FedEx truck to Houston. Hopefully this gets figured out there. I'll post an update here when(/if) I get any answers.