My intel HD 630 integrated gcard can not detect 2nd monitor.
2nd monitor detected by nvidia gtx 1060 and driven by gtx 1060.
If i disable gtx 1060 on device manager, there is no way i get signal on 2nd monitor
Intel suggest it is Mobo problem, which hdmi output directly connected to nvidia gtx card. Is it true dell?
I dont want to enable my BOTH cards in order to use 2 monitor? It brings me awful battery performance.
Windows 10 is freshly installed (creators update) and drivers are up-to-date.
Ironically there are lot of users complaining here who WISH their systems allowed their NVIDIA GPU to directly control the display output when in fact the Intel GPU controls everything on their systems. The basic answer is that you bought a gaming laptop, and having the NVIDIA GPU control the display output is better for a gaming laptop because it enables features like G-Sync (only on DisplayPort), VR, and stereoscopic 3D — none of which the Intel GPU supports, which means that systems where the Intel GPU controls the display outputs can’t use those features even if they have an NVIDIA GPU doing the rendering. So in your system’s case, Dell prioritized gaming functionality over battery life, which again seems perfectly reasonable for a gaming laptop — especially since most times people will have an external display attached, they’ll also be plugged into AC power. In fact, why isn’t that true in your case?
But if you want a system that has the Intel GPU controlling all outputs, with the trade-offs that entails in terms of functionality, then almost every other system on the market works that way. I’m actually surprised yours is designed the way it is, but I consider that a bonus feature for a gaming system, not a design flaw.
In any case, there’s no way to “fix” this “problem” because this system was designed to work this way, so a motherboard replacement won’t change anything, nor will messing with drivers.
Also, disabling a device in Device Manager does not necessarily power it off inside the system.
There are lots of points you missed. Unfourtunately
1) a notebook with discrete graphics card does not mean it is for only "gaming". I use discrete card in train my deep learning network as well as gaming. In deep learning case i spesifically needed to be work on igpu on desktop and train on nvidia.
2) There is an optimus for fast switching from intel to nvidia if i want to play game. That is why it is called "switchable graphics" sometime you need battery, sometime you need performance
3) Technologies you mention, I use none of them. Again it can be switchable.
4) Intel adversite as HD 630 capable of running multiple monitors. In this case Item is not as advertised.
5) Yes it is a HUGE degisn flaw *IF* the system is designed that way. Yet we don't know it is. Unless you worked on R&D team.
6) You said "especially since most times people will have an external display attached, they’ll also be plugged into AC power." why exactly? if that so, Dell need to inform user about this. Why i should i be forced to to as because most people do that? Guess what, i should not. Working always on AC will stress battery so much that makes its life span much more less.
7) Prioriteze gaming with making **DESKTOP** to use high performance card? lol.
😎 I'm not disabling the in order to power off it. What I was triying to force system to use intel graphic card on hdmi output.
9) It is a serious design flaw.
Lastly are you authorized person or another user on the forum? Should i take your "it is a design issue" answer as serious answer?
"Switchable graphics" refers to the ability to choose which GPU handles rendering; it does not change the GPU that actually controls final output to the display. In most systems, the discrete GPU exists as a render-only device and is not directly attached to ANY display outputs. Instead, when switchable graphics has specified to use the discrete GPU, it does the rendering work and then passes completed frames to the iGPU for output to displays.
There are only a few very high-end systems that allow you to switch the GPU that actually controls the display outputs. The Precision 7000 Series models are examples, but even there, this switch is a BIOS-level change (called "Graphics special mode"), not something that can be done within the driver and certainly not on a per-application basis. That switching functionality is achieved through the use of DisplayPort multiplexers on the motherboard, which increases the cost and complexity of the motherboard, and the use cases for giving the discrete GPU direct control of the outputs are also fairly limited -- all of which is why this flexibility is only found on rather high-end systems where that functionality is more likely to be required. Instead, the overwhelming majority of systems only have one GPU connected to a display output. In almost all cases, that's the iGPU; in your case, it's the discrete GPU, which again I think makes sense for a system targeted at gaming.
Of course not everyone uses laptops with discrete graphics for gaming, but your system is specifically called an "Inspiron Gaming" system -- and if you buy a system expressly billed as a gaming system, you should expect gaming-oriented design decisions. The fact that you see it as a design flaw for your use case does not make it a bad design decision for the larger target market overall. But if you don't need or want your discrete GPU to have direct control of the outputs, then the good news is that the overwhelming majority of the PC market meets your needs in that regard. This just isn't one of them, and no amount of ranting will change that. In terms of whether I'm an "authorized person", I'm not an official Dell rep if that's what you mean, but I also happen to be correct. Whether that's enough for you to take my answer as a serious one is entirely up to you, but you won't get a different correct one.
Keeping a laptop on AC power puts much LESS stress on the battery than actively depleting it on a regular basis. It's true that batteries should not be kept at their maximum charge level all the time (which is why Tesla cars by default only charge to 80% of their capacity unless you explicitly request a Max Range charge), but if that's what you're worried about, then go into your BIOS and limit your battery charge to something like 80%. You can even further configure your battery not to START charging again until it drops to a certain level, which will be even better for longevity because the battery will be allowed to self-discharge -- although the risk of course is that you may have much less than a full charge whenever you need to disconnect it. But actively running on battery power when you could be running on AC because you think that preserves the longevity of your battery makes no sense whatsoever. The PRIMARY drain on a battery's longevity is how many charge/discharge cycles it has been through, even if they're not full charge/discharge cycles.
Lastly, Intel's advertising that the HD 630 GPU supports multiple displays is correct. In many systems that use that GPU, it does. It just wasn't implemented by Dell in YOUR SYSTEM in a way that allows that. If you look at the specs of an NVIDIA GPU, you'll find claims that it supports G-Sync, stereo 3D, etc, and most laptops with that GPU will not support it, again because the system builder didn't implement the GPU in a way that would allow it. Similarly, if you bought a desktop video card that contained a GPU that claimed support for HDR, but the company that built the card itself didn't implement DisplayPort 1.4 or HDMI 2.0, then that card wouldn't support HDR. That's why relying on specs from a chip manufacturer rather than a system builder can be a problem; the final capabilities can be limited by implementation decisions, not just chip capabilities.
You can say "We don't know the system was designed that way." But if the NVIDIA drivers show your display directly attached to the NVIDIA GPU, and you don't see a BIOS-level option to change that, then your system is designed that way. On systems such as my XPS 15 that has a discrete GPU but where the iGPU controls all display outputs, the NVIDIA drivers show no displays directly attached to the NVIDIA GPU.
Thank you for wall of text explanations yet there is no single answer on my considarations.
You are just telling it is designed that way and defending it with irrelevant topics.
To make it simple for you. I will give you 2 examples and you by your heart tell me if it is not a design issue ot it is.
- You can not force dgpu to render/drive main monitor on Desktop. If you disable igpu in order to force system to use dgpu for drive main monitor, windows default drivers kicksin and nvidia control panel says, there is no nvidia graphics driver attach to any monitor. Yet main monitor is rendered by nvidia but driven by windows default drivers, we can confirm that by looking how much ram it has which is 6gb.
-You rely on optimus to swtich gpu's on different programs yet it is bugged right now and making system heavy stuttering/freezing on Desktop. System become completely unusable. Issue is discussing on here:
But yet blame others for that right now.
- You can not force igpu to render 2nd monitor over hdmi output, If you disable dgpu to force system to use igpu -> 2nd monitor has blank signal.
As you can see, This system behave completely opposite on different set-ups. About the naming, this is completly irrelevant issue and it is done because of marketing.
If this is a design issue, I really dont know why you are defending it. Complex mobo? We paid 1280$ for the system. If it is not enough, dell should make price higher but deliver us more complete system.
I will repeat this one again.
When you connect you laptop to 2nd monitor. YOUR IGPU and DGPU are BOTH active no matter what you are doing. This is not a game-only system. It is also workstation.
My old laptop TOSHIBA with intel HD 4600 + Nvidia 740m Already solved that issue years ago. It is working as expected to be. And guess what, no option on bios. It authomatically switch to 740m when i'm gaming. And also lets consider it is much much cheaper system compare to new 7577.
I ran into this same issue. The only solution is buy a USB-C to HDMI cable and utilize the Thunderbolt 3 port for the external monitor. Of course, you will run into the stuttering issue with that setup. You can work around the stuttering with this app which forces the dedicated gpu to always be turned on: github.com/.../TrayPwrD3.
Ok, let’s suppose your system was designed the way you want, like your Toshiba and most other systems. That would create this use case:
A gamer buys this system, which is marketed as a gaming system, after checking the NVIDIA site to verify that the system’s dGPU supports the gaming technologies I listed above. They connect an external display to the system and find that they can’t use any of those features because the system design set the iGPU to handle all display outputs. THAT person would consider the design you want to be a design flaw for a gaming system.
The bottom line is that there isn’t always a design that’s ideal for everyone. Instead, different designs have their own sets of benefits and drawbacks. For this system, Dell implemented a design that delivered extra gaming features but came with the drawback of reduced battery life when external displays are attached, and you don’t like that. But if they had done the opposite, a gamer would say, “Dell why would you prioritize battery life when an external display is attached over gaming features on a gaming laptop, especially considering that when external displays are involved, people will practically always have an AC adapter attached anyway and therefore battery life isn’t even a factor!!” Those people would have a point. And in fact there are already people on this very forum making this exact complaint about NON-gaming systems that ARE designed the way you want. They’re essentially being told they should have bought a gaming system if they wanted gaming features — but you’re saying that even the gaming systems shouldn’t be designed to optimize for gaming features. If Dell had done what you wanted, then “gaming system” really WOULD be just marketing as you claim, and gamers would rightfully be annoyed about that.
I understand that this system’s design isn’t ideal for your use case because you personally don’t care about any of the extra features this design enables. But hopefully you can at least understand that its design is appropriate and actually superior for its INTENDED primary use case, which is gaming. You simply bought a system designed with a different set of priorities than yours. That doesn’t make this system’s design objectively worse or flawed; it’s just worse for you.
But at the end of the day, this design is only a problem for people using an external display while running on battery, which is rare, and it sounds like the main reason you’ve been doing it is because of a misconception about what factors affect battery longevity. For everyone else, it’s not unreasonable to assume that someone who uses an external display for an extended period of time will have an AC adapter connected, and those people get extra gaming features for their gaming laptop and don’t even notice the battery life drawback of this design.
The answer to the question, which has been stated by jphughan, is the system was designed that way and no motherboard or Bios upgrade will do that. There is no flaw and my 7567 works the same way. The Intel GPU probably has it hands full running the laptop's 4K display.
There are no "cards" in the system, the Nvidia device is a chip soldered to the motherboard.
Even on my Gaming Desktop, I can not change the GPU for my monitors without changing a physical connection.
The attachments are my system when watching a Netflix movie with an external HDMI monitor and the second image is without the external monitor. These should confirm the operation is a suggested.
XPS 2720, Inspiron 17 7779, Inspiron 15 7567, XPS 13 9365, Inspiron 1545, TB16 Dock
There is no way to force the dGPU to drive the built-in panel directly. In your Case #1, the iGPU is still involved, but it’s just running with a basic driver. The iGPU is the only GPU that is physically wired to the built-in display, so if you disable that GPU in Device Manager and still see an image on the display, then it’s still working, even if it’s not working at full functionality. If the iGPU were completely disabled, the built-in display would go blank.