Inspiron

Last reply by 06-04-2022 Unsolved
Start a Discussion
1 Copper
1 Copper
5561

Dell G3 3590 deliberately misleading marketing regarding displayport capability

Hey folks,

I got a Dell G3 3590 late last year after thoroughly researching a laptop that would be right for all my needs, one of those being VR gaming. The specs are listed here. As you can clearly see, it states that it has displayport through USB-C capability, and only on the models with a dedicated graphics card. Well, I came to find out after getting a Rift S this weekend that the displayport on this model is powered by the integrated Intel GPU for some bloody reason, meaning that I've got an $1800 machine that mislead me to believe it was more capable than it was, and a $600 paperweight. I felt you all should know to look out for this kind of thing, as I'm sure this is not the only case where they've <Profanity removed> people like this.

Replies (15)
7 Plutonium
5113

This is unfortunately catching people out.  Even on some laptops explicitly marketed as "VR Ready", which I don't think that one is, sometimes that moniker is granted if only one display output is wired directly to the discrete GPU.  On some systems, that port is the HDMI output, because many VR headsets use HDMI, including the original Rift.  But of course the Rift S switched to DisplayPort, which means that even those "VR Ready" laptops won't work with the Rift S.  (Yes, there are DisplayPort to HDMI adapters, but they only work to convert a DisplayPort source signal to HDMI, not the other way around.)

As for why this is done, it's mostly related to battery life.  When the system is designed such that no display outputs are wired directly to the discrete GPU, then the discrete GPU can be completely powered off when its additional performance isn't needed.  Otherwise, the discrete GPU would need to remain active whenever a display was connected to one of those outputs, even if nothing graphics-intensive was going on.  Some customers connect to external displays/projectors while running on battery power, so battery life matters to them even when using external displays.

That said, the argument could certainly be made that on a system specifically marketed as a gaming system, the design should prioritize gaming functionality like VR over battery life.  I personally would support that viewpoint.  However, I distinctly remember somebody a while ago who purchased an Inspiron Gaming system where the HDMI output was wired to the discrete GPU, and he complained on this very forum about poor battery life as a result of this setup.  I explained the gaming-related reasons for that system's design, but he was adamant that not wiring the display outputs to the Intel GPU to achieve battery life savings amounted to "bad design", and rejected my suggestion that if he prized battery life over gaming functionality, then he might have chosen the wrong system model for his purposes.

The only other option is to implement a design that includes a BIOS setting whereby the user can choose which GPU controls which display outputs.  Dell has implemented this in their more recent Precision 7000 Series models, and some other vendors have it as well, but that adds cost, which of course some would gripe about (or just use as a reason to buy a competitor's product instead.)

It just goes to show that you can't please everyone.

Lastly, while I wouldn't call it "deliberately misleading", I do agree that it is incredibly confusing to have a system where the USB-C port is only included if you order a configuration that includes a discrete GPU (otherwise you don't get that port at all), but then still have that port wired to the Intel GPU.


2 Bronze
2 Bronze
4868

I agree that this is deliberately misleading at best and outright lying at worst.

The spec page for the G3 3590 clearly states that the Discrete Graphics can be used with many of the available graphics card options with either USB-C or HDMI. I have the NVIDIA GeForce GTX 1660 Ti and it is always using the integrated graphics with an external display connected using USB-C with DisplayPort. Fortunately it does seem to be using the discrete graphics with HDMI, but that spec page is still flat-out wrong.

 

 

4863

@Macatari  "Deliberately misleading" implies intent.  I've personally found that the old saying, "Never ascribe to malice what can plausibly be ascribed to stupidity/incompetence" accounts for the vast majority of cases like this, including this one in my opinion.  I personally don't consider it plausible that Dell wrote specs that they INTENDED to be misleading.  It seems to me that the intent of the table is to say, "When your system includes this graphics controller, you can connect external displays through these outputs", because my understanding is that on this system, when you order it with a GTX 1050, you don't get a video-capable USB-C port at all.  In that case, it is absolutely true that external displays can be connected to the indicated outputs when the system is configured with the indicated graphics controller.  I don't believe that table was meant to specify which display outputs were actually wired to which GPU.  If that's the case, then the specs aren't "flat-out wrong".  They were instead just meant to convey different information than you figured.  That could arguably make them confusing and possibly even misleading -- though I still wouldn't say "deliberately" misleading.

Again, I understand that most people won't understand that distinction because they won't even know what NVIDIA Optimus is, never mind what its ramifications are, but trying to explain all of that in a specs table is arguably not practical.  And frankly, given how many people I see rushing here to ask questions about capabilities/problems without even bothering to read system documentation first, which would have answered their question if they'd actually looked, I wonder how many people would even notice even if Dell DID try to explain Optimus in their documentation.  And then there's the risk of creating the opposite problem, which I've ALSO seen here.  Some people here have developed the understanding that if the discrete GPU is NOT directly wired to the display output, then it can't be used at all for displays attached to that output.  Their impression is that discrete GPUs in Optimus-based systems can only be used to accelerate content on the built-in display, which isn't correct either (and ironically they don't even realize that the built-in display is ALSO using Optimus rather than being directly controlled by the dGPU).  The fact that your display is connected to a display output wired to the Intel GPU does NOT prevent the NVIDIA GPU from being used to accelerate content being shown on that display.  It only prevents certain "specialized" technologies that the Intel GPU doesn't support passing through and/or that requires the NVIDIA GPU to have direct control of the display output, such as VR, G-Sync, Adaptive V-Sync, stereoscopic 3D, and possibly some others.

The bottom line is that the devil is in the details on things like this, and the vast majority of users will not understand those details, or even be aware that they exist in to be understood in the first place.  Even saying something like "VR can only be used through these outputs" wouldn't solve the problem because some VR headsets use HDMI, and others use DisplayPort -- and meanwhile there's Oculus Link for the Oculus Quest, which sends video as regular USB data rather than using a native GPU output, and therefore works even on systems where the dGPU isn't wired to any display output at all.

I completely understand the frustration of people who bought a laptop intending to use it for VR only to find out that it can't be used that way, for reasons that are somewhat nuanced, opaque, and not explicitly found in any published specs even if the user had bothered to read them before purchasing.  But I do not think that it follows that this is "deliberate" on Dell's part.


4858

@Macatari  following up on my earlier post above, when you say that your external display is always using integrated graphics when attached via USB-C/DisplayPort, how are you determining that?  The port is always wired to the Intel GPU, so that will indeed never change, but that doesn't mean that the NVIDIA GPU can't be used.  If you're basing your statement on seeing drastically lower frame rates when the display is connected via USB-C/DP compared to HDMI, then there's something else going on, because when you're just using a regular external display (as opposed to a VR headset), then the dGPU's performance should absolutely be available even through outputs that are wired to the Intel GPU.  The vast majority of laptops that have dual GPUs typically have NONE of their display outputs wired to the discrete GPU, and yet the discrete GPU's performance can still be used for games and such.  I have several such laptops myself and I can use the discrete GPU with games and similar content without any issues.


4852

Thanks for the feedback. You're probably right that this is more of a mistake, but I'm sticking with deliberately misleading since they specifically included two tables, one to show what works with discrete and one to show what works with integrated. By including the separate tables they are strongly implying a level of performance that is not readily available. The simple fix here would be for the discrete table to not indicate that any of them work with USB-C. Just leave USB-C in the integrated table where it already is. Or not even have two separate tables. Since this is a spec sheet, not marketing materials, it should be precise.

I don't care about VR, this is just for regular games. So far in testing, games are reporting that the discrete GPU is not available to them when using USB-C, but report that the discrete GPU is available when using HDMI. I was hoping there would be a BIOS setting to swap this given the table, but there's none that I've seen. Like I said, this is not a huge deal for me as using HDMI does work, even if I'd rather be using USB-C/DisplayPort for the smaller cable and connectors.

I'll dig around more to see if individual games can be set to use the discrete GPU in their deeper settings.

 

4905

@Macatari  are you perhaps using a USB-C to DP adapter that relies on DisplayLink "indirect display" technology rather than tapping into native DisplayPort Alt Mode as the vast majority of such adapters would?  If you have a DisplayLink-based adapter, then yes you would be unable to use the discrete GPU, due to a Windows limitation that causes DisplayLink to only be able to use the primary GPU, which in Intel+NVIDIA GPU systems will always be the Intel GPU.  But otherwise, you shouldn't be having that problem.  Again, I have multiple systems that have Intel and NVIDIA GPUs where I'm using displays attached via USB-C through a USB-C to DP cable, and I can use the discrete GPU just fine.  You can try using NVIDIA Control Panel to force the discrete GPU to be active for the applications you're having an issue with, although that isn't typically necessary since the drivers tend to be good about detecting applications that would benefit from dGPU acceleration on their own.  Or you can try forcing it just for a particular launch of that application by right-clicking the application or a shortcut and using the "Run on graphics processor" menu to choose which GPU should be used for that particular launch.  Unfortunately this menu isn't available on Start menu shortcuts, so you'd need a shortcut on the desktop or something.

As for the two tables, is it possible that the G3 3590 can be ordered in a configuration that includes ONLY the Intel GPU?  I realize that would be an odd setup for a system specifically marketed at gamers, but it might account for the reason two tables exist.

The only Dell systems I'm aware of that have a BIOS setting to allow you to choose which GPU controls outputs are the Precision 7000 Series systems, but they achieve that because they use a more complex motherboard design.  Basically, the display outputs and even the built-in panel are wired to DisplayPort multiplexers, and those are then wired back to BOTH of the GPUs.  The BIOS option determines which video path is actually used.  But that's a relatively uncommon design.


4892

After more testing it appears that so far games are able to recognize the discrete graphics in their settings, even if they don't always choose it by default. It's possible in initial testing that both HDMI and USB-C cables were connected and that confused some settings.

I appreciate the detailed explanations. Thanks!

4887

@Macatari  happy to help.  Fyi though if the HDMI output on that system is in fact wired directly to the dGPU, then everything on that display will always use the dGPU.  NVIDIA Optimus allows the dGPU to be used on displays controlled by the Intel GPU, but you can't use the Intel GPU on displays controlled by the dGPU even if you wanted to.  So if the USB-C output is controlled by the Intel GPU, it would definitely be possible for some application to use the Intel GPU on a display connected that way while it always used the NVIDIA GPU on a display connected via HDMI.

If you want to find out how your system is wired, open NVIDIA Control Panel and go to the PhysX Configuration section.  In there is a diagram that will show which GPU has direct control of each active display.  Connect a display to each output and see which GPU controls it.


4696

@Macatari @jphughan 

HI Guys,


I'm glad I stumbled across this discussion, because I have been looking for answers directly related to this topic and I hope you can help. I recently purchased the G3 3590 - i7, GTX 1660 TI, 144hz display variant and I would like to buy a monitor to connect to the laptop for gaming purposes. Link to product: https://deals.dell.com/en-uk/productdetail/4ke1


Unfortunately, my technical knowledge is somewhat limited when it comes to this type of thing and I want to ensure I will be able to get the most of the monitor before purchasing it. I am not too bothered by how I should connect the the monitor to my laptop, but the USB-C port would be preferred. My ultimate question is, would it be possible for me to connect an external monitor to the laptop and take full advantage of the discrete 1660 TI graphics, while maintaining a 144hz refresh rate? Additionally, would I be able to make use of the laptop display for Discord while using the monitor for my gaming display, or perhaps use the monitor and keep the laptop closed if I wanted to?

According to specifications, this laptop supports USB-C DisplayPort Alt-Mode for 1650 and above. Does this mean that only the intel graphics can be used through the USB-C.?

Any advice, guidance or recommendations would be greatly appreciated and I hope you guys can help. I have checked my Nvidia PhysX configuration and it appears that the 1660 TI / PhysX is connected to the HDMI port and the Intel UHD graphics is connect to the USB-C / Laptop display.


Thanks a mil!

Latest Solutions
Top Contributor