Start a Conversation

Unsolved

X

1 Message

6763

March 9th, 2020 14:00

Dell G3 3590 deliberately misleading marketing regarding displayport capability

Hey folks,

I got a Dell G3 3590 late last year after thoroughly researching a laptop that would be right for all my needs, one of those being VR gaming. The specs are listed here. As you can clearly see, it states that it has displayport through USB-C capability, and only on the models with a dedicated graphics card. Well, I came to find out after getting a Rift S this weekend that the displayport on this model is powered by the integrated Intel GPU for some bloody reason, meaning that I've got an $1800 machine that mislead me to believe it was more capable than it was, and a $600 paperweight. I felt you all should know to look out for this kind of thing, as I'm sure this is not the only case where they've people like this.

4 Operator

 • 

14K Posts

March 9th, 2020 14:00

This is unfortunately catching people out.  Even on some laptops explicitly marketed as "VR Ready", which I don't think that one is, sometimes that moniker is granted if only one display output is wired directly to the discrete GPU.  On some systems, that port is the HDMI output, because many VR headsets use HDMI, including the original Rift.  But of course the Rift S switched to DisplayPort, which means that even those "VR Ready" laptops won't work with the Rift S.  (Yes, there are DisplayPort to HDMI adapters, but they only work to convert a DisplayPort source signal to HDMI, not the other way around.)

As for why this is done, it's mostly related to battery life.  When the system is designed such that no display outputs are wired directly to the discrete GPU, then the discrete GPU can be completely powered off when its additional performance isn't needed.  Otherwise, the discrete GPU would need to remain active whenever a display was connected to one of those outputs, even if nothing graphics-intensive was going on.  Some customers connect to external displays/projectors while running on battery power, so battery life matters to them even when using external displays.

That said, the argument could certainly be made that on a system specifically marketed as a gaming system, the design should prioritize gaming functionality like VR over battery life.  I personally would support that viewpoint.  However, I distinctly remember somebody a while ago who purchased an Inspiron Gaming system where the HDMI output was wired to the discrete GPU, and he complained on this very forum about poor battery life as a result of this setup.  I explained the gaming-related reasons for that system's design, but he was adamant that not wiring the display outputs to the Intel GPU to achieve battery life savings amounted to "bad design", and rejected my suggestion that if he prized battery life over gaming functionality, then he might have chosen the wrong system model for his purposes.

The only other option is to implement a design that includes a BIOS setting whereby the user can choose which GPU controls which display outputs.  Dell has implemented this in their more recent Precision 7000 Series models, and some other vendors have it as well, but that adds cost, which of course some would gripe about (or just use as a reason to buy a competitor's product instead.)

It just goes to show that you can't please everyone.

Lastly, while I wouldn't call it "deliberately misleading", I do agree that it is incredibly confusing to have a system where the USB-C port is only included if you order a configuration that includes a discrete GPU (otherwise you don't get that port at all), but then still have that port wired to the Intel GPU.

4 Operator

 • 

14K Posts

May 4th, 2020 10:00

@Macatari  "Deliberately misleading" implies intent.  I've personally found that the old saying, "Never ascribe to malice what can plausibly be ascribed to stupidity/incompetence" accounts for the vast majority of cases like this, including this one in my opinion.  I personally don't consider it plausible that Dell wrote specs that they INTENDED to be misleading.  It seems to me that the intent of the table is to say, "When your system includes this graphics controller, you can connect external displays through these outputs", because my understanding is that on this system, when you order it with a GTX 1050, you don't get a video-capable USB-C port at all.  In that case, it is absolutely true that external displays can be connected to the indicated outputs when the system is configured with the indicated graphics controller.  I don't believe that table was meant to specify which display outputs were actually wired to which GPU.  If that's the case, then the specs aren't "flat-out wrong".  They were instead just meant to convey different information than you figured.  That could arguably make them confusing and possibly even misleading -- though I still wouldn't say "deliberately" misleading.

Again, I understand that most people won't understand that distinction because they won't even know what NVIDIA Optimus is, never mind what its ramifications are, but trying to explain all of that in a specs table is arguably not practical.  And frankly, given how many people I see rushing here to ask questions about capabilities/problems without even bothering to read system documentation first, which would have answered their question if they'd actually looked, I wonder how many people would even notice even if Dell DID try to explain Optimus in their documentation.  And then there's the risk of creating the opposite problem, which I've ALSO seen here.  Some people here have developed the understanding that if the discrete GPU is NOT directly wired to the display output, then it can't be used at all for displays attached to that output.  Their impression is that discrete GPUs in Optimus-based systems can only be used to accelerate content on the built-in display, which isn't correct either (and ironically they don't even realize that the built-in display is ALSO using Optimus rather than being directly controlled by the dGPU).  The fact that your display is connected to a display output wired to the Intel GPU does NOT prevent the NVIDIA GPU from being used to accelerate content being shown on that display.  It only prevents certain "specialized" technologies that the Intel GPU doesn't support passing through and/or that requires the NVIDIA GPU to have direct control of the display output, such as VR, G-Sync, Adaptive V-Sync, stereoscopic 3D, and possibly some others.

The bottom line is that the devil is in the details on things like this, and the vast majority of users will not understand those details, or even be aware that they exist in to be understood in the first place.  Even saying something like "VR can only be used through these outputs" wouldn't solve the problem because some VR headsets use HDMI, and others use DisplayPort -- and meanwhile there's Oculus Link for the Oculus Quest, which sends video as regular USB data rather than using a native GPU output, and therefore works even on systems where the dGPU isn't wired to any display output at all.

I completely understand the frustration of people who bought a laptop intending to use it for VR only to find out that it can't be used that way, for reasons that are somewhat nuanced, opaque, and not explicitly found in any published specs even if the user had bothered to read them before purchasing.  But I do not think that it follows that this is "deliberate" on Dell's part.

3 Posts

May 4th, 2020 10:00

I agree that this is deliberately misleading at best and outright lying at worst.

The spec page for the G3 3590 clearly states that the Discrete Graphics can be used with many of the available graphics card options with either USB-C or HDMI. I have the NVIDIA GeForce GTX 1660 Ti and it is always using the integrated graphics with an external display connected using USB-C with DisplayPort. Fortunately it does seem to be using the discrete graphics with HDMI, but that spec page is still flat-out wrong.

 

 

4 Operator

 • 

14K Posts

May 4th, 2020 11:00

@Macatari  following up on my earlier post above, when you say that your external display is always using integrated graphics when attached via USB-C/DisplayPort, how are you determining that?  The port is always wired to the Intel GPU, so that will indeed never change, but that doesn't mean that the NVIDIA GPU can't be used.  If you're basing your statement on seeing drastically lower frame rates when the display is connected via USB-C/DP compared to HDMI, then there's something else going on, because when you're just using a regular external display (as opposed to a VR headset), then the dGPU's performance should absolutely be available even through outputs that are wired to the Intel GPU.  The vast majority of laptops that have dual GPUs typically have NONE of their display outputs wired to the discrete GPU, and yet the discrete GPU's performance can still be used for games and such.  I have several such laptops myself and I can use the discrete GPU with games and similar content without any issues.

3 Posts

May 4th, 2020 12:00

Thanks for the feedback. You're probably right that this is more of a mistake, but I'm sticking with deliberately misleading since they specifically included two tables, one to show what works with discrete and one to show what works with integrated. By including the separate tables they are strongly implying a level of performance that is not readily available. The simple fix here would be for the discrete table to not indicate that any of them work with USB-C. Just leave USB-C in the integrated table where it already is. Or not even have two separate tables. Since this is a spec sheet, not marketing materials, it should be precise.

I don't care about VR, this is just for regular games. So far in testing, games are reporting that the discrete GPU is not available to them when using USB-C, but report that the discrete GPU is available when using HDMI. I was hoping there would be a BIOS setting to swap this given the table, but there's none that I've seen. Like I said, this is not a huge deal for me as using HDMI does work, even if I'd rather be using USB-C/DisplayPort for the smaller cable and connectors.

I'll dig around more to see if individual games can be set to use the discrete GPU in their deeper settings.

 

4 Operator

 • 

14K Posts

May 4th, 2020 12:00

@Macatari  are you perhaps using a USB-C to DP adapter that relies on DisplayLink "indirect display" technology rather than tapping into native DisplayPort Alt Mode as the vast majority of such adapters would?  If you have a DisplayLink-based adapter, then yes you would be unable to use the discrete GPU, due to a Windows limitation that causes DisplayLink to only be able to use the primary GPU, which in Intel+NVIDIA GPU systems will always be the Intel GPU.  But otherwise, you shouldn't be having that problem.  Again, I have multiple systems that have Intel and NVIDIA GPUs where I'm using displays attached via USB-C through a USB-C to DP cable, and I can use the discrete GPU just fine.  You can try using NVIDIA Control Panel to force the discrete GPU to be active for the applications you're having an issue with, although that isn't typically necessary since the drivers tend to be good about detecting applications that would benefit from dGPU acceleration on their own.  Or you can try forcing it just for a particular launch of that application by right-clicking the application or a shortcut and using the "Run on graphics processor" menu to choose which GPU should be used for that particular launch.  Unfortunately this menu isn't available on Start menu shortcuts, so you'd need a shortcut on the desktop or something.

As for the two tables, is it possible that the G3 3590 can be ordered in a configuration that includes ONLY the Intel GPU?  I realize that would be an odd setup for a system specifically marketed at gamers, but it might account for the reason two tables exist.

The only Dell systems I'm aware of that have a BIOS setting to allow you to choose which GPU controls outputs are the Precision 7000 Series systems, but they achieve that because they use a more complex motherboard design.  Basically, the display outputs and even the built-in panel are wired to DisplayPort multiplexers, and those are then wired back to BOTH of the GPUs.  The BIOS option determines which video path is actually used.  But that's a relatively uncommon design.

3 Posts

May 4th, 2020 13:00

After more testing it appears that so far games are able to recognize the discrete graphics in their settings, even if they don't always choose it by default. It's possible in initial testing that both HDMI and USB-C cables were connected and that confused some settings.

I appreciate the detailed explanations. Thanks!

4 Operator

 • 

14K Posts

May 4th, 2020 14:00

@Macatari  happy to help.  Fyi though if the HDMI output on that system is in fact wired directly to the dGPU, then everything on that display will always use the dGPU.  NVIDIA Optimus allows the dGPU to be used on displays controlled by the Intel GPU, but you can't use the Intel GPU on displays controlled by the dGPU even if you wanted to.  So if the USB-C output is controlled by the Intel GPU, it would definitely be possible for some application to use the Intel GPU on a display connected that way while it always used the NVIDIA GPU on a display connected via HDMI.

If you want to find out how your system is wired, open NVIDIA Control Panel and go to the PhysX Configuration section.  In there is a diagram that will show which GPU has direct control of each active display.  Connect a display to each output and see which GPU controls it.

4 Operator

 • 

14K Posts

May 26th, 2020 14:00

@JayZmon  I'll try to help as much as I can here.

Dell's product planning created a confusing situation here, which some would call misleading.  Apparently you only get a USB-C port capable of video output when you order a configuration that includes a discrete GPU -- but even then, the USB-C port is still wired to the Intel GPU.  I have no idea why Dell did it this way rather than just standardizing on having a video-capable USB-C port for all systems if it was always going to be wired to the Intel GPU, or else wired the USB-C port to the NVIDIA GPU, but for whatever reason, that's not how Dell did it.

In any case, in terms of getting 144 Hz, this combination doesn't create an ideal solution.  If your USB-C output was wired to the dGPU, then you'd be able to use G-Sync (assuming you bought a display that supported it), but G-Sync doesn't work when the NVIDIA GPU has to pass through an Intel GPU.  And it also doesn't work over HDMI at all; it requires DisplayPort.  Running high refresh rates without G-Sync especially from a laptop system creates a dilemma on its own, which I'll get to in a moment.

But in terms of HDMI capabilities, the product page for the G3 3590 doesn't specify which version of HDMI the system supports, which means it isn't clear how much bandwidth is available and therefore isn't clear what resolution you might be able to run at 144 Hz -- but you didn't even specify what resolution you'd LIKE to run at 144 Hz.  That's a key bit of information you omitted.

In terms of USB-C, you'd have a full DisplayPort 1.2 interface available there if you used something like a USB-C to DisplayPort cable connected directly to that port.  I think that's enough to get QHD/1440p at 144 Hz, but I'm not sure.  But if the display you have only supports DisplayPort 1.2 and says it can run at its native resolution and 144 Hz from that input, then it's possible over DisplayPort 1.2.  The problem is that I don't know if Intel GPUs support 144 Hz refresh rates at all, regardless of resolution.  I've seen that they can definitely handle 120 Hz, but I'm not sure about 144 Hz.

But the problem I alluded to earlier in this post is that even if the Intel GPU supports high refresh rates, you can't use G-Sync through it.  You also can't use Adaptive V-Sync.  As a result, running high refresh rates can become a bit of an issue, because your only options at that point are V-Sync Off or V-Sync On.  The former allows frame tearing to occur, since that's exactly what V-Sync was designed to eliminate.  The latter removes frame tearing, but it's difficult to use if you’re running a high refresh rate because when V-Sync is on, the GPU needs to be able to maintain a frame rate that matches the display's refresh rate.  If the GPU ever does NOT have a frame completely rendered by the time the display wants to refresh, then the GPU will repeat the previous frame in order to buy more time -- which you will perceive as motion judder.  Running at 144 Hz means you'd need to sustain at least 144 fps, which will be tough on a laptop unless you want to reduce detail settings a lot or limit yourself to old games.  And the only way to avoid this is to turn off V-Sync altogether, but then you're back to frame tearing.  This is the exact dilemma that G-Sync is meant to solve by allowing the best of both worlds where you have no tearing AND no judder because the display’s refresh rate will adjust based on what the GPU can deliver, but that’s not possible with this system.

2 Posts

May 26th, 2020 14:00

@Macatari @jphughan 

HI Guys,


I'm glad I stumbled across this discussion, because I have been looking for answers directly related to this topic and I hope you can help. I recently purchased the G3 3590 - i7, GTX 1660 TI, 144hz display variant and I would like to buy a monitor to connect to the laptop for gaming purposes. Link to product: https://deals.dell.com/en-uk/productdetail/4ke1


Unfortunately, my technical knowledge is somewhat limited when it comes to this type of thing and I want to ensure I will be able to get the most of the monitor before purchasing it. I am not too bothered by how I should connect the the monitor to my laptop, but the USB-C port would be preferred. My ultimate question is, would it be possible for me to connect an external monitor to the laptop and take full advantage of the discrete 1660 TI graphics, while maintaining a 144hz refresh rate? Additionally, would I be able to make use of the laptop display for Discord while using the monitor for my gaming display, or perhaps use the monitor and keep the laptop closed if I wanted to?

According to specifications, this laptop supports USB-C DisplayPort Alt-Mode for 1650 and above. Does this mean that only the intel graphics can be used through the USB-C.?

Any advice, guidance or recommendations would be greatly appreciated and I hope you guys can help. I have checked my Nvidia PhysX configuration and it appears that the 1660 TI / PhysX is connected to the HDMI port and the Intel UHD graphics is connect to the USB-C / Laptop display.


Thanks a mil!

2 Posts

May 27th, 2020 01:00

@jphughan 

Firstly, thank you for your detailed response, I appreciate the amount of effort you have put into trying to help me find the answer my question.

 

Secondly, it seems I have opened Pandora's box, because as I mentioned, my technical knowledge is limited and I became aware of how little I knew after reading your response.

Nevertheless, my curiosity has led me down this rabbit hole and I have taken the opportunity to learn as much as I can to bridge the gaps in my knowledge.

 

Now that I fully understand what you have said, my dilemma is becoming more apparent and im going to try and take advantage of your knowledge and expertise if you don’t mind.

 

Ill start by saying that I am by no means a professional gamer or serious content creator, so my desire to have an external monitor is just a luxury I would like to indulge, because of my poor eyesight and the advantages of having a larger display.

I do however, spend a lot of my time on the laptop playing games and I want to try and achieve the best possible experience, given my desire to have a larger display and the laptop’s limitations.

 

To answer your question about resolution and the laptop’s HDMI version, I would be happy with 1080P or higher and I believe the HDMI is 2.0, if that’s what you were looking for?

I’m glad I have not yet purchased a monitor, because the salesman would have taken complete advantage of my ignorance I would have been none the wiser.

 

Now that I know that G-Sync and Adaptive V-sync are out the window, because the USB-C port is wired to the Intel GPU and the dGPU is wired to the HDMI port, I accept that compromises need to be made.

 

By my assessment, I have 2 options at my disposal.

 

Option 1:

Connect the monitor to the USB-C and loose the image quality that the dGPU would provide, because it is connected to the Intel GPU and I might have a better refresh rate?

 

Option 2:

Connect the monitor to the HDMI port and take full advantage of the image quality supplied by dGPU but this would mean that I could suffer screen tearing / motion judder if the FPS and refresh rate were not consistently similar?

 

I am confident that you know exactly what you are talking about and I value your opinion, so my next question is this.

If you were in my position and you were going to purchase a larger monitor, what factors would you take into consideration and what would your ultimate decision be to get the best experience out of the monitor?

 

Theoretically speaking, would it be best to just get a monitor which supports a refresh rate that would be most consistent with the dGPU performance, perhaps 100hz or 120hz, and connect this via HDMI to take advantage of the image quality?

 

Any advice, guidance or recommendations would be greatly appreciated and once again, I must thank you for all your help.

4 Operator

 • 

14K Posts

May 27th, 2020 08:00

@JayZmon  I apologize if I doled out far more information than you wanted, but hopefully this next reply will help.

If you are correct about the display being HDMI 2.0, then that is the current standard (except for HDMI 2.1 which has barely started to arrive on TVs), so if you can find a display whose documentation indicates that it can be driven at its native resolution and refresh rate from its HDMI 2.0 input, then all you'll need is an HDMI cable specifically rated for HDMI 2.0 and you're set.

In terms of your options, some clarifications are in order.

First, using USB-C won't technically reduce your image quality, since that isn't really the same thing as refresh rate.  That said, if your HDMI port is in fact the HDMI 2.0 variety, then there's no benefit to using USB-C either.  If your HDMI port were only 1.4, then the USB-C output would give you more overall bandwidth and therefore might actually allow higher refresh rates at a given resolution than you'd be able to achieve over HDMI, though -- at least unless you hit some sort of max refresh rate that stemmed from an Intel GPU limitation rather than bandwidth.  And again, I don't know that that exists.  I know that I've seen confirmation of 120 Hz from Intel GPUs. I haven't seen anything about 144 Hz one way or the other.

As for screen tearing and judder, that dilemma will exist to some extent regardless of your chosen display output simply because you can't run G-Sync.  If you have V-Sync Off, you'll get some tearing.  If you turn it on, then the higher your refresh rate, the more risk of judder because higher refresh rates means your GPU needs to sustain higher frame rates to avoid judder.  But that will be the case regardless of which output you use.  The only possible factor that might tip the scales here is that while G-Sync cannot be used over HDMI, I believe (but am not certain) that Adaptive V-Sync can be.  That mechanism involves the GPU keeping V-Sync enabled when it can sustain the necessary frame rate, thereby avoiding tearing, but dynamically turning it off when the frame rate drops, figuring that between tearing and judder, the former is the lesser of two evils.  So here again, if that HDMI output is in fact HDMI 2.0 -- and you'll get a display that supports HDMI 2.0, then that's how I would connect the display, because there won't be a downside compared to using USB-C (except something like the fact that USB-C allows a display daisy chain to be created, but that won't matter for a single display).

In terms of display advice, this is where my knowledge runs thin.  I don't keep up with gaming displays in sufficient detail to have up-to-date specific product recommendations, simply because I'm not a gamer.  I will say that for general use, 27" 1080p displays don't provide a great experience due to their lower pixel density, but on the other hand that's a reasonably popular setup for gaming displays because 1080p compared to the more common 1440p resolution for a 27" display.  The reason is that 27" 1080p gives you the same physical size, while the lower resolution reduces your GPU performance requirements without you needing to run the display below its native resolution, which can introduce image quality issues of its own.  So if you're willing to optimize for gaming at the cost of some experience in the "general productivity work" category, that might be a good choice especially on a laptop where 1440p gaming is harder than on a desktop.  And if you drop to 1080p, then there are actually some 240 Hz displays on the market, since less bandwidth required for resolution means more bandwidth can be allocated to higher refresh rates.

But one key choice you'll have to make if you're shopping for gaming displays is TN vs. IPS panels.  You'll find a lot more information by just Googling those acronyms, but essentially TN tends to give faster response times while trading away color gamut and accuracy, and colors and contrast tend to wash out when the display is viewed off-angle.  IPS panels have better color gamut, accuracy, and viewing angle in exchange for lower contrast and higher response time.  Of course in each category there are varying degrees of quality, so you might find a TN panel that maintains decent color or an IPS panel that maintains great contrast.  IPS in general is more expensive.  But here again, even ignoring price, it depends on what you want to optimize for.  Some gamers are fine with TN panels because all they care about is response time.  Other gamers still want their panel to have nice color gamut for when they're not playing games.

Hopefully all this information did more good than harm, and best of luck with your shopping!

1 Message

April 10th, 2021 01:00

wow interesting stuff, so if I get you right the Intel internal graphics kind of sub tasks to the NVIDIA as needed. Therefore If my external monitor is not showing in the NVIDIA control panel "multiple displays", then it doesn't really matter I guess... neither is it showing in the PhysX tab by the way... Am I right?

The other thing I've noticed is a "NVIDIA USB C port policy controller" in the USB section of the Device Manager in windows10... Adding to my confusion... What is the use of that item?

April 20th, 2022 10:00

Hey guys,

In my case (also a Dell G3, but a bit newer model), the PhysX section looks like this:

Capture.PNG

Does that mean that my miniDP port is definately directly wired to the dedicated graphics card and not to the integrated graphics chip?

And if yes, would this then mean that any (mini)DP compatible VR headset would be compatible?
Thanks a lot in advance!




1 Message

June 4th, 2022 06:00

its also been almost a year for me too but i noticed it today that my graphic car of my dell g3 15 was integrated but in the buy page it was written rtx 1650 ti why is it so ??? this really bad from dell

No Events found!

Top