Start a Conversation

This post is more than 5 years old

Solved!

Go to Solution

20252

January 21st, 2018 11:00

Hybrid vs Optimus Graphics

Good day,

I got a Dell XPS 15 9550 laptop with integrated Intel graphics and a dedicated Nvidia GeForce GTX 960 card. Does anyone have an idea what kind of technology is used? Hybrid or Optimus? And what is the difference?

Thank you in advance

4 Operator

 • 

14K Posts

January 21st, 2018 11:00

I don't think "hybrid" is an official name that refers to a specific hardware architecture, but I can tell you how the XPS 15 works and how it compares to other "switchable graphics" architectures that exist in other systems.

Every XPS 15 has an Intel GPU (the integrated GPU, or iGPU), and they can also be ordered with an NVIDIA GPU (the discrete GPU, or dGPU).  In the latter case, the iGPU is still the only GPU that is physically connected to the display outputs, both the built-in panel and external display outputs.  Meanwhile, the dGPU exists as a render-only device.  When it's activated (drivers are supposed to automatically detect that it would deliver a benefit, or users can manually force an application to use a specific GPU), then the dGPU does all of the rendering work and then sends completed video frames to the iGPU, which passes them through to the display(s).  This is what NVIDIA Optimus is, and this is the most common design these days for systems with a dGPU.  Having the iGPU connected to all display outputs means that regardless of what display(s) you're using with the system, it's always possible to have the dGPU completely powered off, which is good for battery life.  The downside of having the iGPU always in the video path is that you're constrained to what the iGPU supports.  There are certain technologies that it doesn't support and/or that require the NVIDIA GPU to hae direct control of the display output.  As of this writing, these technologies include VR, G-Sync, Adaptive V-Sync, stereoscopic 3D, and 5K displays. As a result, even if your NVIDIA GPU supports those technologies, you won't be able to use them with any display/device connected to an output controlled by the Intel GPU.

In some other systems, such as some Inspiron Gaming models, the iGPU is physically connected to the built-in display (and the dGPU can still act as a render-only device for it as needed), but the dGPU is directly connected to the external display outputs.  This means that the technologies I mentioned above CAN be used on external displays (assuming the dGPU supports them), but the downside is that the dGPU is ALWAYS on whenever an external display is connected, even if the dGPU's extra horsepower isn't needed, which means lower battery life.  Someone with a system like this was recently on here complaining about this design because he frequently used an external display while on battery power and he didn't care about those extra features I mentioned, so he was annoyed that this battery life-reducing design was implemented on his system.  I argued that for a gaming system, it made more sense to optimize for supporting gaming technologies than for battery life, especially given that most people will be plugged into AC while running external displays.

And the last way to do multi-GPU setups is what's found in systems like the Precision 7000 Series models.  In those systems, the iGPU and dGPU are each connected to switchers on the motherboard (called DisplayPort multiplexers), and those switchers are then connected to the built-in display and the external display outputs.  Using those switchers means that users can go into the BIOS and CHOOSE which GPU they want to directly control the various displays, and they can even choose to disable the iGPU completely so that it works as a single GPU system, with only the dGPU.  This gives them a variety of options.  For example:

- If they want to optimize for battery life, they can have their system work like the XPS where the dGPU works as a render-only device.
- If they need support for those extra technologies I mentioned above on their external displays but still want better battery life when on the go, there's a mode where they can have the dGPU directly control all external display outputs while the iGPU remains in control of the built-in display (which the dGPU can also still accelerate on an as-needed basis).
- If they want all of their displays, including the built-in panel, to always run on the dGPU in order to avoid having to deal with potential issues related to switchable graphics and/or having different displays driven by different GPUs, they can disable the iGPU completely.

The downside to the above design as you may have already guessed is that it's more expensive because of the extra complexity with the display output switchers.  That plus the fact that there are relatively few cases where you actually need the dGPU to directly control the display outputs is why that last design is only found in very high-end laptops.

4 Operator

 • 

14K Posts

January 21st, 2018 12:00

"Hybrid graphics" to my knowledge doesn't have a formal definition in terms of how the technology operates.  As I already said in my post above, I suspect any system that has more than one GPU could be considered a "hybrid graphics" PC, but as I've already explained, systems that have multiple GPUs can be set up in different ways to achieve different results, so just knowing that it has "hybrid graphics" wouldn't tell you very much.

In your system, the dGPU would be disabled until the drivers detect that it should be activated (or you manually choose to run an application with the dGPU).  If you don't want that to ever happen, I suppose you could disable the dGPU in Device Manager, but I don't believe there's a way to disable it at the BIOS level.  I have an XPS 15 9530, which is the generation just before yours and has the same GPU design, and I don't have any option to disable it in the BIOS.

4 Operator

 • 

14K Posts

January 21st, 2018 12:00

Optimus is the technology where a dGPU can act as a render-only device and pass frames through to an iGPU for final output.  So the XPS system uses Optimus across the board, the Inspiron Gaming system I described uses Optimus for the built-in display but acts as a regular GPU for the external outputs, and on the Precision 7000 system, whether Optimus is used at all (and if so, on which outputs) depends on how the user has configured their BIOS settings.

"Hybrid" I guess would apply to any system that had more than one GPU, so I guess all of the systems above would qualify as hybrid systems -- but that doesn't tell you exactly how they work.

15 Posts

January 21st, 2018 12:00

Thanks a lot for your comprehensive answer.

But which is what ? Is it Optimus technology or Hybrid that is used by my XPS 9550 Laptop?

15 Posts

January 21st, 2018 12:00

Thank you.

Do you know by any change what Hybrid graphics are about?

I don't have any options available in the bios for setting the behaviour, the dedicated card isn't showed at all and there are no options to set it off, the dell XPS BIOS is not very extensively..

4 Posts

September 15th, 2018 00:00

Thanks for the great posts jp,

On my new 9570 with a GTX 1050ti …

I have recently installed the 399.24 driver [nvidia]…

[Which would not install on win10 home, so I installed pro]

It, as you say, will not allow the gtx to be set as'defualt' PhysX processor...

In the menu bar of the control panel, however, is the option to display GPU activity icon in the notification area...

The small square will glow with a color spectrum when the gtx is active...

I've had it running for 15 min, & the only program to activate it thus far is; video playback via win. 'movies & tv' app. ...Which I have set to high performance in;

settings/display/graphics settings/universal app/add/high performance...

Thanks again for all the help in understanding how the xps graphics work,

DayPay

4 Posts

September 18th, 2018 07:00

jp,

I thought it odd also...

The actual message was approx.; Can't install this driver on this version os…

Perhaps it has something to do with it being a 'core version' ? [as purchased]…

---------------

Also...

Could not install, update, or change with volume license key...

Had to use retail key from ms...???

4 Operator

 • 

14K Posts

September 18th, 2018 07:00


@cruzlite wrote:

Thanks for the great posts jp,

On my new 9570 with a GTX 1050ti …

I have recently installed the 399.24 driver [nvidia]…

[Which would not install on win10 home, so I installed pro]

It, as you say, will not allow the gtx to be set as'defualt' PhysX processor...

In the menu bar of the control panel, however, is the option to display GPU activity icon in the notification area...

The small square will glow with a color spectrum when the gtx is active...

I've had it running for 15 min, & the only program to activate it thus far is; video playback via win. 'movies & tv' app. ...Which I have set to high performance in;

settings/display/graphics settings/universal app/add/high performance...

Thanks again for all the help in understanding how the xps graphics work,

DayPay


Glad I could help!  Although not being able to install a driver on Win10 Home sounds very odd.  There's absolutely no reason that a driver should fail on Win10 Home and work on Win10 Pro....

3 Posts

December 26th, 2018 06:00

You seem knowledgeable asf. I have a precision 5530. Has the intel and nVidia Quadro p1000. I want to use oculus on it. Will that be possible? Reddit said turn off Optimus if it’s on

4 Operator

 • 

14K Posts

December 26th, 2018 07:00


@YankDoll wrote:
You seem knowledgeable asf. I have a precision 5530. Has the intel and nVidia Quadro p1000. I want to use oculus on it. Will that be possible? Reddit said turn off Optimus if it’s on

Your system doesn’t have an option to turn off Optimus and keep the dGPU active on its own. That’s a very rare capability as I described above. But check NVIDIA Control Panel’s multi-display configuration area. That should show which outputs are wired to which GPU. The Precision 5530 is the sister system of the XPS 15 9570. I believe that on that system, the HDMI output is wired directly to the NVIDIA GPU when the system has one, even though previous Precision 55x0 / XPS 15 generations had the HDMI output wired to the Intel GPU, but I’m not certain. You’d want it wired to the NVIDIA GPU, because to my knowledge Intel GPUs still don’t support VR passthrough, so you wouldn’t be able to use a Rift through an HDMI output wired to the Intel GPU.

4 Posts

January 6th, 2019 11:00

What an awesome post - thank you! 

I'm having multiple issues (none of them uncommon if I read this community) with my Dell XPS 9570 and honestly regretting ever buying this machine, but past my 30 days return so I'll have to deal with it the next years I guess.

What I wanted to ask: If I use Dell SupportAssist and stress test my hardware it will show the different rendering software with FPS, but it only ever refers to the intel UHD 630. My 1050Ti is showing in device manager and tagged as functioning properly though. Does this mean that even in stress testing the hardware on internal display the render software still only sees the intel UHD? How can I be sure when using any software/games that my GTX 1050 is actually being used? 

4 Operator

 • 

14K Posts

January 6th, 2019 17:00


@TijnF wrote:

What an awesome post - thank you! 

I'm having multiple issues (none of them uncommon if I read this community) with my Dell XPS 9570 and honestly regretting ever buying this machine, but past my 30 days return so I'll have to deal with it the next years I guess.

What I wanted to ask: If I use Dell SupportAssist and stress test my hardware it will show the different rendering software with FPS, but it only ever refers to the intel UHD 630. My 1050Ti is showing in device manager and tagged as functioning properly though. Does this mean that even in stress testing the hardware on internal display the render software still only sees the intel UHD? How can I be sure when using any software/games that my GTX 1050 is actually being used? 


If you right-click a shortcut or an actual EXE file, you should see an option that says "Run on graphics processor" with options to choose which GPU to use for that application, overriding the normal auto-detect functionality and any profiles you may have configured in NVIDIA Control Panel for that particular application launch.  Unfortunately this doesn't seem to be available when right-clicking Start menu items.  But see if that works.  However, when you're playing games, it should be easy to tell whether the NVIDIA GPU is being used because it's massively more powerful than the Intel GPU, so you should have pretty awful framerates if the NVIDIA GPU isn't being used.

4 Posts

January 7th, 2019 01:00

Clear! Thanks a ton for the prompt reply. 

No Events found!

Top