Start a Conversation

This post is more than 5 years old

Solved!

Go to Solution

5391839

May 30th, 2014 22:00

Disabling Integrated Graphics card?

Hi,

I have a Dell XPS 15 L502X. So I recently installed a game called Wolfenstein: The New Order. And every time I run the game I get an error known on the web as "wglCreateContextAttribsARB failed".

I know the cause of the error is that the game keeps identifying my integrated graphics card (Intel HD 3000) as the main graphics card whereas I have a dedicated graphics card onboard (Nvidia GeForce GT 540M, 2GB dedicated memory).

The issue is that my integrated card only supports OpenGL 3.1 and below, although my GPU supports until OpenGL 4.0. I know for a fact this GPU is able to run the game, and YES I have already selected my GPU as the primary graphics card under Manage 3D settings, global settings AND the specific program settings, AND even tried setting PhysX Configuration as the GPU itself all to no avail.

I have also attempted to disable my Intel HD 3000 through device manager (couldn't disable it through bios), but whenever I do so that turns off my laptop's display. However I have a TV to which my laptop is connected to via HDMI. So my laptop's screen is transferred to my TV and I cannot change/turn my laptop's display ON unless I were to re-enable the integrated graphics card. Strange in my opinion!

On the other side, I ran the game without the integrated graphics card (while using my TV as the display), and the game runs just fine !!!!! So CLEARLY the GPU (GT 540M) is able to run this game, but the game doesn't identify it. Instead only identifying the integrated Intel HD 3000 Graphics card or identifies the integrated graphics card as the primary for some reason!

Help please, I want to play the game on my laptop's screen not on my TV... Also on a side question, is it possible to completely disable my integrated graphics card (Intlel HD 3000) without losing my display? If so, how?

Thank you for your time.

EDIT: I have found out that a similar problem I used to have with another game (Watch Dogs) about the game requiring my video card to support DirectX 11 for the game to run (so it never ran), now works smoothly when I disable my integrated graphics card and use my TV as my main display! EVEN THOUGH my GT 540M is below the minimum requirements for the game!

October 29th, 2014 01:00

You can switch between the graphics cards - using low performance Intel graphics for browsing/less graphics intensive application and using high performance nVidia graphics for graphics intensive applications like games.

You can have a look at the folllowing link to switch between Intel graphics and discrete graphics (nVidia, in your case):

http://www.youtube.com/watch?v=DCh9Eob0mNY

 

(edited)

Community Manager

 • 

2.2K Posts

August 4th, 2023 22:00

Video discussing topic:

(edited)

May 31st, 2014 03:00

You can set the graphics card preference in nVidia control panel. You can set the option globally (always use nVidia processor) or configure for individual application (use nVidia graphics processor for Wolfenstein).

The below images show the settings:

Global settings:

Application specific settings:

1 Rookie

 • 

87.5K Posts

May 31st, 2014 04:00

The system will not function with the Intel GPU disabled - the nVidia GPU is not a true discrete video chip - it is a co-processor.  Only the Intel GPU has a connection to the display screen - all video data passes through it on its way to the display, so there is no way to disable it and have a functional system.

Check with the publisher of the game for its recommendations on settings for hybrid video -- not all games will work with a hybrid setup.

May 31st, 2014 11:00

@bajjibala , thanks for the reply. However, I have tried both of these suggestions which don't seem to solve the issue.

The system will not function with the Intel GPU disabled - the nVidia GPU is not a true discrete video chip - it is a co-processor.  Only the Intel GPU has a connection to the display screen - all video data passes through it on its way to the display, so there is no way to disable it and have a functional system.

Unfortunately, so is what I heard about the Intel GPU. Is that why both games identify the integrated graphics card as the primary card though?

What do you mean it is a co-processor? I am pretty sure I can run a PC with the Nvidia GPU, in fact by disabling the Intel GPU and using the TV as a display my laptop runs perfectly fine. The only problem I see here, is that the Intel GPU is directly connected to the display, while the Nvidia GPU is connected to the Intel GPU instead of to the display as well, in this case I feel cheated buying this product from Dell.

Check with the publisher of the game for its recommendations on settings for hybrid video -- not all games will work with a hybrid setup.

I am pretty confident my laptop is supposed to meet the requirements of Wolfenstein: The New Order. And the Nvidia GPU does, but the Intel GPU doesn't meet anything clearly. So I don't see why connect a weak Intel GPU directly to the display instead of the powerful Nvidia GPU, or at least both be connected to the display.

1 Rookie

 • 

87.5K Posts

May 31st, 2014 12:00

There are two ways to implement nVIdia's Optimus technology (and AMD's Catalyst as well).  One is with a true hardware multiplexer that allows hardware control of which GPU is active.  The other is with a mux-less software-switched design where only the Intel GPU has a physical connection to the display panel. The vast majority of systems are designed this way - as is yours.  It's a much more power-efficient way to run the system, but it is also demanding from a software (driver) standpoint as well.

Yes, this is why your Intel GPU always shows primary - it channels all video data to the screen. 

1 Rookie

 • 

87.5K Posts

May 31st, 2014 15:00

The nVIdia GPU should work for the internal display as well as the external - as mentioned before, check with the game publisher on recommended settings for hybrid video.

Bottom line:  no, the nVidia GPU cannot be used standalone with the internal screen.

May 31st, 2014 15:00

There are two ways to implement nVIdia's Optimus technology (and AMD's Catalyst as well).  One is with a true hardware multiplexer that allows hardware control of which GPU is active.  The other is with a mux-less software-switched design where only the Intel GPU has a physical connection to the display panel. The vast majority of systems are designed this way - as is yours.  It's a much more power-efficient way to run the system, but it is also demanding from a software (driver) standpoint as well.

Yes, this is why your Intel GPU always shows primary - it channels all video data to the screen.

Thanks for the clarification.

So basically I cannot play those 2 games without the use of a 2nd monitor, because technically that's the only "physical" connection that can be established directly to the Nvidia GPU, and thus the only way to disable the Intel GPU while losing my main display alongside?

May 31st, 2014 16:00

The nVIdia GPU should work for the internal display as well as the external - as mentioned before, check with the game publisher on recommended settings for hybrid video.

Games don't make special requirements nor any different settings for hybrid setup as you say, so I don't see your point here? I was promised that having two GPUs was for the better, one to save power while the other to be used while gaming.

This does not seem to be the case, the Intel GPU in this case limits the capability of the Nvidia GPU while using my laptop's display, this has nothing to do with the game's specs nor recommended settings, since my laptop is fully capable of running the game (according to their requirements) WITHOUT the integrated Intel GPU, and since that isn't an option, I am not satisfied with this product.

Bottom line:  no, the nVidia GPU cannot be used standalone with the internal screen.

Thanks, this confirms my doubts and clarifies any misconceptions.

1 Rookie

 • 

87.5K Posts

May 31st, 2014 18:00

There are plenty of games that either don't work at all - or that require updates or upgrades - to work with hybrid video.  There are others that require specific settings be used.  The Intel GPU has nothing to do with the game if you've selected the nVidia GPU with the control panel - AND the game supports the hybrid video setup.

If you need a true, dedicated GPU these days, you need a high-end gaming notebook.  The vast majority of systems priced under $2,000 have software-based hybrid video -- doesn't matter if it's a Dell, Toshiba, Lenovo, etc. - they are all similar in design.  There are some business-class hybrid setups (Latitude, Thinkpad, etc.) that DO have hardware switching -  but those are not designed for gaming so much as for operating demanding applications like Adobe's creative suite.

It you haven't checked with the game's publisher, you should indeed do that - there are indeed often settings that need to be changed to make certain games run with a software-based hybrid video setup.  The technology is only a few years old - and support for it is far from universal in the gaming world.

May 31st, 2014 22:00

As mentioned in my previous post, did you try setting the 3D settings as per the application/globally?

This can enable you to use the high performance nVidia graphics alone for the game or as the default graphics card.

June 1st, 2014 00:00

As mentioned in my previous post, did you try setting the 3D settings as per the application/globally?

This can enable you to use the high performance nVidia graphics alone for the game or as the default graphics card.

Yes, I have as I already replied before. Thanks

2 Posts

October 29th, 2014 00:00

Hi,

I also have a dell xps 15 laptop and am curious about this dual graphics card dilemma here.. what is the conclusion? I have an nvidia geforce 630M and an intel integrated graphics card. Are games I play (like league of legends) only running through my inferior integrated card? How do I know if they are hybridizing correctly and I am getting everything out of my nicer nvidia card?

Thank you 

January 13th, 2015 04:00

Hi, and thanks for reading if you've got this far!!!

Grateful to the OP - this could yield a feasible workaround for a showstopper, re webgl not available for dell users with discrete nvidia gfx...

when I run chrome:gpu I find lots of problems reported - because, like the game mentioned by the OP in this thread, it's detecting the embedded Intel HD Graphics card.

Driver Bug Workarounds

  • clear_uniforms_before_first_program_use
  • disable_d3d11
  • exit_on_context_lost
  • scalarize_vec_and_mat_constructor_args
  • texsubimage2d_faster_than_teximage2d

Problems Detected

  • Accelerated video decode interferes with GPU sandbox on older Intel drivers: 180695
    Disabled Features: accelerated_video_decode
  • GPU rasterization is blacklisted on non-Android: 362779
    Disabled Features: gpu_rasterization
  • Some drivers are unable to reset the D3D device in the GPU process sandbox
    Applied Workarounds: exit_on_context_lost
  • TexSubImage2D() is faster for full uploads on ANGLE
    Applied Workarounds: texsubimage2d_faster_than_teximage2d
  • Clear uniforms before first program use on all platforms: 124764349137
    Applied Workarounds: clear_uniforms_before_first_program_use
  • Using D3D11 causes browser crashes on certain Intel GPUs: 310808
    Applied Workarounds: disable_d3d11
  • Always rewrite vec/mat constructors to be consistent: 398694
    Applied Workarounds: scalarize_vec_and_mat_constructor_args
  • Raster is using a single thread.
    Disabled Features: multiple_raster_threads

Does this mean that we can use WebGL with NVIDIA's 2GB memory and dedicated GFX technology if running chrome on an extended display?

I really just want stability. I'm trying to make it so my PC never crashes - I hate losing time, thoughts, and data.

My tests -

 - running aquarium and stagecubes (as discussed on this great page)

 - checking the unmasked rendererer at the bottom of the webgl report (as discussed here)

Tried lots of chrome flags and switches... nothing works. 

**Even when NVIDIA GPU activity indicates chrome is running with high-performance NVIDIA, the chrome:gpu confirms it's seeing the embedded-intel gfx (like the OP mentioned when using laptop-screen)...

Is this just me, or is this for everyone with the dell l502x? 

***

Failed Optimus example of unmasked renderer:

Unmasked Renderer: ANGLE (Intel(R) HD Graphics Family Direct3D9Ex vs_3_0 ps_3_0)

 "Chrome: newer versions should be always show unmasked renderer (Chrome 36 already works)."

***

Note: If you have a notebook with Nvidia’s Optimus graphics-switching technology, Chrome 10 may not be able to able to trigger discrete graphics mode due to a driver issue. Nvidia assures us that this problem will be fixed in the next release of their Verde driver, which is due in the next few weeks. A test version we received fixed our problem on the ASUS U36Jc.

(from Avram Piltch, march 2011)

 

questions

1) Is this normal?

(ie "design flaw", as oppose to a bug that only some folks are having that makes chrome see intel's gpu, not nvidia's, despite optimus running with nvidia) 

2) are there any workarounds? 

(including run on another monitor, change browser version, get a dedicated gfx card?)

1 Rookie

 • 

87.5K Posts

January 13th, 2015 06:00

1.  Yes, it's normal.  All video data passes through the Intel GPU on its way to the display, even if it has been processed by the Intel GPU.

2.  No, there is no way around this.  It's a hardware design feature - there are two ways to design hybrid video.  

One has a hardware switch and the other uses software.  Your system uses software-controlled hybrid video - it cannot be altered by hardware.

No Events found!

Top