Start a Conversation

This post is more than 5 years old

Solved!

Go to Solution

5393837

May 30th, 2014 22:00

Disabling Integrated Graphics card?

Hi,

I have a Dell XPS 15 L502X. So I recently installed a game called Wolfenstein: The New Order. And every time I run the game I get an error known on the web as "wglCreateContextAttribsARB failed".

I know the cause of the error is that the game keeps identifying my integrated graphics card (Intel HD 3000) as the main graphics card whereas I have a dedicated graphics card onboard (Nvidia GeForce GT 540M, 2GB dedicated memory).

The issue is that my integrated card only supports OpenGL 3.1 and below, although my GPU supports until OpenGL 4.0. I know for a fact this GPU is able to run the game, and YES I have already selected my GPU as the primary graphics card under Manage 3D settings, global settings AND the specific program settings, AND even tried setting PhysX Configuration as the GPU itself all to no avail.

I have also attempted to disable my Intel HD 3000 through device manager (couldn't disable it through bios), but whenever I do so that turns off my laptop's display. However I have a TV to which my laptop is connected to via HDMI. So my laptop's screen is transferred to my TV and I cannot change/turn my laptop's display ON unless I were to re-enable the integrated graphics card. Strange in my opinion!

On the other side, I ran the game without the integrated graphics card (while using my TV as the display), and the game runs just fine !!!!! So CLEARLY the GPU (GT 540M) is able to run this game, but the game doesn't identify it. Instead only identifying the integrated Intel HD 3000 Graphics card or identifies the integrated graphics card as the primary for some reason!

Help please, I want to play the game on my laptop's screen not on my TV... Also on a side question, is it possible to completely disable my integrated graphics card (Intlel HD 3000) without losing my display? If so, how?

Thank you for your time.

EDIT: I have found out that a similar problem I used to have with another game (Watch Dogs) about the game requiring my video card to support DirectX 11 for the game to run (so it never ran), now works smoothly when I disable my integrated graphics card and use my TV as my main display! EVEN THOUGH my GT 540M is below the minimum requirements for the game!

1 Rookie

 • 

19 Posts

January 16th, 2015 10:00

thanks for the clarification. this still seems like a software issue - opencl isn't being detected properly (it seems).

I'm not a gamer/developer, so I'm not totally clued up on what's using what technology, but I've read that

Through CUDA, OpenCL and DirectCompute 2.1 support, the GeForce GT 540M can be of help in general calculations. For example, the stream processor can  encode videos considerably faster than can a modern CPU. Furthermore, physics calculations can be done by the GPU using PhysX (supported by Mafia 2 and Metro 2033). However, the GPU is not fast enough to calculate PhysX high detail game settings.

Features
Optimus Support, PureVideo HD VP4, 3D Vision, 3DTV Play, Bitstream HD Audio, CUDA, DirectCompute, OpenCL, OpenGL 4.0, DirectX 11

(Benchmark: Cinebench R11.5 - OpenGL 64Bit)

 

I downloaded Cinebench (11.5 x64 AND newer R15), but it crashes (the same way that all my nvidia apps tend to crash)

 

My GPU control is lost - underclocking (.nsu) profile (safe: 550,850) reverts back to stock GPU core/memory clock speeds (672,900MHz)

 

Couple of ongoing issues here. any other l502x users confirm opencl is active/working in high-performance nvidia gt540m? (of course - if so, how?!)

test request (for l502x / other nvidia optimus users)

try a cinebench (open GL) test (and report back!)

stupid dell/nvidia/computer

 

1 Rookie

 • 

19 Posts

February 23rd, 2015 05:00

this is misleading.

I had problems with chrome on xps l502x - unmasked renderer identified as intel - even when running with nvidia, even when running on display 2 via hdmi, which is via nvidia (but also sadly, seeeeeeemingly via intel first)....

thanks to wolv', and many more, I've just tested (whilst extending display to 2nd monitor) and found that chrome can be tricked into identifying the correct 'unmasked renderer''...

1) in devmgmt.msc disable intel hd graphics 3000

2) run app (ie chrome.exe) 

3) even if i (re)enable intel, chrome still correctly uses nvidia gpu...

re #3, this could mean that users could use a fakemon + script to achieve the above...

June 30th, 2015 20:00

I have the same problem.

I play game with 630M card, then using software to detect temperature, find that iGPU module spikes to 100 C.

I tried to set display connected to 630M via nvidia console, but still everything goes through iGPU.

After read this thread, it seems that it's a hardware design flaw, and there is no way around it .

hybrid setting, the *** is that. Just fancy sales pitch, to confuse customers, to mislead unskilled users.

Dell notebook is stupid. I'll never buy dell product ever again.

1 Message

August 10th, 2016 21:00

On Dell precision I found a solution to turn on the dedicated Nvidia GPU, instead of Intel integrated:
In BIOS, Video settings, disable "Enable Switchable Graphics". After restart, it assumed only dedicated GPU and it has been working fine until now.

No Events found!

Top