As mentioned in my previous post, did you try setting the 3D settings as per the application/globally?
This can enable you to use the high performance nVidia graphics alone for the game or as the default graphics card.
Yes, I have as I already replied before. Thanks
I also have a dell xps 15 laptop and am curious about this dual graphics card dilemma here.. what is the conclusion? I have an nvidia geforce 630M and an intel integrated graphics card. Are games I play (like league of legends) only running through my inferior integrated card? How do I know if they are hybridizing correctly and I am getting everything out of my nicer nvidia card?
You can switch between the graphics cards - using low performance Intel graphics for browsing/less graphics intensive application and using high performance nVidia graphics for graphics intensive applications like games.
You can have a look at the folllowing link to switch between Intel graphics and discrete graphics (nVidia, in your case):
Hi, and thanks for reading if you've got this far!!!
Grateful to the OP - this could yield a feasible workaround for a showstopper, re webgl not available for dell users with discrete nvidia gfx...
when I run chrome:gpu I find lots of problems reported - because, like the game mentioned by the OP in this thread, it's detecting the embedded Intel HD Graphics card.
Does this mean that we can use WebGL with NVIDIA's 2GB memory and dedicated GFX technology if running chrome on an extended display?
I really just want stability. I'm trying to make it so my PC never crashes - I hate losing time, thoughts, and data.
My tests -
- checking the unmasked rendererer at the bottom of the webgl report (as discussed here)
Tried lots of chrome flags and switches... nothing works.
**Even when NVIDIA GPU activity indicates chrome is running with high-performance NVIDIA, the chrome:gpu confirms it's seeing the embedded-intel gfx (like the OP mentioned when using laptop-screen)...
Is this just me, or is this for everyone with the dell l502x?
Failed Optimus example of unmasked renderer:
Unmasked Renderer: ANGLE (Intel(R) HD Graphics Family Direct3D9Ex vs_3_0 ps_3_0)
"Chrome: newer versions should be always show unmasked renderer (Chrome 36 already works)."
Note: If you have a notebook with Nvidia’s Optimus graphics-switching technology, Chrome 10 may not be able to able to trigger discrete graphics mode due to a driver issue. Nvidia assures us that this problem will be fixed in the next release of their Verde driver, which is due in the next few weeks. A test version we received fixed our problem on the ASUS U36Jc.
1) Is this normal?
(ie "design flaw", as oppose to a bug that only some folks are having that makes chrome see intel's gpu, not nvidia's, despite optimus running with nvidia)
2) are there any workarounds?
(including run on another monitor, change browser version, get a dedicated gfx card?)
1. Yes, it's normal. All video data passes through the Intel GPU on its way to the display, even if it has been processed by the Intel GPU.
2. No, there is no way around this. It's a hardware design feature - there are two ways to design hybrid video.
One has a hardware switch and the other uses software. Your system uses software-controlled hybrid video - it cannot be altered by hardware.
thanks for the clarification. this still seems like a software issue - opencl isn't being detected properly (it seems).
I'm not a gamer/developer, so I'm not totally clued up on what's using what technology, but I've read that
Through CUDA, OpenCL and DirectCompute 2.1 support, the GeForce GT 540M can be of help in general calculations. For example, the stream processor can encode videos considerably faster than can a modern CPU. Furthermore, physics calculations can be done by the GPU using PhysX (supported by Mafia 2 and Metro 2033). However, the GPU is not fast enough to calculate PhysX high detail game settings.
Optimus Support, PureVideo HD VP4, 3D Vision, 3DTV Play, Bitstream HD Audio, CUDA, DirectCompute, OpenCL, OpenGL 4.0, DirectX 11
(Benchmark: Cinebench R11.5 - OpenGL 64Bit)
I downloaded Cinebench (11.5 x64 AND newer R15), but it crashes (the same way that all my nvidia apps tend to crash)
My GPU control is lost - underclocking (.nsu) profile (safe: 550,850) reverts back to stock GPU core/memory clock speeds (672,900MHz)
Couple of ongoing issues here. any other l502x users confirm opencl is active/working in high-performance nvidia gt540m? (of course - if so, how?!)
test request (for l502x / other nvidia optimus users)
try a cinebench (open GL) test (and report back!)
this is misleading.
I had problems with chrome on xps l502x - unmasked renderer identified as intel - even when running with nvidia, even when running on display 2 via hdmi, which is via nvidia (but also sadly, seeeeeeemingly via intel first)....
thanks to wolv', and many more, I've just tested (whilst extending display to 2nd monitor) and found that chrome can be tricked into identifying the correct 'unmasked renderer''...
1) in devmgmt.msc disable intel hd graphics 3000
2) run app (ie chrome.exe)
3) even if i (re)enable intel, chrome still correctly uses nvidia gpu...
re #3, this could mean that users could use a fakemon + script to achieve the above...
I have the same problem.
I play game with 630M card, then using software to detect temperature, find that iGPU module spikes to 100 C.
I tried to set display connected to 630M via nvidia console, but still everything goes through iGPU.
After read this thread, it seems that it's a hardware design flaw, and there is no way around it .
hybrid setting, the *** is that. Just fancy sales pitch, to confuse customers, to mislead unskilled users.
Dell notebook is stupid. I'll never buy dell product ever again.
On Dell precision I found a solution to turn on the dedicated Nvidia GPU, instead of Intel integrated:
In BIOS, Video settings, disable "Enable Switchable Graphics". After restart, it assumed only dedicated GPU and it has been working fine until now.