Witcher3 does not use Nvidia GPU (at all, not a single %)
Heaven Benchmark uses 90-100% Nvidia GPU
Team Fortress uses 40-50% Nvidia GPU.
(Numbers from Task Manager)
Can someone please explain why this is and how this is happening?
Personally I dont want the Intel GPU to run at all. I need 100% performance and I need my computer to always use the Nvidia GPU no matter what. I dont care if my laptop sounds like a jetplane or if it starts smoking. I need 100% performance from my dedicated Nvidia GPU 100% of the time every single day. I choose this computer because it was supposed to be a beast and could handle GPU heavy duties. As things are today I could just as well have chosen a computer that cost €1500 less.
If there is a way to do what you want, it'll be in the BIOS setup. On the previous 95xx series, there wasn't - the system is software-controlled, with the hardware being set up such that the Intel GPU is hard-wired to the display output, and the nVidia GPU is simply a co-processor. All you can do is set Optimus to run with the nVidia GPU for either best performance or on a per-application basis.
I don't think the 9570 is designed any differently - meaning the nVidia GPU is not and cannot be made the primary GPU for the internal display panel.
@JohnAnchor try creating a shortcut to Witcher 3 somewhere other than the Start menu, right-clicking it, and selecting "Run on graphics processor > NVIDIA".
You cannot completely disable the Intel GPU, because none of the display outputs in that system are connected to the NVIDIA GPU, and the XPS 15 models do not offer a BIOS option to change that -- because the XPS 15 motherboard doesn't have the design necessary to allow that type of flexibility. The NVIDIA GPU in this system operates purely via NVIDIA Optimus. Most dual GPU laptops on the market are set up this way, fyi. I wrote a detailed post here that explains how this works and also describes alternate, less commonly implemented designs if you're curious.
That said, I'm not sure why the NVIDIA GPU is isn't stepping in automatically for Witcher 3. It definitely should be, without you having to change anything on NVIDIA Control Panel or having to use that "Run on graphics processor" mechanism I just described. But if the latter works, try manually creating a profile in NVIDIA Control Panel for Witcher 3. Just setting the NVIDIA GPU to be active for "all applications" does not make it active all the time, for the reasons I just described and elaborated on in the thread I linked. It might just make it active for "all appliations that currently have profiles configured in NVIDIA Control Panel". Again, I would expect Witcher 3 to have a profile out of the box on NVIDIA drivers, but maybe not. But that doesn't mean it isn't fixable.
@JohnAnchor just adding one small note to the post I wrote above, if you picked the XPS 15 because it was a beast and could handle GPU-heavy work, you may have chosen the wrong system. The XPS 15, like many systems these days, does not have a cooling system robust enough to handle sustained heavy load, especially when that load includes both the CPU and GPU. It is designed primarily to be thin and light and to provide high performance in shorter bursts. If you attempt to run a sustained heavy load on that system, you will find that it will throttle performance significantly once temperatures reach a certain level -- which they will, because the cooling system again can't keep up with a sustained heavy load. You'll notice this in games as a sudden and sharp drop in frame rate, and your system will continue running at that lower frame rate until temperatures DROP to a certain level, at which point things will return to normal again until they rise again, and the cycle continues.
If you want a beast system that can handle GPU-heavy work and can run heavy loads indefinitely without throttling, you'd need to look at bulkier systems with more robust cooling solutions, such as the Precision 7000 Series models or Alienware systems.
With my 9570 I have found 1 use case for the NVIDIA GPU where I seem to be able to control the NVIDIA GPU: When I use Logitech's CAPTURE software, I can choose NVIDIA's NVENC video encoder. Only when I make this choice does 4K become an option for me. This might mean that I can stream 4k video when I teach using Zoom (at the very least I can choose 4k video within the Logitech CAPTURE software). However, I have to use my Logitech BRIO 4k camera as CAPTURE only recognizes Logitech webcams....
I wish that Dell Peripheral Manager - the software I have to use to manage my Dell UltraSharp 4k Webcam WB 7022 offered me a similar way to choose NVIDIA NVENC. Currently, when I connect the Dell 4k webcam to my 9570, the NVIDIA GPU remains asleep....
JohnAnchor
1 Rookie
•
9 Posts
0
May 7th, 2020 04:00
Update:
Witcher3 does not use Nvidia GPU (at all, not a single %)
Heaven Benchmark uses 90-100% Nvidia GPU
Team Fortress uses 40-50% Nvidia GPU.
(Numbers from Task Manager)
Can someone please explain why this is and how this is happening?
Personally I dont want the Intel GPU to run at all. I need 100% performance and I need my computer to always use the Nvidia GPU no matter what. I dont care if my laptop sounds like a jetplane or if it starts smoking. I need 100% performance from my dedicated Nvidia GPU 100% of the time every single day. I choose this computer because it was supposed to be a beast and could handle GPU heavy duties. As things are today I could just as well have chosen a computer that cost €1500 less.
ejn63
10 Elder
•
30.7K Posts
0
May 7th, 2020 06:00
If there is a way to do what you want, it'll be in the BIOS setup. On the previous 95xx series, there wasn't - the system is software-controlled, with the hardware being set up such that the Intel GPU is hard-wired to the display output, and the nVidia GPU is simply a co-processor. All you can do is set Optimus to run with the nVidia GPU for either best performance or on a per-application basis.
I don't think the 9570 is designed any differently - meaning the nVidia GPU is not and cannot be made the primary GPU for the internal display panel.
jphughan
9 Legend
•
14K Posts
0
May 7th, 2020 08:00
@JohnAnchor try creating a shortcut to Witcher 3 somewhere other than the Start menu, right-clicking it, and selecting "Run on graphics processor > NVIDIA".
You cannot completely disable the Intel GPU, because none of the display outputs in that system are connected to the NVIDIA GPU, and the XPS 15 models do not offer a BIOS option to change that -- because the XPS 15 motherboard doesn't have the design necessary to allow that type of flexibility. The NVIDIA GPU in this system operates purely via NVIDIA Optimus. Most dual GPU laptops on the market are set up this way, fyi. I wrote a detailed post here that explains how this works and also describes alternate, less commonly implemented designs if you're curious.
That said, I'm not sure why the NVIDIA GPU is isn't stepping in automatically for Witcher 3. It definitely should be, without you having to change anything on NVIDIA Control Panel or having to use that "Run on graphics processor" mechanism I just described. But if the latter works, try manually creating a profile in NVIDIA Control Panel for Witcher 3. Just setting the NVIDIA GPU to be active for "all applications" does not make it active all the time, for the reasons I just described and elaborated on in the thread I linked. It might just make it active for "all appliations that currently have profiles configured in NVIDIA Control Panel". Again, I would expect Witcher 3 to have a profile out of the box on NVIDIA drivers, but maybe not. But that doesn't mean it isn't fixable.
jphughan
9 Legend
•
14K Posts
0
May 7th, 2020 09:00
@JohnAnchor just adding one small note to the post I wrote above, if you picked the XPS 15 because it was a beast and could handle GPU-heavy work, you may have chosen the wrong system. The XPS 15, like many systems these days, does not have a cooling system robust enough to handle sustained heavy load, especially when that load includes both the CPU and GPU. It is designed primarily to be thin and light and to provide high performance in shorter bursts. If you attempt to run a sustained heavy load on that system, you will find that it will throttle performance significantly once temperatures reach a certain level -- which they will, because the cooling system again can't keep up with a sustained heavy load. You'll notice this in games as a sudden and sharp drop in frame rate, and your system will continue running at that lower frame rate until temperatures DROP to a certain level, at which point things will return to normal again until they rise again, and the cycle continues.
If you want a beast system that can handle GPU-heavy work and can run heavy loads indefinitely without throttling, you'd need to look at bulkier systems with more robust cooling solutions, such as the Precision 7000 Series models or Alienware systems.
DellUserinJapan
1 Message
0
September 13th, 2021 19:00
With my 9570 I have found 1 use case for the NVIDIA GPU where I seem to be able to control the NVIDIA GPU: When I use Logitech's CAPTURE software, I can choose NVIDIA's NVENC video encoder. Only when I make this choice does 4K become an option for me. This might mean that I can stream 4k video when I teach using Zoom (at the very least I can choose 4k video within the Logitech CAPTURE software). However, I have to use my Logitech BRIO 4k camera as CAPTURE only recognizes Logitech webcams....
I wish that Dell Peripheral Manager - the software I have to use to manage my Dell UltraSharp 4k Webcam WB 7022 offered me a similar way to choose NVIDIA NVENC. Currently, when I connect the Dell 4k webcam to my 9570, the NVIDIA GPU remains asleep....