Unsolved
This post is more than 5 years old
1 Message
0
4750
Why does my thunderbolt port default to the Intel graphics instead of NVidia ?
I have been using my LG 55" OLED TV as a monitor, and the performance on Thunderbolt looked good, but was underperforming. I tried to set it to use the NVidia card, but that could never be enabled!
When I switch to using HDMI 2.0 cable, the NVidia card activates, but there was a serious lag with mouse movement. (Well, there was until I disabled HDR in the Laptop Display settings).
My TV specifically handles HDR input, and I'm pretty sure the NVidia card does too, it's possibly the bandwidth that is a bottleneck, and I thought Thunderbolt had a high enough capacity to handle it, but Thunderbolt just won't connect to the NVidia card.
I also tried the HDMI with a Mini Displayport adapter but that sucked worst of all.
Alienware 13 R3, with the NVidia GeForce GTX 1060 with Driver 381.83
Alienware-Eimy
3 Apprentice
3 Apprentice
•
4.4K Posts
0
May 15th, 2017 08:00
Hi omdadom,
The Thunderbolt 3 (USB type-c) port will run with the Intel card and not with the Nvidia, it is just the way it was built.
Tesla1856
8 Wizard
8 Wizard
•
17K Posts
0
May 15th, 2017 09:00
That is likely because you are trying to game at 4K on just a little laptop, with just a Nvidia-1060. Gaming at 4K requires about twice the GPU processing power of 1080p or a more normal resolution.
You can try:
- Disabling HDR (normal color should be fine for gaming).
- Be sure game is set to use Nvidia card and not the Intel one.
- Try both V-Sync settings in the game or Nvidia Control Panel
- Try lowering the resolution of the full-screen game (usually in the game itself)
- Try lowering the game's visual settings (ie, from High to Medium).
- The 4K-HDTV itself might have a "gaming" or low-lag mode