Start a Conversation

Unsolved

This post is more than 5 years old

O

4750

May 14th, 2017 00:00

Why does my thunderbolt port default to the Intel graphics instead of NVidia ?

I have been using my LG 55" OLED TV as a monitor, and the performance on Thunderbolt looked good, but was underperforming. I tried to set it to use the NVidia card, but that could never be enabled!
When I switch to using HDMI 2.0 cable, the NVidia card activates, but there was a serious lag with mouse movement. (Well, there was until I disabled HDR in the Laptop Display settings).

My TV specifically handles HDR input, and I'm pretty sure the NVidia card does too, it's possibly the bandwidth that is a bottleneck, and I thought Thunderbolt had a high enough capacity to handle it, but Thunderbolt just won't connect to the NVidia card.

I also tried the HDMI with a Mini Displayport adapter but that sucked worst of all.

Alienware 13 R3, with the NVidia GeForce GTX 1060 with Driver 381.83

3 Apprentice

 • 

4.4K Posts

May 15th, 2017 08:00

Hi omdadom‌,

The Thunderbolt 3 (USB type-c) port will run with the Intel card and not with the Nvidia, it is just the way it was built. 

8 Wizard

 • 

17K Posts

May 15th, 2017 09:00

using my LG 55" OLED TV as a monitor, and the performance on Thunderbolt looked good, but was underperforming. 

That is likely because you are trying to game at 4K on just a little laptop, with just a Nvidia-1060. Gaming at 4K requires about twice the GPU processing power of 1080p or a more normal resolution.

 

You can try:
- Disabling HDR (normal color should be fine for gaming).

- Be sure game is set to use Nvidia card and not the Intel one.

- Try both V-Sync settings in the game or Nvidia Control Panel

- Try lowering the resolution of the full-screen game (usually in the game itself)

- Try lowering the game's visual settings (ie, from High to Medium).

- The 4K-HDTV itself might have a "gaming" or low-lag mode

No Events found!

Top