Forcing Optimus to use Nvidia GT750M when on 4K external display

Mythstera

Commendable
Aug 4, 2016
2
0
1,510
I have an Acer aspire V laptop, windows 10 64bit, with an integrated Intel HD 4400 GPU and Nvidia GT750M.
When I plug a 4K display with displayport 1.2 I can only get to a barely stable 50hz (with custom resolution in the Intel graphics properties).

The laptop doesnt seem to want to use the Nvidia GPU which is theoretically capable of giving 4k with 60hz.

I believe the problem lies with Nvidia Optimus which is preferring the Intel HD 4400 over the Nvidia GPU.

I updated all drivers from Intel and Nvidia directly (the ones that comes from the laptop manufacturer - Acer - are kinda outdated from mid 2015). I went to Nvidia control panel and selected in Global Settings to prefer the Nvidia GPU. But with no avail. Also there is no option in the BIOS to use only the Nvidia GPU.

Someone else has any idea how I can force Optimus to only use Nvidia GPU when connecting to a 4k external display?

Thank you
 
Solution
Nope. It's a problem with how many people thing optimus works.

Here's how optimus really works:

Optimus uses the PCI-E lanes and the IGPU (intel) for frame transfer from frame buffer to screen. The dGPU itself has no display outs connected to it since it doesn't "need" to.

Here's how the process works:

1. The GPU creates the frame.
2. The GPU puts that frame in the frame buffer (VRAM).
3. The GPU sends that frame thru the PCIE bus to system RAM (in the location of the IGPUs frame buffer).
4. Intel HD graphics takes that frame, runs it thru it's own GPU and onto the screen.

This is why "forcing" optimus won't do you anything no matter what. The IGPU can ONLY display frames that IT SUPPORTS, which does handicap the dGPU which can...
Nope. It's a problem with how many people thing optimus works.

Here's how optimus really works:

Optimus uses the PCI-E lanes and the IGPU (intel) for frame transfer from frame buffer to screen. The dGPU itself has no display outs connected to it since it doesn't "need" to.

Here's how the process works:

1. The GPU creates the frame.
2. The GPU puts that frame in the frame buffer (VRAM).
3. The GPU sends that frame thru the PCIE bus to system RAM (in the location of the IGPUs frame buffer).
4. Intel HD graphics takes that frame, runs it thru it's own GPU and onto the screen.

This is why "forcing" optimus won't do you anything no matter what. The IGPU can ONLY display frames that IT SUPPORTS, which does handicap the dGPU which can do more. But typically for 99% of all laptop users, they don't need to run 4k on a laptop that old.

I hope this helps! Sorry about that. Your only option is to get a new computer or figure out some way to use a dGPU dock.
 
Solution

Mythstera

Commendable
Aug 4, 2016
2
0
1,510


Thanks for your explanation TechyInAZ