Hi this should probably have been posted at the Nvidia forums, but it is out of sercive because it was hacked or somesuch. I also posted in the Laptop subforum - perhaps I should have chosen the Nvidia one? I was just thinking posting here would save me the mandatory "you shouldn't use a laptop for gaming" spam posts. I have tried to be somewhat detailed, so please bear with me.
I have an i5 2.3 GHz laptop with a Geforce GT 540M graphics adapter. This adapter is built with the "Optimus" function, which lets it switch between the Geforce adapter for 3D rendering etc. and the onboard Intel HD 3000 for lighter tasks (like Aero etc.) to optimize power consumption.
I am on Windows 7, and I have the latest set of Nvidia drivers installed (301.42).
Recently I have found out that my laptop solely uses the Intel HD 3000. I am going to use Skyrim as an example because it gives you a lot of easily accessible output, but I mostly play older games so I hadn't noticed this change.
First off, I have tried adding the skyrim, skyrim launcher and the skse (a script extender for mods) executable to the "whitelist" in the Nvidia control panel, and I have also tried setting the Geforce as the prefered graphics adapter under Global Settings. Also tried using the contextual menu (right clicking on the executable and selecting "run with Geforce GT 540M"). None of it changed anything the slightest.
When I load up the Skyrim Launcher it automatically detects my Intel HD, no matter what settings I have chosen in the Nvidia Control Panel - I seem to recall in the past having to use the Geforce as the prefered graphics adapter under Global Settings for Skyrim to recognize it, but that won't work anymore. I also seem to recall that hovering over the Nvidia icon in the task manager would show what programs are actively using the Nvidia GPU, and it would show that the Nvidia GPU kicked in if I ran a video with VLC or watched youtube. Now it constantly shows that 0 processes are running using the GPU no matter what I do.
I searched on the interweb and one guy said to change a line of text in Skyrim.ini and Skyrimprefs, substituting this:
I did that, but still get the same choppy framerate when running the skyrim or skse executable, and as soon as I start the Skyrim launcher it reverts it to the Intel HD in the ini files by itself.
Skyrim also makes a text file output with information about the renderer after running the game. When I messed with all the different combinations of settings I deleted this file each time, and each time I ran skyrim it generated this text file with information about the Intel HD. So it's safe to say that contrary to what Nvidia Control Panel settings is saying, Skyrim now uses the Intel HD solely.
Thing is, it wasn't always like this - I have noticed a bit of a performance drop (as I said I mostly play old and not so gpu-intensive games) lately, but I thouht that was just because Windows was getting cluttered up. When I take a look at the skyrimprefs.ini I used previously when the game ran fine, even the one I tweaked for performance, it had medium-ish settings, with DoF, Radial Blur and HDR lighting switched on. Now I run it on the lowest of low settings, and still get bad framerates.
I found other "solutions" on the net, one of which was to deactive the Intel HD and remove the Geforce in the device manager, restart and then reinstall the Geforce drivers (while keeping the Intel HD deactivated). All that resulted in was a messed up desktop (another article said that no matter what, the Intel HD will ALWAYS control the desktop/aero) with low resolutions that don't fit my 16:9 screen.
So I hope someone has experience with this and can help me out. Thanks.
P.S. Dxdiag also shows the Intel HD no matter what I do (except for when I deactivate the Intel HD in device manager - then it shows "Generic VGA adapter" or something along those lines).
I have an i5 2.3 GHz laptop with a Geforce GT 540M graphics adapter. This adapter is built with the "Optimus" function, which lets it switch between the Geforce adapter for 3D rendering etc. and the onboard Intel HD 3000 for lighter tasks (like Aero etc.) to optimize power consumption.
I am on Windows 7, and I have the latest set of Nvidia drivers installed (301.42).
Recently I have found out that my laptop solely uses the Intel HD 3000. I am going to use Skyrim as an example because it gives you a lot of easily accessible output, but I mostly play older games so I hadn't noticed this change.
First off, I have tried adding the skyrim, skyrim launcher and the skse (a script extender for mods) executable to the "whitelist" in the Nvidia control panel, and I have also tried setting the Geforce as the prefered graphics adapter under Global Settings. Also tried using the contextual menu (right clicking on the executable and selecting "run with Geforce GT 540M"). None of it changed anything the slightest.
When I load up the Skyrim Launcher it automatically detects my Intel HD, no matter what settings I have chosen in the Nvidia Control Panel - I seem to recall in the past having to use the Geforce as the prefered graphics adapter under Global Settings for Skyrim to recognize it, but that won't work anymore. I also seem to recall that hovering over the Nvidia icon in the task manager would show what programs are actively using the Nvidia GPU, and it would show that the Nvidia GPU kicked in if I ran a video with VLC or watched youtube. Now it constantly shows that 0 processes are running using the GPU no matter what I do.
I searched on the interweb and one guy said to change a line of text in Skyrim.ini and Skyrimprefs, substituting this:
for this:sD3DDevice="Intel(R) HD Graphics Family"
sD3DDevice="Nvidia GeForce GT 540M"
I did that, but still get the same choppy framerate when running the skyrim or skse executable, and as soon as I start the Skyrim launcher it reverts it to the Intel HD in the ini files by itself.
Skyrim also makes a text file output with information about the renderer after running the game. When I messed with all the different combinations of settings I deleted this file each time, and each time I ran skyrim it generated this text file with information about the Intel HD. So it's safe to say that contrary to what Nvidia Control Panel settings is saying, Skyrim now uses the Intel HD solely.
Thing is, it wasn't always like this - I have noticed a bit of a performance drop (as I said I mostly play old and not so gpu-intensive games) lately, but I thouht that was just because Windows was getting cluttered up. When I take a look at the skyrimprefs.ini I used previously when the game ran fine, even the one I tweaked for performance, it had medium-ish settings, with DoF, Radial Blur and HDR lighting switched on. Now I run it on the lowest of low settings, and still get bad framerates.
I found other "solutions" on the net, one of which was to deactive the Intel HD and remove the Geforce in the device manager, restart and then reinstall the Geforce drivers (while keeping the Intel HD deactivated). All that resulted in was a messed up desktop (another article said that no matter what, the Intel HD will ALWAYS control the desktop/aero) with low resolutions that don't fit my 16:9 screen.
So I hope someone has experience with this and can help me out. Thanks.
P.S. Dxdiag also shows the Intel HD no matter what I do (except for when I deactivate the Intel HD in device manager - then it shows "Generic VGA adapter" or something along those lines).