Hello! I have a Lenovo Y50-70 with gtx 960m, and whenever I would play games, my fps would decrease. This would decrease for a few mins, then the fps would go back up to 48 fps (i have a 48 refresh rate). It would stay up to 48 fps just for a few mins then it would drop back down to 20 fps.
This happens to all games ranging from 2008-2017. (I played COD4 and it was literally fluctuating without v-sync).
I did everything from adding ram, changing power settings, overclocking, etc. And it would still have the same result, fps drops for a few mins then fps stability after a few mins.
And this has been going on for years.
I downloaded EVGA Precision X1 so that I could overclock, and what I noticed is that my temp would stay at around 73-75 C, then it would increase up to 88-89 C. When the temperature IS increasing, the FPS would be stable (48 fps), but then when the temp would gradually decrease to 73-75 C, my fps would drop to 20 fps.
Basically, fps increase as temp increases, and then, fps decrease when temp decreases.
Not sure, but maybe there is a correlation to this?
Help pls, this has been going on for a few years.
Here's the full specifications of my laptop:
Operating System: Windows 10 Education 64-bit (10.0, Build 17134)
Processor: Intel (R) Core (TM) i7-4720HQ CPU @ 2.60GHz (8 CPUs), ~2.6GHz
Memory: 16,384MB RAM
Available OS Memory: 16,296MB RAM
DirectX Version: DirectX 12
Card Name: NVIDIA GeForce GTX 960M
DAC Type: Integrated RAMDAC
Approx. Total Memory: 12203 MB
Display Memory: 4055 MB
Shared Memory: 8148 MB
This happens to all games ranging from 2008-2017. (I played COD4 and it was literally fluctuating without v-sync).
I did everything from adding ram, changing power settings, overclocking, etc. And it would still have the same result, fps drops for a few mins then fps stability after a few mins.
And this has been going on for years.
I downloaded EVGA Precision X1 so that I could overclock, and what I noticed is that my temp would stay at around 73-75 C, then it would increase up to 88-89 C. When the temperature IS increasing, the FPS would be stable (48 fps), but then when the temp would gradually decrease to 73-75 C, my fps would drop to 20 fps.
Basically, fps increase as temp increases, and then, fps decrease when temp decreases.
Not sure, but maybe there is a correlation to this?
Help pls, this has been going on for a few years.
Here's the full specifications of my laptop:
Operating System: Windows 10 Education 64-bit (10.0, Build 17134)
Processor: Intel (R) Core (TM) i7-4720HQ CPU @ 2.60GHz (8 CPUs), ~2.6GHz
Memory: 16,384MB RAM
Available OS Memory: 16,296MB RAM
DirectX Version: DirectX 12
Card Name: NVIDIA GeForce GTX 960M
DAC Type: Integrated RAMDAC
Approx. Total Memory: 12203 MB
Display Memory: 4055 MB
Shared Memory: 8148 MB