http

/en.wikipedia.org/wiki/Amdahl%27s_law
That gives a more detailed explanation of it. As you spread a task among more and more threads, it gets more and more difficult to not only utilize them well with complex tasks, but with a GPU, the cores aren't the only aspect of performance even within the hardware to worry about. No matter how many cores you have, there are also the ROPs and more to consider. Increasing the GPU frequency will also increase the frequency of the other hardware, but increasing core count doesn't increase their performance when you don't also increase he number of ROPs and memory bandwidth to accommodate them. This is just one example. All of this along with the link that I posted at the top of this post cause GPU shader scaling to get worse and worse, among even more reasons (such as the increasing distance as measured in transistors between each part of hardware).
I'm not very familiar with many lower end mobile Nvidia cards, but if I had to guess, I'd think that the MSI would somewhat beat the Lenovo which would beat the rest of the other laptops more significantly. The Lenovo's GPU as the higher frequency, but a less than 10% higher frequency is unlikely to beat a 35% core count advantage.
However, that would just be in terms of raw graphics performance... Which laptop is overall better for you might not depend strictly on the graphics performance. How the mentioned Asus fits in here, I would need more time to look into and I'm out of time for the day until much later tonight.