http/www.techspot.com/news/68407-tackling-subject-gpu-bottlenecking-cpu-gaming-benchmarks-using.html
This article explains why the reputable sites test for actual CPU performance at 1080p vs. 1440p or even 4K. But the summary (including from the comments) is as follows:
1. Although perhaps not representative of all gamers, the vast majority of Steam gamers (http
/store.steampowered.com/hwsurvey) do
not game above 1080p resolutions. Currently, among the single-monitor users only
6.14% use a resolution greater than 1920x1080...& since that includes both "other" (i.e. non-standard resolutions, 1.8% of the total) & "larger" resolutions like 1920x1200 & 2560x1080, the actual share of users at 1440p or higher is only 2.72%. Even for multi-monitor setups, the percentage of Steam users with at least 1440 vertical pixels was only 5.93%. So, for anyone that yammers on about how 1080p isn't 'relevant' to gaming, we know that you're actually saying, "I think
my setup & resolution is more important than anyone else's, even if I'm in a very small minority".
2. Assuming you have unlimited graphic rendering resources, the amount of
game data processed by the CPU is going to be mostly the
same. Whether it's eventually going to show up on a 1080p/60Hz display or a mammoth triple 4K/120Hz display array, your CPU is sending the same amount of data to the GPU...because all that data says is, "here's X amount of objects, each with its assigned shape, size, texture, & location coordinates. Have fun rendering it". It's then up to the GPU to actually render everything & make it look pretty. This is why, when you look at benchmark reviews by reputable sites, they take the most powerful GPU they can possibly use
at the time, set it to a mid-range or "common" resolution, & then see what the CPUs can do. In a few cases -- usually when there's some sort of frame cap hard-coded into the game -- you'll see them also rerun the test at a higher resolution, or even simply run the test at a higher resolution, because their GPU testing already determined that their best CPU/GPU combo was hitting the frame cap.
Techspot did this, for example, in their Dark Souls III review; they'd already determined that their top 3 GPUs -- GTX 980Ti, R9 Fury X, & R9 Nano -- were still hitting the 60FPS cap at 1440p resolutions with their i7-6700K, so they didn't even bother doing any CPU testing at 1080p & skipped straight to 1440p (http
/www.techspot.com/review/1162-dark-souls-3-benchmarks/page5.html)...& they still found that most of the common CPUs when paired with a GTX 980Ti (including the Pentium G4400, FX-4320, & FX-8320e) were able to hit 50+ FPS on average, with the Pentium G3258 (?!?) & Athlon X4 860K (?!?!?) right behind them (albeit the last 2 were hitting 90-100% CPU utilization & suffering much lower minimum rates). It's also interesting to note that the same review also showed a Sandy Bridge i5 being able to keep up at 1440p resolutions with the Skylake i7...not that I'm making any comments at this time about performance improvements (or lack thereof) on Intel chips from generation to generation...
Basically, 1080p resolution is the "mainstream" for gaming right now, & this is why "mainstream" GPUs like the GTX 1060 & RX 470/480 are geared towards that market. By extension, then, more powerful GPUs like the GTX 1070/980Ti, 1080, & 1080Ti/Titan X can be reasonably expected to easily churn out 1080p resolution gaming without any strain whatsoever. This means that, at 1080p resolutions, you can expect to notice & identify how well the
CPU performs in a particular game, especially when compared with other CPUs.
As for those that still whine about it & say, 'But I want to see
exactly how my system will perform at 1440p/4K resolution!!!":
1. First rule of benchmarking: no one talks about bench...LOL, sorry, wrong rules. First rule of benchmarking is just like with laboratory testing: unless you have an
identical setup, you can't expect identical results. That being said, most hardware running at
stock speeds tend to perform pretty much the same (maybe within 1-3FPS either way). So if your setup is similar to the benchmarking setup, give it a 1-5FPS margin either way.
2. Compare your hardware to the tested hardware, then consider your resolution. The lower your resolution, the more likely the CPU result will be the limiting factor, so you should expect your results to match it. The higher your resolution, the more likely the GPU result is the limiting factor, so adjust your expectations accordingly. And at
all resolutions, whichever result (GPU or CPU) is
lower, assume your results are going to match it.
For example, my PC has an FX-8320 (stock speed) & an R9 380. My monitor is fairly old, so it's limited to 1600x900 resolutions. Based the ration between 1080p & my resolution in terms of pixels (1.44:1), I can probably assume that I can multiply any 1080p results by that ratio...but just to be safe, I would limit myself to a x1.2 ratio. Now, based on the CPU test results for For Honor (http
/www.techspot.com/review/1333-for-honor-benchmarks/page3.html), I would reasonably assume that I could have expected over 100FPS if I had a Titan XP card; since I don't, I'll look at their 1080p results (http
/www.techspot.com/review/1333-for-honor-benchmarks/). Ideally, I might be able to expect at least 54FPS for my minimum & 72FPS for my average results (assuming that x1.44 ratio), but I'll be conservative & assume I'll probably see 45FPS minimum & 60FPS average (using a x1.2 ratio), all using Extreme Quality. Note that this is nowhere near my CPU's expected performance, but that's because in this case there's a limitation from my GPU. Now, were I to upgrade to an RX 480 (8GB) or GTX 1060 (6GB) GPU, or even a more powerful GPU, then I would start to see my system topping out at maybe 100FPS, because then I'd start seeing the limitation from my CPU (74-89FPS minimum for both; 95-114FPS average for the RX, 98-118FPS average for the GTX), because the expected GPU performance is greater than the CPU performance.
Hmmm...that turned out longer than I'd originally thought, but I think it helps show how this all works.