AMD Ryzen 7 vs Intel Core i7: Benchmark Battle

Status
Not open for further replies.

LordJ7X

Prominent
Mar 2, 2017
1
0
510
Will be good to know which memories were used in the Ryzen test bench...... some testings has shown that Ryzen is affected by RAM speed... also try 1440 and 4k .. maybe FireStrike? also which temperature is getting on each test vs intel.... thats something the Intel fans always saids... AMD is a nuclear reactor.. etc etc... we have seen "the out of heatsink test", burn the Intel CPU... :p
 

Halonan

Prominent
Mar 2, 2017
3
0
510
I game in 4K. It takes alot more GPU/CPU power to keep the frame rates high. Ryzen consistently outgames Intel in 4K benchmarks. That is precisely why it isn't shown here.
 

Armix_

Prominent
Mar 3, 2017
1
0
510
You people still have high hopes for Ryzen :) Anyway its an improvment over the past FX series
 

Boof

Honorable
Apr 27, 2013
1
0
10,510
you need windows 10 for ryzen as it is and will not be supported on legacy os versions due to instruction sets not being compatible with older than w10
 


How can a first time poster have any evidence to support the accusation in your last sentence, halonan?
 

VirpZ

Estimable
Apr 26, 2014
1
0
4,510
It seems something does not bold there. Why so different set of benchmarks for the 6900K series and Ryzen ? Did you guys 4k monitor has dissapeared somehow along with your faith ?
 

scirishman76

Distinguished
Mar 20, 2009
3
0
18,510
Agreed Most of the benches I have seen has the Ryzen winning most in 4K and trading blows at 1440 while losing at 1080. Its Also true that I think some sites are intentionally not testing 4K. I read hardware reviews almost every day and almost every other cpu launched includes 4K gaming test...Why are so many reviewers leaving them out of Ryzen reviews. Especially since 1440p and 4k are not only where this cpu shines but also more realistic for people buying this kind of hardware.
 

Halonan

Prominent
Mar 2, 2017
3
0
510


You are correct. I was a first time poster. But I have been designing my own systems since the mids 80s. I don't think your number of posts makes you any more of an authority on any subject. I do, however, notice a large disparity in the 1080 benchmarks versus the 4k benchmarks.

As far as evidence, it is everywhere. The demand on the CPU does not decrease when you raise the display resolution in a system. 4k uses more CPU (even moreso when in SLI or Crossfire). The workload of a CPU increases with increased resolution and framerates. Running on lower resolution does not magically cause more stress to the CPU. (A good demonstration of this here: https://www.youtube.com/watch?v=AfoQD9pDwXM ).

My largest matter of contention here is this: If you game at 4k and get a tangible increase in performance at 4k when using the Ryzen in comparison to an Intel processor(using the same video card) then why only show the less demanding resolutions in the benchmarks?

I am not a fanboy. In fact, in the last decade, the only thing AMD has accomplished in the desktop CPU market was to allow the stronger company to set prices even higher due to lack of legitimate competition.

Now that we have competition, let's let it be a competition and test the system to today's industry standards on a level playing field.




 


My reference was to your masked accusation that Tom's was biased towards Intel in its tests.

Now, another poster has now pointed out that so have many other testers in not testing in 4K.
 

putergod

Honorable
Jan 10, 2014
1
0
10,510
Toms has ALWAYS been Intel and nVidia biased! I've been an avid Toms reader since the Athlon days, and even when AMD was kicking Intel's behind, Toms was still highly Intel biased.
 

Halonan

Prominent
Mar 2, 2017
3
0
510
My reference was to your masked accusation that Tom's was biased towards Intel in its tests.

Now, another poster has now pointed out that so have many other testers in not testing in 4K.
[/quotemsg]

I have no masked accusations. I love hardware, not conspiracy theories or brand names. I did not insinuate that Tom's was on the take. I am merely pointing out that the author had a winner in mind when he drafted this persuasive review. Why present benchmarks that give one processor a clear advantage over others?

Give a well rounded technical review with concise evidence and let the reader decide. To stack a deck, whether intentional or not, reduces the credibility of the author.

Techspot put together a good review and had some pretty interesting findings about resolution effecting Ryzen benchmarks.

"The key thing to note when comparing Ryzen against Intel CPUs is that the 1800X trails the 6900K by 12% at 1080p yet moving to 1440p shrunk that margin to a nearly insignificant 4%. You can likely guess what would happen at 4K."

http://www.techspot.com/review/1348-amd-ryzen-gaming-performance/page5.html

This thread is about benchmarks and performance, not Red vs. Blue. As a 4k gamer, I can attest that the CPU does play a role in higher resolution gaming. I don't need numbers for 1080p alone, that is inconclusive. Gamers buying new hardware aren't looking to support yesterday's tech. They are buying 1440p or 4k monitors.




 

spdragoo

Distinguished
Herald
Oct 17, 2011
186
0
18,910
http://www.techspot.com/news/68407-tackling-subject-gpu-bottlenecking-cpu-gaming-benchmarks-using.html

This article explains why the reputable sites test for actual CPU performance at 1080p vs. 1440p or even 4K. But the summary (including from the comments) is as follows:

1. Although perhaps not representative of all gamers, the vast majority of Steam gamers (http://store.steampowered.com/hwsurvey) do not game above 1080p resolutions. Currently, among the single-monitor users only 6.14% use a resolution greater than 1920x1080...& since that includes both "other" (i.e. non-standard resolutions, 1.8% of the total) & "larger" resolutions like 1920x1200 & 2560x1080, the actual share of users at 1440p or higher is only 2.72%. Even for multi-monitor setups, the percentage of Steam users with at least 1440 vertical pixels was only 5.93%. So, for anyone that yammers on about how 1080p isn't 'relevant' to gaming, we know that you're actually saying, "I think my setup & resolution is more important than anyone else's, even if I'm in a very small minority".

2. Assuming you have unlimited graphic rendering resources, the amount of game data processed by the CPU is going to be mostly the same. Whether it's eventually going to show up on a 1080p/60Hz display or a mammoth triple 4K/120Hz display array, your CPU is sending the same amount of data to the GPU...because all that data says is, "here's X amount of objects, each with its assigned shape, size, texture, & location coordinates. Have fun rendering it". It's then up to the GPU to actually render everything & make it look pretty. This is why, when you look at benchmark reviews by reputable sites, they take the most powerful GPU they can possibly use at the time, set it to a mid-range or "common" resolution, & then see what the CPUs can do. In a few cases -- usually when there's some sort of frame cap hard-coded into the game -- you'll see them also rerun the test at a higher resolution, or even simply run the test at a higher resolution, because their GPU testing already determined that their best CPU/GPU combo was hitting the frame cap.

Techspot did this, for example, in their Dark Souls III review; they'd already determined that their top 3 GPUs -- GTX 980Ti, R9 Fury X, & R9 Nano -- were still hitting the 60FPS cap at 1440p resolutions with their i7-6700K, so they didn't even bother doing any CPU testing at 1080p & skipped straight to 1440p (http://www.techspot.com/review/1162-dark-souls-3-benchmarks/page5.html)...& they still found that most of the common CPUs when paired with a GTX 980Ti (including the Pentium G4400, FX-4320, & FX-8320e) were able to hit 50+ FPS on average, with the Pentium G3258 (?!?) & Athlon X4 860K (?!?!?) right behind them (albeit the last 2 were hitting 90-100% CPU utilization & suffering much lower minimum rates). It's also interesting to note that the same review also showed a Sandy Bridge i5 being able to keep up at 1440p resolutions with the Skylake i7...not that I'm making any comments at this time about performance improvements (or lack thereof) on Intel chips from generation to generation...

Basically, 1080p resolution is the "mainstream" for gaming right now, & this is why "mainstream" GPUs like the GTX 1060 & RX 470/480 are geared towards that market. By extension, then, more powerful GPUs like the GTX 1070/980Ti, 1080, & 1080Ti/Titan X can be reasonably expected to easily churn out 1080p resolution gaming without any strain whatsoever. This means that, at 1080p resolutions, you can expect to notice & identify how well the CPU performs in a particular game, especially when compared with other CPUs.

As for those that still whine about it & say, 'But I want to see exactly how my system will perform at 1440p/4K resolution!!!":
1. First rule of benchmarking: no one talks about bench...LOL, sorry, wrong rules. First rule of benchmarking is just like with laboratory testing: unless you have an identical setup, you can't expect identical results. That being said, most hardware running at stock speeds tend to perform pretty much the same (maybe within 1-3FPS either way). So if your setup is similar to the benchmarking setup, give it a 1-5FPS margin either way.
2. Compare your hardware to the tested hardware, then consider your resolution. The lower your resolution, the more likely the CPU result will be the limiting factor, so you should expect your results to match it. The higher your resolution, the more likely the GPU result is the limiting factor, so adjust your expectations accordingly. And at all resolutions, whichever result (GPU or CPU) is lower, assume your results are going to match it.

For example, my PC has an FX-8320 (stock speed) & an R9 380. My monitor is fairly old, so it's limited to 1600x900 resolutions. Based the ration between 1080p & my resolution in terms of pixels (1.44:1), I can probably assume that I can multiply any 1080p results by that ratio...but just to be safe, I would limit myself to a x1.2 ratio. Now, based on the CPU test results for For Honor (http://www.techspot.com/review/1333-for-honor-benchmarks/page3.html), I would reasonably assume that I could have expected over 100FPS if I had a Titan XP card; since I don't, I'll look at their 1080p results (http://www.techspot.com/review/1333-for-honor-benchmarks/). Ideally, I might be able to expect at least 54FPS for my minimum & 72FPS for my average results (assuming that x1.44 ratio), but I'll be conservative & assume I'll probably see 45FPS minimum & 60FPS average (using a x1.2 ratio), all using Extreme Quality. Note that this is nowhere near my CPU's expected performance, but that's because in this case there's a limitation from my GPU. Now, were I to upgrade to an RX 480 (8GB) or GTX 1060 (6GB) GPU, or even a more powerful GPU, then I would start to see my system topping out at maybe 100FPS, because then I'd start seeing the limitation from my CPU (74-89FPS minimum for both; 95-114FPS average for the RX, 98-118FPS average for the GTX), because the expected GPU performance is greater than the CPU performance.

Hmmm...that turned out longer than I'd originally thought, but I think it helps show how this all works.
 

Spank1

Honorable
Nov 27, 2013
2
0
10,510
The chipset will make a big difference at 4K, on Firestrike Ultra I can get higher scores with crossfired cards with an AMD FX6300 than I can on an i5 4670.
2 x 7850's FX6300 - 1354 points
4670K - 874

2 x 7870's 4670K - 1057 points

I tried a single 6870 with both chips as well on the 4K firestrike bench and they score within 2% of each other.

Sorry benches are old I might start running some 4K tests with newer hardware to prove the point.

4K is pretty pointless to test the CPU for gaming power as the system as a whole is more important.

1080p is what you want for a CPU test then there is nowhere for it to hide. Clearly my i5 is superior to the FX6300 but at 4K on a single card there is almost no difference, with two cards the 2 x 16x PCI express comes into play.

Maybe they should do some crossfire or SLI benches vs X99 setup, guaranteed the X99 will win.
 

Iowa__

Prominent
Mar 19, 2017
1
0
510
would be good to give info on pricing at time of writing.. pricing would make a big diff if a person wanted a good pc that can handle games but is not a hardcore gamer, especially if the AMD was half the cost, a hardcore gamer might still opt for the intel but if the AMD was half the price it might work for them.. cost is a big factor.
 
Status
Not open for further replies.