Game Developer Raises Concern About Wii U CPU Speed

Page 3 - Seeking answers? Join the Tom's Guide community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]dragonsqrrl[/nom]I think a lot of news reports have blown this interview way out of proportion, especially ign. He doesn't say the processor is slower (in terms of raw performance) in any of the quotes I've read. He merely hints that the processor clocks may be lower than current gen consoles, and it should be abundantly clear to anyone who reads Tom's that this is in no way indicative of actual performance. Compare single threaded performance of a 3.4 GHz Ivy Bridge to a 3.4 GHz Prescott, and see what happens. I'm willing to bet you could under-clock the Ivy Bridge to sub 2 GHz and still achieve superior performance.[/citation]

Actually, you could set it to 1 GHz and it would still outdo the Pentium 4.
 
[citation][nom]shinmalothar[/nom]The thing about the Wii U (from what I have heard) is that the CPU and GPU have to render to two screens, the main display and the controller screen.I'm not an expert on these types of things but I'm guessing rendering completely separate visuals on two different screens can soon sap the processing power.I've seen various comparison videos for Assassins Creed 3 etc and although not very conclusive, the Wii U version still looked the best in my eyes so its not all bad.[/citation]

Yeah, because it requires so much to do that.
Two Nvidia 9300GE video cards can do 4 screens for microsoft flight simulator.
That's a lot of rendering and calculations on dated hardware that's no issue.

It's not the multiple screens, it's level of detail, activity on the screen, and setting a standard of quality that just won't be possible on the mobile screen, and not up to par on a full HD TV with the components in the WiiU.
Or likely any of these "next" generation systems for that matter. By the time they come out, they'll be two generations behind in terms of hardware.
 
[citation][nom]aggroboy[/nom]The casual market is growing towards social network games and mobile device apps.[/citation]
The casual market is growing in all directions. Obviously, Microsoft and Sony thought the casual market was important enough for them to put time and money into each creating and marketing their own motion sensors.
 
[citation][nom]dragonsqrrl[/nom]I think a lot of news reports have blown this interview way out of proportion, especially ign. He doesn't say the processor is slower (in terms of raw performance) in any of the quotes I've read. He merely hints that the processor clocks may be lower than current gen consoles, and it should be abundantly clear to anyone who reads Tom's that this is in no way indicative of actual performance. Compare single threaded performance of a 3.4 GHz Ivy Bridge to a 3.4 GHz Prescott, and see what happens. I'm willing to bet you could under-clock the Ivy Bridge to sub 2 GHz and still achieve superior performance.[/citation]

I bet that 1 - 1.2 ghz Ivy bridge would would kick ass p4 at 3ghz
 
these rumors are getting ridiculous.
First it's GPU speed, but the CPU is fine.
Then it's oh Wii U GPU is actually faster than the nextbox (6770 for Wii U vs 6670 for nextbox).
Now it's Wii U CPU speed too slow, but GPU is good!
WTF?!

Just wait for it to come out and we'll have a bunch of website dissecting it and telling us what's finally in the thing.
 
[citation][nom]hasten[/nom]Holy cow you are out of the loop. The last I saw they were comparing capabilities to a 7850/70 - not using one, but a custom hybrid dual card solution of some sort. I believe that was on the PS. I'm sure someone here has more detail than that.[/citation]
Is that in terms of the card that they're using (e.g. they're using a 7850/70 equivalent card) or comparing the relative performance (e.g. it'll take a 7850/70 on a PC to get the console's graphics)? Because that's a pretty big difference.
 
CPU Speed, Graphics Speed, blah blah blah. People have always had fun playing nintendo games, end of story. I am sure the processor is at least as fast as the original NES, and honestly, are there any games today with as much playing and fun power and the original Metroid, or Super Mario? A game is fun because its fun, not because of its specular lighting, or 60 frames per second in HD with anti aliasing. If a developer can't work with the hardware, then they are selling gimics, and not fun gameplay. Because the dumbest beautiful game still sucks. And the funnest 16 bit Mario Cart game is fun 20 years later.
 
[citation][nom]back_by_demand[/nom].A P4 clocked at 3.4ghz is massively slower than a Core i7 clocked at 2.8ghz, the clock speed race ended over 7 years ago and nobody won[/citation] A 2.2Ghz AMD 64 CPU is faster than that P4. Even the bottom end Celerons with CORE2 tech at 1.6Ghz is faster than ANY Pentium4.
 
[citation][nom]hasten[/nom]That's actually completely untrue. Single player it might somewhere in the same realm but certainly not multiplayer. I don't know where you get your information, but it might be time for a new source. For instance try googling any of the 10,000 articles about bulldozer vs. sandy bridge performace.[/citation]

CPU is most definitely important. I have no idea where these people pull this stuff from. I get so sick of hearing this "CPU doesn't matter" stupidity that it just makes me want to scream.
 
While Nintendo employed clever user experience to win the hearts of last generation's console war, I'm not so sure if consumers will be hooked to a second attempt.
 
mean while on the pc side we get games to pull 85%of the power from the gpu, my cpu rearly needs to pull above 40% and i got a 580 clocked at 900mhz
 
99% of games in 2012 are utter trash. Same boring 6hr military shooter like CoD or some stereotypical RPG junk like Skyrim (no, it is rubbish, its every fantasy cliché every stuffed into one boring repetitive plot). I miss the early 00's, games like Soldier of Fortune that actually had a long campaign with a *gasp* plot, games like Planescape Torment and The Suffering that were innovative in their own way and not the same old same old. Games like Far Cry that pushed the boundaries. Even cheese like Bloodrayne and State of Emergency. Etc etc. Now look at 2012, apart from shining stars like the Mass Effect series and Max Payne 3, what else is there? Spec Ops was decent and The Witcher 2 was spectacular but the rest were AAA trash. I hope current gaming crashes and we go back to fresh interesting releases that are released yearly not every two or three years.
 
[citation][nom]hasten[/nom]That's actually completely untrue. Single player it might somewhere in the same realm but certainly not multiplayer. I don't know where you get your information, but it might be time for a new source. For instance try googling any of the 10,000 articles about bulldozer vs. sandy bridge performace.[/citation]

BF3 MP is one of the few games where a default core config BD CPU with six or eight cores actually keeps up with Intel in gaming perfomrance, so that's quite the poor example to use. Regardless, a Celeron 520 won't even be in the same performance range of an LGA 1155 i5 or better in BF3 unless you're using similarly low end graphics and aren't going for high FPS.

[citation][nom]blubbey[/nom]Is that in terms of the card that they're using (e.g. they're using a 7850/70 equivalent card) or comparing the relative performance (e.g. it'll take a 7850/70 on a PC to get the console's graphics)? Because that's a pretty big difference.[/citation]

It was supposedly a GCN GPU based on Pitcairn and there's also supposedly such a GPU for a console that is based on the Tahiti, so the rumors say similar hardware to the 7800/7900 cards so long as they have proper memory bandwidth, meaning much more performance than a PC counterpart if the software is as optimized as it was on most previous consoles.

[citation][nom]slabbo[/nom]these rumors are getting ridiculous.First it's GPU speed, but the CPU is fine.Then it's oh Wii U GPU is actually faster than the nextbox (6770 for Wii U vs 6670 for nextbox).Now it's Wii U CPU speed too slow, but GPU is good!WTF?!Just wait for it to come out and we'll have a bunch of website dissecting it and telling us what's finally in the thing.[/citation]

Not really. This crap about the Wii U CPU is just someone who is apparently ignorant about CPU performance and is complaining about the clock frequency of a CPU rather than that CPU's actual performance.
 
[citation][nom]ashesofempires04[/nom]I won't buy a Wii U, even though the Zelda series is my favorite game franchise. I'll wait for my roommate to buy one, and I'll just play the new Zelda games on his.I doubt anyone is really surprised that Nintendo isn't willing to deliver up a truly next-gen console. They got drunk off of their staggering Wii sales, and assumed that it was because gamers everywhere thought it was the most amazing console ever. In reality the console's sales were driven by people who thought they could get fit with Wii Fit, or to have something for people to mess with at a party. Most people who bought it never bought anything beyond the console, and maybe Rock Band or some other party games. A lot of the people who bought it never bought any games at all, they simply played Wii Sports Resort when people came over.Gamers, the ones who actually sustain the industry with repeat purchases, never considered the console as anything other than something they had to buy in order to play Mario and Zelda franchise games on. Of all of the friends I have who bought a Wii, only 3 of them ever bought more than 3 games for it.[/citation]
Yes, I completely agree!
Zelda has to be one of my favorite game series' but I just can't bring myself to buy a whole console just for that game. I don't think they are drunk off thier success of the Wii. I think they know that the Wii was purchased by so many people that typically would never play games.

My sister has a Wii and has 3 games. (She's not a gamer) My bother-in-law has 2 games for Wii and I know many other people that have only 2 or 3 games for that console.

But, I do disagree and think that Nintendo is aware of that issue.

That's it why the Wii U is targeted towards the hardcore gamers... (Have you seen the "Pro" controllers?) Hardcore gamers love games and buy more and more games. But gamers like myself find it hard to justify a new console when the only game they can't get on a different console is Zelda, Mario and the rest of the Nintendo only clan. I really would like it if Nintendo would just stop making consoles a keep making games so I don't have to buy a Wii U just to play Zelda.

Pretty sure I won't be first in line for one of these consoles... Infact, if I can resist the Zelda temptation I will not buy one at all. (I might be convinced to buy one but it would take some cool features I didn't currently know about.)
 
[citation][nom]Shin-san[/nom]I wonder how low the clock speed is, plus how much should we put into clock speeds? The 360 CPU and the PS3 Cell are both in-order CPUs, and it's likely the Wii-U will have an out-of-order CPU. This would make the Wii-U have more bang for each MHz. Remember how a single core of a Core2 CPU runs vs a 3.2 GHz Pentium IV[/citation]I think just about everybody here knows that clockspeed alone doesn't tell you much. But the problem is that clock speed *is* still relevant when comparing two chips of the same architecture. He might have just been saying that he wanted the same chip, but clocked a little faster, for better final performance. He was probably just using the existing consoles as a point of reference, since he's not supposed to divulge exact specs. That way we have something to go by now... maybe it's 3Ghz?

Also, even though it's POWER7-based, and out-of-order, there are some unknown details, like SMT. Overall POWER7 + OoO execution will be a huge boon, even if it ends up lacking SMT. Final performance should be faster than current consoles, but it won't be night and day. We'll have to wait until the other two consoles get full sequels to see a larger jump in console performance. Then again, the first Wii did fine among casual gamers and families, so they should be OK.

I will say that they really need to try to improve the battery life of those oversized VMU controllers.
 
[citation][nom]elcentral[/nom]mean while on the pc side we get games to pull 85%of the power from the gpu, my cpu rearly needs to pull above 40% and i got a 580 clocked at 900mhz[/citation]40% of WHAT? Logical cores.... or physical ones? You realize that by itself, that % number might not mean jack sheet? For example... a 4 core CPU with SMT running a game using 4 threads would put the "CPU utilization" at sub-50%. That doesn't mean the CPU is only working at 50% (or lower) capacity.

That's not even considering the fact that most games can't FULLY utilize more than 3-4 cores. Scaling starts to drop off after 2-3 cores for many games, even if you are testing at minimum graphical settings to ensure no GPU bottleneck.
 
[citation][nom]alextheblue[/nom]I think just about everybody here knows that clockspeed alone doesn't tell you much. But the problem is that clock speed *is* still relevant when comparing two chips of the same architecture. He might have just been saying that he wanted the same chip, but clocked a little faster, for better final performance. He was probably just using the existing consoles as a point of reference, since he's not supposed to divulge exact specs. That way we have something to go by now... maybe it's 3Ghz?Also, even though it's POWER7-based, and out-of-order, there are some unknown details, like SMT. Overall POWER7 + OoO execution will be a huge boon, even if it ends up lacking SMT. Final performance should be faster than current consoles, but it won't be night and day. We'll have to wait until the other two consoles get full sequels to see a larger jump in console performance. Then again, the first Wii did fine among casual gamers and families, so they should be OK.I will say that they really need to try to improve the battery life of those oversized VMU controllers.[/citation]

This chip isn't the same architecture, and that's what actually upset Harada. Pretty much any of the rumored Wii U chips like Power or Power 7 would run circles around the XBox CPU regardless of clock speed due to far superior architecture. You're comparing architecture that's almost a decade old to modern supercomputer chips. The Wii U's CPU has been nerfed down likely due to power/heat reasons, or it would be a supercomputer. Anyone thinking the Wii U CPU is comparable to Xbox knows absolutely nothing about CPUs. From what I understand, Harada is quite upset at this interview and how it's spread and has said not to trust the media. I can't blame him since the context makes him look foolish.

His issue is not the CPU power, it's the architecture. They're finding it difficult to work with, much like the PS3 was initially difficult. This is worse news for Nintendo than it was for Sony. Sony released the PS3 with 3rd party devs behind them from the PS2 era, so courting them back was not hard. Nintendo's success hinges not on power, but on bringing 3rd party devs to their system. A difficult platform, even if it is powerful, may discourage new devs. The fact that Tekken is coming to the Wii U at all when Tekken games never come to Nintendo systems and have been best known as a Playstation title is a good sign in that regard. At the end of the day, controllers and CPUs don't matter here, titles do. The most common complaint here isn't about power, but the fact there just wasn't many real games for the Wii.
 
Nintendo has always been good with low specs, managing to do more with less. They keep their games fun even though they run on less pixels/ram/clock-speeds than the competition. But this is just a new low for them... 854x480 on a 6 inch tablet is ridiculous. I mean God that's Archos-tier resolution. And it's not like they gave it some other aspect to compensate. It's not AMOLED, it's not even multi-touch. It's using archaic resistive touch screen tech.

I'm probably still going to get one, because as bad as all that is the last couple batches of DS have had even worse resolution, but man... I'm really disappointed in Nintendo.
 
[citation][nom]dragonsqrrl[/nom]I think a lot of news reports have blown this interview way out of proportion, especially ign. He doesn't say the processor is slower (in terms of raw performance) in any of the quotes I've read. He merely hints that the processor clocks may be lower than current gen consoles, and it should be abundantly clear to anyone who reads Tom's that this is in no way indicative of actual performance. Compare single threaded performance of a 3.4 GHz Ivy Bridge to a 3.4 GHz Prescott, and see what happens. I'm willing to bet you could under-clock the Ivy Bridge to sub 2 GHz and still achieve superior performance.[/citation]

Yeah you're right and I looked into it. It seems to be that a 1.3ghz ivy bridge core is around the same speed as a 3.6ghz P4 core. All while using less than 1/2 the power. Pretty hilarious to think that the masses still think that clockspeed is the name of the game.
 
Status
Not open for further replies.