Wii U Specs Possibly Leaked: PowerPC CPU, AMD GPU

Page 3 - Seeking answers? Join the Tom's Guide community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

nottheking

Distinguished
Jan 5, 2006
311
0
18,930
[citation][nom]rohitbaran[/nom]Also, a CPU that is similar to the Xbox 360 CPU? Wow, Nintendo really seems keen to cut production costs by using old dated hardware lol![/citation]
The CPU is the one factor mentioned that may be accurate... But that'd be due to a lucky guess rather than actually getting information.

Keep in mind that both the 360 and PS3 have their CPUs for a reason: NEITHER of them intended for a game to need all their raw CPU power. In fact, in cross-360/PS3-to-Wii games, the only concessions made tended to be all in the graphics department due to the weak GPU. The 360 only really needs its CPU because it needed power to emulate the original Xbox. The PS3, likewise, was emulating the PS2, as well as accelerating Blu-Ray playback.

The Wii, on the other hand, didn't need to emulate anything, since it was natively backwards-compatable with the Game Cube, much like how a Core i7 will run stuff written for a Pentium 4. Similarly, the Wii U will have the same native BC to both the Wii and GameCube, so again, software emulation won't be necessary.

Of course, a similar tale will apply to the Xbox 720 and PS4: both will ostensibly re-use the same form of architecture, so they won't have to emulate the prior console. Hence, neither will boast a huge upgrade in their CPU power: this'll primarily be to save costs, as Microsoft and ESPECIALLY Sony felt the burn of a console that lost money on release.

[citation][nom]dimar[/nom]Fine, maybe I overdid the specs a little. All I want is to play games at 1080p with nice AA and 60 fps (nice and smooth, without screen tearing).[/citation]
Keep in mind that the standard for 3D games on a console is 30fps, not 60fps. I can't name any games off of the top of my head that, on the 360 or PS3, run at 60fps that aren't sports games.

Of course, I'll add further qualification that the Wii, however, standardized its games at 60fps. Of course, this was at a mere 720x480, and without any AA, and generally much-lower details.

[citation][nom]WolvenOne[/nom]Essentially, between confusing reporting, vague reporting, and outright technical impossibilities, we can definitively state that we know nothing more from this rumored leak than we did before.[/citation]
This sums it up quite perfectly. +1.

Though to address a couple other points:

- I'd actually missed any news that the dev kits had 1GB of RAM. Do you have a link for my future reference, mayhap?
- Yeah, the "EDRAM" part is definitely completely out there. I think it demonstrates that the writers were trying too hard and don't understand what EDRAM is. Also, even 96MB would be way too large to embed; while EDRAM may be more compact than SRAM, it's still pretty bulky; at 45nm this would still be larger than the 10MB EDRAM found in the Xbox 360's GPU.
- You forgot to mention the "compelling reason" for separate memory pools. I'm assuming you were going to mention the bandwidth, or possibly the second concern of latency. (though GPUs tend to not suffer much there, hence the GDDR varieties that tend to trade timings for clock rate)
 

leper84

Distinguished
Nov 2, 2010
28
0
18,580
[citation][nom]ikyung[/nom]I don't play console anymore, but SNES was in my opinion, the best console ever made. There was no gimmick. Just had a library of some of the best games ever made.Who cares if its 700mb or 1gig or 4gigs. It's all about launch titles. I guarantee if Nintendo puts something like Zelda, Mario, Kirby, Super Smash Bros, Pokemon, Chrono Trigger, Mega Man, etc. the Wii U will sell like hotcakes.But, I agree, the whole tablet controller gimmick just isn't what Nintendo should have gone for. Whether you hate Nintendo or love it, respect it. They made the most contribution and made gaming the way it is today.[/citation]

I can see the tablet controller actually being pretty sick if implemented right. One thing I'd like to see is an FPS- Nintendo could sell you a plastic rifle built to work with the motion sensor. Imagine a spot right around or opposite (depending on your dexterity) of the ejection port where the touchscreen could snap into place. Now it could display your total ammo, ammo left in current magazine, it could control the rifle to force you to pull a charging handle or even swap mags to reload and could still control your current inventory and allow you to swap ammunition types or current weapons.

If they actually put in the hardware to let it run 3d @1080p, that would be some pretty intense immersion for you average home user.
 

shortbus25

Distinguished
May 17, 2009
18
0
18,560
The only people that buy Wii's are gamers with kids, we get them a game console so they can play games to and leave us alone while we are playing. Its not that its a great console or if they make it HD or not it's for the little guys so they don't break our better adult type consoles/PC's!!!!
 

nao1120

Distinguished
Mar 27, 2009
31
0
18,580
[citation][nom]sony1978ak[/nom]eeew 768 of ram? you call that HD gaming LMAO hahahahahahaahaha. just like ps3 and 360 512 of ram even 1 gig ram is outdated hardware, how can they call HD gaming ? PC is the GOD OF HD Gaming and it will all ways be[/citation]

This isn't a bad thing - Would you rather dish out 2000 every 3 years for a PC, or enjoy it for 6? Thankfully most games are made so it runs on that hardware, forcing the developers to code it better so we don't get left with a mess requiring top end PC systems all the friggen time.
 

tanjo

Distinguished
Sep 24, 2011
97
0
18,590
Crap. I thought it's a portable and not just a controller. I assume it will use Bluetooth 4 (for bandwidth used in streaming video and audio plus backwards compatibility for other Wii gadgets)?
 

tomfreak

Distinguished
May 18, 2011
176
0
18,630
RAM is cheap these days, 4GB just cost only 20USD. Why would they fit Wii U with 768MB only? if they fit it with 1.5GB-2GB it would nt cost much more..
 

retrig

Distinguished
Dec 2, 2010
35
0
18,580
[citation][nom]shortbus25[/nom]The only people that buy Wii's are gamers with kids, we get them a game console so they can play games to and leave us alone while we are playing. Its not that its a great console or if they make it HD or not it's for the little guys so they don't break our better adult type consoles/PC's!!!![/citation]
Wrong. I have a PC, so there is no reason to purchase a 360 or PS3, as there are very few must have exclusive titles, and games are better on the PC anyway. Connect a 360 controller if you need a gamepad for the very few genres that benefit from it. Why a Wii? It's all about the exclusives. Mario and Zelda, among others.
 

hetneo

Distinguished
Aug 1, 2011
128
0
18,630
[citation][nom]of the way[/nom]I don't think HD means what you think it means. It really just is a buzzword now though I guess. And you are right, PCs will always reign supreme, if you have the money for it.[/citation]
You can't compare prices of the best PC and the best console. If you do so then you come to "if you have the money for it" conclusion. If you compare performance per dollar or euro or pound, PC comes as cheaper than any comparable console. If you want any game to look on PC as good as it can and to run better (read 60+fps) you need a lot of money. But if you want any game to run as smooth as on consoles, but to not have the best possible video quality though still much better than on consoles, PC comes on top as cheaper option.
 

nottheking

Distinguished
Jan 5, 2006
311
0
18,930
[citation][nom]leper84[/nom]Imagine a spot right around or opposite (depending on your dexterity) of the ejection port where the touchscreen could snap into place.[/citation]
The only problem is the tablet's a bit large for it.

[citation][nom]nao1120[/nom]forcing the developers to code it better so we don't get left with a mess requiring top end PC systems all the friggen time.[/citation]
This isn't the result. Instead, devs just don't bother to push the envelope, deciding that the "diminishing returns" just aren't worth it. Case-in-point on this philosophy: compare Skyrim to Battlefield 3; the latter looks amazing even on the consoles, because making it PC-first REQUIRES that they do good coding to fit a reasonable amount into the console.

[citation][nom]tanjo[/nom]Crap. I thought it's a portable and not just a controller. I assume it will use Bluetooth 4 (for bandwidth used in streaming video and audio plus backwards compatibility for other Wii gadgets)?[/citation]
If I remember correctly, Bluetooth was used for the Wii, and it's the same standard for the Wii U, to the point where yes, all existing Wii input devices will work with the Wii U right out of the box.

[citation][nom]Tomfreak[/nom]RAM is cheap these days, 4GB just cost only 20USD. Why would they fit Wii U with 768MB only? if they fit it with 1.5GB-2GB it would nt cost much more..[/citation]
Nintendo, historically, has made a terrible habit of giving their machines way too little RAM, to the point where it's almost always the ACTUAL bottleneck on the system, and holding back the CPU and GPU. This has been the case since the NES, really. The flip side is that Nintendo has never been surpassed in RAM speed: the GC's memory actually was faster than the Xbox's, and the Wii, in spite of its weaker CPU & GPU, used the same 1400 MHz GDDR3 as the other two consoles.

[citation][nom]hetneo[/nom]If you compare performance per dollar or euro or pound, PC comes as cheaper than any comparable console.[/citation]
Heck, thanks to Llano, you can readily build a PC more powerful and capable than a console for under $300US.
 

yapchagi

Distinguished
Oct 24, 2010
9
0
18,510
[citation][nom]rohitbaran[/nom]768 MB of RAM seems pretty low. Also, a CPU that is similar to the Xbox 360 CPU? Wow, Nintendo really seems keen to cut production costs by using old dated hardware lol!BTW, showing that remote instead of the actual console isn't really helping in boosting the image of the console Toms. Even at launch, Nintendo weren't smart enough to present it properly and now, nearly everyone is showing that remote pic whenever Wii U is talked about.[/citation]

yeah just like we're showing 360 controller to promote 360 or ps3 dual shock to promote ps3 LOL. Doesn't make sense.
 

zakaron

Distinguished
Nov 7, 2011
20
0
18,560
I'm not too concerned about these specs. Console and PC development have always been different, mainly because of the rigid structure of console hardware that has to conform to a strict price point.

For example: I read about the PS2 architecture and how it compared to PC, and the PS2 was doing things that PC hardware could barely pull off. Ram was expensive back then, so how'd they pull off the graphics with only 4MB of VRAM? Bandwidth. The bus speed between main ram and vram was something like 24GB/sec. Insane speed back then. PC relies on loading as much texture info into vram on the video card, that's why cards have so much memory. But the PS2 method was to swap textures constantly from ram to vram and toss out what's not needed. This made for a lot of back and forth, but when you have the bandwith you can get away with so little vram.

Anyway, even the processor is not a concern. A quad core PPC @ 3GHz is quite fast for a gaming machine. GPU's nowadays do all the work anyway. Actually Amiga had the right idea back in the 80's with custom chipsets. An A500 used a Motorola 68000 @ 7MHz, custom chips for graphics, audio, I/O and set aside 1MB of chip ram. The 68000 basically did "management" tasks and let the custom chips work in parallel to do everything else.

What does concern me is the GPU. This is really the heart and soul of a gaming machine since graphics (especially @ 1080p) take the biggest load hit. I mean I watched a development diary of Killzone 2 for PS3 years back and they showed techniques where they actually used several cores of the Cell for graphics work. The RSX chip just couldn't handle everything they needed, so they set aside 1 core of the cell to do lighting overlays.

But the bottom line is that SOFTWARE sells a system. Period. I could name a dozen platforms that have went under over the years, and the most common reason is the lack of good software. That's what makes or breaks a system, no matter how good the hardware is.
 

stupidvillager

Distinguished
Jul 29, 2009
5
0
18,510
This isn't system ram this rumor is talking about. The rumor states it is embedded ram, which would make it insanely fast. The xbox has 10MB. People see a number and without knowing what they are talking about they say that it is not enough. I'm not saying that I really believe this rumor though.
 

STravis

Distinguished
Nov 3, 2009
238
0
18,830
"Additionally, the console may get 768 MB of DRAM that is embedded with the CPU, but shared between the CPU and GPU."

This is the amount of RAM IN the CPU. The article doesn't indicate if this is the only amount of RAM (ie..is there External DIMMs?)
 

nottheking

Distinguished
Jan 5, 2006
311
0
18,930
[citation][nom]SLABBO[/nom]Wow, that's a lot of eDRAM. Just as a comparison, Xbox 360 only has 10MB of eDRAM. You guys are mistaking this for just regular DRAM. eDRAM is on the chip...kinda like L1, L2, L3 cache...etc.eDRAM != DRAM.http://www2.renesas.com/process/en/edram.html[/citation]
By some designations, it doesn't necessarily HAVE to be on the same die to be "embedded;" after all, originally the 360's EDRAM was on a separate "daughter die" from the main GPU. (however, the ROPs *were* on said die) To me, it was mostly a difference in the design: EDRAM's structure is built so that it can be made using the same fabrication process normally used for CPUs. (normal DRAM is made using an incompatible with also making CPUs/GPUs) The GameCube's main memory was 24MB of this sort, even though it was placed in two separate packages on the motherboard, similar to normal DRAM. (with the Wii, they're placed on the same die as the GPU, which also serves as the memory controller for both consoles)

[citation][nom]stupidvillager[/nom]This isn't system ram this rumor is talking about. The rumor states it is embedded ram, which would make it insanely fast.[/citation]
Well, the bandwidth is generally more important; so regardless of the size, it'd be capped to that. Of course, if it's on-die, a 256-bit (or even 512-bit) interface, when anything beyond 128-bit is normally implausible to incorporate on a console's motherboard.

[citation][nom]STravis[/nom](ie..is there External DIMMs?)[/citation]
DIMMs add significantly more cost and complexity versus soldering the RAM chips right onto the motherboard; hence the latter is the only method consoles have ever taken.
 

back_by_demand

Distinguished
Jul 16, 2009
1,599
0
19,730
[citation][nom]eddieroolz[/nom]Basically the Wii will hold back the console market yet again.[/citation]
Really? You think that Nintendo releasing an underpowered product will somehow exert peer-pressure on Sony and Microsoft to underpower their own products?
...
Not likely, the power required to push out a HD game is relatively old-hat, technology wise, if it were compared against a GPU then 4 generations of graphics card ago could manage it with ease, the next market is all based on gimmicks like Kinect, rather than pure pixel-power.
 
G

Guest

Guest
Come on. I expect more from Toms Hardware. The Wii U CPU is based off i7 Quad cores @ 3ghz + in early dev kits. Now, we should all know, an i7 outperforms the older CPU found in an XBox360, plus the Wii U has 4 cores, not 3. The Wii Us processor vastly out performs the Cell processor family too. Developers hate Cell. The i7 family runs more efficently /faster, has its own cache memory, and hyper threads great, with advance boosting speeds.
 

nottheking

Distinguished
Jan 5, 2006
311
0
18,930
[citation][nom]Fan007[/nom]The Wii U CPU is based off i7 Quad cores[/citation]
Have you found any source to cite for this? I've seen nothing to indicate the Wii U uses any x86 CPU. Also, ALL CPUs these days "have their own cache memory." The 360's Xenon had 1MB of L2 cache, that was shared between its three cores.
 

9_breaker

Distinguished
Jul 24, 2011
31
0
18,580
[citation][nom]brickman[/nom]45nm, that's old tech. Also a higher nm makes for a hotter running CPU?[/citation]

gulftown uses 45nm and there's nothing wrong with it except for price
 
Status
Not open for further replies.