nottheking
Distinguished
[citation][nom]rohitbaran[/nom]Also, a CPU that is similar to the Xbox 360 CPU? Wow, Nintendo really seems keen to cut production costs by using old dated hardware lol![/citation]
The CPU is the one factor mentioned that may be accurate... But that'd be due to a lucky guess rather than actually getting information.
Keep in mind that both the 360 and PS3 have their CPUs for a reason: NEITHER of them intended for a game to need all their raw CPU power. In fact, in cross-360/PS3-to-Wii games, the only concessions made tended to be all in the graphics department due to the weak GPU. The 360 only really needs its CPU because it needed power to emulate the original Xbox. The PS3, likewise, was emulating the PS2, as well as accelerating Blu-Ray playback.
The Wii, on the other hand, didn't need to emulate anything, since it was natively backwards-compatable with the Game Cube, much like how a Core i7 will run stuff written for a Pentium 4. Similarly, the Wii U will have the same native BC to both the Wii and GameCube, so again, software emulation won't be necessary.
Of course, a similar tale will apply to the Xbox 720 and PS4: both will ostensibly re-use the same form of architecture, so they won't have to emulate the prior console. Hence, neither will boast a huge upgrade in their CPU power: this'll primarily be to save costs, as Microsoft and ESPECIALLY Sony felt the burn of a console that lost money on release.
[citation][nom]dimar[/nom]Fine, maybe I overdid the specs a little. All I want is to play games at 1080p with nice AA and 60 fps (nice and smooth, without screen tearing).[/citation]
Keep in mind that the standard for 3D games on a console is 30fps, not 60fps. I can't name any games off of the top of my head that, on the 360 or PS3, run at 60fps that aren't sports games.
Of course, I'll add further qualification that the Wii, however, standardized its games at 60fps. Of course, this was at a mere 720x480, and without any AA, and generally much-lower details.
[citation][nom]WolvenOne[/nom]Essentially, between confusing reporting, vague reporting, and outright technical impossibilities, we can definitively state that we know nothing more from this rumored leak than we did before.[/citation]
This sums it up quite perfectly. +1.
Though to address a couple other points:
- I'd actually missed any news that the dev kits had 1GB of RAM. Do you have a link for my future reference, mayhap?
- Yeah, the "EDRAM" part is definitely completely out there. I think it demonstrates that the writers were trying too hard and don't understand what EDRAM is. Also, even 96MB would be way too large to embed; while EDRAM may be more compact than SRAM, it's still pretty bulky; at 45nm this would still be larger than the 10MB EDRAM found in the Xbox 360's GPU.
- You forgot to mention the "compelling reason" for separate memory pools. I'm assuming you were going to mention the bandwidth, or possibly the second concern of latency. (though GPUs tend to not suffer much there, hence the GDDR varieties that tend to trade timings for clock rate)
The CPU is the one factor mentioned that may be accurate... But that'd be due to a lucky guess rather than actually getting information.
Keep in mind that both the 360 and PS3 have their CPUs for a reason: NEITHER of them intended for a game to need all their raw CPU power. In fact, in cross-360/PS3-to-Wii games, the only concessions made tended to be all in the graphics department due to the weak GPU. The 360 only really needs its CPU because it needed power to emulate the original Xbox. The PS3, likewise, was emulating the PS2, as well as accelerating Blu-Ray playback.
The Wii, on the other hand, didn't need to emulate anything, since it was natively backwards-compatable with the Game Cube, much like how a Core i7 will run stuff written for a Pentium 4. Similarly, the Wii U will have the same native BC to both the Wii and GameCube, so again, software emulation won't be necessary.
Of course, a similar tale will apply to the Xbox 720 and PS4: both will ostensibly re-use the same form of architecture, so they won't have to emulate the prior console. Hence, neither will boast a huge upgrade in their CPU power: this'll primarily be to save costs, as Microsoft and ESPECIALLY Sony felt the burn of a console that lost money on release.
[citation][nom]dimar[/nom]Fine, maybe I overdid the specs a little. All I want is to play games at 1080p with nice AA and 60 fps (nice and smooth, without screen tearing).[/citation]
Keep in mind that the standard for 3D games on a console is 30fps, not 60fps. I can't name any games off of the top of my head that, on the 360 or PS3, run at 60fps that aren't sports games.
Of course, I'll add further qualification that the Wii, however, standardized its games at 60fps. Of course, this was at a mere 720x480, and without any AA, and generally much-lower details.
[citation][nom]WolvenOne[/nom]Essentially, between confusing reporting, vague reporting, and outright technical impossibilities, we can definitively state that we know nothing more from this rumored leak than we did before.[/citation]
This sums it up quite perfectly. +1.
Though to address a couple other points:
- I'd actually missed any news that the dev kits had 1GB of RAM. Do you have a link for my future reference, mayhap?
- Yeah, the "EDRAM" part is definitely completely out there. I think it demonstrates that the writers were trying too hard and don't understand what EDRAM is. Also, even 96MB would be way too large to embed; while EDRAM may be more compact than SRAM, it's still pretty bulky; at 45nm this would still be larger than the 10MB EDRAM found in the Xbox 360's GPU.
- You forgot to mention the "compelling reason" for separate memory pools. I'm assuming you were going to mention the bandwidth, or possibly the second concern of latency. (though GPUs tend to not suffer much there, hence the GDDR varieties that tend to trade timings for clock rate)