So I'm getting voted down for pointing out that Toms' commenters can't read the articles? The Xbox 720 isn't likely going to have an AMD CPU. Anyone who has more than basic literacy skills would see that both this article and the original said AMD *GPU*, not *CPU*. It's not my fault most of you won't read.
Nor my fault if you want to believe that the next console will magically be able to sell high-end computer parts for $300US. Higher-end consoles have historically only reached about mid-range parts at the time of their release, so this puts extreme stuff out of the question. And I doubt that Microsoft is going to go for a huge loss on the hardware: after all, the current (7th) generation was won by a console that had laughably weak hardware, and Microsoft knows that.
[citation][nom]laxin84[/nom]If 360 is left on 1080 when all of consumer electronics starts making a bump to 3D 1080p or 4320p, it'll be a clearer win for Sony this time. They should be careful about jumping early for once. Sony could also potentially get in with 3D optical media, meaning 100GB+ blu-rays.I can say this for sure tho, twin GPUs most likely means 3D support out of the box (makes stereoscopic rendering a heckuva lot easier...)[/citation]
Remember that the power required to render graphics is proportional to the number of pixels; Even under ideal conditions, the 360 and PS3 for top-shelf games are only managing 0.92 megapixels. (720p) To render the same games at 4320p (33.2 MP) would require having a machine 36 times as powerful. Even being generous to allow for FOUR generations of Moore's law, that still means beating the curve by +125%. Given that the Xbox 360 and PS3 were two of the most agressively-powered machines out there (as two of the three consoles in history to sell at a loss upon release, alongside the original Xbox) thinking that Sony (or Microsoft) would go FURTHER is something that just has no logic to it. If anything, given that both companies failed to get to grab the market dominance they'd expected to, (with that being taken by Nintendo) I'm betting both will dial back there, and produce machines that will be profitable even at $3-400US.
Also, dual-GPU setups don't really make 3D any easier; modern video cards are quite fully capable of rendering multiple different buffers at once: that's how they accomplish many effects such as reflective water, by rendering the scene twice. Also, while games are getting larger, the need for an over-sized optical media is overblown: the only reason, so far, 360 games have needed more than one disc is because of high-definition pre-rendered FMVs.
[citation][nom]kastraelie[/nom]You might be forgetting the unit cost (vendor) of a 6970 is $80[/citation]
Erm, what source can you cite for this? I'm going to attach a [citation needed] here.
[citation][nom]dragonsqrrl[/nom]Damn. And that's why you sweep the existing comments before you post...[/citation]
I my defense, because the comments have no "multi-quote" function, I just quote and reply as I go along. Though I noticed the extra comment already, so I granted credit.
[citation][nom]supertrek32[/nom]Don't forget about all the driver optimizations which go into consoles. A console 6850 will blow a PC 6850 out of the water. Looking at hardware isn't going to tell you the whole story.[/citation]
Actually, no, that's not the case. There's no such thing as such magical "driver optimizations." Console CPUs often don't have to deal with so much bloat on PCs, (something that astute enthusiasts don't deal with in gaming either) but given that, by nature, DirectX optimizes everything, a console's GPU is NOT at any advantage power-wise.
The only advantage seen is that the developers hand-pick a single setting for the user. An astute enthusiast on the PC can do the same thing for themselves. To make games look like they "do everything" on the console, devs resort to the following:
* Capping the Framerate - There are no top-shelf titles for either the 360 or PS3 that run at 60fps. Typically, they cap to 30fps, though in some cases it's as low as 24 or 20. That outright lowers the bar.
* Using lower-res textures - This is something painfully apparent in so many games. Of course, this is to be expected when 1-2GB is the norm and a console's limited to about 256MB.
* Cutting LOD distances - Because there's a fixed resolution, LOD and draw distances are scaled back JUST right so that most players won't notice the reduced view compared to the PC version, which will be running at 1050p or higher instead of 720p.
* Skimping on resolution or AA - Only a handful of top-shelf 360 titles had BOTH a 1280x720 resolution and AA. The only one for the PS3 that used AA at all was Final Fantasy XIII. Skyrim on the 360 runs at 1024x576, a mere 64% of 720p.
So, an astute enthusiast that KNOWs what they're looking at when watching gameplay, rather than just assuming, will be able to produce comparable results on the PC with much, much weaker hardware than expected. As I'd said prior, the 360's GPU is more comparable to a Radeon 3650, and the PS3's something between a 7800GS or 8600GT.
That, and there's no real information to substantiate the usual rumors that "The console will have the best PC GPU around!" This has been seen with the 360, where at first it was described as an X800XT, then an X1800XT, X1900XT, and eventually even an HD 2800XT, even though the last came out well after the console did... And that all of the above were more potent than it. (though the X800 just barely)
My predictions, which are based on a bit more logic, put the 720 at using something, at best, comparable to a Radeon 6700 series... But more likely closer to a 6500/6600.