[citation][nom]Anomalyx[/nom]Sigh... there should be a requirement of at least a basic understanding of the differences between console and PC gaming before anyone posts a comparison. Yes, if you tried to run a PC game on the same grade of hardware that a PS3/360 has, it would suck. PS3 only has 256 MB of RAM! Try even running Windows with that. Consoles get away with it because they can develop the graphics to directly interface with the graphics cards, since they don't have to worry about everyone having a different card model. For PC gaming, they have to go through the DirectX interface for absolutely everything, which slows it waaaaaaaaaaaaaaaaaaaaaaaaaay down. Not to say DirectX is bad, but it's the price paid for having the choice between hundreds of different graphics cards to choose from.Summary: yes, the raw hardware in consoles is very old, but with the way console games can be developed, it's more than they'll ever need.[/citation]
That's true; but explain one thing, then: Why do the graphics still suck and all games have to be in 720p to provide some playable fps? Why is that when I compare Mass Effect 2 for the PC and for the 360, the former looks like it's a totally different, far superior game? (Hell, it is!)
Because, despite the direct interaction with the hardware, the hardware limit HAS been reached, no matter what lies they spread. They can talk about how it has not been reached and keep providing crappy graphics, or they can supply us with a new generation of these pathetic excuses for a gaming platform and stop embarrassing themselves even more.