@hakesterman
http://www.anandtech.com/video/showdoc.aspx?i=2453&p=9
"The PlayStation 3’s RSX GPU shares the same “parent architecture” as the G70 (GeForce 7800 GTX), much in the same way that the GeForce 6600GT shares the same parent architecture as the GeForce 6800 Ultra."
"Remember that the RSX only has a 22.4GB/s link to its local memory bandwidth, which is less than 60% of the memory bandwidth of the GeForce 7800 GTX. In other words, it needs that additional memory bandwidth from the Cell’s memory controller to be able to handle more texture-bound games. If a good portion of the 15GB/s downstream link from the Cell processor is used for bandwidth between the Cell’s SPEs and the RSX, the GPU will be texture bandwidth limited in some situations, especially at resolutions as high as 1080p"
I compared the PS3's videocard to that of a 7800 series Nvidia graphics cards, going off of memory, and I'd say that since I didn't have an exact desktop equivalent that that was pretty fair. The 7800 GTX doesn't have all of the features of the PS3 videocard exactly, and is superior or inferior depending on where you compare, but overall its pretty much a wash... and its my opinion that a videocard that compares to a 7800 is pretty darn weak. Especially with only 256mb of RAM, AND low memory bandwidth working against it. Actually, I chose that second quote on purpose because it does a good job suggesting that the PS3's videocard is actually inferior to that of desktop 7800 cards, because we all know that memory bandwidth bottlenecks destroy performance at reasonable resolutions. At the least it helps explain why the PS3 will always struggle like hell at 1080p resolution.
Blu Ray has poor read speed. Its advantage is that you have lots of space for big textures, but the PS3's videocard can't handle big textures for reasons already discussed. *my conclusion* Blu Ray isn't sweet... it actually kinda sucks. At least when used to play games.
And as others have discussed here, actually, the CPU kinda sucks too (it only operates on 1 real core, and its hard to program in such a way that utilizes the processor efficiently).
But, I guess I'm the moron eh? /sarcasm. Darn Fanboy, you're defending an abomination, as far as hardware choices go... I don't have time for an argument backing a statement like "The Xbox 360 has a superior graphics card to the PS3" right now (it does). I was just saying that the PS3's graphics card "sucks". And, I guess that's relative to what you believe is acceptable performance. I guess I think anything worse than a 50 dollar 8800GS pretty much sucks. Which would mean I think the 360's graphics card "sucks" too.
But, looping back to the point, its old graphics hardware that everyone knows how to optimize for already, that's already being pushed to 100% of its potential and there's nothing left to sqeeeze. And compared to what's available for desktop computers, the graphics card sucks. Actually the whole unit is a piece of shit, that seems slapped together by someone just trying to use as much expensive propriety crap is possible without actually increasing the device's ability to play games.... which I thought was the entire purpose of the thing.
-Frankly I'm amazed they get the level of performance out of the thing they do!
That said, I might get one to play Assassin's Creed II on. Games are still about fun, not graphics or hardware superiority.