[citation][nom]shin0bi272[/nom]so if you wanted any eye candy above what dx9 provides ... sorry.[/citation]
Actually, even though the PS3 and 360 DO hafve DX9 capability, they actually STILL won't match up to what you'd see on the PC with a similar level, simply because the console
doesn't have the power to drive it all. They're cut-down GPUs, made so that they could be cheap enough for making millions and selling them at a console-level price.
[citation][nom]TA152H[/nom]The only issue would be if they started on the consoles and did a simple port, but it sounds like they have separate development teams for each platform, so presently is probably not a concern.[/citation]
The problem here is that countless other developers have given the same song-and-dance before, and in the end it usually is not what they claim; they in fact wind up making a console game, and more or less porting it to the PC. (a prominent example of this would be Bethesda, and
Fallout 3) With many developers, they'll later tack on additional excuses, such as claiming that "they can't make money on PC due to piracy, and consoles are pirate-proof."
So you do have to grant a bit to doubters; it's quite possible that the "separate development" would rest largely in just the interface department, and the other groups necessary to port the game as it's worked on; after all, it's MUCH cheaper to port a game that's already made than to basically make the same game twice in parallel. And virtually all studios are driven by greed.
[citation][nom]cwolf78[/nom]I'm not sure where you're getting your facts, but the RSX is not "simply a cut-rate G71." It's not the same exact chip as some minor modifications were made at Sony's request - not to mention the RSX is designed to make use of the significantly higher bus bandwidth available on the PS3 as compared to the first-gen PCIe bus of that time.[/citation]
Actually, bandwidth differences are a mixed bag; the FlexIO interface could arguably be higher than PCI-e 1.0, albeit the figures given do not figure in overhead, and are uni-directional figures. It's also quite arguable that such amounts are overkill. (as with such cards, it was shown that, for instance, running SLi with 2x8 lane cards instead of 2x16 lane cards made no performance difference) It could be argued that the PS3 might need this bandwidth, though, as with many of the games put on it, it winds up overflowing its 256MB buffer, and having to some of the Cell's XDR; the interface here means that it suffers a far smaller penalty than a PC would in a comparable situation, but it doesn't stop the fact that it's still a situation one tries to AVOID. (since it still hurts performance, even if not as much as before)
The downside is that memory-wise, the RSX has
half the memory interface of the G71; it's 128-bit instead of 256-bit. (a GPU always has exactly 1 memory chip for every 32 bits of interface) this basically halves its memory bandwidth, to 22.4GB/sec; this makes it on a par with the G73-based GeForce 7600GT, rather than the 7800 cards. Further nodding to this limitation, there are only 8 ROPs on the RSX, compared to 16 of the G71.
[citation][nom]cwolf78[/nom]But even if it were exactly the same as a PC-variant G71, those have no problems running HDR and MSAA simultaneously. Take FF13 for example. The mode it runs at on the PS3 is 1280x720 at 2X MSAA. And that game does make use of HDR lighting as well. (http
/www.eurogamer.net/articles/ [...] off?page=1) I think you might be referring to the issues the Source engine had on Nvidia cards when they first added HDR support (HL2: Lost Coast). When that first came out, only ATI cards could run MSAA with HDR, whereas only one or the other could be enabled with an Nvidia product. But they eventually got MSAA+HDR working on Nvidia within a few patches.[/citation]
Lost Coast was a unique case, actually; it used an "older" HDR method that wasn't DirectX 9.0c; this allowed it to be used on older Radeon 9- and X-series cards, which only had DirectX 9 and DirectX 9.0b support. This older algorithm actually ran worse, (taking both more shader power as well as more memory bandwidth) though, and were only feasible then because the 9- and X-series cards (especially the X800/X850 ones) had unusually strong DX9 shader capabilities, and at the time, Valve was REALLY favoring ATi over nVidia. This method also arguably looked worse than more "modern" DX 9.0c HDR, too.
As for other games, the AA+HDR problem was never resolved with the G7x-based cards. For the G7x-GPUs (including the RSX) the limitation was in hardware, not software, and at a most basic level: modern HDR algorithms (which are what the consoles use) require making use of blending 16-bit floating-point color channels; however, the G7x has a limitation that it cannot stack this blend; if they use it for HDR, it cannot be used for AA, which ALSO requires a blend pass to combine each sub-pixel's sample. So technically, while one CAN force the AA algorithm on such a GPU, one won't see any effect, as because no blending step took place, you'd only see ONE of the 2-4 samples that were taken for the pixel. (careful inspection might show perhaps a very minor "warping" effect due to the fact that AA sub-samples are taken off-center; normally it just has samples from all portions, rather than just one in one portion)
As far as FF XIII goes, you've corrected me on one thing, but it wasn't on HDR; I look at those videos and I can tell you,
I don't see any HDR in the game; I see bloom, and lots of it. The lack of HDR in the comparison video (in EITHER side) is glaring; the camera makes a lot of transitions from different light-level areas, where HDR WOULD'VE adjusted the exposure level, to yield the whole "dynamic" effect. So apparently, Square/Enix opted to select AA over HDR this time; I was mistaken to say "always," then. However, this still underscores my point, that the PS3 cannot do AA+HDR at the same time.
[citation][nom]cwolf78[/nom]That being said, I've been impressed with what they've been able to do with the PS3 as of late - using the additional Cell cores to add graphical features and effects and quality of those effects far beyond what the RSX alone is capable of rendering (such as in Uncharted 2). Saying that the PS3 is limited to DX9 level effects is not entirely correct, especially if Crytek is stating that they are making full use of all the hardware on each respective platform.[/citation]
To be honest, I've been a bit wary of some claims about using "all the PS3's power." Granted, using the Cell, one could use software to produce effects outside the scope of what the GPU could handle; theoretically any effect could be produced. (this is how effects were done on PCs in the days before the broad standardization of video cards, and in fact was also how graphics were originally done on PCs, back before the GPU ever existed) However, that still comes with a severe caveat: it'd take a lot of CPU processing power as, unlike the GPU, it's circuitry isn't specifically designed for that sort of thing. Similarly, while the Cell has plenty of extra math power to spare due to its design, it is a tad short on instructional capability; the SPEs are in effect, "fast but dumb;" they require the PPE to instruction them, otherwise they sit idle.
This makes them worthless for running the main game core, the main task of the CPU; this is why they typically go largely unused in games. However, if a big SIMD (single instruction, multiple data) task comes up, it will be a big enough load with few enough instructions that the PPE would be able to keep the SPEs occupied for a while, leaving it time to do its own tasks after instructing the SPEs. This happens constantly if the chip is handling media, such as a blu-ray movie; most instructions will entail thousands, if not millions, of math operations before the next instruction comes up. In this case, it will in fact be the PPE that will be idle most of the time, with only its comparably weak math portions (which are less than that of a single SPE) being used most of the time. Hence, the more "hybrid" design of the Cell was aimed not so much so that the entire chip would be used, but that it could act as one of two chips to whichever task it'd work better at, like a square-shaped block that could morph into a triangle to fit whichever hole it needed to go through.
Of course, it still is POSSIBLE to try to get it all used, but it's a dicey proposition. Some graphical side-effects are a safer bet than other attempts, because they tend to lean more toward SIMD than traditional instruction-heavy core functions. Though as the GPU can already handle all of the simpler functions for graphics without sweat, that does limit the functions you'd want to try to more complex ones. I'm definitely not saying saying it can't be done, (it obviously has) but it is certainly something to be cautious with, as once you factor in all the read/write operations to pull/replace the data from the CPU side to the GPU side, and then process the pre-effect frame and work over it, you could wind up running out of PPE power very fast, slowing the whole game down to the point of being unplayable. My estimate is that you could probably get up to two, maybe three full-screen effects out of it, depending on complexity; a nice visual boost, but perhaps equivalent to only a 20-30% improvement over the base GPU power. (plus the benefit of circumventing the GPU's own hardware limitations)
[citation][nom]techguy378[/nom]At the time of its release the PS3 had the most advanced graphics of any desktop computer or gaming console.[/citation]
Actually... It was outdated before it even came out. Re-read all that I've said about it being a slightly-cut-down G71, which is the wide-accepted description of the core. That means that higher-end full-blown G71 card (such as the 7900GTX) that were out a few months before the console had quite a bit more core GPU power, as well as VASTLY better memory bandwidth, as they had full 256-bit memory interfaces, compared to the gimped 128-bit one of the RSX. (and to say nothing of the Radeon X19x0 series cards, which were even beyond that)