Its not better, its just simpler and platform independent (need a NVIDIA GPU for PhysX). Just look at the old "Tackle Alley" video for Backbreaker; Havok can't do that, Euphoria (Backbreakers engine) and PhysX can.
The PS3 COULD have been a viable (PC-Lite) platform... but sadly, it was crippled with insufficient main memory. I installed linux on mine and it was PAINFULLY slow. In the PS3 native environment, they further crippled PC-like functionality by making the Keyboard and Mouse operate in about as unintuitive and useless a way as could possibly be contrived. That at least could be fixed (has not been yet to my knowledge), but the memory problem can't.
Physics on the other hand is right up the Cell Alley... the PS3 SHOULD be able to finally lever some differential advantage from the cell investment if this takes off (and if they can keep the Cell's pipleine full... which is tough since THERE'S NOT ENOUGH MAIN MEMORY!).
Remember, in Windows, games can only use a maximum of 2GB (assuming 32-bit coded games).
Given the fact that most console games are written in very low level languages to get the most out of the hardware, the memory you would need to run a game for the PS3 is far less then the amount you would need in Windows.
My point for the above, is that most OS's are unomtimzed, and coded in such a way to guarentee compatability over performance. Consoles, on the other hand, are made the other way around, made to make efficent use of system resources.
For example, in Windows (or most any computer OS), if you perform an arithmatic function, that naturally takes up memory. And I know for a fact, even when that variable is no longer needed, few bother to actually deallocate that variable from RAM (part of the reason being how few languages in use support memory deallocation). For consoles (or any lightweight embeeded software), you make sure to clean up after yourself. If its not being used, it cleaned up to free up resources.
In short, what you need 2GB for in Windows could easily be accomplished with half that amount, if the OS were optimzed and programmers made sure to clean up after themselves when done. I argue that Far Cry 2 could be modified to run on 512MB RAM without any major performance loss, if anyone ever wanted to put that much effort into the work.
I agree that for PS3, its main memory is enough for console games w/ limited or no program running at background. But for using PS3 as a PC-lite (as stated by Dave K) w/ linux or other OS... having more RAM on PS3 can defintely make our life much easier.
Back on topic, I'm fully expecting an announcment by the Backbreaker team regarding Backbreaker by months end. That will be the game that determines what goes on with PhysX, hence the partnership with NVIDIA.
[citation][nom]gamerk316[/nom]And your point with Main Memory? They use such a light OS, they don't need 2GB just to run the thing (I'm looking at you Windows).[/citation]
The point is that with 256meg of CPU memory, the PS3 is ill suited for general processing tasks, which is a shame because in every other respect it is reasonably capable. With even 1GB of main memory the PS3 would have made an excellent Linux box (well... it would have if Sony hadn't chosen to block access to most of the GPU). Furthermore, the Cell processors ability to pipeline huge quantities of data are going to be hampered by that lack of memory ... the system is likely to have problems keeping them fed.
The result is that I will be surprised if any game producer is ever able to fully leverage the Cell's... which is a shame.
this just sounds more like nivida rubbing sony's butt. obivously they are happier with sony than they were with ms on teh original xbox , all becaue MS wouldtn let nvida raise prices on thier gpu's even though the price was set up in teh contract... the more i read aobut nivida the less i like them as a company, they arn't an 800 lb gorrilla as they were once described but more liek an 8oolbs baby tht crys when deals don't go thier way , and when they do they suck up as mucha s they can.
[citation][nom]gamerk316[/nom]My point for the above, is that most OS's are unomtimzed, and coded in such a way to guarentee compatability over performance. Consoles, on the other hand, are made the other way around, made to make efficent use of system resources. For example, in Windows (or most any computer OS), if you perform an arithmatic function, that naturally takes up memory. And I know for a fact, even when that variable is no longer needed, few bother to actually deallocate that variable from RAM (part of the reason being how few languages in use support memory deallocation). For consoles (or any lightweight embeeded software), you make sure to clean up after yourself. If its not being used, it cleaned up to free up resources.In short, what you need 2GB for in Windows could easily be accomplished with half that amount, if the OS were optimzed and programmers made sure to clean up after themselves when done. I argue that Far Cry 2 could be modified to run on 512MB RAM without any major performance loss, if anyone ever wanted to put that much effort into the work.[/citation]
Modern development tools have very detailed memory management approaches. Develop in .NET and you get variable scoping at a very granular level (within virtually ANY element you can have a memory scope), when a variable goes out of scope it is collected and its memory is freed. If your program is properly structured... it won't have any dead variables lying around taking up space. Decisions can be made to scope variables globally... but those decisions tend to be made for performance reasons, thus those variables should be increasing code efficiency not hurting it.
Furthermore... the problem the PS3 has with lack of memory is less to do with optimization and more to do with the nature of the beast. Games tend to deal with processing large blocks of data, large blocks of data take up a LOT of memory.
The throughput of a Cell processor (or worse yet... a group of them) working on a large array operation is pretty phenomenal... and all that data has to come FROM somewhere and go TO somewhere. That somewhere is memory... and the PS3 aint got enough to allow game makers to allocate large buffers to keep the Cell's fed. What that means is a limit on the volume of data, or lots of paging to disk. Limit the volume of data and you limit your options, simpler texture, less complex physics. Page to disk and your paging operation becomes your bottleneck - leaving the Processing system waiting for work.
Additionally... there have been plenty of studies around the idea that people can 'hand optimize' to improve the efficiency of code produced by modern compilers. The result: in VERY SELECT cases, hand optimized code can perform better than compiler optimized code... but in general it does not. Modern processors are simply too complex for most programmers to effectively optimize. Programmers COULD have optimized Far Cry 2 to run optimally in 512meg of memory, but not without making tradeoffs in performance, and why would they want to? When memory costs a few buck a gig, programmers are (as they should be) working to most effectively leverage the amount that the target system has (which for a new gaming rig these days will be 4-8gig)... there is no free lunch in programming, forcing the entire Far Cry 2 package to run in 512meg would be nonsensical. Is Far Cry 2 the most efficient code out there? Doubtful... but unless the technical direction on the project was totally incompetent, there were design decisions behind every choice to increase memory footprint.
It's popular for the uninformed to complain about the memory requirements of modern operating systems (they tend to pick on Microsoft because it's easy I think). The fact of the matter is that modern operating systems are designed to run on modern hardware... Vista 64 runs like gangbusters on my new quad box with 8 gig of ram, I don't have it installed on my old Athlon 64 machine.
Why would Microsoft want to design their OS to run on 512meg? That would just mean that their system would be sub-optimized on my 8 gig machine. Design the OS for a window of system capabilities... optimize it there. ROUGHLY, Vista is happiest on 4-8gig, XP was happy with 1-4gig, 2000 was happy with 256meg to 1gig, NT ran quite nicely on 128meg to 256meg. Every time a new OS is released, a few people complain about how it takes up 'SO MUCH' memory.
You want an OS that runs well in 256meg of memory, install NT or 2000... why would you EVER expect Microsoft to work to make their new flagship OS run in that little memory when it would clearly compromise the efficiency and power of the OS when running on new machines?