[citation][nom]gamerk316[/nom]My point for the above, is that most OS's are unomtimzed, and coded in such a way to guarentee compatability over performance. Consoles, on the other hand, are made the other way around, made to make efficent use of system resources. For example, in Windows (or most any computer OS), if you perform an arithmatic function, that naturally takes up memory. And I know for a fact, even when that variable is no longer needed, few bother to actually deallocate that variable from RAM (part of the reason being how few languages in use support memory deallocation). For consoles (or any lightweight embeeded software), you make sure to clean up after yourself. If its not being used, it cleaned up to free up resources.In short, what you need 2GB for in Windows could easily be accomplished with half that amount, if the OS were optimzed and programmers made sure to clean up after themselves when done. I argue that Far Cry 2 could be modified to run on 512MB RAM without any major performance loss, if anyone ever wanted to put that much effort into the work.[/citation]
Umm... no.
Modern development tools have very detailed memory management approaches. Develop in .NET and you get variable scoping at a very granular level (within virtually ANY element you can have a memory scope), when a variable goes out of scope it is collected and its memory is freed. If your program is properly structured... it won't have any dead variables lying around taking up space. Decisions can be made to scope variables globally... but those decisions tend to be made for performance reasons, thus those variables should be increasing code efficiency not hurting it.
Furthermore... the problem the PS3 has with lack of memory is less to do with optimization and more to do with the nature of the beast. Games tend to deal with processing large blocks of data, large blocks of data take up a LOT of memory.
The throughput of a Cell processor (or worse yet... a group of them) working on a large array operation is pretty phenomenal... and all that data has to come FROM somewhere and go TO somewhere. That somewhere is memory... and the PS3 aint got enough to allow game makers to allocate large buffers to keep the Cell's fed. What that means is a limit on the volume of data, or lots of paging to disk. Limit the volume of data and you limit your options, simpler texture, less complex physics. Page to disk and your paging operation becomes your bottleneck - leaving the Processing system waiting for work.
Additionally... there have been plenty of studies around the idea that people can 'hand optimize' to improve the efficiency of code produced by modern compilers. The result: in VERY SELECT cases, hand optimized code can perform better than compiler optimized code... but in general it does not. Modern processors are simply too complex for most programmers to effectively optimize. Programmers COULD have optimized Far Cry 2 to run optimally in 512meg of memory, but not without making tradeoffs in performance, and why would they want to? When memory costs a few buck a gig, programmers are (as they should be) working to most effectively leverage the amount that the target system has (which for a new gaming rig these days will be 4-8gig)... there is no free lunch in programming, forcing the entire Far Cry 2 package to run in 512meg would be nonsensical. Is Far Cry 2 the most efficient code out there? Doubtful... but unless the technical direction on the project was totally incompetent, there were design decisions behind every choice to increase memory footprint.
It's popular for the uninformed to complain about the memory requirements of modern operating systems (they tend to pick on Microsoft because it's easy I think). The fact of the matter is that modern operating systems are designed to run on modern hardware... Vista 64 runs like gangbusters on my new quad box with 8 gig of ram, I don't have it installed on my old Athlon 64 machine.
Why would Microsoft want to design their OS to run on 512meg? That would just mean that their system would be sub-optimized on my 8 gig machine. Design the OS for a window of system capabilities... optimize it there. ROUGHLY, Vista is happiest on 4-8gig, XP was happy with 1-4gig, 2000 was happy with 256meg to 1gig, NT ran quite nicely on 128meg to 256meg. Every time a new OS is released, a few people complain about how it takes up 'SO MUCH' memory.
You want an OS that runs well in 256meg of memory, install NT or 2000... why would you EVER expect Microsoft to work to make their new flagship OS run in that little memory when it would clearly compromise the efficiency and power of the OS when running on new machines?