I'm laughing at all the BS flying around and bad mouthing / language by little kids.
No virtualized XP on a Vista / 7 machine will not run faster then native XP run on same box. Virtualization always produces overhead and wait-states / NOP instructions. Because systems are rarely utilizing more then 50% of available HW resources its often a good idea to share those HW resources across multiple systems and reduce your cost. Thus virtualization is now pretty common in data-centers.
But a desktop is not a data-center, there is no need to run XP Mode (just vanilla MS Virtual PC with a pre-installed XP disk image) if all your work needs to be done within XP. Any XP run on MS Virtual PC will have the exact same vulnerabilities as stock XP, its the exact same. It doesn't pass the common sense test to have someone power on then log into a Windows 7 system only to power on / log into the XP virtual session to do their office work.
There are hundreds of XP era applications within the DoD that simply won't work on NT 6+ yet. In time the program managers will have their applications ported / tested / configuration managed then have a scheduled release. That time takes a year at least, and that is from the moment the decision to go with vista+ is made, that decision alone can take six months to a year. Corporations are not much quicker, their entire business relies on their automation just working. Big business does not treat automation as a hobby, people don't just "upgrade" anything their working on just-cause, there needs to be a documented requirement then a risk analysis / mitigation strategy developed. Then you have resource analysis / requirements determination, then prototype fielding, then testing / developing, then eventually configuration management and scheduled product release. Company's that don't follow these things tend to spend lots of money troubleshooting / tracking down problems.
Ontop of that, Vista's kernel is atrocious in its memory management. Superfetch was a good idea but it broke the OS on anything with less then 4GB of memory. The way it works in Vista is that ~always~ reads and loads into memory all your recently accessed documents / programs. It doesn't distinguish between temporary internet files, your last played games resource files, your PowerPoint documents, or your office PST file. Your HDD can only read one thing at a time and Vista will fill the I/O que with read requests for all these things that your not needing to use. Then when you load your actual program you want to use Vista will start to page out to swap all that stuff it loaded earlier thus consuming more disk IOs. It always seems to consume those disk IO's at the worst moments, the times when you need them. Windows 7 is a bit smarter on how it prioritizes IO / memory space. The biggest thing to remember is that if you consume all available memory as a cache, then when you do need more memory it must be paged to disk. Paging to disk cause's lag and delays any other disk IO's, there is no way around that. Consuming all available memory will lead to disk paging.
The amount of memory required in your system is not some mythical "set" amount, it all depends on what your habits are. If you run lots of different things, not necessarily at the same time, then super-fetch will try to load ~all~ of it into cache memory regardless of what your trying to do. I run 8+ GB of memory on a very beefy system and I had to turn off super-fetch on Windows 7 x64. I do too many games / have too much stuff going on for Windows to try to manage my memory footprint. Windows Vista also keeps a copy of everything in WDM twice. Each application gets its own WDM bubble to write into, but the WDM itself then copies the contents of those applications into WDM memory then into the display. So every GUI takes up twice as much display memory. Windows just passes a reference to the applications display memory rather then copying it, helps a ton.