All told, the Wii's hardware isn't QUITE that much weaker than its competitors. A lot of the gains to graphics hardware the 360 and PS3 had are quickly eaten up by the fact that they mandate HD resolutions for all their games. Any astute gamer on the PC (or just anyone that looks at PC gaming benchmarks) can tell you cranking the resolution is one of the quickest ways to tax hardware. So that's part of how the Wii manages to be able to port almost all 360 and PS3 games with such weaker hardware: they only run at 720x480, without either AA or HDR. (two other taxing options)
Much of the CPU power difference is simply because the Xbox 360 and PS3 don't even use their full CPUs for their main gaming mode; they're reserved for other tasks, particularly software emulation of previous consoles, as well as playing HD video. (downloaded for the 360, and Blu-Ray for the PS3; that stuff eats up a LOT of CPU power, for those that didn't check)
As a result, I'm wondering how the Wii, if it's not a new generation of console, (and instead a "half-step upgrade" like, say, the DSi) would manage to still retain the same formfactor while dealing with much bigger hardware. Granted, it could PROBABLY fit the same if they used 45nm instead of 90nm parts like the original 360, PS3, and Wii used, but I'm not so sure about heat production.
As far as the rest of it... I do agree that full backwards-compatibility is a must; Nintendo did a smart move with the Wii, basing its architecture to fully encompass the GameCube's architecture; this is a novel idea for consoles, but is rather old hat for PCs, where a modern Core i7 will still run code written for a 31-year-old 8086 CPU. If Nintendo's architecture seems still sound, they could likely just build onto it yet again, yielding a console that'll natively play both GameCube and Wii games without needing either software emulators or separate and expensive BC hardware.
As far as the actual hardware in it... Nintendo's strategy seems to always center around making a profit on the hardware. (actually, the only three consoles that launched to sell at a loss were the Xbox, Xbox 360, and PS3, of with the 360 now makes a profit) Fortunately, chip prices have fallen drastically in the past few years, particularly right AFTER the 7th generation consoles launched. So if they stick with only GDDR3 for the memory (which is RELATIVELY cheap) they could easily cram 512MB in; the current cost of that is maybe $20-25US. Similarly, the Wii's CPU and GPU are pretty tiny, so even just switching to 45nm would let them quadruple the transistor count and keep costs/size about the same.
Of course, then come some of the OTHER questions... Like if they'll ever get around to letting people buy them in colors OTHER than white, like they implied/promised people would. And of what sort of videos, if any, they could play. Once the CPU power is increased, even HD video really wouldn't be out of reach, but even for DVD, royalty fees is something of concern; the patent on DVDs remains until, I think, 2015, so before then it'd be like $10US on the cost of every "Wii 2" would have to go for that. And of course, Blu-Ray capability would require LOADS of things, not the least of which would involve making a contract with Sony to use loads of their technology. (as a result, I don't think it's gonna happen... Same as for why the 360 hasn't had a Blu-Ray player for it)