[citation][nom]nottheking[/nom]You took the words right out of my mouth.What on Earth could you mean with that? Current desktop GPUs are over an order of magnitude beyond what the Xbox 360 and PS3 have under the hood. To put it into perspective, the floating-point capability of the 360's R500 Xenos is a mere 192 gigaFLOPS, compared to 3.789 gigaFLOPS for the Radeon 7970.In terms of texture & pixel throughput, the 360 is only about 4.29 times as potent as the original Xbox; the 7970 is 14.8 times that of the Xbox 360. (or 63.5 times as powerful as the Xbox!)Also, GPUs are best NOT using the x86 architecture, as Larrabee's troubles have shown us. x86 isn't good for such units: it's best for the CPU, and the 360, unlike the original Xbox, (but like the Game Cube, Wii, and PS3) uses IBM's PowerPC architecture, which seems to be a lock for being the architecture for all three next-gen consoles.What sort of universe did you live in? Gaming appeared before the PC, and definitely before consoles. There was no such thing as "gaming hardware" then. And throughout the ages, PC gaming has almost always been the cutting edge side. The ONLY potential exception was the year or so between the release of the Nintendo64 and more-fleshed out 3D accelerators for the PC. Other from that brief exception, consoles have always been a case of settling for less. There were also the "home computers" (basically competing macro-architectures to the IBM PC) such as the Commodore Amiga which also featured a strong selection of games that, technology-wise, put the consoles to shame. When the NES was limited to 3-6 colors for each object on-screen, the Amiga was capable of using 64. Later versions of the Amiga caught up with 256-color modes (13h, X, and Y, all of them VGA) being used on the PC from 1987 onward. Doom is a rather good example of what was capable of PC graphics from the late 1980s.All computing hardware (barring the N64, again) that was "specialized for gaming" on a non-PC platform was specialized to be CUT-DOWN: it invariably wasn't as powerful, and the "specialization" was engineering done to reduce the price, and (in the case of handhelds) reduce power consumption as well.Almost all of the ideas taken were, in fact, recycled from the PC and other computers. The clever "backgrounds and sprites" mode that made smooth-scrolling 2D gaming plausible on inexpensive hardware was, in fact, simply based on a re-purposed "text-mode" display lifted from the PC: the "background" was simply a text layer changed to use square characters, and the "sprites" were similarly based on the cursor part; the only significant change was moving the tileset for both from a hard-coded ROM chip to a a freely-re-writable section of RAM. Over time, they simply made it more advanced, by increasing the number of cursors, increasing the color depth, and allowing for multiple layers of "text" to be shown at once.You've got it backwards: the Xbox, Xbox 360, and PS3 were the ONLY consoles to release at a loss. Sony made money on every PS1 and PS2 sold, from the release day onward. That's why they debuted with what were, at the time, seen as horrendously high $300US MSRPs.As for royalties without making the hardware... That's kinda easy. After all, Sony doesn't make the PS3 anyway; Foxconn does. (Foxconn ALSO makes the 360 and Wii the same way: using guts manufactured by IBM, ATi/nVidia, Samsung, Toshiba etc.) Similarly, many more obscure console and home computer designs were ENTIRELY outsourced: a good example would be Microsoft's MSX, a Japanese-only system that actually sold well, and was famous for being the place where three powerhouse gaming series got their start: Castlevania, Dragon Quest, and yes, Metal Gear. The MSX machines out there actually bore the branding of whatever company manufactured that unit, not that of Microsoft. (ironically, one of these makers was Sony) Microsoft simply made a standard for these machines, and licensed out the base technology: the individual Japanese electronics companies build a design that fit the specifications, but could vary wildly on many attributes. It was almost kind of like a form of DirectX for console design.No, it's safe to assume the Wii U (at least the base unit; no word on the tablet) will be vastly more powerful than either the Xbox 360 or PS3, simply due to Moore's Law. Simply following it would've meant that even keeping with the "Wii" line would've resulted in something equal to the 360 around the year 2009... And keep in mind it's 3 years later, so even if they went with the same "cheap and tiny" route, you're looking at something about four times as powerful: that's the same power gap between the Xbox 360 and original Xbox.Similarly, word is that the Wii U's GPU will be based on one of AMD's two now-old architectures: VLIW5, or VLIW4. (respectively, the Radeon HD 5000 and HD 6000 series, though both are still recycled for lower-end 7000 parts) Almost every single GPU made with those is significantly above the R500. Also, both the version of VLIW5 (the iteration as used in the Radeon 5800 series) and VLIW4 sport dedicated tesselation support, which likewise would give the Wii U a serious boost up.Most comparisons here come from either ill-informed fanboys trying to deride the Wii U before it's even out, or simply the uninformed listening to word that a lot of the Wii U's initial titles will be cheap cross-ports of every single Xbox 360 and PS3 title out there. (read: "Shovelware Bandwagon: The Next Generation," going boldly where every publisher has already gone before) The reason they largely won't get upgrades to make use of the Wii U's increased capabilities is because that'd be too much investment for games that, in all reality, a LOT of the base will already have on the 360 or PS3, and they're all just looking for a quick way to cash in here. This is also the same reason why the Wii got a lot of direct PS2 ports in spite of being several times as powerful, and how most cross 360/PS3 games make use of NEITHER'S strengths, and simply stick with the lowest common denominator between them.Sony started making money on each console sold from around the time that the PS3 Slim came out. By now they've actually recovered all the losses they made prior to that point, leaving the original Xbox history's only console to not make a net profit on sales across its lifespan. (IIRC Microsoft is still out a few hundred million on it, but has more than recovered that with the 360)This is actually incorrect. GPU-wise, there is literally no difference in "efficiency" between consoles and PCs. Any perceived difference is a combination of two factors:1. The settings on the console are fixed to what the developers found to be the best tradeoff of visuals and performance. You can get the same effect on the PC if you're willing to dig into the .ini file and tweak settings until you get the best settings for your own machine. 2. The perceived difference is often illusory, based on assumptions that the console is doing things it actually isn't, such as the assumptions most make that Xbox 360 games are running at 1280x720 (Skyrim, for instance, runs at 1024x576, actually) or that EITHER the PS3 or 360 is getting 60fps, when almost all games are capped at 30fps.CPU-wise is a bit of a different story: even if you remove all the bloat from PCs (which CAN be done and still run Windows fine) a lot of PC+console games are just console games cheaply ported to the PC so inefficiently that it's all but EMULATING the console.I feel that Sega's approach to hardware was also flawed as well: they focused WAY too much on "beating Nintendo," that their designs for the Sega Saturn and Dreamcast reflected a lot of decisions that emphasized beating Nintendo's EXISTING console, not the one that'd come out a year or two after Sega's. Sega only managed to avoid this mentality with the Genesis, (aka Mega Drive) which coincidentally, was the one most fiercely competitive with Nintendo's systems; while technically inferior to the Super NES, the margin was close, making it FAR more impressive for a machine released in 1989 than the Super NES released in 1991. It only really fell behind in multi-layer blended transparency support, and in the SNES having a vastly superior audio system, that was the first to support compression. (Sega would not implement audio compression support until the DC, which by then was too little too late)[/citation]
with sega, there is also the factor that they did make better systems like the saturn and such, but never thought of the people who developed for it, it was like the early cell days where everyone complained, but because of how sucessfull they were, people just got use to it.
and also with the saturn, if im correct, they went and made it for arcade ports, something that was next to impossible on a nintendo system, and was sluggish even on a ps1,
but as for console graphics, they are able to pull more out, and i have always wondered just by how much. yea, ini tweaks can help, but base code, on a console exclusive, will be more efficient than lets say on a pc exclusive, because that pc has to take into account x cpus and x gpus where consoles only need 1. i believe that with the wiiu, they will have enough of a performance boost to being console only, that when they use tessellation, it will have a very minimal effect on the overall gameplay, weather that is at 30 or 60fps doesn't matter to much.