Rumor: MS to Make 'Huge' Xbox 720 Announcement at CES

Page 5 - Seeking answers? Join the Tom's Guide community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Nonesuch

Distinguished
Jul 23, 2008
7
0
18,510
[citation][nom]molo9000[/nom]I think it's extremely unlikely that they're going to go with power hungry Bulldozer chips... or x86 chips in general. Has any gaming console in history ever used an x86 processor?Wasn't the rumor a few weeks ago that it's going to be an ARM based CPU with specialized co-processors on the same die? That would explain the 6 "cores".Dual GPU sounds weird though...[/citation]

The first Xbox used a Pentium 3.
 

nottheking

Distinguished
Jan 5, 2006
311
0
18,930
So I'm getting voted down for pointing out that Toms' commenters can't read the articles? The Xbox 720 isn't likely going to have an AMD CPU. Anyone who has more than basic literacy skills would see that both this article and the original said AMD *GPU*, not *CPU*. It's not my fault most of you won't read.

Nor my fault if you want to believe that the next console will magically be able to sell high-end computer parts for $300US. Higher-end consoles have historically only reached about mid-range parts at the time of their release, so this puts extreme stuff out of the question. And I doubt that Microsoft is going to go for a huge loss on the hardware: after all, the current (7th) generation was won by a console that had laughably weak hardware, and Microsoft knows that.
[citation][nom]laxin84[/nom]If 360 is left on 1080 when all of consumer electronics starts making a bump to 3D 1080p or 4320p, it'll be a clearer win for Sony this time. They should be careful about jumping early for once. Sony could also potentially get in with 3D optical media, meaning 100GB+ blu-rays.I can say this for sure tho, twin GPUs most likely means 3D support out of the box (makes stereoscopic rendering a heckuva lot easier...)[/citation]
Remember that the power required to render graphics is proportional to the number of pixels; Even under ideal conditions, the 360 and PS3 for top-shelf games are only managing 0.92 megapixels. (720p) To render the same games at 4320p (33.2 MP) would require having a machine 36 times as powerful. Even being generous to allow for FOUR generations of Moore's law, that still means beating the curve by +125%. Given that the Xbox 360 and PS3 were two of the most agressively-powered machines out there (as two of the three consoles in history to sell at a loss upon release, alongside the original Xbox) thinking that Sony (or Microsoft) would go FURTHER is something that just has no logic to it. If anything, given that both companies failed to get to grab the market dominance they'd expected to, (with that being taken by Nintendo) I'm betting both will dial back there, and produce machines that will be profitable even at $3-400US.

Also, dual-GPU setups don't really make 3D any easier; modern video cards are quite fully capable of rendering multiple different buffers at once: that's how they accomplish many effects such as reflective water, by rendering the scene twice. Also, while games are getting larger, the need for an over-sized optical media is overblown: the only reason, so far, 360 games have needed more than one disc is because of high-definition pre-rendered FMVs.

[citation][nom]kastraelie[/nom]You might be forgetting the unit cost (vendor) of a 6970 is $80[/citation]
Erm, what source can you cite for this? I'm going to attach a [citation needed] here.

[citation][nom]dragonsqrrl[/nom]Damn. And that's why you sweep the existing comments before you post...[/citation]
I my defense, because the comments have no "multi-quote" function, I just quote and reply as I go along. Though I noticed the extra comment already, so I granted credit.

[citation][nom]supertrek32[/nom]Don't forget about all the driver optimizations which go into consoles. A console 6850 will blow a PC 6850 out of the water. Looking at hardware isn't going to tell you the whole story.[/citation]
Actually, no, that's not the case. There's no such thing as such magical "driver optimizations." Console CPUs often don't have to deal with so much bloat on PCs, (something that astute enthusiasts don't deal with in gaming either) but given that, by nature, DirectX optimizes everything, a console's GPU is NOT at any advantage power-wise.

The only advantage seen is that the developers hand-pick a single setting for the user. An astute enthusiast on the PC can do the same thing for themselves. To make games look like they "do everything" on the console, devs resort to the following:

* Capping the Framerate - There are no top-shelf titles for either the 360 or PS3 that run at 60fps. Typically, they cap to 30fps, though in some cases it's as low as 24 or 20. That outright lowers the bar.
* Using lower-res textures - This is something painfully apparent in so many games. Of course, this is to be expected when 1-2GB is the norm and a console's limited to about 256MB.
* Cutting LOD distances - Because there's a fixed resolution, LOD and draw distances are scaled back JUST right so that most players won't notice the reduced view compared to the PC version, which will be running at 1050p or higher instead of 720p.
* Skimping on resolution or AA - Only a handful of top-shelf 360 titles had BOTH a 1280x720 resolution and AA. The only one for the PS3 that used AA at all was Final Fantasy XIII. Skyrim on the 360 runs at 1024x576, a mere 64% of 720p.

So, an astute enthusiast that KNOWs what they're looking at when watching gameplay, rather than just assuming, will be able to produce comparable results on the PC with much, much weaker hardware than expected. As I'd said prior, the 360's GPU is more comparable to a Radeon 3650, and the PS3's something between a 7800GS or 8600GT.

That, and there's no real information to substantiate the usual rumors that "The console will have the best PC GPU around!" This has been seen with the 360, where at first it was described as an X800XT, then an X1800XT, X1900XT, and eventually even an HD 2800XT, even though the last came out well after the console did... And that all of the above were more potent than it. (though the X800 just barely)

My predictions, which are based on a bit more logic, put the 720 at using something, at best, comparable to a Radeon 6700 series... But more likely closer to a 6500/6600.
 

jms22

Distinguished
Mar 20, 2008
12
0
18,560
LOTS of people here were stating that there was NEVER going to be a new Xbox console. What you say now? hahaha...ALSO, Game developers dont give a crap about your little console vs pc hardware debate. They only care about $$$MONEY$$$. And as long as they keep making lots of it with game consoles, what you think they are going to do? Negro please...
 

blubbey

Distinguished
Jun 2, 2010
116
0
18,630
[citation][nom]hetneo[/nom]6 years, since November of 2005.[/citation]
Can't remember where but it said it won't be released until 2013 or something. Development time for the 360 too, atleast a year if not two.
 

masterjere

Distinguished
Nov 11, 2011
4
0
18,510
I don't think I've ever been MORE excited about a system then I am this! Fantastic post!!! A Hexa-quad! and Dual Vid cards! Finally! What a system! That's like a PC/Mac gaming machine but in a console, I LOVE IT!
 

kartu

Distinguished
Mar 3, 2009
379
0
18,930
[citation][nom]molo9000[/nom]Has any gaming console in history ever used an x86 processor?Wasn't the rumor a few weeks ago that it's going to be an ARM based CPU with specialized co-processors on the same die? [/citation]
Cough, first xbox, cough?
This ain't mobile market, even old Celerons would laugh at performance that current ARM CPUs can offer.
 

Specter0420

Distinguished
Apr 8, 2010
11
0
18,560
[citation][nom]jeffredo[/nom]Whatever they do I hope the new XBOX is beefy enough to avoid the spillover console dumbing-down that is affecting many cross-platform games. PC hardware suffers from game developers gimping their games so current gen consoles can run it. Our hardware doesn't get to utilize their full power as a result.[/citation]
Yes but even if the next gen consoles are 4x as powerful as a gaming machine from a year in the future the ports will still be dumbed-down to work with the few controller buttons vs native mouse/keyboard support. They will be less mod friendly and wont have as many (or as effective) graphics options.
 

jgutz2006

Distinguished
Jul 7, 2009
120
0
18,630
[citation][nom]timoseewho[/nom]Why do they call it an Xbox 720? Because when you see it, you do a 720 and walk away.[/citation]
I can see everyone in retail stores spinning in a couple circles before walking away... hah
 

nottheking

Distinguished
Jan 5, 2006
311
0
18,930
[citation][nom]kartu[/nom]Cough, first xbox, cough?This ain't mobile market, even old Celerons would laugh at performance that current ARM CPUs can offer.[/citation]
Well, maybe not OLD Celerons... But yeah, no one really understands "Engineering," so they can't put things into perspective. Namely, they can't comprehend the concepts of "trade-offs" and "scalability." In other words, everyone assumes that there's an "ultimate one-size-fits-all" solution. If there was, engineering simply wouldn't have a need to exist.

ARM is absolutely TERRIBLE in performance-per-clock; Cortex A8/A9 and the like, as I recall, are worse than old Pentium 4s. They'd make terrible processors for desktops and servers. Their advantage is that they can consume well under a watt. However, this isn't very scalable: while it reigns as champ for performance-per-watt, it can only do that in the low-power sector to begin with, and hence can't compete with server chips. Too bad so many enthusiasts won't realize that.

[citation][nom]Specter0420[/nom]Yes but even if the next gen consoles are 4x as powerful as a gaming machine from a year in the future the ports will still be dumbed-down to work with the few controller buttons vs native mouse/keyboard support. They will be less mod friendly and wont have as many (or as effective) graphics options.[/citation]
Yep, and of course, no console bests high-end gaming PCs when they come out; historically, the Nintendo64 and original Xbox came KINDA close (no other consoles were quite close) but usually there's a pretty hefty gap, since no one wants to buy a $1,000US console (See: Neo Geo) and it's been shown that making a loss on console hardware tends to not pan out as well. (see: original Xbox, PS3)
 

gavenr

Distinguished
Nov 17, 2011
9
0
18,510
im glad to see a new xbox coming out.. Tired of cod having the same style game. hopefully with the new xbox they can push the envelope.
 
G

Guest

Guest
@nottheking

and your post started out so very promising, what with the engineering and everything

ARM chips are actually very efficient per clock if utilized correctly, due to their inherited nature (RISC architecture with short pipes) clock for clock they are more efficient then your x86 (and that why they will always trump x86 in the mobile market), x86 advantage comes from their ability to handle and execute complex commands within a single cycle, something which may take an ARM chip several cycles to complete, but when tackling simple instructions an ARM chip will trump a x86 chip every time clock for clock.
 

jrob8604

Distinguished
Nov 18, 2011
1
0
18,510
I call BS. every other report that I have read points at a late 2013 release.
first of all if they are just now getting custom dev kits out this late theres no way thats enough development time to release triple A titles for the release which is required for success of a console launch. Next 2gb of ddr3 ram is garbage. most computers are able to come with up to 8gb of ddr3 for around $60. let alone with microsoft mass producing that cost could come to about $30-40 per unit. also the dual gpu seems a bit far fetched but engineers will have alot more time to develop and customize what they can for a new console.

They will probably announce the new xbox is in development but I don't expect them to give a 2012 date.
 

oneblackened

Distinguished
Sep 16, 2010
27
0
18,580
[citation][nom]molo9000[/nom]I think it's extremely unlikely that they're going to go with power hungry Bulldozer chips... or x86 chips in general. Has any gaming console in history ever used an x86 processor?Wasn't the rumor a few weeks ago that it's going to be an ARM based CPU with specialized co-processors on the same die? That would explain the 6 "cores".Dual GPU sounds weird though...[/citation]
Yes, actually, the original Xbox used a modified PIII Coppermine.
 

nottheking

Distinguished
Jan 5, 2006
311
0
18,930
[citation][nom]ARMcycle[/nom]ARM chips are actually very efficient per clock if utilized correctly, due to their inherited nature (RISC architecture with short pipes) clock for clock they are more efficient then your x86 (and that why they will always trump x86 in the mobile market), x86 advantage comes from their ability to handle and execute complex commands within a single cycle, something which may take an ARM chip several cycles to complete, but when tackling simple instructions an ARM chip will trump a x86 chip every time clock for clock.[/citation]
"Utilized correctly," would have to mean "used purely in a synthetic benchmark designed solely to make ARM the winner. ARM is tolerable at basic integer operations, but is woefully unprepared for floating-point. Even in most integer stuff, though, benchmarks demonstrate that it can't even really match the per-clock performance of an Intel Atom; and an Atom, in turn, is considered on a par with the Pentium 4.

Of course, you may have missed my entire point: the weakness in ARM I alluded to was the potency of the CPUs on a per-clock-cycle-basis: by nature, RISC-oriented CPUs (I say "oriented" because the line between RISC and CISC is hardly distinct; Thumb, after all, is a very CISC-style instruction set) tend to be weak in that regard. CISC-oriented CPUs tend to be superior there, and it does show; some of the best of ARM like the Cortex A8/A9 can't even keep up with the weakest examples of x86, let alone a more modern example like Phenom or Bulldozer, let alone Sandy Bridge.

Again, it's an issue here of the fact that ARM was built for performance-per-WATT, which it does quite admirably; while a 1 GHz ARM CPU will yield only a tiny fraction of the performance of a 1 GHz Sandy Bridge core, it will do so while utilizing and even TINIER fraction of the power. The trade-off is that even if you pushed ARM to the limits in clock speed, it would cease to be competitive, as the performance-per-watt would go DOWN. Hence, ARM is very dominant in the mobile market (where battery life is king) but doesn't pose any real threat of taking other markets. (aside, of course, from those where PPW is a chief concern, like many kinds of embedded systems)
 

Supertrek32

Distinguished
Nov 13, 2008
268
0
18,930
[citation][nom]WhysoBluepandabear[/nom]Blow it out of the water how? Consoles of course lock the FPS - but other than that, it's not as if a 6850 in a console would produce better quality than a 6850 in a PC. You also forget overclocking - good luck doing it in the cramped console box - and even if you did, you can't change graphic settings on a console to take advantage of it. Consoles simply lock FPS and change certain settings that aren't quite as important to console gamers, as they are to PC gamers who are literally 1 ft away from the screen.[/citation]
You either chose to completely ignore the main part of that post, "driver optimizations," or are horribly misinformed. When you optimize drivers to the point which consoles have, you will get very noticeable gains on the same piece of hardware. Look at professional graphics cards. They cost 10x the price for the exact same hardware. Why do you think people buy them? The drivers. Using an optimized program on these show huge gains. The cards on the Xbox will no doubt have the same done for them (for games instead of engineering applications, obviously). Tom's has even done articles on these in the past, so go read them and get your facts straight.
 

nottheking

Distinguished
Jan 5, 2006
311
0
18,930
[citation][nom]supertrek32[/nom]You either chose to completely ignore the main part of that post, "driver optimizations," or are horribly misinformed.[/citation]
No, it's that you're the horrible misinformed one. There's no such thing as "driver optimization" in terms of games, or at least that would render a console any better at handling games than a PC's video card. 'cuz guess what? Consumer video cards are bought for playing games. The gains for professional applications on workstation cards is because those are entirely different applications.

What's actually done on consoles is that the games run at lower resolutions, settings, and framerates, and the developers don't directly tell you. Hence most of the uninitiated (read: almost all console games) are entirely unaware that they're actually playing a reduced game versus the PC version, and hence over-estimate the console's capability. It's kinda telling when people with weak cards are playing The Elder Scrolls V: Skyrim with ancient, weak video cards and still running it on "ultra."
 

slabbo

Distinguished
Feb 11, 2009
192
0
18,630
Who here trusts Microsoft with 2 GPUs when they couldn't even prevent 1 from over heating? They better get their act together if they plan to rush this out.
 
Status
Not open for further replies.