Rumor: Next-gen Xbox SoC "Oban," Already in Production

Page 2 - Seeking answers? Join the Tom's Guide community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

bens1

Distinguished
Jan 21, 2011
2
0
18,510
[citation][nom]alidan[/nom]consoles dont use direct x, [/citation]

Actually the name XBox comes from "DirectX Box" (Widely documented).

Also when it talks about having DirectX 11 features its talking about hardware features such as tessellation, which will be applicable no matter whether they use DirectX or not.

That said though, the GPU does seem a bit weak. Traditionally consoles initially come out with a GPU that is better that most PC GPUs available at the time, and then the PC catches up and then overtakes.... but if its being launched with a last gen GPU then its out of date already.
 

drwho1

Distinguished
Jan 10, 2010
367
0
18,930
outdated or not, it should still be better than what 360 users currently have.

I prefer Sony consoles, but I'm still happy for the sort of good news.
 

gamerk316

Distinguished
Jul 8, 2008
325
0
19,060
Not shocked at all about the GPU. I can now say "I TOLD YOU SO".

A console does not need nearly as much horsepower as a PC, because you don't go through a heavy OS, and typically do not code in higher level API's. You code the vast majority of time-intensive processes in assembly. As such, a powerful GPU is a luxury, not a requirement.

Nevermind the heat requirements, something MSFT is VERY keenly aware of...
 

g00fysmiley

Distinguished
Apr 30, 2010
476
0
18,930
[citation][nom]darkchazz[/nom]HD 6670 ?! LMAOAnd here I was thinking my GTX560ti rig wont be as powerful as the new xbox... well seems like it will be outdated the day it comes out..What the heck just get it out already... I'm tried of the shit ports..Will be interesting to see how the devs will take full advantage of that hardware, it's about time they make full use of DX11.[/citation]

From a hardware perspective consoles are dated from the date they come out. it is how they sell a all in one unit for $3-400. fo rthat price you arne't going to get a gtx 560 and an i7 2500.

What they do have goign for them is optimization. if you can tweak and design a game to work with an exact hardware parameter then you can make it look better than you'd expect from a comprable pc.

You are correcton it holding PC gamers back though, what this tells me is that my 450's in SLI are likely all I'll need for a ew years... though I might jump in on the next die shrink for power / heat purposes (2 450's in my pc combined with 2 450's in my wife's pc makes for one warm office in the summer when both of us are gaming)

hopefully as pc gamers we will continue to support those studios who do make games for pc that use the hardware and look better than what consoles are capable of.
 

SteelCity1981

Distinguished
Sep 16, 2010
249
0
18,830
Well there goes all that AMD Radeon 7800 talk right out the window. The Xbox 360 GPU was more comparable with Highend PC's GPU's for it's time that the next gen Xbox model is right now. Not to say that the Radeon 6670 isn't a bad GPU, but by 2013 when it's released it will alreayd be 2 gen behind AMD's midrange PC GPU's let alone way behind AMD's highend GPU's and that's a problem given the fact that they expect this console to be their flagship for atleast 6 years.
 

Marco925

Distinguished
Aug 11, 2008
530
0
18,930
[citation][nom]choji7[/nom]Really hoping for something with bit more of a powerful gpu than a 6670 equivalent. Us PC gamers are going to be stuck with these hardware limiters for several years.The comments in the ign link really shows how little people know about consoles. Many of them seem to believe that consoles are somehow vastly more efficient than PCs so the weaker gpu is ok, the custom boards and basic OS can only help games performance so much. Shame they don't seem to get that consoles normally render at lower res and then scale up the image.[/citation]
though with the prices of modern video cards, i dont want to pay $1000 for a console for the top video card because that costs $700
 

in_the_loop

Distinguished
Dec 15, 2007
48
0
18,580
I'm very disappointed.
The needed upgrade to push for better graphics that would also lead to better Pc-graphics (with ports or just the general competition) is just this?
I checked the "Graphics Card Hierarchy Chart" here at toms and the 6670 is 6 tiers lower than the mainstream Nividia 560/560ti, which I use.
The 560 Ti is alright, but it struggles with some demanding titles at 1920X1200 with highest settings.
And that is today!
Yes, I know that the X-box is better optimized than the Pc counterpart, but 6 tiers?
And even if it was equal to the 6 tiers higher 560Ti, it would still be a mediocre performance when (or if) it arrives in late 2013.
I mean, the integrated gpu;s in Intels and AMD;s processors will probably be as fast by then!
Let's hope for that the rumors are wrong this time!
 

nottheking

Distinguished
Jan 5, 2006
311
0
18,930
This is a little curious (And dubious) a claim. The 6670 is in fact 20% LESS powerful than the 4770 when describing raw processing power, both for shaders and textures. In terms of memory bandwidth, the 6670 does have a 25% improvement, by virtue of its otherwise-identical memory subsystem clocking its GDDR5 at 4.0 GHz instead of 3.2 GHz.

However, relying on the memory clock is going to be a non-starter when Nintendo's involved... Since historically, each machine they've put out has emphasized very heavily on having the fastest RAM available, or at least plausible. This pretty much rules out the Wii U using anything OTHER than GDDR5, and clocked anything less than 4.0 GHz. (up to the current maximum of 5.5 GHz is a distinct possibility)

As far as comparison goes, in terms of raw throughput, the PS3's GPU was 10% beyond that of the Xbox 360; both had the same number of TMUs and shader vector units, (the 360 had 48 independent SPs, while the PS3 had 24 two-vector pipelines) with the PS3 being clocked 10% higher. (and arguably, it might've had slightly improved power through its 8 dedicated vertex shaders)
[citation][nom]memadmax[/nom]I remember back in the day when consoles could spank anything a PC could muster.....God, I feel old........[/citation]
That actually only happened once: that was with the Nintendo64. Of course, it's worth noting that there was also a bigger price gap then, too; if you were building a decent gaming rig circa 1990, you were using a 33 MHz 486, a VGA card, a MIDI card, and 16MB of RAM; such a setup made even SNK's Neo-Geo look laughable, (let alone the Sega Genesis or SNES) but would set you back $3,000US+, a steep price compared to less than $200US for mainstream competing consoles.

The Nintendo64's superiority came mostly from the simple fact that through the early/mid 1990s, PC hardware development stagnated: that 1990 PC I just mentioned wouldn't become obsolete whatsoever until at least 1996; it was kind of analogous with how the current console hardware market has left a lot of software development stagnated.

[citation][nom]alidan[/nom]consoles dont use direct x, so features in dx aren't tied to the gpu.[/citation]
Actually, the exact opposite is true: both the Xbox and Xbox 360 rely on DirectX. For the 360, it's a unique version that ROUGHLY equates to 9.0c as found on the PC: it falls short on some specs (only equating to DX9) but on some actually surpasses them slightly, nearly reaching DX10. Overall, it's very comparable to PC 9.0c cards; almost all shaders written for one can be used on the other and vice-versa.

[citation][nom]alidan[/nom]people complaining about how the "hardware is already dated" how many games today take full advantage of everything available to them?[/citation]
Pretty much all flagship Xbox 360 and PS3 titles, and all Wii titles that actually bother to try looking good. (like, say, Conduit 2)

[citation][nom]alidan[/nom]what im saying is that both consoles have a tessellation unit in them, and when done right will produce great graphics regardless of the hardware. as long as the consoles can push higher rez textures, and can push more pollies (4million ish) than they will be great.and on the pc side, you will most likely see the benefit being higher detail settings. with engines taking advantage of tessellation, and probably doing it a hell of allot better than now, you will most likely see a huge leap in graphic quality on the pc, and probably specs for better graphics being reduced a bit too, because of better engines.[/citation]
I think you've bought WAY too much into the "Tesselation" hype. Contrary to popular belief, the stuff doesn't magically insert an infinite number of polygons/second. Rather, it allows for a very smooth, on-the-fly adjustment of polygon levels in a model. The drawback present with tesselation, is that even after the unit finishes its work, the stream processors STILL have to handle the setup for each polygon and vertex, just as if tesselation wasn't being in use.

The real advantage of tesselation is that it'll make many forms of LOD scaling obsolete, or vastly easier: no longer will a game have to include multiple meshes for the same model, as it can just include a single one and generate all the others on the fly. The quality improvement won't necessarily be direct; the first instance will be that sharp "LOD tiers" will be eliminated, putting an end to times where an object will "pop" between different detail levels: ideally, tesselation can adjust a model's detail by an increment of 1 polygon at a time. The second, indirect improvement will be an increase in higher-polygon models: they WON'T be faster to render than before, but will be practical to use for close-ups because the work of making multiple meshes will be eliminated, as would the increased disk and RAM usage.

If I was to make a technical comparison, I'd say that tesselation is for meshes what trilinear filtering (along with mip-mapping) is for textures. Similarly, the GPU has dedicated fixed-function units to handle mip-mapping, along with trilinear filtering.

[citation][nom]memadmax[/nom]this next gen, is realistically be the first gen that could last 10 years as we have all the tools right now to render photo real in real time.[/citation]
Photo-realism was also claimed for both the 6th generation and 7th (current) generations. Hell, I could probably find some claims for FIFTH-generation consoles. Someone was probably nutty enough to claim that about the Nintendo64, or possibly even the Sega Saturn or Playstation.

[citation][nom]memadmax[/nom]i don't see dx12 or such being as show stopping as 11 was (what 10 should have been) dx 12 will most likely be a better rendering system than 11, no real new features to be added. if games use opencl physics, we get better versions on the pcif game employ dx11 graphics, we get better versions on the pctell me one way consoles will hold the pc back this gen, because i cant see it, and i would love to know, and please to not site gpu advancements from 10 years ago, because 10 years ago we werent in an area of diminishing returns.[/citation]
The actual GPU "diminishing returns" chiefly come from visuals, not technology and features. You yourself espoused on the virtues of tesselation, indicating that you still believe that even today earth-shattering features are still coming out. The lack of DX 12 or DX 13 (since we're having a major release every couple of years)

Just look at the current generation. 7th-gen consoles are so very badly holding back PC development. The same will happen in the 8th generation, just starting with a different level.

[citation][nom]Steveymoo[/nom]Microsoft really needs to look into making direct x more of an open standard, or using opengl for it's consoles. Locking down a console to use a pre-defined set of instructions and features, really limits studios on what they can do. Imagine what you could create with a completely free reign on code and shading techniques![/citation]
You're mistaking an abstraction layer for a programming langauge, it would seem. If you want to hard-code your own shaders, everyone is still perfectly free to use ASM for the shaders; HLSL is merely an OPTION.
 

kinggraves

Distinguished
May 14, 2010
445
0
18,940
Oh noes, not LAST generation GPUs! Why not use the 7xxx series which isn't even out yet for consumer use? Except there's been very little change between the low end 5xxx, 6xxx, and 7xxx series, pretty much the same thing. Except on top of that they're built custom chips which are BASED on the 6670, NOT actually 6670s, and can use the more recent processes. Something around 6670 level can do 1080p, and 1080p is the maximum the new XBox needs to do, not anything higher, not a ton of graphics bells and whistles, just 1080p with low power draw and heat output. It doesn't have to do multi monitor, or high resolutions, or any of the other things your PC GPU can do.

PC gamers, you're missing the big picture here. A 6xxx level GPU means DX11 and tesselation. That means when games are ported from the new XBox, they can be ported in DX11 with tesselation, unlike the DX9 locked console ports now. You're getting ports anyway as long as consoles are still making money, better take what you can get.
 

Polaski

Distinguished
Nov 4, 2011
1
0
18,510
It's a disappointment it's true but imagine a 560ti...the price will go up a lot! Get real people...get the full picture.
 

nottheking

Distinguished
Jan 5, 2006
311
0
18,930
[citation][nom]alidan[/nom] like i said in my above post, that is getting downvoted probably because i hit a nerve with the people who want gaming to require 4 6990 to run even a measly 20fps,[/citation]
No, you got down-voted because what you posted was largely complete nonsense, and you began it with a FLAGRANTLY incorrect claim, "Consoles don't use DirectX." As bens1 said, the whole NAME of "Xbox" is well known as a shortening of "DirectX Box;" the design on both the original and 360 is mostly-PC-based, and simply stripped down to where DirectX constitutes the vast majority of the OS.

[citation][nom]alidan[/nom](as next gen will be the one that really pushes the engines games can make due to the hardware in those chipsets. all they need to do is increase texture resolutions, i dont want to see a bad testure unless im trying to get within 1 foot of a wall or im trying to posission an object so its right in front of the camera.its funny to me how little people know about graphics, and still complain about graphics at the same time, i'm still waiting for someone to tell me the ground breaking graphical advancement that wont be supported because consoles cant handle it at all.[/citation]
You're asking for people to tell you what the next DirectX will support. Just because none of us know the future years in advance doesn't mean that there's nothing major to do; this makes you sound like the mythical US Patent Office employee in the late 19th century who, as legend had it, quit his job and recommended the office be closed because, as he put it, "there was nothing left to invent."

You can safely trust that there will be a DirectX 12, and that it will bring serious, major improvements over DirectX 11. Again, just because no one is telling you what would be included in the FUTURE doesn't mean it doesn't exist. Imagine how someone back in 2004 would sound when they proclaimed that the Xbox 360 wouldn't be missing out on anything, because no one knew that Tesselation would be a big thing in DirectX 11, or DX10's memory virtualization, or perhaps more significantly, DX10's elimination of the "capability bits?"

[citation][nom]gamerk316[/nom]Not shocked at all about the GPU. I can now say "I TOLD YOU SO".A console does not need nearly as much horsepower as a PC, because you don't go through a heavy OS, and typically do not code in higher level API's. You code the vast majority of time-intensive processes in assembly.[/citation]
Assembly stopped being the chief means of programming a console starting around the 5th generation.

When it comes to a GPU, due to the highly-efficient nature of how they work, the same raw GPU will get effectively the same results in either a PC or console. The PS3 and 360 are close, respectively, to the power of a GeForce 8600GT or Radeon 3650. If you actually run a game on the PC on equal settings, the PC and console will perform close to identically. Consoles have a nasty habit of using lower settings in some things than are publicly revealed, so that what many people would think are "equal" on the PC are in fact higher. The PS3 and 360 don't use anisotropic filtering, either use lower resolutions or don't use AA, and their framerates are actually capped at 30fps, to name a few.

The claims of "GPU efficiency" are the result of backwards thinking here: people buy into claims that the consoles are "rendering on maximum," so they assign more power to them than they actually have. And when people who actually know what the hell they're talking about show up to correct the first group on the console GPU, the first group just twists the logic to assume that it's magically somehow "more efficient." There's no OS overhead on a GPU.
 

noblerabbit

Distinguished
Oct 14, 2010
84
0
18,580
so, the chips are done, in production, but no sign of this console until 20 months from now?? Somebody please fire that product manager now.
 

zak_mckraken

Distinguished
Jan 16, 2004
868
0
18,930
[citation][nom]mandus[/nom]hm... i wonder where we can find similar performance power at this moment ... /sarcasm off[/citation]
For around 300$? Nowhere.

I really like the update. I don't care much about 3D, although it's nice, but the multi-display support sounds great! Not only will we able to game on a surround-screen system, but how about multi-player on separate screens? Hell yeah! LAN gaming on a single console! :)
 

nottheking

Distinguished
Jan 5, 2006
311
0
18,930
[citation][nom]kinggraves[/nom]Why not use the 7xxx series which isn't even out yet for consumer use?[/citation]
Um: what is this, then? It's been there for more than two weeks now.

[citation][nom]kinggraves[/nom]Except there's been very little change between the low end 5xxx, 6xxx, and 7xxx series, pretty much the same thing.[/citation]
That's because the lower-end parts recycle old GPU designs; the 6770 and all below it are, in fact, evergreen GPUs, and the 6790/6800 series' "Barts" was NEW, but still based off of the Evergreen's VLIW5 architecture, rather than the Northern Islands' VLIW4. A similar story goes for the Radeon 7000s; the 7300 unashamadly uses the antiquidated Cedar core, while up to the 7670 still uses the VLIW5 architecture akin to the 6800. And the 7700 series appears to be based off of VLIW4, not GCN.

If a console wanted to be on the cutting edge here, it could still downsize a bit, but instead just rely on a newer architecture; such as taking GCN, and while still only having 480 or 640 SPs, use the more-effective GCN architecture.

[citation][nom]kinggraves[/nom]Something around 6670 level can do 1080p, and 1080p is the maximum the new XBox needs to do, not anything higher, not a ton of graphics bells and whistles, just 1080p with low power draw and heat output. It doesn't have to do multi monitor, or high resolutions, or any of the other things your PC GPU can do.[/citation]
But what good is it that, once the the cost for upgrading to 1080p from 576-720p (a lot of Xbox games fail to render even 720p, such as Skyrim) the amount of power left over only allows for incremental improvements to things like shaders and effects? No one would've called the Xbox 360 visually impressive if the only major technical graphical improvement was that it rendered at 1280x720 instead of the Xbox's 720x480.

[citation][nom]kinggraves[/nom]PC gamers, you're missing the big picture here. A 6xxx level GPU means DX11 and tesselation. That means when games are ported from the new XBox, they can be ported in DX11 with tesselation, unlike the DX9 locked console ports now. You're getting ports anyway as long as consoles are still making money, better take what you can get.[/citation]
That's a horrible attitude; I still would prefer a return to the 5th- and 6th-generation eras, where consoles got ports from the PC, not the other way around. An upgrade so that those ports from the consoles will start using what the PC already technically is capable of isn't something to PRAISE, it's something with which someone says "it's about darned time already."

[citation][nom]zak_mckraken[/nom]For around 300$? Nowhere.[/citation]
Actually, that's quite feasible. As IGN pointed out, the GPU currently costs about $80US. (some on Newegg go for as low as $70US) You could readily pack in the rest of a PC for $220US. Also, don't forget it's very likely the Xbox 720 will MSRP at $400US, not $300US.

[citation][nom]zak_mckraken[/nom]but how about multi-player on separate screens? Hell yeah! LAN gaming on a single console![/citation]
That would require adding multiple physical display connectors to the machine: those would add extra cost that would NEVER go down with miniaturization, (unlike the silicon parts) and be of dubious added value. (not many people would have the multiple displays on hand, etc.) I'd prefer that developers start implementing split-screen multiplayer more often.
 

cknobman

Distinguished
May 2, 2006
277
0
18,930
For everyone saying the 6670 is not bad, do you realize that by the time this thing comes out in 1.5 years it will be 2+ gen old?

Gimmie a break Microsoft. On the bright side I guess I wont have to share my gaming budget between platforms in the future and will be able to devote 100% back to PC.

I refuse to buy into a new generation of consoles based on 3-4 year old tech.
 

master9716

Distinguished
Jul 27, 2006
73
0
18,580
4K Resolutions are coming this year they need to prepare for that. Also Dual 4k resoltions capable devices will be needed since the life of these devices will be long . DUAL 6670s can do 4k but 8k might be tricky . They should move up to the 6770s so 8k Resolutions can be possible.
 

wild9

Distinguished
May 20, 2007
456
0
18,930
Don't think it matters much if the GPU is an ATI 6000 derivative as opposed to 7000. As long as there's no API in the way it would still kick ass, even compared to the 7000 series on a PC. I think the only advantage the 7000 series would have on the PC is higher resolutions and GPGPU transcoding.
 

Antimatter79

Distinguished
Jul 24, 2009
103
0
18,640
Since the GPU will be discrete, how come the consoles don't have the option of upgrading through the manufacturer or a licensed upgrade center? That way, the games could have several graphic options from normal to ultra, something along those lines. I have wanted to get a PS3 or an Xbox for the few exclusives I would want to bother playing, but the non-native HD graphics on many games, muddy textures and jaggy aliasing have kept me away. I feel like the current consoles have been a huge ripoff to boast HD graphics when most of the times, that's clearly not the case.
 

husker

Distinguished
Oct 2, 2009
428
0
18,930
[citation][nom]joytech22[/nom]Wow.. my old 8800GTS has more horsepower than the 6770, of course without the DX11, Tessellation etc.. Oh and power consumption was a lot higher but still...[/citation]
Wow.. my team of draft horses can pull more than a modern pickup truck, of course without the speed, comfort, etc.. Oh and they require a different power source and upkeep but still...
 
Status
Not open for further replies.