Will the Next Microsoft Console Be Called Xbox Infinity?

Page 4 - Seeking answers? Join the Tom's Guide community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

blazorthon

Distinguished
Sep 24, 2010
761
0
18,960
[citation][nom]Shin-san[/nom]The Cell is an architecture, and don't take my word for it: . The SPEs were designed to be scalable in number easily.The one in the Roadrunner is actually an enhanced version of the one in the PS3. IBM, Toshiba, and Sony all lost interest in improving it. The Vita using ARM is a prime example of the lost interest.There's two supercomputers, but how many more? Also, that's actually one of the problems with the PS3. Sony hinted that it's orders of magnitude faster than the Xbox 360, so people designed games for the Xbox 360 thinking porting wouldn't be a problem. It turns out that the 360 in quite a few cases was more powerful. Also, the CPU being in a supercomputer doesn't mean much for video games. It shows that it makes a good supercomputer for those cases it was made for, but doesn't necessarily translate into video games. This is Tomshardware, and by being here, you should know about frame rate benchmarks and how they better show off performance compared to corporate PR[/citation]

The Cell is not an architecture. Sandy Bridge can be scaled too and SB-E proves that, but that doesn't make the i7-2600 and other CPUs based on Sandy Bridge an architecture... It's still a CPU, not an architecture. The architecture for Cell is not the processor itself. No amount of arguing will change that because it is a fact whether or not you dispute it. Furthermore, there were several PS3 super computers. If you think that the Cell can't be used for gaming, then you're wrong.

The problem lies purely in how difficult it is to code for. Vector processing can be used for most workloads that a game imposes on a CPU if developers can figure out how to implement the code for it. Given how difficult it is to code for, the limited success is justified. This is Tomshardware and you should know that frame rates are very subjective and a poorly optimized platform doesn't necessarily give anywhere near the frame rates that it could give if it was coded for properly.

The 360 isn't more powerful, but it is easier to code for and can be more easily optimized for because of this. You should know better than to assume that an non-fully optimized benchmark defines the potential of the hardware.
 

bigdog44

Honorable
Apr 6, 2012
29
0
10,580
Back to the "Infinity".
The name is probably appropriate because hopefully broadband penetration/average speed in the next 5-10 years will be sufficient to end the current hardware cycles. After that its just about accessories and product shrinkage. On a side-note, if the IBox does have a 6-8x performance increase across all components, I think thats pretty respectable. If you look at intel cpus over an 8 year gap in the Extreme Edition category and account for the so-called "Moores law" increase (~ 16x), you will find that with a few exceptions the increase in performance is half that or less. That includes clock rate, IPC, HT, multi-core, buses,etc.
 

blazorthon

Distinguished
Sep 24, 2010
761
0
18,960
[citation][nom]bigdog44[/nom]Back to the "Infinity".The name is probably appropriate because hopefully broadband penetration/average speed in the next 5-10 years will be sufficient to end the current hardware cycles. After that its just about accessories and product shrinkage. On a side-note, if the IBox does have a 6-8x performance increase across all components, I think thats pretty respectable. If you look at intel cpus over an 8 year gap in the Extreme Edition category and account for the so-called "Moores law" increase (~ 16x), you will find that with a few exceptions the increase in performance is half that or less. That includes clock rate, IPC, HT, multi-core, buses,etc.[/citation]

Moore's law has nothing to do with performance. Moore's law refers to transistor density or more specifically, the amount of transistors that can be put in a given area without sacrificing affordability. Also, the i7-3960X is undoubtedly faster than you're giving it credit for compared to the top P4 EEs from 8 years ago.
 

Console_Gamer

Honorable
Jul 6, 2012
4
0
10,510
Damn, i love it when PC fanboys talk about how powerful their PC(s) are and then again you have your clueless "console gamers" (like me) who jumps into the conversation talking about the Playstations and XBOX's and PSP Vitas (LOLxD)... I'm just kidding, so what if you like the keyboard and mouse over the joysticks... most PC gamers are just mad because they don't like using the joystick(s) or maybe playing with "the joystick". it's all personal preferences at the end of the day. shit, i know PC games are supporting gamepads more or more natively these days, and the end of the day it's just all about the money.

Microsoft knows the profits of keeping it within, and Steam service is never getting on Xbox for that matter... why would Microsoft let Steam in when the profit margin is higher keeping the Microsoft store. Xbox live is practically Steam already. it already sells it's software there and it doesn't need a third party (Steam) to sell its proprietary software.

btw, I remember the days of PC gaming... i benchmarked my PC more than i played games on it anyways. upgrading parts as an enthusiast is pretty pricey over time. i always wanted to upgrade my PC and remember never being satisfy with it long enough to completely enjoy playing games anyways. there's always some upgrading that needs to be done to it, overclocking and cooling... and then yeah, facebook and youtube too (oh wait, i think you can now do those things now too with "apps"). :D j/k

sorry for the blabbering but i'm one of those people on tom's hardware who probably don't even care about the article but more about bashing or reading the comments. :D :D :D
 

blazorthon

Distinguished
Sep 24, 2010
761
0
18,960
[citation][nom]Console_Gamer[/nom]Damn, i love it when PC fanboys talk about how powerful their PC(s) are and then again you have your clueless "console gamers" (like me) who jumps into the conversation talking about the Playstations and XBOX's and PSP Vitas (LOLxD)... I'm just kidding, so what if you like the keyboard and mouse over the joysticks... most PC gamers are just mad because they don't like using the joystick(s) or maybe playing with "the joystick". it's all personal preferences at the end of the day. shit, i know PC games are supporting gamepads more or more natively these days, and the end of the day it's just all about the money.Microsoft knows the profits of keeping it within, and Steam service is never getting on Xbox for that matter... why would Microsoft let Steam in when the profit margin is higher keeping the Microsoft store. Xbox live is practically Steam already. it already sells it's software there and it doesn't need a third party (Steam) to sell its proprietary software.btw, I remember the days of PC gaming... i benchmarked my PC more than i played games on it anyways. upgrading parts as an enthusiast is pretty pricey over time. i always wanted to upgrade my PC and remember never being satisfy with it long enough to completely enjoy playing games anyways. there's always some upgrading that needs to be done to it, overclocking and cooling... and then yeah, facebook and youtube too (oh wait, i think you can now do those things now too with "apps"). j/k sorry for the blabbering but i'm one of those people on tom's hardware who probably don't even care about the article but more about bashing or reading the comments.[/citation]

I don't waste my time benchmarking my computers more than I do other work. I only benchmark immediately after a change that makes me want to do it. Upgrading also isn't expensive unless you don't do it properly. If you upgrade often, then you should sell the previous components right before or right after the replacement parts come out, if you can. That way you get top dollar for them (unless there's something wrong with them in which case you should have been more careful) and the upgrade is almost free. If you can't sell them right before or right after, then at least get close and you'll get plenty of cash for the previous part unless it was junk.

Heck, I've had some occasions where an upgrade made more money than I paid for it. Most PC gamers dislike console games not because of the controls, but because of the ancient graphics quality. The games look like they're from more than five years ago yet they also have low frame rates and such. This is not appealing compared to even a $500 computer and heck, an even cheaper computer can still be several times better than a console.
 

Console_Gamer

Honorable
Jul 6, 2012
4
0
10,510
[citation][nom]blazorthon[/nom]I don't waste my time benchmarking my computers more than I do other work. I only benchmark immediately after a change that makes me want to do it. Upgrading also isn't expensive unless you don't do it properly. If you upgrade often, then you should sell the previous components right before or right after the replacement parts come out, if you can. That way you get top dollar for them (unless there's something wrong with them in which case you should have been more careful) and the upgrade is almost free. If you can't sell them right before or right after, then at least get close and you'll get plenty of cash for the previous part unless it was junk.Heck, I've had some occasions where an upgrade made more money than I paid for it. Most PC gamers dislike console games not because of the controls, but because of the ancient graphics quality. The games look like they're from more than five years ago yet they also have low frame rates and such. This is not appealing compared to even a $500 computer and heck, an even cheaper computer can still be several times better than a console.[/citation]

i don't want to nitpick what you said here but some things you said are agreeable. it's true that consoles are always going to inferior to PC's. a console is practically a PC in a small form factor, and i'm pretty sure you're aware of that already. i've had my 1055t thuban chip for 2.5 years now and it seems like it's almost about time for an upgrade including the mobo, and the 5770 that i have in there. but if i upgrade and try to sell those parts, im still looking on spending atleast 200.00 after selling the old components. of course i can just get another 5770 for 70.00 and put it in crossfire but then again why do that when u can just grab a mid range 7000 series that is on par or better than the 5770 to stay futureproof. one upgrade always leads to another, unless you're like one of those guys that still have a Pentium D combined with a 6990.
 

bigdog44

Honorable
Apr 6, 2012
29
0
10,580
[citation][nom]blazorthon[/nom]Moore's law has nothing to do with performance. Moore's law refers to transistor density or more specifically, the amount of transistors that can be put in a given area without sacrificing affordability. Also, the i7-3960X is undoubtedly faster than you're giving it credit for compared to the top P4 EEs from 8 years ago.[/citation]

Im well aware of what Moores law was as originally intended, but nowadays its seen as a self-fulfilling prophecy in competitive manufacturing and tied to the perception of processing power.
 

blazorthon

Distinguished
Sep 24, 2010
761
0
18,960
[citation][nom]bigdog44[/nom]Im well aware of what Moores law was as originally intended, but nowadays its seen as a self-fulfilling prophecy in competitive manufacturing and tied to the perception of processing power.[/citation]

It was always a self-fulfilling observation. It's still not related to processing power. Some people simply mistake it for being related to processing power.

[citation][nom]Console_Gamer[/nom]i don't want to nitpick what you said here but some things you said are agreeable. it's true that consoles are always going to inferior to PC's. a console is practically a PC in a small form factor, and i'm pretty sure you're aware of that already. i've had my 1055t thuban chip for 2.5 years now and it seems like it's almost about time for an upgrade including the mobo, and the 5770 that i have in there. but if i upgrade and try to sell those parts, im still looking on spending atleast 200.00 after selling the old components. of course i can just get another 5770 for 70.00 and put it in crossfire but then again why do that when u can just grab a mid range 7000 series that is on par or better than the 5770 to stay futureproof. one upgrade always leads to another, unless you're like one of those guys that still have a Pentium D combined with a 6990.[/citation]

You're looking at less of a return on the sell because those are not newer components. I specifically said that in order to get top dollar, they should be sold around the launch of their replacements or a little before that launch. It's a little more risky, but worst comes to worst, you should be able to buy old parts without taking a loss. If you don't upgrade everything often, then you take greater losses.
 

bigdog44

Honorable
Apr 6, 2012
29
0
10,580
[citation][nom]blazorthon[/nom]It was always a self-fulfilling observation. It's still not related to processing power. Some people simply mistake it for being related to processing power.You're looking at less of a return on the sell because those are not newer components. I specifically said that in order to get top dollar, they should be sold around the launch of their replacements or a little before that launch. It's a little more risky, but worst comes to worst, you should be able to buy old parts without taking a loss. If you don't upgrade everything often, then you take greater losses.[/citation]
I think my post was misunderstood. Im not arguing with Moores law as originally intended with regards to transistor density. My point was the implied processing power tied to Moores law by the competitive electronics manufacturers, Intel included. Its like they're saying, "ok now we have 2x the transistors, what are we gonna do with them", and how that translates to the consumer. Its in the "tick-tock"...
 

Shin-san

Distinguished
Nov 11, 2006
169
0
18,630
[citation][nom]blazorthon[/nom]The Cell is not an architecture.
...
The architecture for Cell is not the processor itself. No amount of arguing will change that because it is a fact whether or not you dispute it.[/citation]
IBM calls it an architecture, in which IBM, Sony, and Toshiba hasn't shown any interest in updating. No harping on a website will change the fact that IBM calls it one. There's the PS3 Cell and the PowerXCell, both different cersions.

[citation][nom]blazorthon[/nom]If you think that the Cell can't be used for gaming, then you're wrong.[/citation]People make games on Ti-83s, so that's not the issue. The CPU hasn't really proven itself to be orders of magnitude over the Xbox 360's. The very fact that it's used in a supercomputer doesn't change the fact that there's plenty of Xbox 360 games that run better.

[citation][nom]blazorthon[/nom]The problem lies purely in how difficult it is to code for.[/citation]That's only part of it, but the other part is that it's not as efficient. Any SPE overhead could really knock down the numbers

[citation][nom]blazorthon[/nom]Vector processing can be used for most workloads that a game imposes on a CPU if developers can figure out how to implement the code for it.[/citation]That doesn't help code that runs better on a non-vector PU. This is why we still have CPUs despite GPUs being able to crunch more.

[citation][nom]blazorthon[/nom]Given how difficult it is to code for, the limited success is justified. This is Tomshardware and you should know that frame rates are very subjective and a poorly optimized platform doesn't necessarily give anywhere near the frame rates that it could give if it was coded for properly.The 360 isn't more powerful, but it is easier to code for and can be more easily optimized for because of this. You should know better than to assume that an non-fully optimized benchmark defines the potential of the hardware.[/citation]
Frame rates aren't really subjective. It either has a higher frame rate or not. You are also blaming programmers instead of asking yourself the question: how much of it is the hardware? By now, Sony should have shown third parties how to use the SDK better.

It's up to Sony to show external developers how to get it faster than the 360. Also, optimizing nowadays usually isn't unlocking hardware. It's more often used to get hardware from running like crap, which is a lot harder on a weaker CPU.
 

blazorthon

Distinguished
Sep 24, 2010
761
0
18,960
[citation][nom]Shin-san[/nom]IBM calls it an architecture, in which IBM, Sony, and Toshiba hasn't shown any interest in updating. No harping on a website will change the fact that IBM calls it one. There's the PS3 Cell and the PowerXCell, both different cersions.People make games on Ti-83s, so that's not the issue. The CPU hasn't really proven itself to be orders of magnitude over the Xbox 360's. The very fact that it's used in a supercomputer doesn't change the fact that there's plenty of Xbox 360 games that run better.That's only part of it, but the other part is that it's not as efficient. Any SPE overhead could really knock down the numbersThat doesn't help code that runs better on a non-vector PU. This is why we still have CPUs despite GPUs being able to crunch more.Frame rates aren't really subjective. It either has a higher frame rate or not. You are also blaming programmers instead of asking yourself the question: how much of it is the hardware? By now, Sony should have shown third parties how to use the SDK better. It's up to Sony to show external developers how to get it faster than the 360. Also, optimizing nowadays usually isn't unlocking hardware. It's more often used to get hardware from running like crap, which is a lot harder on a weaker CPU.[/citation]

They named the architecture after the processor built on it. The processor itself is not the architecture. The architecture is not the Cell processor, it is the Cell architecture. The processor has substantially greater throughput when used properly than the Xbox 360's Xenon CPU does. However, orders of magnitude would probably be a huge over-exaggeration.

Did you ever notice how more and more CPU tasks are being made to work on GPUs? CPUs won't be necessary soon enough except for much more specialized tasks and legacy support. Knowqing how to use the SDK better doesn't mean that they can use it fully optimally. It has been used better and improvements have been made, but it's still not nearly perfect even after all of these years. This just shows how tasks that aren't embarrassingly parallel or at least highly parallel are extremely difficult to code for on such a parallel processor with such a more obscure and new architecture to run at full speed. At least with GPUs, it's one task running on many cores instead of multiple tasks running in tandem on many cores.

Basically, with the Cell, developers need to learn to not waste resources at any given time in order to get full usage and performance. This is hard when it's not a single simple operation that gets run over and over on different data like say a compression on encryption program. Instead, the devs need to leverage the SPEs on multiple tasks at once and this is much more difficult to coordinate without wasting a lot of performance. Chances are that the Cell will never be fully used by any game. It has the raw performance and then some, but it can't easily be fully or even well utilized.

Oh and yes, frame rates are subjective. If a game that had devs who somehow managed to write perfectly for the Cell, then it would probably be able to use more than twice as much performance (far more than merely twice as much isn't unreasonable) than the most efficient game on the PS3 right now. Whether or not the GPU could keep up would help decide how much of a frame rate boost you would get, but still, it would probably be a huge improvement. If Skyrim could go from a fairly CPU limited game to a much more CPU independent game merely from being updated to use more modern optimal code, then imagine the difference that would be made by being able to use all SPEs and the core fully instead of just a few and only partially with those few in the Cell.

Optimizing never was unlocking hardware. It was always being able to better use that hardware. Whether or not it was to make crap hardware usable instead of making higher end hardware even better is a worthless sentiment. Besides, that isn't even true anyway. Both Nvidia and AMD keep releasing new drivers to make all of their cards even faster (both companies did so yet again recently too), not just the low end cards which actually tend to get less optimization than the high end, especially with AMD's using a different architecture for their low end cards than for their high end and now their mid-ranged cards too.
 

Shin-san

Distinguished
Nov 11, 2006
169
0
18,630
[citation][nom]blazorthon[/nom]They named the architecture after the processor built on it. The processor itself is not the architecture. The architecture is not the Cell processor, it is the Cell architecture. [/citation]The Cell was named after the architecture. Source: IBM, Sony, Toshiba, and the IEEE

[citation][nom]blazorthon[/nom]Knowqing how to use the SDK better doesn't mean that they can use it fully optimally. It has been used better and improvements have been made, but it's still not nearly perfect even after all of these years.[/citation]So, developers not knowing how to code the chip better won't make a difference? The problem is far more amplified with the Cell. There were unoptimized Cell benchmarks, and it was competitive with a Pentium II-III.

[citation][nom]blazorthon[/nom]Chances are that the Cell will never be fully used by any game. It has the raw performance and then some, but it can't easily be fully or even well utilized.Oh and yes, frame rates are subjective. If a game that had devs who somehow managed to write perfectly for the Cell, then it would probably be able to use more than twice as much performance (far more than merely twice as much isn't unreasonable) than the most efficient game on the PS3 right now. [/citation]Perhaps. Maybe. Then again, what are the chances it'll happen before the Xbox 720 gets released?

[citation][nom]blazorthon[/nom]Whether or not the GPU could keep up would help decide how much of a frame rate boost you would get, but still, it would probably be a huge improvement. If Skyrim could go from a fairly CPU limited game to a much more CPU independent game merely from being updated to use more modern optimal code, then imagine the difference that would be made by being able to use all SPEs and the core fully instead of just a few and only partially with those few in the Cell. [/citation]Something tells me that we might not get that difference. It still has to deal with the GPU. Not only that, it has to deal with that plus feeding the graphics. The moment you diversify tasks, the moment you have more overhead, which slams the Cell

[citation][nom]blazorthon[/nom]Optimizing never was unlocking hardware. It was always being able to better use that hardware. Whether or not it was to make crap hardware usable instead of making higher end hardware even better is a worthless sentiment.[/citation]In the Cell's case, making it perform better than the Nintendo Gamecube
 

blazorthon

Distinguished
Sep 24, 2010
761
0
18,960
[citation][nom]Shin-san[/nom]The Cell was named after the architecture. Source: IBM, Sony, Toshiba, and the IEEESo, developers not knowing how to code the chip better won't make a difference? The problem is far more amplified with the Cell. There were unoptimized Cell benchmarks, and it was competitive with a Pentium II-III. Perhaps. Maybe. Then again, what are the chances it'll happen before the Xbox 720 gets released? Something tells me that we might not get that difference. It still has to deal with the GPU. Not only that, it has to deal with that plus feeding the graphics. The moment you diversify tasks, the moment you have more overhead, which slams the CellIn the Cell's case, making it perform better than the Nintendo Gamecube[/citation]

Alright, so the processor is named after the architecture. It still isn't the architecture and you should have been more specific at the time, but my bad regardless. I never said that better developers don't make a difference. I said that they still probably won't get fully optimal performance out of the PS3's Cell processor. I don't think that there would be such debilitating overhead as you claim. Also, the PS3 performs far better than the gamecube even with the older games, so beating the Gamecube is not an issue. Not even the Wii can come close in performance and it's far faster than the game cube. Heck, the PS2 performs better than the Gamecube and the PS2 most certainly does not perform better than the PS3.
 

Shin-san

Distinguished
Nov 11, 2006
169
0
18,630
[citation][nom]blazorthon[/nom]Alright, so the processor is named after the architecture. It still isn't the architecture and you should have been more specific at the time, but my bad regardless. I never said that better developers don't make a difference. I said that they still probably won't get fully optimal performance out of the PS3's Cell processor. I don't think that there would be such debilitating overhead as you claim. Also, the PS3 performs far better than the gamecube even with the older games, so beating the Gamecube is not an issue. Not even the Wii can come close in performance and it's far faster than the game cube. Heck, the PS2 performs better than the Gamecube and the PS2 most certainly does not perform better than the PS3.[/citation]It's not about running faster than the Gamecube, but running at "not Gamecube" levels. There's those that say that the GC is faster than the PS2, but since the PS2 was first, the Gamecube wasn't put to its fullest potential. Sound familiar? In fact, a lot of the promises that Sony made for the PS2 weren't reached. We've already have seen more of the same.

Also, the PS3 doesn't have to beat just the Xbox 360. It's going to have to get ready to contend with the Wii-U and the Xbox 720/Durango/8/whatever it will be called. The Wii-U comes out this year, and it does seem weaker than expected. Then again, they might have been stuck with early dev kits. The Xbox whatever is expected 2013-2014.

Let's say they "Unleash the power of the Cell!" What graphics card level will the PS3 be pushed to? Will it be at the RSX level? If it will make it like running two RSXs, then that brings it up to around a Geforce 8800GT.
 

blazorthon

Distinguished
Sep 24, 2010
761
0
18,960
[citation][nom]Shin-san[/nom]It's not about running faster than the Gamecube, but running at "not Gamecube" levels. There's those that say that the GC is faster than the PS2, but since the PS2 was first, the Gamecube wasn't put to its fullest potential. Sound familiar? In fact, a lot of the promises that Sony made for the PS2 weren't reached. We've already have seen more of the same.Also, the PS3 doesn't have to beat just the Xbox 360. It's going to have to get ready to contend with the Wii-U and the Xbox 720/Durango/8/whatever it will be called. The Wii-U comes out this year, and it does seem weaker than expected. Then again, they might have been stuck with early dev kits. The Xbox whatever is expected 2013-2014. Let's say they "Unleash the power of the Cell!" What graphics card level will the PS3 be pushed to? Will it be at the RSX level? If it will make it like running two RSXs, then that brings it up to around a Geforce 8800GT.[/citation]

Why does the nearly decade old PS3 need to compete with new consoles in performance? That's what the PS4 (or whatever it get's called) should be for... Also, the PS3 most certainly is not so slow as to run at "Gamecube performance levels" even with the first few games and the inferior level of optimization that they had. It was still far faster than the Wii (maybe twice as fast, maybe more or less than twice as fast, I would have to check that one) which is something like twice as fast as the Gamecube.
 

Shin-san

Distinguished
Nov 11, 2006
169
0
18,630
[citation][nom]blazorthon[/nom]Why does the nearly decade old PS3 need to compete with new consoles in performance? That's what the PS4 (or whatever it get's called) should be for... Also, the PS3 most certainly is not so slow as to run at "Gamecube performance levels" even with the first few games and the inferior level of optimization that they had. It was still far faster than the Wii (maybe twice as fast, maybe more or less than twice as fast, I would have to check that one) which is something like twice as fast as the Gamecube.[/citation]That's because they fought to keep it from performing at the level. Also, the next console generation is expected to start this year within 5 months. If they are going to unleash anything, it has to be soon, otherwise it'll get to the point where it would be like Atari bragging that they got better performance out of the 7800 over the NES. Impressive, yes, but not that useful.
 

blazorthon

Distinguished
Sep 24, 2010
761
0
18,960
[citation][nom]Shin-san[/nom]That's because they fought to keep it from performing at the level. Also, the next console generation is expected to start this year within 5 months. If they are going to unleash anything, it has to be soon, otherwise it'll get to the point where it would be like Atari bragging that they got better performance out of the 7800 over the NES. Impressive, yes, but not that useful.[/citation]

If the next PS is better than the next Xbox, then I wouldn't call it not useful unless the extra performance can't be utilized better than the next Xbox can (kinda like the PS3 versus the Xbox 360). If it's even just 10% faster on average in games, then that extra performance can be put to use in some way.

Do they need to do it soon? That would be very important, but need might be too strong of a word. Besides that, who's to prove that Sony won't release anything?
 

Shin-san

Distinguished
Nov 11, 2006
169
0
18,630
[citation][nom]blazorthon[/nom]If the next PS is better than the next Xbox, then I wouldn't call it not useful unless the extra performance can't be utilized better than the next Xbox can (kinda like the PS3 versus the Xbox 360). If it's even just 10% faster on average in games, then that extra performance can be put to use in some way.Do they need to do it soon? That would be very important, but need might be too strong of a word. Besides that, who's to prove that Sony won't release anything?[/citation]Oh yes. With how long graphics-pushing games take to make, most graphics-heavy games started today will be released 2014-2015.
 

freggo

Distinguished
Nov 22, 2008
778
0
18,930
[citation][nom]fb39ca4[/nom]Eyefinity is AMD's multi monitor gaming feature, Comcast has Xfinity.[/citation]

You are right of course. So, make that TWO lawsuits coming ;-)
 

blazorthon

Distinguished
Sep 24, 2010
761
0
18,960
[citation][nom]freggo[/nom]You are right of course. So, make that TWO lawsuits coming ;-)[/citation]

I doubt that there will be legal action taken against MS over the name :)
 
Status
Not open for further replies.