Sony Working on Multi-Core Design for PS4?

Page 3 - Seeking answers? Join the Tom's Guide community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]sonnywoj[/nom]yeah, its to bad the world ends in 2012, wont even get to see ps4....lol[/citation]
lmao, that's pretty good. I'll be offering anyone pennies for all their good worldly possessions since they won't be needing them after 2012.
 
[citation][nom]stevo777[/nom]I have nothing against PC gaming. But, the advantage of PC gaming vs. PS3 is being overstated. If anyone pays attention to news, many entities are building supercomputers with chained PS3's as they are very cost effective and have high throughput. Also, in cases like folding@home, the PS3's are faster than anything else. In case people don't know what throughput is, a system is only as fast as it's slowest part. PC's have slow parts--especially, the very antiquated x86 lineage. Because of legacy concerns, the PC has many bottlenecks that do not exist in the PS3. On the flip side, the PS3 does lack RAM amount, yet that RAM is fast. With the Cell, legacy was not a concern, so innovation was possible. Intel seems desperate to hang onto the x86 strategy, so PC's will still have the same bottlenecks going forward. True, you can keep boosting the GPU tech, but, there is no way around the rest of the system. Consoles on the other hand don't have to worry about such problems, so a system can be designed from the ground up that is very fast. RAM should continue to be the weak point of the consoles, but they can still be very robust. I think some heads will be turned this year when Gran Turismo HD comes out (GT5). This game should showcase some of the true potential of the PS3.[/citation]
I know you love the PS3, but many developers have stated that they have had trouble coding games to get them up to the same framerate as the 360. Neither console can even approach the framerates available on a decent game computer (not to mention resolution and filters/effects). Also your information on Folding@home is wrong. The ATI and Nvidia clients put out far more computational power than any others with Nvidia far ahead due to Cuda. If/when we see a directcompute or OpenCL client the ATI cards will be at or above the Nvidia, with both far ahead of all the others.
 
ugh, sony SHOULD use larabee, not as a replacement for the GPU, but as a replacement for the CELL!!

oh, and lets not forget the little down side of the xbox coder friendly multicore arch. its SLOWER! its NOT AS POWERFUL!! As sony proved with the PS3, and intel has proved with larabee, it is clear that the common GPU is here to stay for a long while! rendering is hard work, the hardest, and the only thing that can do it fast ATM is a dedicated GPU.

what is NOT here to stay is the idea of a big general purpose multi core CPU! like that in the xbox! truly general stuff that needs this arch is a fairly light work load, the CELL was absolutely the right idea, one small cheap general processor to manage everything, and lots of more specialised worker cores to do the grunt work. this is the idea behind GPGPU and should be the idea behind larabee, not graphics.

the CELL was not the problem with the PS3, the graphics card was, it wasn't powerfull enough, and having to use the CELL to help it along is what makes the PS3 so hard to code for.

my perfect next gen console would be just a larabee (or other sell like thing) for processing and some powerful GPU for graphics. or a cheap single core processor for management and a really powerful GPGPU enabled GPU for graphics and other intensive work.

I think the former is actually more likely (even if not with a larrabee) because GPGPU is better for PCs where the case is that some times you need Graphics power, and sometimes you need that power for other things. in a games console, you need both at the same time.
 
[citation][nom]commandersozo[/nom]I bet it would play Crysis!I'm sorry, I swear! I won't say it again, I... holy crap, who hired ninjas! No! No! In all seriousness now: If a more typical multicore processor woul make the system easier to understand for developers, then I say go for it. Yeah, Sony can justify it by saying "It's all part of the plan" and "It extends the lifecycle" but if developers don't develop good games, what's the point?[/citation]


How about starting off by using a multi core salution to begin with. the Xbox's 3 core system was interesting but being the only multicore console im not even sure if it contributed to the system over say another multicore system since there isnt one. Not sure if its even nessisary ina console. Though it would add alot of cost to a console maybe dx 11 and multi gpu could be viable? at least it would push coding for multi gpu and dx11 games.

I dont actualy care what they do with console. as long as it makes these developers make games for my PC with updated coding for all the new and not so new goodies that havent really been optimised for yet.
 
I say no to the intel chip anyways, they are way to expensive and they won't make the money they need if they keep it in house or co-develope with IBM. That is why Microsoft got screwed with the orginal XBOX is that they were using the expensive PIII chips from intel and had to pay out the ass for them. Thus unable to reduce cost from them.

I think that unifying the graphics with the main processor will also be beneficial in that it can share the same memory source. I hope they also put more RAM in the new consoles, 256 or even 512 doesn't seem like much these days graphically speaking.
 
Status
Not open for further replies.