Crysis 2 is a 'Solved Challenge' for Consoles

Page 3 - Seeking answers? Join the Tom's Guide community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

chickenhoagie

Distinguished
Feb 12, 2010
311
0
18,930
who cares if it comes out for consoles? if people want to play the storyline than so be it. not everyone can afford a gaming computer, so they like to play all their games on consoles. stop bashing on consoler users just because you (select PC gamers) like to spend thousands of dollars on a PC just to play a freaking video game. This is coming from someone who has a 360 AND has a gaming desktop + a gaming laptop, so call me biased, but i'm not. I think graphics of this game will look perfectly fine on the 360 and of course flawless on the PC - this is how it always works people..nothing new
 

Userremoved

Distinguished
Feb 27, 2010
157
0
18,630
[citation][nom]chickenhoagie[/nom]who cares if it comes out for consoles? if people want to play the storyline than so be it. not everyone can afford a gaming computer, so they like to play all their games on consoles. stop bashing on consoler users just because you (select PC gamers) like to spend thousands of dollars on a PC just to play a freaking video game. This is coming from someone who has a 360 AND has a gaming desktop + a gaming laptop, so call me biased, but i'm not. I think graphics of this game will look perfectly fine on the 360 and of course flawless on the PC - this is how it always works people..nothing new[/citation]
My old business Compaq from 2006 with an upgraded card (9500gt) and a single core can play the game.
 

Startingline13

Distinguished
Oct 17, 2008
38
0
18,580
Actually, I had the opportunity to use the program at GDC this year on a number of occasions, along with a few training sessions and quite frankly, it looks amazing. With four screen running at once, the developer has the ability to drop in and out of the development kit on the fly with either of the consoles, as well as the PC, while another individual is making changes to the level.

The visual fidelity was obviously superior on the PC, however it still looked incredible on the consoles. It's the same engine, just some graphical fidelity is removed (Ex. draw distance, texture resolutions, AA on PS3)

Overall, you won't be dissapointed in what you see at all. Believe me.
 

Regulas

Distinguished
May 11, 2008
520
0
18,930
[citation][nom]soldier37[/nom]Unless this is a port from consoles, of course the console versions will look worse! Their generations behind current PC hardware! I have a quad core at 4ghz,8Gb DDR3,a DX11 5870 and 2560 x 1600 res on 30 inches and the console games run at 720p no AA or AF even on the best HDTV it wont compare. And I have a PS3 also on a Samsung 46" LED HDTV 240htz but use it only for media/Blu Ray.[/citation]
You must be over-clocked to have 4Ghz? I leave my Q9650 at stock speed (3GHz) and it does fine as does my GTX 285. But I am pushing to 1920 x 1200 on a 24" monitor, your beast has allot more pixels. I do have 8Gb of DDR2 1066 RAM (got a great deal at newegg) but it is not being used right now, XP 32 bit. I looked forward though and know one day probably soon I will buy a OEM copy of Windows 7 64bit and another HD and dual boot my gaming rig.
PS all my real life computing (banking, online shopping, etc.) is done on my Ubuntu Laptop that came with 7 but I wiped it for Linux.
 

Regulas

Distinguished
May 11, 2008
520
0
18,930
[citation][nom]billj214[/nom]All I can say is with a Crysis Engine running on the Xbox you can expect to see a lot of "Ring of Death" consoles due to overheating![/citation]
I love it and it may be true.
 

Regulas

Distinguished
May 11, 2008
520
0
18,930
[citation][nom]soldier37[/nom]Unless this is a port from consoles, of course the console versions will look worse! Their generations behind current PC hardware! I have a quad core at 4ghz,8Gb DDR3,a DX11 5870 and 2560 x 1600 res on 30 inches and the console games run at 720p no AA or AF even on the best HDTV it wont compare. And I have a PS3 also on a Samsung 46" LED HDTV 240htz but use it only for media/Blu Ray.[/citation]
You are missing out on a couple really fun games if you only use your PS3 for Blue Ray movies. Ratchet and Clank, Tools of Destruction is great and I hear the new one is even better. Don't waste you money on FF XIII though, it sucks.
 
G

Guest

Guest
Let's understand something. Cryengine2 was so poorly written that a game developed over two years ago still cannot function correctly on the latest and greatest hardware. The problem isn't the hardware, the problem is the engine itself. Also looking at the games imagery itself, it's not that great anyways. I mean to say when I bump everything to max visual settings in Crysis my jaw doesn't drop. My goodness, the graphics taxing with 3-d in the new call of duty game is far more complex and and well written. That's why it looks so good and performs so well. I love to see that whenever they come out with new hardware that's got twice the performance and they run benchmarks on it, they insist on benchmarking Crysis. Then they wonder why it still doesn't perform well. It doesn't matter what you are running, cryengine2 still sucks. It will always suck. Please stop running benchmarks against it. It's like running benchmarks against a broken app.... Oh wait, that's exactly what it is... lol
 

bv90andy

Distinguished
Apr 2, 2009
391
0
18,930
I played Crysis on a PC with 1.2 GB of DDR1, AMD 2200 and a 7300 GT and was impressed by the results. With a little OC I was able to play at 30-40 fps on medium to high settings. no AA or other post processing of course.
 

alders

Distinguished
Mar 1, 2010
6
0
18,510
[citation][nom]TA152H[/nom]If that's really your opinion, you've identified yourself as a complete idiot. But, at least, you're open about it.People with a brain prefer the mental stimulation of an intriguing game, not "eye-candy", to use your gay term. It can be pretty, but ultimately, if it's not stimulating in any other way, it's just not going to keep people interested. Ms. Pacman was played a lot more than any of the much more visually advanced games made now (although I never liked it), so game play is pretty important. [/citation]

yer, but graphics at the time of ms pacman (1981) everything was 2D so graphics on ms pacman were basically at the height of graphic ability so you cant compare.

To me graphics are important, but equally important is gameplay.. for example, Far Cry 2, had good graphics, but in my opinion the worst game i have ever played, so i didnt play it, im yet to come across a game that has good gameplay and shit graphics that i havent stop playing cause the graphics are bad.

Crysis for me, brilliant game, i was taken in my the videos and trailers of the visuals, and when i played it, i loved it more, i like the sandbox idea, i like that you can (for the most part) attack from where-ever you want doing whatever you want.
 

alders

Distinguished
Mar 1, 2010
6
0
18,510
[citation][nom]techguy378[/nom]At the time of its release the PS3 had the most advanced graphics of any desktop computer or gaming console. The PS3 has more than enough horse power to play Crysis and Crysis 2 at maximum detail and 1920x1080 resolution.[/citation]

Yer but they dont play at that resolution... also PS3 has 256MB of RAM. when a standard computer today comes with atleast 1024, 4x more memory and gaming computers atleast 2048 8x more memory which is a large reason why consoles dont have things such as AA and all that, not enough memory to cope.

i know there wont be a MASSIVE differenc between dx9 and dx11 but PS3 = DX9 PC =DX11 theres no possible way a console can do dx11...and yes crysis is dx11,

that being said ive seen videos of console version, and it looks pretty darn good. whether or not the xbox360 will explode i dont know, and the ps3 slim will burst into flames is unknown.
 

mlopinto2k1

Distinguished
Apr 25, 2006
817
0
18,930
[citation][nom]afforess[/nom]In English: "We sacrificed graphics on the consoles so console fanboys can pay $10 dollars more for the same game with worse graphics than the PC. Brilliant!"[/citation]
It's people like you that make this world take a step back and say, we're doing something wrong.
 

mlopinto2k1

Distinguished
Apr 25, 2006
817
0
18,930
[citation][nom]bv90andy[/nom]I played Crysis on a PC with 1.2 GB of DDR1, AMD 2200 and a 7300 GT and was impressed by the results. With a little OC I was able to play at 30-40 fps on medium to high settings. no AA or other post processing of course.[/citation]What, at 800x600? I have a Q6600 with 6GB's of ram and a GTX260 o/c'ed and was barely able to play the game on High. "barely able" meaning to actually play THROUGH the game without me wanting to rip my face off.
 

mlopinto2k1

Distinguished
Apr 25, 2006
817
0
18,930
[citation][nom]techguy378[/nom]At the time of its release the PS3 had the most advanced graphics of any desktop computer or gaming console. The PS3 has more than enough horse power to play Crysis and Crysis 2 at maximum detail and 1920x1080 resolution.[/citation]Your being sarcastic.. right?
 

Userremoved

Distinguished
Feb 27, 2010
157
0
18,630
[citation][nom]mlopinto2k1[/nom]What, at 800x600? I have a Q6600 with 6GB's of ram and a GTX260 o/c'ed and was barely able to play the game on High. "barely able" meaning to actually play THROUGH the game without me wanting to rip my face off.[/citation]
Have you tried a lower resolution?
 

nottheking

Distinguished
Jan 5, 2006
311
0
18,930
[citation][nom]shin0bi272[/nom]so if you wanted any eye candy above what dx9 provides ... sorry.[/citation]
Actually, even though the PS3 and 360 DO hafve DX9 capability, they actually STILL won't match up to what you'd see on the PC with a similar level, simply because the console doesn't have the power to drive it all. They're cut-down GPUs, made so that they could be cheap enough for making millions and selling them at a console-level price.

[citation][nom]TA152H[/nom]The only issue would be if they started on the consoles and did a simple port, but it sounds like they have separate development teams for each platform, so presently is probably not a concern.[/citation]
The problem here is that countless other developers have given the same song-and-dance before, and in the end it usually is not what they claim; they in fact wind up making a console game, and more or less porting it to the PC. (a prominent example of this would be Bethesda, and Fallout 3) With many developers, they'll later tack on additional excuses, such as claiming that "they can't make money on PC due to piracy, and consoles are pirate-proof."

So you do have to grant a bit to doubters; it's quite possible that the "separate development" would rest largely in just the interface department, and the other groups necessary to port the game as it's worked on; after all, it's MUCH cheaper to port a game that's already made than to basically make the same game twice in parallel. And virtually all studios are driven by greed.

[citation][nom]cwolf78[/nom]I'm not sure where you're getting your facts, but the RSX is not "simply a cut-rate G71." It's not the same exact chip as some minor modifications were made at Sony's request - not to mention the RSX is designed to make use of the significantly higher bus bandwidth available on the PS3 as compared to the first-gen PCIe bus of that time.[/citation]
Actually, bandwidth differences are a mixed bag; the FlexIO interface could arguably be higher than PCI-e 1.0, albeit the figures given do not figure in overhead, and are uni-directional figures. It's also quite arguable that such amounts are overkill. (as with such cards, it was shown that, for instance, running SLi with 2x8 lane cards instead of 2x16 lane cards made no performance difference) It could be argued that the PS3 might need this bandwidth, though, as with many of the games put on it, it winds up overflowing its 256MB buffer, and having to some of the Cell's XDR; the interface here means that it suffers a far smaller penalty than a PC would in a comparable situation, but it doesn't stop the fact that it's still a situation one tries to AVOID. (since it still hurts performance, even if not as much as before)

The downside is that memory-wise, the RSX has half the memory interface of the G71; it's 128-bit instead of 256-bit. (a GPU always has exactly 1 memory chip for every 32 bits of interface) this basically halves its memory bandwidth, to 22.4GB/sec; this makes it on a par with the G73-based GeForce 7600GT, rather than the 7800 cards. Further nodding to this limitation, there are only 8 ROPs on the RSX, compared to 16 of the G71.

[citation][nom]cwolf78[/nom]But even if it were exactly the same as a PC-variant G71, those have no problems running HDR and MSAA simultaneously. Take FF13 for example. The mode it runs at on the PS3 is 1280x720 at 2X MSAA. And that game does make use of HDR lighting as well. (http://www.eurogamer.net/articles/ [...] off?page=1) I think you might be referring to the issues the Source engine had on Nvidia cards when they first added HDR support (HL2: Lost Coast). When that first came out, only ATI cards could run MSAA with HDR, whereas only one or the other could be enabled with an Nvidia product. But they eventually got MSAA+HDR working on Nvidia within a few patches.[/citation]
Lost Coast was a unique case, actually; it used an "older" HDR method that wasn't DirectX 9.0c; this allowed it to be used on older Radeon 9- and X-series cards, which only had DirectX 9 and DirectX 9.0b support. This older algorithm actually ran worse, (taking both more shader power as well as more memory bandwidth) though, and were only feasible then because the 9- and X-series cards (especially the X800/X850 ones) had unusually strong DX9 shader capabilities, and at the time, Valve was REALLY favoring ATi over nVidia. This method also arguably looked worse than more "modern" DX 9.0c HDR, too.

As for other games, the AA+HDR problem was never resolved with the G7x-based cards. For the G7x-GPUs (including the RSX) the limitation was in hardware, not software, and at a most basic level: modern HDR algorithms (which are what the consoles use) require making use of blending 16-bit floating-point color channels; however, the G7x has a limitation that it cannot stack this blend; if they use it for HDR, it cannot be used for AA, which ALSO requires a blend pass to combine each sub-pixel's sample. So technically, while one CAN force the AA algorithm on such a GPU, one won't see any effect, as because no blending step took place, you'd only see ONE of the 2-4 samples that were taken for the pixel. (careful inspection might show perhaps a very minor "warping" effect due to the fact that AA sub-samples are taken off-center; normally it just has samples from all portions, rather than just one in one portion)

As far as FF XIII goes, you've corrected me on one thing, but it wasn't on HDR; I look at those videos and I can tell you, I don't see any HDR in the game; I see bloom, and lots of it. The lack of HDR in the comparison video (in EITHER side) is glaring; the camera makes a lot of transitions from different light-level areas, where HDR WOULD'VE adjusted the exposure level, to yield the whole "dynamic" effect. So apparently, Square/Enix opted to select AA over HDR this time; I was mistaken to say "always," then. However, this still underscores my point, that the PS3 cannot do AA+HDR at the same time.

[citation][nom]cwolf78[/nom]That being said, I've been impressed with what they've been able to do with the PS3 as of late - using the additional Cell cores to add graphical features and effects and quality of those effects far beyond what the RSX alone is capable of rendering (such as in Uncharted 2). Saying that the PS3 is limited to DX9 level effects is not entirely correct, especially if Crytek is stating that they are making full use of all the hardware on each respective platform.[/citation]
To be honest, I've been a bit wary of some claims about using "all the PS3's power." Granted, using the Cell, one could use software to produce effects outside the scope of what the GPU could handle; theoretically any effect could be produced. (this is how effects were done on PCs in the days before the broad standardization of video cards, and in fact was also how graphics were originally done on PCs, back before the GPU ever existed) However, that still comes with a severe caveat: it'd take a lot of CPU processing power as, unlike the GPU, it's circuitry isn't specifically designed for that sort of thing. Similarly, while the Cell has plenty of extra math power to spare due to its design, it is a tad short on instructional capability; the SPEs are in effect, "fast but dumb;" they require the PPE to instruction them, otherwise they sit idle.

This makes them worthless for running the main game core, the main task of the CPU; this is why they typically go largely unused in games. However, if a big SIMD (single instruction, multiple data) task comes up, it will be a big enough load with few enough instructions that the PPE would be able to keep the SPEs occupied for a while, leaving it time to do its own tasks after instructing the SPEs. This happens constantly if the chip is handling media, such as a blu-ray movie; most instructions will entail thousands, if not millions, of math operations before the next instruction comes up. In this case, it will in fact be the PPE that will be idle most of the time, with only its comparably weak math portions (which are less than that of a single SPE) being used most of the time. Hence, the more "hybrid" design of the Cell was aimed not so much so that the entire chip would be used, but that it could act as one of two chips to whichever task it'd work better at, like a square-shaped block that could morph into a triangle to fit whichever hole it needed to go through.

Of course, it still is POSSIBLE to try to get it all used, but it's a dicey proposition. Some graphical side-effects are a safer bet than other attempts, because they tend to lean more toward SIMD than traditional instruction-heavy core functions. Though as the GPU can already handle all of the simpler functions for graphics without sweat, that does limit the functions you'd want to try to more complex ones. I'm definitely not saying saying it can't be done, (it obviously has) but it is certainly something to be cautious with, as once you factor in all the read/write operations to pull/replace the data from the CPU side to the GPU side, and then process the pre-effect frame and work over it, you could wind up running out of PPE power very fast, slowing the whole game down to the point of being unplayable. My estimate is that you could probably get up to two, maybe three full-screen effects out of it, depending on complexity; a nice visual boost, but perhaps equivalent to only a 20-30% improvement over the base GPU power. (plus the benefit of circumventing the GPU's own hardware limitations)

[citation][nom]techguy378[/nom]At the time of its release the PS3 had the most advanced graphics of any desktop computer or gaming console.[/citation]
Actually... It was outdated before it even came out. Re-read all that I've said about it being a slightly-cut-down G71, which is the wide-accepted description of the core. That means that higher-end full-blown G71 card (such as the 7900GTX) that were out a few months before the console had quite a bit more core GPU power, as well as VASTLY better memory bandwidth, as they had full 256-bit memory interfaces, compared to the gimped 128-bit one of the RSX. (and to say nothing of the Radeon X19x0 series cards, which were even beyond that)
 

falchard

Distinguished
Jun 13, 2008
421
0
18,930
Might be popular to port a PC shooter to the consoles instead of the other way around. PC shooters tend to be on a different league then ones designed specifically for consoles.
 

nottheking

Distinguished
Jan 5, 2006
311
0
18,930
[citation][nom]falchard[/nom]Might be popular to port a PC shooter to the consoles instead of the other way around. PC shooters tend to be on a different league then ones designed specifically for consoles.[/citation]
Indeed, I personally favor this approach, and also why I feel that EVERYONE would be better off if developers would not strive for first-day-console releases. Instead, release the PC version first, then port it to the consoles in a few months. Hence, the PC version won't run poorly, and the console version will get something it can't get otherwise: forced creativity. Because they'd be trying to shoehorn in the PC version's features into the console, developers will be FAR more likely to find ways of pushing the console's envelope. The result: console games that look better than otherwise.

This approach works; just look back at what Lucasarts did, when they ported most of their PC games to console. The crowning achievement there was managing to squeeze the virtual entirety of Indiana Jones and the Infernal Machine onto the Nintendo64. They wound up re-writing the entire console's microcode, inventing a few new texture formats and compression algorithms, and doing other things in the process, but demonstrated capabilities far beyond what anyone thought a console of the day could do. I just wish I could see this sort of thing today; the closest thing I've seen to this sort of "pushing the envelope" was The Conduit on the Wii, where they essentially tried to shoehorn a 360 game to the Wii. (just skipping the original 360 development)
 

badaxe2

Distinguished
Aug 27, 2008
180
0
18,630
I wonder how much money they spent trying to get their new engine to run on consoles.....probably enough to be worse off than if they released Crysis 2 as an optimized PC exclusive.
 
G

Guest

Guest
I wonder why they don't use Shiny entertainment's approach in their game 'sacrifice', by increasing visual quality depending on the framerate.
As long as you can set your desired framerate or quality level the engine can choose the variables and display the game depending on your hardware.
In the past I set that game to 25-30fps, and with a hardware upgrade there comes a visual quality upgrade!
 
Status
Not open for further replies.