Carmack: Gaming Hardware Reaching Limits

Page 2 - Seeking answers? Join the Tom's Guide community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

JAYDEEJOHN

Distinguished
Moderator
Aug 23, 2006
96
0
18,590
So, its the game devs, you listening Mr Game dev Carmack?
Weve got the hardware ready, wheres the games? I call BS. SUUUUUUURE its the HW. So, using SLI/CF we see scaling at 90% in the higher resolutions. We have Lucid Hydra coming out. Each gen, the cpu "keeps up" by allowing the new gens of gpus to have to go to a higher res.
Make better games, make our cards have to stretch their legs, and turn off the BS tap, cause it aint gonna flow here
 

frost7500

Distinguished
Jun 2, 2009
10
0
18,560
I find it funny how many people here are bashing carmack. Some of you probably don't even realize he created the FPS genre with wolfenstein and then doom. Give the man some credit. When he says we have hit a wall in terms of hardware ask yourself this question. Just who exactly is willing to create a engine that would take advantage of todays current hardware? better yet, ask yourself this: Who is willing to create a engine to take advantage of tomorrow's hardware? Answer: Nobody. In other words, we have hit the wall.
 

LePhuronn

Distinguished
Apr 20, 2007
327
0
18,930
[citation][nom]frost7500[/nom]Who is willing to create a engine to take advantage of tomorrow's hardware? Answer: Nobody. In other words, we have hit the wall.[/citation]

id did with Tech 4. Even the GeForce 6800 Ultra at the time couldn't handle the engine on Ultra quality (500MB of uncompressed textures and all that) and it took a few card generations to get there. And so did Crytek with CryEngine 2 - even now what does it take to run Crysis at full whack?

But I'd have to agree with jaydeejohn here: if Carmack is correct and saying we're hitting a wall then that's teh perfect opportunity to stop make new engines over and over again and start doing something with the kit you have. How about some GAMES instead of glorified tech demos? If the hardware means you've hit the technical limit then put your time into what you're supposed to be doing and make some games.
 

amnotanoobie

Distinguished
Aug 27, 2006
134
0
18,640
He moves on into the prospect of cloud computing, where gamers won't need to worry about the hardware limitations within the home, but the capabilities of the hardware mounted somewhere else.

I find it funny that he suggests cloud computing when consistent performing, low-latency, high bandwidth and cheap connections aren't really available to all. Him saying that they hit the wall on consoles might be true, but the advancements on the PC are leaps beyond what the consoles could offer today.

[citation][nom]frost7500[/nom] Just who exactly is willing to create a engine that would take advantage of todays current hardware? better yet, ask yourself this: Who is willing to create a engine to take advantage of tomorrow's hardware? Answer: Nobody. In other words, we have hit the wall.[/citation]

I think you're forgetting about Oblivion and Crysis, they pushed the hardware available then to their limits. The problem with Carmack's statement is that he didn't clarify that he was referring to consoles, as some of the developers are already at the X360's limits. With PC's developers have around a 1.5x~2x more processing and storage headroom.

I have yet to see a game that properly and fully utilizes any quad-core processor, and push a 275GTX or 4890.
 

mikeyman2171

Distinguished
Aug 8, 2009
2
0
18,510
I think that a wall was actually hit but its not a hardware wall, because innovations will continue to make better, faster, cheaper hardware that was not available before. I think the wall that was hit was the creativity wall. A lot of developers have become accustomed to building the same game they made years back and just make it prettier every other year. The problem with that was that their core audience became tired of the same old thing and when sales started to suffer most of them decided to blame the hardware rather than go back to the drawing board and create something new that we all would want to play. I do not agree with an earlier statement that its the consoles fault because the console market is a completely different market than the PC gaming marker, always has been always will be. Again the fault lies with the developer, Its cheaper to develop for a "locked down" hardware set. You only have to develop for one GPU type and spec one audio codec and one GPU, unless you decide to develop for all consoles but even then your game doesnt have to take up the same amount of space as it would on PC because each console has its own version only on the disc. To do that on the PC would be expensive imagine roughly 30 different versions of Crysis for each different video card out at the time and thats just the NVIDIA stuff!!. Obviously the developer is looking at the return of their investment money and unfortunately for the PC gamer its just cheaper to develop for the console and port to the PC, even cheaper if the target console is the 360 since it uses either XNA or DirectX. Ultimately it really comes down to a lack of creativity in the industry what good is a great engine and awesome hardware if no one develops the entertaining games that take advantage of them?
 

Vermil

Distinguished
Jul 22, 2009
47
0
18,580
But Carmack isn't right. Now, he is specifically thinking about consoles, and what's topmost in his mind is the increasing lifespan of the consoles. And from that he pulls *reasons* out of the air. And goes wrong. Ultimately, of course we will run into physical limitations,.. but please, we're nowhere near yet. Which makes this sort of reasoning just as uninteresting today as it has been whenever it has been brought up in past (remembering a fool who figured 100MHz was a limit and who couldn't imagine 10 million transistors on a chip). Hardware progress slowed down? No it dam well haven't. Look at harddrives and graphics cards for instance, the thremendoes increase in performance in late years. And multi-cores and software multithreading is opening up the lanes for CPUs again. Now, next, the best value from more transistors, will be to increase the processor's numeric performance, vectorized and parallelized, hence Intel Larrabee and AMD fusion. There are immense gains available there. and there are immense software opportunities to utilize such performance.

And all tech & economy thingy futures, points towards distributed computing rather than cloud computing. Oh, I don't doubt that some software companies will be able to charge more and earn more that way, but technical progress don't favor that road. It reached its pinnacle back in the days of the X-terminal. Since then technology have gone way far in the opposite direction, still does and will forever. FFS use your heads! Technology and production economies places immense computing and storage powers, cheaply into every mans hand. What do you do? Centralize all computing and storage? And use network bandwidth to display results? No dam it! You use bandwidth to distribute computing and storage! The push towards cloud computing is driven by other concerns: Taking away power and control from the user and charge more. Just a few years ago, someone else had the same idea. Remember the net-PC?
 

demonhorde665

Distinguished
Jul 13, 2008
802
0
18,930
"It’s great and there’s going to be at least another generation like that, although interestingly we are coasting towards some fundamental physical limits on things. We’ve already hit the megahertz wall and eventually there’s going to be a power density wall from which you won’t get more processing out there."


heard it a milion times

the mega hertz wall isa myth , we never "hit" a wall , just companies got more focused on this whole multicore thing and stopped trying to push speed limits beyound. i doubt we'll hit a "processing wall iether , at least not in what we can ,do physically , perhaps in the way that the consumers dont' need to processor power. new discivers are to be made , who knows what is around tomorrow's corner, i've long ago stopped buying the old story "we arereachign limits"crap along time ago , next year they could find a whole new way to do things and that could set off another technological revolution/evolution. consider this , computers were first born in teh 40's buy the 60's we had discovedred how to fit the switching capacity of a room full of vacumtubes into a single "chip". and today we can fit a million times that on one of these chips with multi cores , so who is to say what we may or may not do in the future ? many peopel say it is impossible to go faster than light but not 2 years ago scientist proved taychons exsist and they do indeed travel faster than light . So i have learned to take these "Limit" predictions witha grain of salt. because if ther is any thing that humans as a race has undoubtably proven , it is the simple fact that "if we can dream it , we can do it at some point"
 

hack__you

Distinguished
Feb 25, 2009
45
0
18,580
I think gaming via cloud computing will only be viable when most of us has at least 5GBps of internet connection, same speed as usb 3.0. On average we're just at 2MBps I think....so maybe in 5 years?
 

Vermil

Distinguished
Jul 22, 2009
47
0
18,580
@ hack_you. Why would it be viable then? Why would it be viable to concentrate computing resources to one spot and distribute video on 5 GBps then? Things like that will ONLY ever be "viable" if there are monopolies driving this. MS and Intel would wish it. I'm sure. They don't want everybody to have virtually the leading edge of technology and virtually the same product and power. Competition is forcing them towards that. But they'd rather sell something that is far, far below their top product, to the mainstream customer, for the same or higher price that they charge today. And then charge along a steep ladder for additional features and power from those who need it or are able to pay more. Intel would love selling the future "Core i7"s only to big computing centres, servers, mainframes and supercomputers for top dollars, and sell future "Atom"s to you, for the same price they sell consumer Core i7s today. And MS want to do the same with OSes. And they're going to sell this with all kinds of BS which, unfortunately, most of you are going to swallow just fine, as soon as they get rid of competition.
If you care at all about the future of computing, use Linux or at least Open Office for as much as possible, and use AMD CPUs (you don't really give up much, Athlon II and Phenom II are very OK and excellent value).
 

thearm

Distinguished
Dec 18, 2008
81
0
18,580
I agree hack_you

@vermil

[citation][nom]Vermil[/nom] use Linux or at least Open Office for as much as possible, and use AMD CPUs (you don't really give up much, Athlon II and Phenom II are very OK and excellent value).[/citation]

You think AMD wouldn't LOVE to be a the only kid on the block? If you do, you know nothing about business. They all think the same way. Intel, MS and Nvida are not the devil. They are successful and have a successful product that dominates. That's all. There was a time when AMD dominated but they didn't have the product to compete. End of story. The ONLY ONLY reason people don't like said companies is because they are not the underdog. EVERYONE loves the underdog. Well, I don't. I go with whoever performs and AMD can't perform... yet.

I'm not even going to get started on Linux you fan boy. Dedication is a good thing, but not in this case.
 

annymmo

Distinguished
Apr 7, 2009
145
0
18,630
There is also some intriguing technologies in development, experimental stage.

Optical computer with optical processors.
Much faster (1000? more? ), less power hungry.

@thearm
Linux performs in many places better of windows.
 

Regulas

Distinguished
May 11, 2008
520
0
18,930
Come on John, you are reaching now dude, cloud gaming. Go back to PCs and think of the kiddie consoles as secondary. If your upcoming Tech 5 engine can not scale up and give my Q9650 & GTX 285 setup a work out then you have not done your job and have been converted to the dark side, aka console junky.
 

AdamB5000

Distinguished
Jan 24, 2006
93
0
18,580
Is it possible that software puts pressure on hardware and hardware puts pressure on software? Perhaps Crysis put enough pressure on hardware that the companies worked hard to push faster video computing. Maybe we need "another Crysis" to show us that our current hardware isn't good enough for our gaming software.
 

eccentric909

Distinguished
Oct 4, 2006
228
0
18,830
[citation][nom]amnotanoobie[/nom]I find it funny that he suggests cloud computing when consistent performing, low-latency, high bandwidth and cheap connections aren't really available to all. [/citation]

"He said that while latency is currently a "fundamental issue," online storage provides many "intriguing possibilities."

I don't get why people are bashing what he said about cloud computing, by saying "But latency sucks!", when he already said that is an issue holding the cloud computing concept back. It's not like he's saying the only way to go forward is with cloud computing, just that he sees the possiblities in that technology when the fundamental issue of latency isn't such a problem anymore.

At least, that is how I read that part of his statement. /shrug
 

eyemaster

Distinguished
Apr 28, 2009
396
0
18,930
I don't know if Cloud Computing is the way to go. Pretty soon we'll see silicone chips change the way they are made. Quantum computers and such are in the future, but we don't know how close. Laser inter connects instead of copper wires and who knows what other tech is brewing in big corporations.

Silicone chips as we know them today will change and the MHz race will change. Maybe even the way we program games will change. Instead of single thread games with a few multi thread blocks, we could have X full threads running (sort of like 2 games running in parallel) but accessing the same memory blocks. Like, 1 full continuous loop for user interface (I know, overkill, but this is an example), another full loop for environments, another loop for drawing the screen, another loop for whatever else. Instead of the current way of doing things which is ONE total loop that branches out (or is it still that way?).
 

bounty

Distinguished
Mar 23, 2006
121
0
18,630
What is the max size power supply the average gamer is willing to live with. Remember the days when you could get a gamming laptop that didn't light you on fire when you used it on your lap? Or when you didn't need 140mm fans and water cooling? Power wall is real folks.

There was a time when every year you could set new land speed records in a car, then it became every 2 years, then once every 5 years. Now it's maybe, every 15 years. Unless there is a fundamental change, quantum computing etc. I expect PC performance to be similar.
 
Status
Not open for further replies.