Carmack: Gaming Hardware Reaching Limits

Status
Not open for further replies.

lancelot123

Distinguished
May 12, 2008
44
0
18,580
I try not to listen to this crazy old man. He makes fine games, but it seems like he doesn't think before he talks. It is always one crazy comment after another.
 
G

Guest

Guest
Didn't he say a little while back that there's more to be seen from the PS3 but that it's just "too difficult" to code for? I can't imagine that somebody out there won't finally get it right and build a PS3 optimized engine or SDK...but I guess stranger things have happened.

On the other hand, some of the best games I've played aren't on the cutting edge of technology. Photo-realism alone will not make a great game.
 

Dave_69

Distinguished
Mar 27, 2009
68
0
18,580
where gamers won't need to worry about the hardware limitations within the home, but the capabilities of the hardware mounted somewhere else.

And this is why LAN parties are so fun - the connections are so much quicker! (I'm looking at you, Blizzard!)
 

fuser

Distinguished
Aug 4, 2008
115
0
18,630
"He moves on into the prospect of cloud computing, where gamers won't need to worry about the hardware limitations within the home, but the capabilities of the hardware mounted somewhere else"

Please! Cloud computing for games? I have enough trouble streaming low res porn over my crappy AT&T DSL line. I think I'll stick with games that run on MY video card. Thanks.
 

nukemaster

Distinguished
Moderator
I think Carmack just likes to hear himself talk nowdays.
ID has almost always had great game engines, well optimized and all that. Keep doing that. The Quake III engine was so good to play and work with(modder friendly).

As far as cloud computing, I would really worry about latency since even to good servers for games i get 40-60ms (17ms direct to a friend 4 hours away). So they would need servers in almost every large city for this to work.
 

chripuck

Distinguished
Oct 30, 2008
85
0
18,580
I find it hillarious that there hasn't been any decent jump in video card technology in the past 2 years yet my $180 GTX 260 can play any game at 1680x1050 with most if not all eye candy on, a resolution just a few years ago was unheard of.

I'll believe that hardware power is becoming an issue when gaming actually pushes my card.
 
G

Guest

Guest
I think Carmack has been pretty accurate when talking about hardware and directions. He's one of the leading...if not the leading engineer on game engine design...and you have to have a high level understanding of hardware to create engines.
 
G

Guest

Guest
you know what this proves

pc>consoles. yet again. lol hehe :) btw how long have these consoles been going? 4 years almost? in tech terms, if i am right in saying, that is actually pretty darn old. gfx cards 4 years ago now are totally demolished by the power of today, so it is actually amazing games look as good as they do (i know optimising etc but still, give the consoles a little lee-way, right)
 

Kami3k

Distinguished
Jan 17, 2008
575
0
18,930
Hahaha, this fool again? Gee it's seems that when people are way past their prime they say the dumbest things.

I don't even have to say why what he says is insane.
 

Kaiser_25

Distinguished
Jul 9, 2009
90
0
18,580
I work in semiconductor ind...and he is right we are approaching a brick wall...unless there is a fundamental breakthrough...we are going to see a MAJOR slowdown in tech expansion...
 

JAYDEEJOHN

Distinguished
Moderator
Aug 23, 2006
96
0
18,590
The guys being an idiot. Unlike cpus, node shrinks bring more available power and speeds. We havnt maxxed out core speeds on gpus yet, and as long as theres node shrinks, theres more to add.
This is parallel computing, where have more actually helps, unlike cpus, which are more serial, and rely too heavily on software.
Get a grip John C
 

stevo777

Distinguished
Jan 8, 2008
139
0
18,630
I guess the guy's never heard of graphene and metamaterials. Graphene based processors are coming in a few years as well as all kinds of new stuff. I'm getting this info from the EE Times, popular mechanics etc..., which have been around for decades.
 

JAYDEEJOHN

Distinguished
Moderator
Aug 23, 2006
96
0
18,590
Well, we went from what, 650 to 1Ghz last gen. Thats the largest jump seen in awhile. And having more die space just means more shaders etc, which makes the gpu work faster
 

doomtomb

Distinguished
May 12, 2009
310
0
18,930
I think hardware advancements have slowed down but it's not done progressing. There will always be some more advancements. One thought with cloud computing... what if my internet is down, now my console is dead too.
 
G

Guest

Guest
I've had an SLI 8800GT, overclocked E4600 to 3.0Ghz for over 2 years now. When crysis first came out, it was taxing at 1680 X 150, but completely playable. I just popped it in again yesterday, and it runs WAY better with the all the driver improvements since release. I still can't find a game that is REALLY taxing on my (probably below average now) gaming system. I used to buy graphics cards, and in 6 months be wondering why Half Life was struggling so much on my voodoo 1 3Dfx card. Now its been 2 years... and I still don't see the need to upgrade, other than maybe SSD in the near future. I blame consoles... they've dumbed down the graphics and slowed graphical improvements big time, now that everything goes console, then ported to PC. The upside is that maybe game developer will stop trying to make everything so shiny and pretty, and concentrate on what counts... is the game fun, and does it actually ENTERTAIN!
 

gto127

Distinguished
Jan 8, 2008
82
0
18,580
Carmack is a genious but I beleive he's wrong on this one. If the video hardware makers didn't find a way to keep making faster hardware then they would lose money. They willfind a way to keep going faster while lowering power. Whether it be diamond silicon or Quantium mechanics they will find a way $$$$.
 

Shadow703793

Distinguished
Feb 3, 2007
696
0
18,940
[citation][nom]chripuck[/nom]I find it hillarious that there hasn't been any decent jump in video card technology in the past 2 years yet my $180 GTX 260 can play any game at 1680x1050 with most if not all eye candy on, a resolution just a few years ago was unheard of.I'll believe that hardware power is becoming an issue when gaming actually pushes my card.[/citation]
1. Heat is limiting factor
2. We have yet to have more games such as Crysis (as in GPU demand)
3. Most PC games in the future will be ports from console where CPU will become a bigger factor than GPU (GTAIV any one?)
 
Status
Not open for further replies.