NVIDIA And AMD Vs. Intel: A New Graphics War Emerges That Should Have Been Avoid

Status
Not open for further replies.
G

Guest

Guest
I have to disagree. The current Gaming market is fetting stale and fall into the same category of "relatively boring products and customer disinterest" because the gaming community is very narrow and if you look at the latest releases of graphics cards you will notice the trend: Yay... Faster GPU... Faster memory... 128 Stream Processors...
Intel is offering a truly new way to bring gaming, physics, GPGPUs, Multicore gaming, etc. all together in a whole new way. I am personally bored of the last, current, and new generations of gaming. Intel's idea will increment the gaming world in GFX details, speed, and effects that the worthless DX10(.1) has to offer. NVIDIA and ATI need this shake-up to reinvgorate the aging technology.
As with computing, gaming should be moving to threaded applications and games instead of just faster GPUs. if you took a current multi-threaded application and worked it on a single core CPU then moved it to a Quad core or more, you begin to multiply the speed of the application.
I think if NVIDIA or ATI had a true multicore GPU (Not the generic x2 or SLI/Crossfire platforms) as Intel is planning and threading on that level instead of the CPU level, the performance will go up 2x and more. Throw Physics processing on one GPU core, sound on another (Processes before the sound card of course), rendering on another, vectors, and so on.
In the future there will be no distinction between a CPU and GPU and a "processor" will perform all these functions on a singular level. Parallel processing is the way of the future.
 

greenmachineiijh

Distinguished
Oct 18, 2005
13
0
18,560
I have to disagree. The current Gaming market is fetting stale and fall into the same category of "relatively boring products and customer disinterest" because the gaming community is very narrow and if you look at the latest releases of graphics cards you will notice the trend: Yay... Faster GPU... Faster memory... 128 Stream Processors...
Intel is offering a truly new way to bring gaming, physics, GPGPUs, Multicore gaming, etc. all together in a whole new way. I am personally bored of the last, current, and new generations of gaming. Intel's idea will increment the gaming world in GFX details, speed, and effects that the worthless DX10(.1) has to offer. NVIDIA and ATI need this shake-up to reinvgorate the aging technology.
As with computing, gaming should be moving to threaded applications and games instead of just faster GPUs. if you took a current multi-threaded application and worked it on a single core CPU then moved it to a Quad core or more, you begin to multiply the speed of the application.
I think if NVIDIA or ATI had a true multicore GPU (Not the generic x2 or SLI/Crossfire platforms) as Intel is planning and threading on that level instead of the CPU level, the performance will go up 2x and more. Throw Physics processing on one GPU core, sound on another (Processes before the sound card of course), rendering on another, vectors, and so on.
In the future there will be no distinction between a CPU and GPU and a "processor" will perform all these functions on a singular level. Parallel processing is the way of the future.
 
G

Guest

Guest
Also take an economics course if you are writing about economics and speculating on a market. This is good for everyone.
 
G

Guest

Guest
the competition will make prices cheaper and make better products at the same time. this is how we win.
 

Slobogob

Distinguished
Aug 10, 2006
8
0
18,510
@JayDog/greenmachineiijh
GPUs are quite parallel - more so than any current CPU now or coming up for the next few years.
 
G

Guest

Guest
Wow...

First off, with Nvidia purchasing Ageia, that's what's going on, JayDogg. Take a look at current happenings in the market first before you make a comment. These X2 and Crossfire/SLI solutions are just the first step in creating a true multi-core GPU. Although I will be honest, the progression is slow, especially for AMD.

Secondly, these "wars" are a good thing, but only when they're not in a stalemate. Having a proper "war" between these graphics giants, means that they are constantly trying to outdo each other. Take a look at AMD vs. Intel a few years ago (yeah, I know it's not GPU, but the point is the same). AMD tried to do something completely different and succeeded (Athlon), then Intel did something different again, and again had a big success (Core 2 Duo). The problem is that AMD is suffering too much to push the envelope. With this calm, Intel and Nvidia can just keep dong the same old thing (come on, what's the real difference between the 8800 and 9600?). This comfort that Intel and Nvidia have right now is what's halting progress.

Although, having Intel step in there could be a good thing and a bad thing. The good is that Intel has the funds to do just about anything. With another push to a new technology (hopefully better), AMD and Nvidia will should respond in kind. The bad news is that AMD may not be able to keep up. Even if they go through the merger (or acquisition or whatever it is), if Intel's solution is a big hit, AMD could just fold. I don't see there being that big of a financial pressure on Nvidia, they have a much larger bankroll than AMD. As an AMD (ATI) fan boy (I like the CCC interface and hardware concepts much more on the AMD side of the fence) I don't want to see AMD go bankrupt.

There is the chance, though, that Intel will not target AMD at all. If AMD falls, Intel will be the only real desktop processor manufacturer. This would cause big problems for the company, something that I'm sure they are aware of. (I don't include Nvidia as a processor manufacturer, because my understanding is that Nvidia's step into the processor market is just another way to off-load the processor from a few calculations the graphics card can do. I could be wrong on that though). Having AMD out of the picture could be a cause for a monopoly investigation in the processor market. No business wants to go through an ordeal like that.
 

mbulut

Distinguished
Apr 19, 2008
2
0
18,510
Using general purpose architectures it is diffucult to compete with dedicated structures. Integrating the dedicated chips(sound, graphics, physics, ...) into a single chip sounds like the best way to go. Human brain is a good example of such a structure.
 

mbulut

Distinguished
Apr 19, 2008
2
0
18,510
Using general purpose processors it is difficult to compete with dedicated ones if performance is the number one parameter. Integrating dedicated chips (sound, graphics, physics ...) into a single chip sounds like the best way to go. Human brain is a good example of such an integrated structure.
 
G

Guest

Guest
in my opinion ,the war will benifit customer,cause the war isn't only a market competation but also a technique competation . Mic beat APPLE exactlly by market nor their technique .but vedio card was no the seem as OS .it was totally depend on its technique ,the apperance of gaming ,the 3dmark result it has .in that way ,if inter do better than amd and nvidia . i think The AMD and NVIDIA will do better in the next season or week or day,hour,min,s.......
both in their price and technique...
thanks in advance .i am not good at using writen englich..
 
G

Guest

Guest
Competition is good indeed for consumers. However the markets we are discussing are oligopolies with a tendency towards monopolies. For economists this is far from perfect competition. The highly specialized nature of these markets creates these oligopolies, which aren't known to exhibit great competitiveness. Usually collusion is more likely. In that sense the current competition isn't really a big "force". Despite this it could, and probably would, be pretty bad if Intel would manage to capture a large market share in both markets (read CPU and GPU). The extent to which such a (semi) monopoly is bad for consumers depends on the driving forces in these markets. If software developers push hardware requirements then hardware manufacturers would have to keep up with demand. If they don't, a competitor will enter the market and supply the good (CPU's / GPU's). At least the presence of such a threat will reduce the exploitability of the market. If software developers however follow hardware developments, i.e. use whatever hardware is available, then a hardware monopoly is much more exploitable. Hardware manufacturers could "milk" certain technologies endlessly. In any event I think the demand for better hardware is very limited, which hampers progress. The number of gamers that really demand better graphics and act like it (e.g. buy high end graphics cards /CPU's)is limited. Only some of the best selling games have high end graphics and it is quistionable to what extent these graphics influence sales. I think it boils down to gameplay, i.e. software development is more important. I mean consoles have ok but not great graphics, and they capture a large part of the gaming market. With respect to the pc two major titles come to mind. WOW and COD4, both big sellers and both have "ancient" graphics (pls don't start on COD4 graphics they are worse than HL2, period.) In any case it will be interesting to see how matters unfold.
 

korsen

Distinguished
Jul 20, 2006
40
0
18,580
Hellooooooooooooooooooooo. Programming anyone? Games need to be coded for a very general hardware spectrum. Drivers need to be polished for every game, and every update to every game. There's alot of work going on there.

Even though, of course, it would be the easiest thing on the planet to throw on 256 ROPs, 1024 unified shaders, etc etc and just crank everything up 10x. But who wants to do that? That doesn't make money, and it forces people to innovate. Nvidia and AMD will stick to incremental increases to milk the sector for what it's worth.

Screw that. If i owned one of those companies i'd throw my nuts at the wall and just create this massive card that obliterates my competition and just wait and watch while it takes them 5-10 years to design, code, and push out something that comes close because they're all running around like headless chickens.

I think AMD is never getting the CPU crown back, ever, and unless ATI finds out where it placed it's head when it merged with AMD, they're never getting the graphics crown back either. Statistically, ATI should be crushing Nvidia, but they're lagging 30% behind despite having 30% more hardware. No idea why they didn't take advantage when Nvidia screwed up their drivers for such a long time.
 

campdude

Distinguished
Apr 20, 2008
3
0
18,510
I'm not buying a video card till that Larabee Video Card comes out with Benchmarks and hits the shelves. I dont think i am the only consumer holding thier money waiting to see how this intel thing plays out.
If anything AMD and Nvidia may see less sales up till the release of Larabee.(depending on the success larabee has)
Consumers will wait to have the luxury of three Video Card company choices. .
Rob Enderle might change his opinion if Larabee turns out to be the best Card on the market and even buy one himself.
 

campdude

Distinguished
Apr 20, 2008
3
0
18,510
I'm not buying a video card till that Larabee Video Card comes out with Benchmarks and hits the shelves. I dont think i am the only consumer holding thier money waiting to see how this intel thing plays out.
If anything AMD and Nvidia may see less sales up till the release of Larabee.(depending on the success larabee has)
Consumers will wait to have the luxury of three Video Card company choices. .
Rob Enderle might change his opinion if Larabee turns out to be the best Card on the market and even buy one himself.
 

animehair

Distinguished
Apr 16, 2008
10
0
18,560
i agree with some of the posts regarding how this will only be a very good thing for the consumers in the end. i think we will see some much needed technological advances due to this new competition. Personally the whole sli/crossfire technologies went over my head a bit. I mean to expect people to buy two graphics cards always seemed silly to me. It really turns me off to computer gaming...I would much rather buy a p3 or xbox360 console because I feel as though I was being taken advantage of in the wallet with computer gaming.

However I do like that Nvidia is coming out with dual gpu cards...and I think that dual/quad core gpu's make more sense than sli/crossfire setups.
 
G

Guest

Guest
@campdude:

I doubt Larabee will be much more than a better performing version of current IGP's like AMD's 780G. At the time it doesn't sound feasible to actually make it that powerful, think of it, is it easy to jam an 8800GTX (just an example) into a CPU? I think not.
 
G

Guest

Guest
Hmmmm..... I can't help but think that greenmachine and jaydog are payed Intel shills, it's not possible to double post by accident with different screen names, LOL. They probably both copied and pasted the same prepared post 2 minutes apart.

Besides, that comment about "multicore GPUs" is just retarded, that whole 320 stream processors(you know, parallel processing: "the future") bit is the same as multicore, but done much better.

 

holmebeef

Distinguished
Apr 20, 2008
1
0
18,510
Boo Hoo ... we don't want a war ... sniffle, sniffle ...

This isn't the US in Iraq Rob and people aren't going to die. War (or competition to be less melodramatic) drives innovation, better prices and better customer service. This is the best thing that will happen to PC graphics in a long time. Perhaps we can finally solve the heat dissipation / huge energy consumption graphics cards are currently disabled with.
 
G

Guest

Guest
A few points:
Driving forces behind faster/better graphics computing is always going to be a mix between software and hardware engineers. Software is always going to require better ways to get certain results that they are looking for in their games. Faster shading, texturing, anti-aliasing etc. But solving these problems can occur in so many different ways which are completely invisible to the programmer and is much more of a hardware problem that is solved by nVidia/ATI. The change that Intel is bringing to the table is a reaction to the software's needs but is bringing new ideas and ways to solve these same problems, because they have different ideas about the hardware solutions. When the limitations of hardware start to show it's always good to explore other hardware solutions even if that means you rock the programmer's world. (64-bit computing, SSE, Multi-Core, even GPUs weren't standard back in the day)
Plus, Intel's Larrabee is "supposed" to introduce more than just graphics power. Any computation that would benefit from stream processing could take advantage of this new hardware. Even if they don't break into the graphics market directly they will have plenty of other avenues to explore and gain momentum. I like hardware that does more than less, and if I can get my graphics card to help out in my other applications I'd be all for that.
 
Status
Not open for further replies.