I wasn’t planning on building a new gaming PC but the RTX 5080 just changed my mind

I'm kind of in a similar boat.

-3080 12gb
-13700k
-64gb ddr5
-asrock z790

Fortunately all i need to upgrade is the gpu. So I've been eyeing the 5080 too. Just gonna wait to see the real world benchmarks and reviews. Hopefully it lives up to the hype.

The zotac 3080 12gb has served me admirably for several years but has shown its age in more recent graphically demanding titles. Especially if things like ray tracing are enabled. Not having frame generation tech sucks.
 
I heard TSMC is trying to screw world over right now on 2nm wafer prices, I heard Intel and Samsung are going to come in to compete, with Intel going directly to 1.8nm with panther lake. Might be better to wait for these 2nm and 1.8nm chips, and hopefully pcie version 6.

Yeah I think 1k extra for 5090 is ridiculous, for pitiful extra 16gb vram for AI. I've run lama 70billion models on just CPU and and memory cause nothing fits on these cards anyways. If you heavy into video editing I'd say just go for it, should get a huge speedup on gaming and converting your movies to av1
 
Just out of curiosity, when does the return on investment turn sour on these new GPUs?
When the 2nm wafers start production. Think apple told TSMC to go to hell this year on their prices and a few others. 2026 or end of 2025 looks like when they'll hit, so I wouldn't be surprised with a 6090 for next year on 2nm process.

Then again they may wait for PCIe version 6 to become mainstream before releasing it which would make 5090 the best for many years to come. Panther lake is out, but only to select people, Intel finding overheating issues on PCIe version 6 atm, so we may not see new cpu's hit market till 2026, then 2027 as a mass adoption period when motherboards are out for it.

Can blame TSMC for holding world back this year, sad really. Karma going to bite them for their greed I suspect though, they could have lowered their wafer prices and made tons of money, now everyone will ditch them and get Intel, Samsung and some Japanese company to make them when their processes mature by end of year I suspect
 
Last edited:
My concern is the 16gb of the 5080. Is that going to be enough for the next 4-5 yrs? Is it even enough now for some games?
I'll need to rebuild mine as well as it's running. 5900x so will bottleneck on the new cards.
 
"it’s still quite capable of handling most games so long as I properly tweak the settings"

Uhh no, your computer is quite capable of playing any game released for quite some time. Calling a computer ancient because it can't run every game on ultra is silly. There's also little reason to upgrade an entire PC that's 3 years old instead of just swapping the graphics card.
 
  • Like
Reactions: das_
My concern is the 16gb of the 5080. Is that going to be enough for the next 4-5 yrs? Is it even enough now for some games?
I'll need to rebuild mine as well as it's running. 5900x so will bottleneck on the new cards.
You'd need to upgrade from PCIe version 4 to 5 to use the new cards. I have 5950x as well, my only option on this old PC is a 4090.

I ordered 128gb of ecc memory for it last week so I'll repurpose it in basement as a NAS when ready to upgrade.
 
When the 2nm wafers start production. Think apple told TSMC to go to hell this year on their prices and a few others. 2026 or end of 2025 looks like when they'll hit, so I wouldn't be surprised with a 6090 for next year on 2nm process.

Then again they may wait for PCIe version 6 to become mainstream before releasing it which would make 5090 the best for many years to come. Panther lake is out, but only to select people, Intel finding overheating issues on PCIe version 6 atm, so we may not see new cpu's hit market till 2026, then 2027 as a mass adoption period when motherboards are out for it.

Can blame TSMC for holding world back this year, sad really. Karma going to bite them for their greed I suspect though, they could have lowered their wafer prices and made tons of money, now everyone will ditch them and get Intel, Samsung and some Japanese company to make them when their processes mature by end of year I suspect
I think my question was when do these new GPUs become too expensive to be worth the cost? It seems to me that newer, faster, more powerful hardware gives developers license to be sloppy with their code. I've also read that players are starting to move away from resource-intensive games with fine details in favor of less intensive gameplay releases that are easier to play within a wider community. I'm not a gamer, so I don't know. Duke Nukem, Quake, Doom, and Tetris are about my speed. I'm just curious. It seems at some point the expense doesn't provide much reward. And where does that place console gaming systems in the mix?

Thanks...
 
Fake frames are not a compelling reason. Lol ancient...my gaming PC is 2017. 3080ti can already play 60fps at 4k. That's not enough now? I've never dropped 4 grand on a PC.
 
Sitting here with a 3080 that I was somehow able to snag early on for MSRP and now 4+ years in I'm also thinking about an upgrade this generation. I don't ever see myself buying a top end card like the 5090 but even with the 5080 I'll need to wait and see what the performance / price ratio looks like. The 3080 was $699 MSRP for a GPU on the same 628mm GA102 die as the 3090 i.e. good bang for the buck. Now with the 5080 we are at $999 for the xx80 series card that is on the smaller 377mm GB203 die vs. the 5090's 744mm GB202.

At this point the 5070Ti might be the better move for me. It's closer to a 5080 than a 5070 (same GB203, same amount of memory, 256-bit bus width, etc.) The main thing is that it's $250 cheaper so averaged out over time if I were to keep a 5080 for 4 years, the 5070Ti could be upgraded after only 3 years ($250/year). Of course you could make the same argument for a 5070 at $550 MSRP but I use a 4K TV for my monitor and I'd be looking for a bigger performance uplift to be stay at that resolution in newer titles, otherwise I might as well just stay with the 3080.

I'm also going to keep an eye on AMD and the 9070XT. Not anything I'm going to preorder but if it can outperform the regular 5070 I would consider going that way too depending on where the street price ends up vs. Nvidia.
 
Incorrect, pci-e is forwards and backwards compatible. A pci-e 5.0 gpu will work in a 4.0 slot it will just be limited to pci-4.0 bandwidth.
Happy to hear it, no way 5090 would saturate PCIe 4 when older releases couldn't even saturate PCIe 3.0 slot.
 
I think my question was when do these new GPUs become too expensive to be worth the cost? It seems to me that newer, faster, more powerful hardware gives developers license to be sloppy with their code. I've also read that players are starting to move away from resource-intensive games with fine details in favor of less intensive gameplay releases that are easier to play within a wider community. I'm not a gamer, so I don't know. Duke Nukem, Quake, Doom, and Tetris are about my speed. I'm just curious. It seems at some point the expense doesn't provide much reward. And where does that place console gaming systems in the mix?

Thanks...
Honestly, they are to expensive, 5090 can't even saturate a PCIe 4.0 slot. Why would anyone spend that much money when they could buy a cheap used car for same price? Not many would, only people into video transcoding, getting as much VRAM as possible to run deepseek AI models, or looking for bragging rights getting most frames per second gaming would have them :)

The problem is Nvidia needs competition, and this is what happens when a company monopolizes the market. Low supply equals high demand high prices.

Best to wait for older models to come down in price on eBay.
 

TRENDING THREADS