If you’re playing PC games and not using supersampling, you’re gaming wrong

Aug 18, 2024
2
4
15
I understand that for some reason, the meaning of SuperSampling has lost its original meaning, however, DLSS, FSR, and XESS are not SuperSampling in how you've described.

These technologies are upscalers; they render the game at a resolution lower than your monitor's native, and then upscale the image, using various methods. You can see a drastic improvment in FPS.

True SuperSampling on the other hand, is about rendering the game at a higher resolution than your monitor can display, and then downscaling back to your target resolution. This will eat your performance for breakfast, especially if you are already struggling to reach 60FPS.
 
Last edited:

jesster10

Honorable
Jun 13, 2018
2
3
10,515
I understand that for some reason, the meaning of SuperSampling has lost its original meaning, however what DLSS, FSR, and XESS are not SuperSampling in how you've described.

These technologies are upscalers; they render the game at a resolution lower than your monitor's native, and then upscale the image, using various methods. You can see a drastic improvment in FPS.

True SuperSampling on the other hand, is about rendering the game at a higher resolution than your monitor can display, and then downscaling back to your target resolution. This will eat your performance for breakfast, especially if you are already struggling to reach 60FPS.
100%. Upscaler is the much more accurate term. But we know how much PC marketing teams care about technical accuracy and informative naming.
 
Aug 18, 2024
2
4
15
100%. Upscaler is the much more accurate term. But we know how much PC marketing teams care about technical accuracy and informative naming.
Yeah, I definitely agree on that point, especially because the technology that popularized the upscaling idea, confusingly has "Super Sampling" in their initialism. However the author of the article is using the traditional definition of the word and applying it to the upscalers.
 
  • Like
Reactions: jesster10

jesster10

Honorable
Jun 13, 2018
2
3
10,515
I love this article because I feel the exact same. DLSS isn't perfect, but let's all go back 10 years and look at our antialiasing options then...

Supersampling, which produced flawless results, but for a cost higher than path tracing these days.
Multisampling was the tried-and-true method, but it missed all transparencies and still had crawling edges and specular flickering in motion. And depending on the engine, it was very expensive to enable.
Post-processing filters like FXAA and SMAA were a great cheap alternative that cleaned up the whole image, but still suffered from blurring and/or crawling edges and specular flickering. (My favorite solution was turning on one option in-game and injecting the other one to clean up the image further, then sharpening the end result.)
The first time I saw TXAA, I knew it was the future - it fixed almost all the annoying motion-based aliasing that other solutions couldn't touch. At the cost of MSAA plus a little too much blur...
Other game engines integrated increasingly effective temporal antialiasing options and I was amazed at the results - games were really starting to look like offline CG.
I suppose Nvidia took what they learned with TXAA and eventually turned it into DLSS.
And of course AMD and Intel have followed suit with slightly-less good options.

In conclusion, even FSR is an amazing option compared to the antialiasing options of the past, and you GAIN performance by enabling it. DLSS is basically magic.
 
  • Like
Reactions: 3060_12GB
Aug 18, 2024
1
0
10
I love this article because I feel the exact same. DLSS isn't perfect, but let's all go back 10 years and look at our antialiasing options then...

Supersampling, which produced flawless results, but for a cost higher than path tracing these days.
Multisampling was the tried-and-true method, but it missed all transparencies and still had crawling edges and specular flickering in motion. And depending on the engine, it was very expensive to enable.
Post-processing filters like FXAA and SMAA were a great cheap alternative that cleaned up the whole image, but still suffered from blurring and/or crawling edges and specular flickering. (My favorite solution was turning on one option in-game and injecting the other one to clean up the image further, then sharpening the end result.)
The first time I saw TXAA, I knew it was the future - it fixed almost all the annoying motion-based aliasing that other solutions couldn't touch. At the cost of MSAA plus a little too much blur...
Other game engines integrated increasingly effective temporal antialiasing options and I was amazed at the results - games were really starting to look like offline CG.
I suppose Nvidia took what they learned with TXAA and eventually turned it into DLSS.
And of course AMD and Intel have followed suit with slightly-less good options.

In conclusion, even FSR is an amazing option compared to the antialiasing options of the past, and you GAIN performance by enabling it. DLSS is basically magic.
Agreed 100% people hate TAA so much like some kind of a cult but it is far better than FXAA and SMAA while being far cheaper than MSAA and SSAA. And as for upscaling, not using is almost as dumb as ultra settings. You gain so little visuals by sacrificing so much performance. Especially when DLSS quality looks better than native in most new games.
 
Aug 19, 2024
1
1
15
I can't say I fully agree. In CP 2077 @ 1440p I can definitely see a difference between Quality DLSS and native. Even if DLSS works wonders, the lower resolution source image still comes through somewhat when there's any movement of your perspective.

Where I find DLSS really shines is the DLDSR technique. Running a game @ 1920p on a 1440p monitor with Quality DLSS renders it internally at the native resolution (then upscales and then downscales it), but it's basically the best anti-aliasing technique bar none. It's one of the best ways to boost image quality in game.
 
  • Like
Reactions: Gerbagel
Aug 19, 2024
1
0
15
Imagine thinking you need UPSCALING* (not supersampling) to play a modern game at decent framerates instead of blaming the developers for making such crappy games you actually need third-party technologies in order to play decently
 

Looming Dementia

Prominent
Jun 18, 2023
4
2
515
100%. Upscaler is the much more accurate term. But we know how much PC marketing teams care about technical accuracy and informative naming.
I know. It's maddening. I've seen so many people make videos to explain how to tweak the settings in games, and they'll repeatedly say something like, "... but we aren't actually upscaling it. We're downing the resolution to 1620p, instead of our desktop resolution of 4K."

No, what the person (in the video I'm thinking of) just described is upscaling, and then he declared that it's wrong that the game labeled it upscaling. 🤦 He's rendering it at 1620p, and the system is then upscaling that image to 2160p. People need to learn what words mean, before they try to educate people about things that make use of those words.
 

sfjuocekr

Prominent
May 29, 2023
3
1
515
If you are playing a game that needs upscaling to run better, you are gaming with the wrong intentions to begin with.

If your system can't handle 4K, like most people, just play the game at 1080p. Can't play at 1080p? You go down to 720p.

You do not need to waste more resources on already struggling hardware...

Oh wait, we are comparing top of the line hardware with unrealistic settings in unrealistic scenarios. Stop feeding the upscaling hype, you don't need it and it looks downright awful.
 

Looming Dementia

Prominent
Jun 18, 2023
4
2
515
If you are playing a game that needs upscaling to run better, you are gaming with the wrong intentions to begin with.

If your system can't handle 4K, like most people, just play the game at 1080p. Can't play at 1080p? You go down to 720p.

You do not need to waste more resources on already struggling hardware...

Oh wait, we are comparing top of the line hardware with unrealistic settings in unrealistic scenarios. Stop feeding the upscaling hype, you don't need it and it looks downright awful.
You know, there are a lot of people out there with 30-series and 40-series cards who will have a much better visual experience running with higher detail settings and DLSS enabled. Much better.

Also, if you're playing on a 60+" large screen, you come out WAY ahead outputting at 4K with upscaling, compared to outputting at 1080p. If you can run with frame gen, even better. That's the sort of thing that this article is talking about, and it's real.

If you're playing on something tiny, like a 20" or 22" monitor, then they aren't talking to you. Go on with your day.
 
Aug 20, 2024
1
0
10
I would personally beg to differ depending on the circumstances. I have compared using DLSS and/or FidelityFX and for my personal taste I prefer to enable these features based on how well they actually work in game on a game by game basis as some games handle visual fidelity better with these technologies better than others. I also am on edge about using ray-tracing depending on the game so that affects things as well quite a bit. If I decide I prefer not using ray-tracing in a game then it frees up a ton of resources I could use in the first place to just achieve a natively rendered high frame rate.

My issue is that the little bit of let's say "definition loss" is extremely noticeable to me in certain games whereas the trade-off for FPS and Graphical Settings isn't worth it to me as the clarity of the generated image in some games is noticeably worse than others when enabled. Despite the graphical fidelity settings being set higher and the bump in FPS, it's just not always worthwhile for me.

I get where you're coming from, But there are trade-offs (albeit small, yes) to this great tech we have available and I really enjoy being able to dial in my games with or without these technologies enabled to fit the exact experience I like from said game 🤷🏼‍♂️
 

sfjuocekr

Prominent
May 29, 2023
3
1
515
You know, there are a lot of people out there with 30-series and 40-series cards who will have a much better visual experience running with higher detail settings and DLSS enabled. Much better.

Also, if you're playing on a 60+" large screen, you come out WAY ahead outputting at 4K with upscaling, compared to outputting at 1080p. If you can run with frame gen, even better. That's the sort of thing that this article is talking about, and it's real.

If you're playing on something tiny, like a 20" or 22" monitor, then they aren't talking to you. Go on with your day.
Yes, I am one of them. I have a 65" screen with an RTX 3080 fed by a 5800X3D and sit at two meters from the screen.

For me upscaling is not the answer at all, it introduces artifacts that distract you from what is going on, it introduces movement where there is none and adds latency by default. When I play games, most of them run at 1080p with integer scale to 4k (2x)... if you wanted to run 1440p scaled to 4k... you are now using a FRACTIONAL scale and you are effectively scaling an image that doesn't fit pixel perfect. Which means? You are looking at a blurry image because pixels need to be extrapolated and then you NEED a filter to make it look sharp again... that is not better, it is worse.

Like I said, if a game can't run at 4k and your machine is already struggling at 1080p... there is absolutely no reason to spends more resources on upscaling.

Besides all of that, all these arguments are based on review systems with specs that the average gamer could only dream of... a system where there is absolutely no need for upscaling. We are already comparing 100+ FPS to 100+ FPS for no reason other than to waste more energy on a game where it is not needed at all.

For me it is MotionPlus all over again: just slap a scalar on the output, apply a sharpening filter and interpolate between frames. Now you changed the image just enough for people to perceive it as better where it is not.