Maxwell’s Dynamic Super Resolution explored

One of the more intriguing capabilities Nvidia introduced with the GeForce GTX 970 and 980 is a feature called Dynamic Super Resolution, or DSR, for short. DSR is a way for a fast GPU to offer improved image quality on a lower-resolution display. Nvidia bills it as a means of getting 4K quality on a 2K display.

That sounds a little too good to be true, but still, my interest is piqued. Given that a whopping 77% of TR readers have monitors with a resolution of 1920×1200 or lower, I suspect DSR might become a very popular feature among PC gamers. Naturally, then, I’ve decided to take a closer look at DSR, to see exactly what it is, how it works, and what sort of images it produces.

So DSR is supersampling, right?

Let’s start at the beginning. In graphics, antialiasing is any of numerous methods intended to deal with a fundamental problem. GPUs are attempting to represent objects with all sorts of funky contours, from diagonal lines to curved surfaces to complex, irregular shapes, yet the final images must be mapped to a regular, fixed grid of square pixels. That’s less than ideal. The human eye does a spectacular job of recognizing patterns, so we tend to fixate on the jagged edges and crawling effects caused by mapping irregular shapes to a regular matrix of pixels.


Jaggies are easy to spot—and distracting in motion

Today’s graphics card control panels and in-game settings menus are littered with a dizzying collection of antialiasing options intended to address this problem. The various methods generally represent different sets of tradeoffs between image quality and performance.

As I noted in my initial review of the new GeForce cards, I’ve been part of a small chorus of people calling on Nvidia to enable supersampled antialiasing in their graphics control panel for some time now. Supersampling is the gold standard of antialiasing methods in terms of image quality and is widely used in offline rendering by the likes of Pixar. The performance hit is pretty drastic, though: 4X supersampling generally takes four times as long to render. Graphics cards used to offer a supersampling option in their control panels a matter of course, but SSAA has fallen out of favor as more efficient edge-based AA methods like multisampling have grown more popular.

Happily, with the abundant power offered by the GeForce GTX 970 and 980, Nvidia has decided to expose an extra-high-quality rendering mode once again. DSR isn’t quite supersampling, but it is pretty closely related.

Supersampling is often described as rendering a scene at a higher resolution—something like 2X or 4X the number of visible pixels—and then scaling the image down to fit the display. That’s not a bad way to visualize what’s happening, but in a very nerdy sense, that description isn’t entirely accurate. Supersampling is really about taking multiple samples from different locations within the same pixel and blending them in order to get a higher-fidelity final result. Proper supersampling can grab samples from anywhere within a pixel, and the best routines may use a rotated grid or quasi-random sample pattern in order to achieve better results. Even the old 3dfx Voodoo cards, from back in the early days of 3D accelerators, took their samples from a rotated grid.


An example of rotated grid supersampling from a 3dfx Voodoo card

Oddly enough, Nvidia’s DSR really is about rendering a scene at a higher resolution and scaling it down to fit the target display. If you ask DSR to render a game at 4X the native res, say at 3840×2160 when the target display is 1920×1080, then the result should be similar to what you’d get from 4X supersampling.

The benefits are the same. The extra sample info improves every pixel—not only does it smooth object edges, but it also oversamples texture info, shader effects, the works. The performance hit is the same, too. The GPU will perform like it would when rendering to a 4K display, perhaps a little slower due to the overhead caused by scaling the image down to the target resolution.

The twist with DSR is that it can scale images down from resolutions that aren’t 2X or 4X the size of the target display. For example, DSR could render a game internally at 2560×1440 and scale it down to fit a 1920×1080 monitor. That’s just… funky, if you’re thinking in terms of supersampling. But it does seem to work.

In order to make DSR scale down gracefully from weird resolutions, Nvidia uses a 13-tap gaussian filter. This downscaling filter is probably quite similar to the filters used to scale video down from higher resolutions, like when showing a 1080p video on a 720p display. The fact that this filter uses 13 taps, or samples, is a dead giveaway about how it works: it grabs samples not just from within the target pixel area but also from outside of the pixel boundary.

We’ll get into some examples shortly, but the effect of blending in info from neighboring pixels is easy enough to anticipate. This downscaling filter will blur or soften images somewhat, granting them a more cinematic look. The effect is similar to the tent filters AMD used in its old CFAA scheme or, more recently, to the kernel employed by Nvidia’s own TXAA technique.

GeForce 8800 GTS
CSAA 8X
Radeon HD 2900 XT
CFAA 8X – 4X MSAA + Wide tent

Some PC gamers seem to have a strong negative reaction to anything that reduces the sharpness of on-screen images, which is probably one reason why AMD sadly doesn’t offer CFAA with tent filters any longer. I happen to think Nvidia is on the right track here, though. In motion, from frame to frame, these softer images result in less high-frequency noise or sparkle. The images produced by the DSR filter convey a sense of solidity and consistency that I find very pleasing. Unfortunately, I can’t really demonstrate that reality using static screenshots or compressed video; you’ll have to try DSR for yourself in order to see what I mean.

Anyhow, that’s enough theory without many concrete examples—my apologies for that. Let’s look at how to enable DSR and what sort of images it produces.

Enabling DSR

Making use of DSR isn’t too terribly complicated. You need a Maxwell-based GeForce video card, like the GTX 970 or 980, and a relatively low-res display. The DSR options aren’t available with a GTX 980 connected to a 4K monitor, but they do show up at 2560×1440 and below.

To find them, navigate to the “3D settings” section of the Nvidia control panel.

You can then choose which DSR modes will be exposed to games by checking a series of options in a dialog box. I turned ’em all on, so I could choose from a whole host of DSR resolutions up to 3840×2400—four times the native res of my 1920×1200 display.

The “smoothness” slider lets you control the sharpness or softness of DSR’s downscaling filter. The default of 33% works well, in my opinion, but it’s possible to drop the “smoothness” to 0%, which will be very sharp and noisy, or to 100%, which is pretty smeary. I have some examples of what happens when you tweak this slider coming up.

After you’ve enabled DSR in the control panel, it will “fool” your games by offering them resolutions higher than the display’s native capacity. The example above comes from Guild Wars 2. I’m able select from seven different resolutions that are higher than my display’s native size.

First-world DSR problems

Once you’re using a high-res DSR mode, you’ll be faced with some of the same problems that owners of 4K monitors often see.


The “larger” Guild Wars 2 interface at 1920×1200


The same “larger” interface scaled down from DSR 3840×2400 to 1920×1200

Not all games are well-equipped to handle really high-PPI (or effective PPI) displays. Guild Wars 2 has a number of different interface size options, but even the largest one results in some pretty tiny icons and a minuscule mini-map in the example above. The situation is improving as newer games adapt to a high-PPI world, but remember that DSR is most likely to be a live option with older games, those with less demanding GPU requirements than brand-new, top-shelf titles. These issues vary with the type of game and the sort of interface it uses, though. I suspect quite a few older games will work just fine.

Then again, not every new game is demanding enough to make DSR impractical. For the sake of testing, I eliminated all doubt on this front by doubling up on GTX 980 cards.

Yes, that’s right, I used dual GTX 980s in SLI to drive a 1920×1200 display.

That may sound like overkill, but remember that the cards are rendering internally at 3840×2400—and 4K resolutions are still challenging for any single-GPU config. With two GTX 980 cards, Unreal Engine 3 titles like Thief are not a problem.

Other games, like Guild Wars 2, Tomb Raider, and Battlefield 4, need every ounce of power that both of these GM204 GPUs could muster and sometimes still want more.

Notice above that I’ve included results from a native 4K display (at 3840×2160). As you can see, using DSR at 3840×2400 and scaling down to our 1920×1200 display for output is slightly slower than rendering natively at 4K.

DSR image quality in action

You can’t capture screenshots of DSR’s output using traditional software tools. You’ll just end up with a high-res screenshot, not the native-resolution downscaled result. In order to grab DSR images, I fired up my FCAT rig and captured five-second-long snippets of completely uncompressed 1920×1200 video to a quad-SSD RAID 0 array. I was then able to choose an appropriate screenshot from each video clip. That’s what you’re seeing below.

There are lots of screenshots in these comparisons. You can click the buttons beneath the images in order to switch from one shot to the next. We’ll start with a simple example from Guild Wars 2 showing some high-contrast edges. We have examples below from the display’s native 1920×1200 resolution and several DSR modes, up to “3840 DSR” or (rendered internally) 3840×2400. I’ve also added examples using GW2‘s built-in “supersampling” option, its built-in FXAA post-process antialiasing filter, and the combination of 2560×1440 DSR with FXAA.


Flip through the screenshots, and you’ll see the obvious jaggies in the first image start to become smoother as the internal DSR resolution rises. The dark, near-horizontal edge across the bottom of the frame is a nice litmus test. The fine geometry of the vines hanging down on the left side of the shot is better resolved by the higher-res DSR modes, too. Also, the silhouettes of the leaves on the right of the image become smoother and more organic in the higher DSR resolutions.

Now I’ll ruin it for you. Look at the vine that shoots diagonally across the bottom right corner of the screenshot. Now look at it in 3840 DSR mode. You’ll notice that there’s kind of a dark halo effect around the vine in DSR. It’s worst in the 3840 mode since the sky texture happened to be brighter, resulting in higher contrast, as I pulled that screenshot. That’s one downside of DSR’s gaussian downscaling filter. I don’t think it’s the end of the world, but it’s not ideal, either.

I’m not quite sure what GW2‘s built-in supersampling option is doing—perhaps it’s 2X supersampling?—but it’s not very effective at eliminating jaggies.

FXAA is a different case. Created by Timothy Lottes while he worked at Nvidia, fast approximate antialiasing examines the frame after it’s been completely rendered, detects edges in the image, and smooths along them. FXAA is very fast—the performance overhead is almost negligible on a high-end GPU—and can be quite effective, but it doesn’t know anything about the underlying geometry in a frame; it only knows about objects at the pixel level. As a result, FXAA can lead to subtly shifting silhouettes from one frame to the next. I’ve noticed this effect in shooters where there’s a gun positioned in front of the camera. The weapon’s outline can subtly morph as it bobs around. I’ll show you another one of FXAA’s weaknesses shortly. That said, FXAA is quite effective in the static shot above.

FXAA is even more effective when combined with DSR at 2560×1600. The dark, near-horizontal edge in that shot is almost ridiculously soft and smooth. There’s nothing wrong with layering on multiple AA methods that work differently. Doing so can produce stunning results.

This next example shows us how well the different AA modes resolve fine geometry. Look especially at the ropes stretching between the tips of the windmill’s blades as you cycle through the images. Also concentrate on the arrow-like tops of the masts supporting those blades.


The reality becomes immediately obvious: without AA, the ropes are a blotchy, partial mess. Adding FXAA to the mix—with its ignorance of the underlying geometry—does nothing to help the situation.

With more internal samples, DSR is another story entirely. The ropes look a little better at DSR 2560×1600, and at DSR 3840×2400, they’re vastly improved. Although the difference is apparent in screenshots, it’s even more dramatic in motion. Without DSR, the ropes seem to sparkle, and portions of them disappear and reappear as the mill rotates. At 3840 DSR, the ropes appear properly as solid, fine silhouettes moving through the sky.

I’ve gotta say, though, I went back and looked at this same scene on a true 4K monitor. Yeah, uh, that’s even better, to put it gently.

More DSR image quality in action

We’ll use this sample scene from Skyrim to demonstrate the impact of tweaking DSR’s “smoothness” slider. Each of the images comes from DSR 3840 downscaled to a 1920×1200 native resolution.


The differences between the individual steps can be difficult to detect, so you might want to start by clicking on “0%” and then on “100%” to compare between the extremes. I see the contrasts most readily in several places. For edge antialiasing, watch the high-contrast edges around the dude’s cloak and around that dragon thingy on the right. To see changes in sharpness elsewhere, watch the stiches on that guy’s cloak and the interior texture on the wooden beam to the right.

My sense is that Nvidia’s default of 33% smoothness is pretty soft. That choice works out well along the high-contrast edges; those look feathery smooth in the example above. The wood texture on that right-hand beam does lose some of its definition, though, compared to the 0% or 15% levels. So do the stitches in that cloak. As I’ve said, I really don’t mind this minor loss of sharpness, since the softer images produce less sparkle and crawl while in motion.

The loss of definition is even more pronounced at the 66% and 100% smoothness settings, and it begins to look like overkill. I suspect that these higher amounts of blurring might become useful when scaling down from weird DSR ratios like 1.5X or 1.7X. In our example, we’re scaling down from four times the native resolution, so the sharper filter settings are sufficient to make things look nice.

While we’re on the subject of smoothness and blurring, let’s have a look at this same scene using some other AA methods.


Check out the wood grain on that right-hand beam in the shots above. Flip between the native-res or MSAA screenshots and the FXAA one, and you’ll see that FXAA blurs that texture considerably, likely because it detects the grain in the wood as edges and wants to smooth them. Whoops.

Next, watch the wood grain as you flip between the other images and the DSR 3840 shot. Although DSR does soften the final image somewhat, it also samples the underlying texture multiple times per pixel. The result is a more detailed rendition of the wood grain than in any of the other shots. Also, that wooden surface should be represented more consistently from one frame to the next, even as the camera shifts, since DSR pulls multiple samples from the underlying texture.

For my next trick, I’m going to try looking at AA methods using an example from a busy, low-contrast scene. Kids, this is high-risk behavior, so don’t try it at home. Here’s a shot from Crysis 3 with a ton of on-screen detail. This sort of thing presents a formidable challenge for most current AA techniques.

I’ve circled a few areas in this scene where I’d like to focus our attention. The mess of tree branches and foliage on the left is complex and organic. Without AA, it’s just kind of a pixelated mess. In the middle of the shot, the tree branch full of tiny, single-pixel leaves looks like it’s being viewed through a screen door; in motion, these leaves will shimmer and pop unrealistically. On the right is a series of near-horizontal edges that are marred by jaggies.


Let’s talk first about the branches on the left. As we move from the native resolution to 2560 DSR and then to 3840 DSR, two contrasting things happen. The areas of the screen stippled with single-pixel leaves become softer and less contrasty, while the darker tree branches become more clearly defined. The smallest branches, made of fine geometry, are easier to pick out. Pretty much the same is true of the tree branch and leaves in the middle of the shot, too.

On the right, the edges in the building look a little blurry at 2560 DSR, but switching to 3840 DSR increases the clarity of the image while further reducing jaggies.

Overall, my takeaway is that DSR—especially at a 4X ratio—offers a considerable improvement in image quality for this extremely complex scene.

Crysis 3 has a couple of other popular AA methods built in, so I’ve included screenshots from them. SMAA is a post-process AA method similar to FXAA, but its 2X mode incorporates some spatial multisampling, as well. 2X SMAA somewhat reduces the jaggies on the building ledge, but it doesn’t clarify the fine geometry in the leaves and branches nearly as well as 3840 DSR.

Nvidia’s TXAA combines multisampling with a couple of tricks: varying sample patterns temporally and borrowing some samples, lightly weighted, from neighboring pixels. In this last respect, it’s similar to DSR’s downscaling filter. As you can see, TXAA softens on-screen images somewhat. The results, I think, are pretty darned good. TXAA looks strikingly similar to 2560 DSR in this example, and it handles nearly everything well, even the fine geometry in the tree branches on the left.

TXAA has to be integrated into the game engine in order to work, and it’s only available to owners of GPUs in Nvidia’s Kepler and Maxwell families. If it’s an option for you, though, it’s worth considering. I’d say TXAA is a nice consolation prize for owners of Kepler-based GeForce cards who can’t yet get access to DSR.

The final verdict

Now that you’ve seen DSR in static screenshots, I hope you can begin to appreciate what it does. I’m not sure I’d say it gives you 4K image quality on lower-resolution displays, but DSR can offer tangible and sometimes dramatic improvements in fidelity. Other AA methods sometimes come close, as we saw with TXAA in our Crysis 3 example, but DSR at four times the native resolution offers the best image quality currently available without hacking somebody’s drivers. The combo of 4X oversampling for every single pixel and soft scaling to reduce temporal noise is pretty spectacular.

Just keep in mind that it’s expensive. Rendering a scene in DSR’s 3840×2160 mode is at least as taxing for the GPU as driving a native 4K display. You’ll want to use DSR in cases where a game, even at its highest quality settings, doesn’t present any real challenge for your video card otherwise. Happily, the GTX 970 and 980 are fast enough that such situations shouldn’t be terribly rare.

For those more demanding scenarios where DSR isn’t appropriate, Nvidia has another trick up its sleeve known as MFAA. I discussed it briefly here in my GTX 980 and 970 review. MFAA promises the quality of 4X MSAA at the performance cost of 2X MSAA by employing various forms of dark magic. Unfortunately, we’re still waiting for Nvidia to deliver a driver with MFAA enabled. Once they do, hopefully we can take a closer look at it, too.

I have to scale down my sentences to 140 characters in order to fit them on Twitter.

Comments closed
    • ufoman
    • 5 years ago

    That’s an extremely nifty article exploringDSR’s capabilities. I’ve particularly liked the smoothness comparison. I’ll definitely include TR in my daily tech site rotation 🙂

    • WhatMeWorry
    • 5 years ago

    Fascinating article and enjoy the advancement in technological, but maybe I’m too old and can’t really appreciate the improvement, (I remember Zork) but I can’t get over how impressive the non-improved or “worst” case images are.

    To my bloodshot eyes, a lot of the new images are just “different”. Hey, you don’t need a new GPU or monitor, just get failing eyesight.

    • Ninjitsu
    • 5 years ago

    It’s curious, I’ve seen TR mention quite a few times in recent history that SSAA is disabled in the NVCP, but my GTX 560 can use it…maybe they disabled it for Kepler and later archs.

    • dashbarron
    • 5 years ago

    I can hear Damage cackling in his lab, “I used two 980’s for 1080p, and noone can stop me!”

    • itachi
    • 5 years ago

    Well that’s pretty damn cool to see new innovating technology! I personally couldn’t see any difference I bet it’s my browser, surely I can’t be that tired, or maybe… but it does sound good, make it worthy to invest in a “hypothetical 980Ti” even while having a 1200p display like me ? 😉 until I can blast for a 4k screen or a 1440p or 1600p 144hz assuming they’re fast enough for FPS, oh by the way there was that news talking about this on some other websites, I would be very curious what you guys at Techreport think of these displays which should be coming (hopefully) soon ! really excited to hear news from that IPS-Type panel as they call it and hope they make crazy screens, i’d be more interested in that due to the very high budget required for smooth 4k experience.

    • TwoEars
    • 5 years ago

    While it’s a cool piece of tech I wouldn’t personally use it. I’m very sensitive to stutter and anything that isn’t buttery smooth tends to break the immersion for me, but a few jagged edges on the other hand I rarely notice. I rarely focus on detail in games but look at the big picture if that makes sense.

    So I’d rather have the performance overhead to make sure I don’t experience any slowdowns.

    But that’s my personal preference of course, each to his own and all that.

    • l33t-g4m3r
    • 5 years ago

    You can already do this on existing nvidia cards without the blur filter. It’s called downsampling.
    [url<]http://www.neogaf.com/forum/showthread.php?t=509076[/url<]

      • Meadows
      • 5 years ago

      Might not scale down as nicely as with DSR’s Gaussian filter. (I don’t know what algorithm the “regular” downscaling uses.)

      On the other hand, the method you linked works nicely, at least. Sadly, my 660 Ti can at most exceed my native resolution by exactly 50% in both dimensions. If I set it any higher, even just 1% higher, the driver hangs. I’d suspect the analogue connection has something to do with it but I’m not overdriving the RAMDAC because I followed the instructions you linked and left the output parameters at the intended native resolution. (Input parameters, exposed to Windows, were raised.)

      Also, on this analogue connection, I’m not getting the option to set scaling to “GPU”. “Display” is the only option I’m allowed to use, which may also have something to do with it.

      Still, exceeding my native resolution by 50% “for free” is pretty cool: that would equal something like 2880×1620 for someone with a full HD monitor. But I’d still rather have DSR because I’m sure its implementation is better than this.

        • l33t-g4m3r
        • 5 years ago

        Right. It is a hack that has some limitations, but it does work much better with a digital connection where the GPU scaling option is available. ATI owners can do downsampling too, but it’s several times more work, as it requires additional software and steps.

        For the most part, all I have to do is select a higher resolution in game, and the driver does the rest. Works about the same as DSR with low/no smoothness. The screenshots in the [url=http://www.neogaf.com/forum/showthread.php?t=509076<]link[/url<] prove that it works. Considering all that DSR is, I don't think it's a feature "exclusive" to Maxwell, but a software tweak added to nvidia's control panel for those cards, much like FXAA. We'll see if it gets backported to Kepler. I would also prefer a smart filter like SMAA over their normal blur filter. That would be more ideal. Also, considering how well nvidia's scaling works, I don't see why game streaming has to change resolution to 720p. They should just scale down your desktop res to 720, which wouldn't screw up the layout of your desktop shortcuts. Which of course, is really a MS problem that hasn't been fixed since 9x. Will w10 fix it? Probably not. They just finally fixed the command prompt after all, which will be w10's new killer feature, like the task manager was for w8. I digress.

        • jessterman21
        • 5 years ago

        Yep, same here – 150% h/w of native. And yep – this is a much slicker implementation – with a sharpness slider 🙂

          • Meadows
          • 5 years ago

          Regardless, this “hack” did smooth out some games, but many of the edges remained noticeable because of the oddball resolution.

          It would probably be perfect had I managed an even 200%, but at 150% the performance penalty was already insane. I tried it in Metro 2033 for example: I got 10 fps in a specific scene where I’d previously manage 40 instead.

            • jessterman21
            • 5 years ago

            I don’t use it in Metro 2033 because the “being attacked” red-screen thing only lights up part of the screen when I use downsampling…

            And because Metro…

            • Terra_Nocuus
            • 5 years ago

            Having DSR on in Metro 2033 gives me different types of visual distortion / corruption. At one level, it’s fine black lines across the entire screen, at another, it’s red lines. So I just stick with native res.

    • kamikaziechameleon
    • 5 years ago

    Very interesting article indeed.

    • GrimDanfango
    • 5 years ago

    I tried out DSR on Elite Dangerous, running at 5120×2880 on my GTX 980/ROG Swift.

    It looks absolutely stunning! This really comes into its own when you have very high detailed distant textures, like looking at the outside of a Coriolis station (or a planet, etc) – the textures seem to mipmap far less, presumably as the game thinks they’re being projected much larger than they really are, so all the windows and detailing are absolutely pin-sharp! It looks absolutely incredible compared to rubbish old 2560×1440 resolution 😛

    Of course, even a 980 can only just about struggle to run at that resolution – GPU-Z also reckoned I was very nearly hitting the 4GB limit of the card’s RAM too! Still, it was *borderline* playable 🙂

      • Milo Burke
      • 5 years ago

      1440p with DSR sounds fantastic, but 5120×2880 is 14.7 megapixels! 78% more pixels to render than the already heavy load of 4k!

      But I think it will be a nice place when we have the power to run it.

        • GrimDanfango
        • 5 years ago

        Well, it was a fun experiment, but I personally don’t think supersampling will ever be the best way to do things. Real time graphics is all about the most efficient ways to achieve a particular effect, and supersampling is by its very nature a brute-force approach.
        No matter how much GPU power we have available, there will always be a better way to apply it. With a ROG Swift, that way is almost inevitably to make it run buttery-smooth at 120hz+. Temporal resolution is something that has been very much lacking until recently (and will be utterly vital when the Rift comes to market!)

        Supersampling will always have its place making older games look as crisp and detailed as possible, but I don’t think it’ll ever really be a viable technique for anything cutting edge.

          • GrimDanfango
          • 5 years ago

          I can now add that The Vanishing of Ethan Carter looks absolutely sublime at 5120×2880!
          It just about manages a stable 30-40 fps most of the time too… except when you use the wibbly-detective-power and it has to render two scenes at once. Alas, not *quite* enough to make use of G-Sync to smooth things out.

          It really does cause this game to make the leap from very impressive to borderline photorealistic though. It being a relatively slow game, I’m inclined to take the frame rate hit just to enjoy it this way.
          Bring on GM200-über-Titan-thingy! DSR might be a tad more viable with a 384-bit bus and 6-8GB of RAM 🙂

            • jessterman21
            • 5 years ago

            Who’d have ever thought we’d call a UE3 game borderline-photorealistic? Can’t wait for it to go on sale 🙂

            • GrimDanfango
            • 5 years ago

            Just goes to show how little the engine has to do with good or bad visual design 🙂

    • floodo1
    • 5 years ago

    I think i’ll stick with real super sampling on my 290x 🙂

    • Rakhmaninov3
    • 5 years ago

    Great review. Totally would have tipped it if there’d been a tip button available (hint hint)

      • Damage
      • 5 years ago

      You can pay what you want to support TR here:

      [url<]https://techreport.com/subscriptions.x[/url<] In return, you'll get a subscription with some nice features like single-page view, as well, that you can use if you'd like.

    • Chrispy_
    • 5 years ago

    Is it just me, or does FXAA hold up pretty well for “free” AA?

      • Waco
      • 5 years ago

      I force it on in everything just because of that…even if the underlying app is forcing other AA methods.

      I wouldn’t mind playing with this in older games (that have nice UI scaling) on my 770…it’s too bad it’s restricted to big Maxwell.

      • BlackStar
      • 5 years ago

      FXAA is too blurry, but SMAA looks awesome and it is essentially free on a modern GPU.

    • swaaye
    • 5 years ago

    My favorite use for downscaling is my 1360×768 50″ plasma TV. Running a 2720×1536 downscale via my 560 Ti is such an improvement it has eliminated my desire for a TV upgrade.

      • jessterman21
      • 5 years ago

      Yep – downsampling is the reason I’ve settled with my 900p screen until 1080p Gsync monitors hit $199.

    • Meadows
    • 5 years ago

    What’s up with all the “meh, blur” comments by the way? Is this some sort of shill campaign by AMD?

      • Prestige Worldwide
      • 5 years ago

      Mostly just a few individuals who seem to like to post comments to provide evidence that they didn’t read the article.

    • deruberhanyok
    • 5 years ago

    Thank you guys so much for this writeup!

    The very first thought I had after looking at that windmill screenshot was about the plethora of 27″ 1080p monitors out there now, with good quality IPS displays and low response times (of which I have one), and the question of upgrading the display.

    I’d been thinking to pick up a 2560×1440 27″ display to go along with a 970 when I got it, but now that I’ve seen DSR in action I’m not sure it would be worth it. Maybe better to just do the DSR thing and hold out on the monitor upgrade until 4k+freesync is readily available, a few years from now.

    • Nictron
    • 5 years ago

    Thanks for a good review of the DSR functionality. It is always a pleasure finding these on TR.

    I will stick to my 27″ 2560×1440 for now. Though I am tempted by either a Dell, Samsung or LG curved wide 34″ screens with G-Sync running 4K, I can dream :).

    • Bensam123
    • 5 years ago

    You know, stuff like this is great for photos, but if you play it, it doesn’t feel like you’re playing at said resolution when you ‘mess’ up the textures enough. Even though you’re still operating at 1080p, it’ll go down hill from there due to the lack of sharpness.

    I think the best methods will always involve just the edge and hopefully just that, without getting anything else in the scene involved.

      • chuckula
      • 5 years ago

      FYI, I upthumbed your post but apparently there’s some hateraid around.

        • Bensam123
        • 5 years ago

        Are you demanding a pat on your back or are you rabbelrousing?

      • Bauxite
      • 5 years ago

      I just said ****it and bought a tv capable of 4k@60 and low input lag, along with a 980 as the tv needs hdmi 2.0 to work. Its the ideal option compared to [i<]any[/i<] kind of AA running at 1080p, and if the game is less demanding I can still run some of the AA options as well. Won't be for everyone, not that cheap (but still far cheaper than PC screen 4k@60) and need a big desk, but the pc monitors just aren't keeping up on price and features. Maybe when ____ sync goes mainstream and not TN + overpriced.

        • Bensam123
        • 5 years ago

        Aye… I don’t think this stuff really works in the intended way. It smooths images and makes them artistic, but at the cost of perceived resolution.

        Even though this is downscaled 4k, it almost seems like it’s less then 1080p due to filters.

    • sfarabi
    • 5 years ago

    [quote<]MFAA promises the quality of 4X MSAA at the performance cost of 2X MSAA by employing various forms of dark magic. [/quote<] That sentence made me chuckle. I would love see a DSR vs native "uber sampling" from the Witcher 2 game.

      • Meadows
      • 5 years ago

      GPU vendors have promised “the quality of 4X MSAA at the performance cost of 2X MSAA” quite many times over the years.
      I’ll believe it when I see it.

    • drfish
    • 5 years ago

    Something I’ve done that has a similar effect to this is streaming games from my desktop to my laptop with Steam. My laptop is only 1366×768 but if I stream 1920×1080 games to it they do look markedly improved (specifically Civ5). Streaming 2560×1440 to it starts to give you the same problems you would expect from 4K though…

      • Theolendras
      • 5 years ago

      Nice to know, I was wondering, if the gain would be mostly negated by the encoding process (I know it’s after the render process has ended) but still…

      So having a 4k capable desktop in the basement, might actually improve my soon coming HTPC/in-Home streaming setup in the living room with a 1080p screen image quality wise.

      More and more looks like the PS4 will have to lock in quite nice exclusives in order to get my cash…

    • Anovoca
    • 5 years ago

    Currently I am running 1 2560×1440 and 2 1920×1080 monitors. I am curious if I can utilize this tool to trick games into thinking all three monitors are 2560 and allow me to play games three wide. Looking at the screenshots of how to enable this in the control panel I think that to be unlikely but worth a try at least.

    • mad_one
    • 5 years ago

    Thank you for this article! I would have liked to see the different sharpness settings for the lower scale factors, which is probably a harder case for the filter. At 2x 0 and 15% seem superior to me.

    What both the article and most comments seem to ignore is that this is most useful as a “last resort” AA technique, for titles that do not work with driver SSAA and MSAA, e.g. Defense Grid 2. The only thing that works is FXAA and DSR is vastly superior to that.

      • puppetworx
      • 5 years ago

      [quote<]What both the article and most comments seem to ignore is that this is most useful as a "last resort"[/quote<] I agree. It's an expensive process, but if you're sitting looking at a poorly ported console title with scant AA options it's a really nice option to have. I only recently played BioShock Infinite: Burial at Sea and the anti-aliasing options were abysmal, you can choose from 'On' or 'Off'. I even resorted to SweetFX to try and enhance the AA but to no avail. Those first steps into Rapture were an [url=http://i.imgur.com/mo7P3GQ.jpg<]abomination[/url<]. A process like this would have been an enormous improvement.

    • rutra80
    • 5 years ago

    Ok so they made a kind of SSAA prosthesis with blur, better than nothing I guess…

    • Wirko
    • 5 years ago

    A noob question: does any of the AA methods take LCD subpixel arrangement into account? Not just in games, it’s also Photoshop etc. that could take advantage of that when downscaling a raster image for display.

      • hechacker1
      • 5 years ago

      I doubt it. My 6950 (soon to be replaced with a gtx 970) offers lots of AA methods. One of the reasons I don’t use “edge-detect” or the different wide or narrow filters is that they cause blurring on the entire screen.

      Yeah they look good, but they also happen to blur the GUI and text, which sucks. Combined with the fact that Steam uses subpixel rendering for fonts (Cleartype), it just looks screwed up when enabling those features.

      So I rather have sharp, and inferior methods of in game AA, so at least it can render text and the GUI sharply. If you could only decouple those aspects.

        • Cannonaire
        • 5 years ago

        Do you know if Steam overlay uses Cleartype even in Windows 8? I seem to remember Microsoft started using a different kind of font smoothing after Windows 7.

      • Damage
      • 5 years ago

      I’m not aware of any GPU or gaming AA modes that are sub-pixel aware. I think that means there aren’t any in use right now. 🙂

        • Cannonaire
        • 5 years ago

        It might be nice in a few instances, but wouldn’t sub-pixel aware AA modes result in awful screenshots when viewed on other monitors? I could cringe when I see the cyan/magenta/yellow around upscaled screenshots of text as it is.

          • Damage
          • 5 years ago

          Yeah, sub-pixel-aware AA modes don’t look nice at the multi-pixel level. Not sure why that would be a problem, really, in the grand scheme, but there you have it.

        • BlackStar
        • 5 years ago

        SMAA is subpixel aware, which is why it looks so much better than FXAA.

          • Damage
          • 5 years ago

          SMAA 2X and higher includes a component of spatial multisampling, which means it gathers samples at a sub-pixel level from the underlying scene geometry.

          However, we’re not talking about that. We’re talking about the LCD subpixel arrangement of the display.

          Some AA schemes, like MS ClearType, change their output to take advantage of the positioning of the RGB subpixels on the display in order to achieve a higher effective resolution.

          I don’t believe SMAA does anything of that nature. If it did, users would have to tune it to match the LCD subpixel arrangements of their monitors.

    • auxy
    • 5 years ago

    What is with the idiots in this comment thread? “Another blurry AA method”, “another NVIDIA gimmick”, COME ON! This is what the real graphics fans (like me) have been waiting for for a decade ever since super-sampling fell out of fashion in the Geforce 4 era. Comments like this infuriate me so bad I want to crush your tiny pinheads.

    I remember back in 2005 I used to hack the registry so I could make Final Fantasy XI run at 4x resolution internally. I have no idea what method the game used to downsample that to the output resolution, but it was a gorgeous effect and it blew my mind. [url=http://www2.picturepush.com/photo/a/13358890/1024/FFXI/pol-2009-06-18-01-46-22-61.jpg<]Here's an example.[/url<] It's a Playstation 2 game, so the models and textures are very simple, but look how nice and clean all the geometry is. I've been addicted to super-sampling ever since, from setting /renderscale 2.0 in [url=http://www4.picturepush.com/photo/a/13129302/img/Champions/GameClient-2013-05-09-01-58-35-48.png<]Champions Online[/url<] and more recently [url=http://www2.picturepush.com/photo/a/14195795/img/Neverwinter/GameClient-2014-05-31-08-52-06-58.png<]Neverwinter[/url<], to using SGSSAA on everything from [url=http://www1.picturepush.com/photo/a/14220759/img/TERA-Online/TERA-2014-06-20-07-11-31-37.png<]TERA Online[/url<] to [url=http://www3.picturepush.com/photo/a/13988401/img/BLR/BLR-2013-11-09-10-32-50-06.png<]Blacklight Retribution[/url<] and [url=http://www1.picturepush.com/photo/a/14328319/img/PSO2/pso20140829-012738-001.png<]Phantasy Star Online 2[/url<], and even more recently, with durante's [url=http://blog.metaclassofnil.com/?tag=gedosato<]GeDoSaTo[/url<], running [url=http://www3.picturepush.com/photo/a/14304041/img/Dark-Souls-II/Majula-Downsampled.png<]Dark Souls II[/url<] and [url=http://www3.picturepush.com/photo/a/14193871/img/Mass-Effect/screenshot-2014-05-17-04-47-40.png<]Mass Effect[/url<] in 4K downsampled to 1920. [url=http://www1.picturepush.com/photo/a/14012349/img/DarkSouls/screenshot-2014-02-06-01-58-23.png<]The first Dark Souls game[/url<] could achieve the same effect using some settings in DSfix. I also have used GeDoSaTo on Borderlands 2, see [url=http://www3.picturepush.com/photo/a/14308276/img/Borderlands-2/screenshot-2014-08-17-13-14-25.png<]here[/url<]! I'm really glad some newer games have started to implement super-sampling settings in the game, like the newer Tomb Raider and [url=http://www4.picturepush.com/photo/a/13243142/img/Remember-Me/RememberMe-2013-06-06-00-36-01-34.png<]Remember Me[/url<], which I found [url=http://www1.picturepush.com/photo/a/13243139/img/Remember-Me/RememberMe-2013-06-04-01-37-05-96.png<]visually[/url<] [url=http://www2.picturepush.com/photo/a/13243135/img/Remember-Me/RememberMe-2013-06-04-00-52-13-84.png<]stunning[/url<], but lacking as a game. Seriously, DSR is a god-send and quite frankly anyone who doesn't think so is not likely to be someone I will ever agree with about graphics. Higher resolution improves any signal, and games are no exception; even simplistic lo-fi games like the old FFXI or Minecraft still look better with less artifacting. [i<]*I do wish it supported Lanczos filtering instead of Gaussian, though...[/i<]

      • rahulahl
      • 5 years ago

      When I played God of War on an emulator, I played on super high resolutions downscaled, and the result was PS2 games looking awesome. But when I tried this on my games, I didn’t get any such benefit. Mainly because at 1440p, my resolution is already high enough that making it 4k gives me diminishing returns. So for the blurriness and high performance hit, I get a minor AA improvement. Sorry buddy, but its not for everyone. Maybe if you have an old game or really low resolution screen it might be worth using. But its use is gonna be limited.

        • auxy
        • 5 years ago

        Nah, this is nonsense.

        1920×1080 is not “really low resolution”; most people are still using this resolution or similar, including most enthusiasts, as shown by the recent TR hardware survey. I have several 2560x monitors here and I stick with my triple-1920x setup because 2560 just isn’t enough of an improvement over 1920×1080, since the DPI isn’t actually any higher when you have to move to an inconveniently large 27″ display. (Also, my single GK110 GPU handles games in 3840×2160 quite nicely, while 5120×2880 is generally unplayable.)

        If you’re getting blurriness when downsampling, you’re doing it wrong. More likely you just don’t know what you’re doing and didn’t set it up correctly, both because doing so is a bit unintuitive and also from your post you don’t seem especially informed. 2560×1440 is not a high enough resolution to start saying things like “my resolution is already high enough”.

        Just like the other guy who posted a similar-but-even-less-informed comment, DSR is not “blurry”. Did you even read the article? It’s considerably LESS blurry than other AA methods, especially FXAA and TXAA. Given how ill-informed you are, I really don’t even know why I bother writing this reply post, but maybe it will help someone else who comes along.

          • rahulahl
          • 5 years ago

          I just compared 480p from PS2 to 1440p from my monitor.
          Just why is it wrong for me to say that 480p benefits more from this than 1440p?

          I might be ill informed, but what I tell you is what I saw. I saw no reason to upscale my games from 1440p to 4k. I tired it anyway, since I get about 300 FPS in my game with maxed AA. The results though were blurry for me. Initially I didnt even know it was supposed to be blurry. The blurry textures just annoyed me even though I didnt know what it was about. I only found after playing a couple of games that the reason it felt blurry was because it actually was. I didnt look for any imperfections. It just hit me too blatantly to ignore.

          If a setting meant to make your game pretty results in a performance hit, its one thing. I mean, some people would prefer performance, while others might prefer the visual improvement.
          But here its not quite as simple. Because this setting is degrading a certain visual aspect and giving a big performance hit as well at the same time.

          And yes, you might be consider 1440p as not good enough, but to me, I have no reason to go any higher. I do have the Dell 15 ultrabook, which comes with a 4k display in 15 inch format.
          I did try it on games the day I got it just to see what 4k gaming feels like, and I didn’t see any improved visuals worth the performance hit. So yea, as far as I am concerned 1440p is high enough.

      • jessterman21
      • 5 years ago

      Couldn’t agree more – and I absolutely adore playing PCSX2 and Dolphin with 3xSSAA – they both have an internal scalar built-in.

      I’ve been downsampling on my 900p monitor since I picked up my GTX 660 and now I don’t play any game at less than 1080p. There is some blur, for sure, but it’s worth it for the much sharper textures and less aliasing everywhere. Combining downsampling with SweetFX is a great solution, since you can sharpen up the final image to make it look more native.

      • aggies11
      • 5 years ago

      I think both sides are “right” 😉

      One of my pet peeves about computer games, is that the scenes often look just a little too “crisp”. Compared to say reality. That is just the nature of square pixel based rendering, there is always going to be extra “detail”/”noise” added to the image. Not just edge aliasing, but really all facets of the image as it can’t really be rendered 1:1

      So I think that all the people complaining about the loss of detail are in effect correct. However I would say that it’s proponents argue that the details you are “losing” are actually artifacts introduced by the flawed rendering process.

      Personally I like the “smoothed” effect, especially in scenes like the Crysis 3 one, where it really gives a more “diffuse” lighting effect. It makes the scene seem to have more depth, and more akin to what you would see in reality or possible film. The lack of all the extra detail makes it seem more real/believable.

      But I can definitely see how some might prefer (or become attached to) the extra detailed found in computer rendered games.

      TLDR – it is getting rid of detail, that’s the whole point. The argument is that the detail it’s removing are imperfections in the image and aren’t supposed to be there. But I can see how some people could prefer that extra detail as it makes the image appear more sharp. (Ie. look at a sharpening filter in an photo editing program. It adds more detail/noise, not found in the original photo, making it appear more sharp)

      PS. I’m not against more detail personally. In fact for games such as Skyrim, where the texture detail is so abysmally low, I actually used a Post Process Sharpening filter on it. (But then I also applied an FXAA smoothing filter after that, to make the added detail seem “baked in” and not artificial sharpness). The end result gave the appearance of more detail, all the while still maintaining the diffuse lighting/ “realistic” feel to it.

    • arunphilip
    • 5 years ago

    Just curious – which game does the first screenshot (the barrels) and the second one (the church roof/steeple) come from?

      • auxy
      • 5 years ago

      From the old 3DMark2001.

        • Terra_Nocuus
        • 5 years ago

        I thought the church roof was from Oblivion, but I couldn’t place the barrels 🙂

          • Damage
          • 5 years ago

          The first one is from the old 3DMark, like auxy said, and the second is indeed from Oblivion.

          • auxy
          • 5 years ago

          You’re right, sorry; I didn’t read the entirety of the OP. My mistake! ( *´艸`)

        • swaaye
        • 5 years ago

        3DMark 2000 actually. 🙂

          • Entroper
          • 5 years ago

          The best 3DMark.

            • Cannonaire
            • 5 years ago

            I agree. I can still hum the music from the demo sequence in 3DMark 2000.

    • ferdinandh
    • 5 years ago

    I keep reading that 1920×1080 is such a low resolution that a fast card is overkill for it. I have a 660ti and it can not handle Farcry3 on high on 1920×1200. Is the 970 so much faster that I could run Farcry3 on high with 60+fps?
    And I love playing freedom fighters with the heavy 32x AA options from the control panel. Nothing flickers and everything is smooth without being blurry 🙂

      • Meadows
      • 5 years ago

      NVidia’s own numbers claim the GTX 970 is about 30% faster than the 660 Ti, with the GTX 980 being about 50% faster instead. TR’s results are similar.

      I also own a 660 Ti, but mine is greatly overclocked. I’m entirely satisfied with it that way for the time being. Perhaps that’s your best bet in the near term.
      (For reference: I set its top speed at 1160 MHz and the memory to 7010 MT/s. Performs pretty decently. I don’t even remember what the base specs are, because it was already factory overclocked when I bought it.)

      Edit: yes, +30-50% performance is very significant. For reference, I pulled the above approximate numbers from this comparison page: [url<]http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-970/performance[/url<]

      • Prestige Worldwide
      • 5 years ago

      [url<]http://lmgtfy.com/?q=gtx+970+far+cry+3+benchmark[/url<] Patronizing aside, yes, according to the first google search result, Tom's shows Far Cry 3 on High Detail Preset @ 1080 4xMSAA with minimum of 64fps and average of 77.4 fps.

    • Vinceant
    • 5 years ago

    I have used it and like it quite a lot… but I don’t use it very often because it rearranges every danged window on my secondary monitor and puts it all on my primary. So every time I exit a game I’ve used it on, I have to spend 2 minutes cleaning up the mess DSR made. I also can’t read my chats and stuff while running a game using DSR. This is almost a dealbreaker for me. Not sure why the main monitors resolution screws up window arrangement on the secondary monitor so much.

    The other issue I have with it is extremely minor, but somewhat important to me. It can only scale based on native panel resolution. You cannot scale based on a different resolution or pixel ratio. This is annoying for recording as my monitor is 16:10, but when I record something I run it and record it in 16:9 (because that’s what everything uses now -_-). So I could never use it to record without getting a completely different monitor. Oh well. Again, pretty minor, but annoying none the less.

      • Meadows
      • 5 years ago

      Does your secondary monitor work normally while DSR is in action on the primary one?

      If it does, then this is an ugly bug you should report.
      If not, then it’s still a bug but easier to explain.

        • Vinceant
        • 5 years ago

        Yes, it displays my desktop background, but all windows on it move to the primary monitor. I have found a way around this though. If I change my primary monitors desktop resolution to 4k before launching the game, the problem doesn’t occur. This is a relatively easy workaround, but still annoying. It’d be nice if the bug didn’t occur at all.

        Also, I think this has more to do with how desktop space is allocated in windows than DSR itself. I can have the reverse effect occur if I open a game with a ridiculously low resolution.

          • Meadows
          • 5 years ago

          You might want to report the bug nonetheless.
          [url<]https://nvidia-submit.custhelp.com/app/answers/list[/url<]

      • morphine
      • 5 years ago

      DisplayFusion (dual-monitor software) may help you with your window positioning woes. Plus, it’s awesome.

        • Vinceant
        • 5 years ago

        I just looked into DisplayFusion per your suggestion and I’m not sure what it does that windows 8 doesn’t already do natively. Nor am I sure of why it would help this situation at all. Care to elaborate?

          • morphine
          • 5 years ago

          Multi-monitor and per-monitor configurable taskbars, proper alt+tab behavior, providing clickable icons for moving windows between monitors, saving all the window positions, a ton of configurable keyboard shortcuts for window management, plus other niceties like wallpaper and icon management, easy audio device selection, etc.

          Although, like I said, do try it first. If the problem you have is indirectly created by NV’s drivers, there’s a chance DF might not be able to do anything about it.

    • B.A.Frayd
    • 5 years ago

    Bleh. Looks like another AA method that blurs the textures and lowers the image quality for me. Not to mention, it adds a horrendous burden on the GPU, killing framerates in the process.

      • MathMan
      • 5 years ago

      I think there’s no argument that ordinary brute-force super-sampling with 4 samples per pixel with simple averaging of the 4 pixels is better than 1 sample per pixel. It’s more correct simply based on first principles.

      However, averaging is not as correct from a signal theory point of view as gaussian filtering. I suspect that setting the slider to 0% will be extremely similar to averaging. At that point, it’s hard to find any fault with DSR.

      • pandemonium
      • 5 years ago

      Agreed. If softening edges comes at the price of blurring textures, I’ll pass. Natively high resolution > low resolution with demanding AA.

      What we need is an edge detection AA that doesn’t suck.

      Edit: I apologize for interrupting the Nvidia circle jerk. Carry on.

        • auxy
        • 5 years ago

        Did you even read the article?

          • pandemonium
          • 5 years ago

          What are you even talking about? Did you even COMPARE THE SAMPLES WITH YOUR EYEBALLS?

      • Meadows
      • 5 years ago

      Did you even read the article?

        • auxy
        • 5 years ago

        ( `ー´)ノ

    • Arclight
    • 5 years ago

    Redundant piece of tech that makes games look blurry.

    • rahulahl
    • 5 years ago

    I tried it on CS:GO.
    Result was a blurry mess. At least for me. I kept getting distracted by the blurry textures on boxes, etc that I never even noticed before. When I went back to default 1440p, it was all nice and crisp again.

    Personally, I don’t think that the technique is good enough for me to use even if it had a 0% performance hit. But with such a high performance penalty, I wouldn’t recommend it. I would rather get my 144+ frames than use this.

      • Terra_Nocuus
      • 5 years ago

      Were you running 1440p x2 or x4 (i.e., 5120×2880 or 10240×5760) on your 1440p display, or were you running 1080p x2/x4 [i<]upscaled[/i<] to 1440p? That could explain the blurries.

        • rahulahl
        • 5 years ago

        I have a 1440p monitor and the game resolution was set to 4k.

      • Prestige Worldwide
      • 5 years ago

      CSGO is a game that needs this feature the least. You can just max out the MSAA settings to 16xQ from within the game’s video settings. Maxing out AA in the game at 1080 / 120hz, I get well over 100fps in heavy smoke and up to 300fps in less demanding situations on a single GTX 670.

        • Theolendras
        • 5 years ago

        Well, that way you don’t benefit from subpixel improvement as much tough…

    • MadManOriginal
    • 5 years ago

    The AA improvements certainly look nice where they are intended to look. But it also seems to lend a slightly diffuse, fuzzy quality to certain areas and textures. I suppose it’s the nature of the beast depending upon the AA method. Not having glaring jaggies is nice, but I would just as soon have sharp textures.

      • Meadows
      • 5 years ago

      I didn’t see a loss of texture detail. (Sometimes, shading or shader effects would act a little differently, presumably because of the resolution, but the textures don’t get any worse.)

      If you want objective sharpness regardless, then you’ll just set the smoothing factor to 0%, I guess. That might be what you’re looking for.

      It’s also worth noting that chasing sharpness at all cost means deluding oneself. While the movie and advertising industries both over-use sharpness to great effect, that’s not realistic at all and games with DSR actually looked “more realistic” than they’d look otherwise.
      After all, look outside. You don’t see real-world “textures” pop either, only the resolution is higher.

        • MadManOriginal
        • 5 years ago

        You didn’t try very hard then. Check out the last comparison shot on this page: [url<]https://techreport.com/review/27102/maxwell-dynamic-super-resolution-explored/4[/url<] and look at the building on the right.

          • Meadows
          • 5 years ago

          See, this is exactly what I was talking about.

          Crysis 3 applies a noise effect (similar to what they use in some movies nowadays) because your eyes confuse noise for fine detail so it makes scenes look “better” to a human viewer.
          With DSR, this noise is generated onto an area four times larger, then scaled down, essentially averaging it.

          Edit: zoom in to observe the noise better, or open the images in some editor. The effect is hard to tell at native resolutions, which is kind of the point by the way.

          So what you see is not a loss of detail but the quasi-removal of a stupid effect by scaling it down. Nevertheless, if you also set “smoothing” to 0%, that may yet increase edge contrast in some places, but I doubt it would affect those particular textures for example.

      • Prestige Worldwide
      • 5 years ago

      If anything, there seemed to be more texture detail in the Skyrim example using 4K on 1080.

      • odizzido
      • 5 years ago

      Yeah the entire scene becomes blurry and it’s something I don’t like. I picked up a 970 so I might try it out sometime, but based on this I don’t think I will actually use it anytime.

    • bender
    • 5 years ago

    The improvements from this AA technique are pretty awesome. I imagine that, as you mentioned several times, seeing the results in motion is required to really appreciate what’s being done here.

    I’m wondering though … since this involves rendering (eg) 4x the pixels, would it look better to use this averaging technique on a lower resolution monitor or just render to a real 4k display? Also, I imagine that modern cards just don’t have to horsepower (or memory?) to pull this off for a graphically intense game on a 4k display.

    I’ll bet that would look amazing though!

      • Meadows
      • 5 years ago

      Using this on a 4K display is not only overkill, it should almost [i<]require[/i<] 2 GPUs with today's technology. Perhaps more. This is because the GPU would render at resolutions *larger* than 4K, all the way up to 8K, internally.

      • Damage
      • 5 years ago

      I mentioned in the article that a true 4K display (and I meant one at high PPI like this here 28″ panel) looks even better at 1:1 than a lower res display with DSR. That’s generally true, but I have seen places where shader sparkle is an issue on hi-PPI displays.

      What you really want, eventually, is 16X supersampling (or better!) with a stochastic sample pattern on a high-PPI display. At 144Hz solid. Strapped to your face. 🙂

        • Milo Burke
        • 5 years ago

        WHERE CAN I BUY THIS??!?

        [quote<]What you really want, eventually, is 16X supersampling (or better!) with a stochastic sample pattern on a high-PPI display. At 144Hz solid. Strapped to your face. :)[/quote<]

    • Meadows
    • 5 years ago

    [quote<]I'm able select from five different resolutions that are higher than my display's native size.[/quote<] I counted seven. [quote<]Notice above that I've included results from a native 4K display (at 3840x2160). As you can see, using DSR at 3840x2400 and scaling down to our 1920x1200 display for output is slightly slower than rendering natively at 4K.][/quote<] That's mostly because 3840×2400 has more pixels than "native 4K". Since you used a 16:10 aspect ratio screen instead of a full-HD spec one, DSR will scale up while keeping the original aspect ratio. It should then be noted that 3840×2160 is 8 megapixels, while 3840×2400 is already 9, an 11% increase in effort compared to "native 4K".

      • Damage
      • 5 years ago

      [quote<]That's mostly because 3840×2400 has more pixels than "native 4K". Since you used a 16:10 aspect ratio screen instead of a full-HD spec one, DSR will scale up while keeping the original aspect ratio.[/quote<] I didn't say otherwise. I was simply noting a useful comparison. DSR at a similar internal resolution offers similar performance.

        • Meadows
        • 5 years ago

        You didn’t say otherwise, true.
        But you suggested from the get-go that downsampling might add overhead on top of rendering at a high resolution, so it felt to me like you’re hammering that point.

        I’d say writing “only slightly slower” instead of “slightly slower” might suggest your original intent more accurately.

          • Damage
          • 5 years ago

          Perhaps. Would be a very minor adjustment to the text.

          I think there are better uses of my time. 😉

      • TardOnPC
      • 5 years ago

      [quote]That’s mostly because 3840×2400 has more pixels than “native 4K.”[quote]

      Excellent point. Running 4X DSR on a 1080 display might close the gap between the referenced numbers, which are close and promising. 1080 display with DSR seems to offer the most options. Some games, Arma III/Van Helsing/Watchdogs, look fuzzy at 1080 on a 4K display while others however, Arkham Origins/Castlevania LOS2, look fine.

      I’d like to see 1080 4X DSR VS 3840×2160 screenshots.

      • Meadows
      • 5 years ago

      It should also be noted that the Crysis 3 screenshot made me go “mother of god, I need a Maxwell card” more than I’d like to admit.

      The improvement in definition is immense.

      The GTX 980 review did mention NVidia making this feature available to older GPUs via a driver update eventually, so I’m looking forward to “eventually”.

    • Forge
    • 5 years ago

    Confess, you were rummaging through ancient pictures as part of your recent hard disk cleanup, and you just wanted to get these screenshots up one more time. 😉

    Good memories, though. I do miss my Voodoo5 often. It wouldn’t manage anything by today’s standards, even my Intel integrated would beat it into a coma, but 3Dfx managed a very seamless and polished presentation. Ahh, dem AAs. All the Rage128 and GeForce DDR people had such envy, usually masked with indignant rage-face. Good times!

      • Rakhmaninov3
      • 5 years ago

      Rage

    • jihadjoe
    • 5 years ago

    DSR looking really good over there. Looking forward to your analysis of “MF’in AA”.

    • Crayon Shin Chan
    • 5 years ago

    Wow, I remember that first image – is that from 3DMark 2000? I remember back then this was used to promote the Voodoo 5…

      • Damage
      • 5 years ago

      From my GeForce 3 review: [url<]https://techreport.com/review/2515/nvidia-geforce3-graphics-processor/19[/url<]

      • clocks
      • 5 years ago

      Maybe my 40yo eyes just are not that good, but I have a hard time finding huge discrepancies in graphics these days. For instance the Xbone vs PS4. Yeah, if the pause the picture and look really, REALLY close you can see a small difference. It is really a big deal?

      I was always a graphics whore in my younger days, but now most the time I’d have a hard time telling if a game is 720p or 1080p, etc…

        • Damage
        • 5 years ago

        We are in a much better place now in real-time graphics. Many of the differences in image quality are pretty small compared to the early days. Nothing wrong with that!

          • clocks
          • 5 years ago

          Indeed. Not like back in the day when one system displayed 4 colors at a time, while another could do 16. I could spot the difference between Atari vs.Colecovision, Nes vs. Master system, SNES vs Genesis, etc..

          Now, I watch the comparison video of Destiny showing ps3/ps4/xbox360/xbone and the game looks great on all four systems! I have a hard time telling the difference. Diminishing returns are here.

      • derFunkenstein
      • 5 years ago

      I had the exact same flashback, from the second test. Good call.

Pin It on Pinterest

Share This