Nvidia’s DLAA Could Be a Huge Step Forward for Anti-Aliasing

This site may earn affiliate commissions from the links on this page. Terms of use.

Nvidia has announced that an upcoming update to the Elder Scrolls Online will introduce DLSS support for lower-end GPUs, to help them run the game more effectively. But that’s not the only feature coming. Nvidia also intends to introduce a new, Deep Learning Anti-Aliasing mode, or DLAA for short.

The Elder Scrolls Online’s Creative Director, Rich Lambert, said the following:

While we were working on adding NVIDIA DLSS, we also worked with them on some new tech that we’re going to be the first game that’s ever done this before. This is debuting their new tech, which is called NVIDIA DLAA. It’s the same kind of concept, you won’t get a performance boost out of this but what you get is absolutely incredible anti-aliasing. It’s unbelievable, it’s crazy how good it is.

What’s the Difference Between DLSS and DLAA?

Lambert’s comments about DLSS and DLAA working in a conceptually similar fashion may inadvertently confuse some folks. DLSS is an acronym that stands for Deep Learning Super Sampling. Supersampling is itself a type of anti-aliasing. It’s typically considered the best type, as far as output image quality, but it’s also the most expensive.

Supersampling renders an image at a much higher resolution internally before outputting the frame at the desktop’s default display resolution. Traditional supersampling upscales everything in-frame. This is great when you have a powerful GPU and a low-resolution monitor; activating 4xSSAA on a 1080p display yields output that’s nearly identical to 3840×2160. The reason SSAA works so well is the same reason it isn’t a preferred solution for many gamers — rendering an image at 4x internal resolution puts a heavy load on the GPU.

Lambert’s comment isn’t a lot to extrapolate from, but we can make some guesses about DLAA based on what we know about how DLSS works. Nvidia’s DLSS uses AI upscaling models that have been trained on high-resolution images to convert a lower-resolution of an output to a higher-resolution version. If you turn DLSS on and set for 4K rendering, your GPU actually renders in 1440p natively. Nvidia’s tensor cores then operate on the 1440p data stream to convert it to a 4K output that’s intended to closely match (and occasionally, even exceed) the quality of the original.

Much of Nvidia’s work for DLSS seems as though it would have applications in DLAA. Instead of lowering the resolution of the initial output to 1440p in an attempt to create 4K output, Nvidia would presumably use the native signal and work with that directly, as opposed to tapping lower-resolution motion vectors and image data.

From Nvidia’s DLSS presentation. The company is already apparently working with much of what it would need to implement DLAA.

DLAA may differ from DLSS in other ways, but it’s not immediately clear how. DLSS already incorporates both spatial and temporal data into its upscaling approach, so Nvidia wouldn’t have to reinvent the wheel.

Because DLAA presumably operates on the native resolution of your GPU, it wouldn’t necessarily improve game performance in the same way. What it might do, however, is deliver the benefits of higher levels of antialiasing within a smaller performance hit than engaging those methods natively.

Instead of enabling 4K DLSS and seeing superior frame rates to 4K, I suspect the goal is to enable, say, 4x DLAA and see better antialiasing quality or less of a performance hit at a given quality level compared with enabling 4x SSAA, MSAA, or what have you.

The potential advantage to Nvidia from this — in addition to enabling higher image quality at a lower absolute performance cost — is that it establishes DLAA as a technology that’s unreservedly about improving absolute image quality over native baseline. With DLSS, Nvidia has to accept that native resolution may be better, even if DLSS is very good. With DLAA, Nvidia may be able to offer a feature that strictly improves image quality at a smaller perf hit than anti-aliasing typically requires.

We don’t have any additional details to share, but if Zenimax is being allowed to talk about the feature we’ll probably hear more in the not-too-distant future. It’s exciting to contemplate what AI might bring to the field of antialiasing, specifically, because running AA workloads through AI cores could relieve a substantial amount of pressure on the GPU, depending on the AA methods NV offers/supports.

Now Read:

Comments are closed.