Deep Learning Super Sampling (or DLSS) is a technology developed by Nvidia, using Deep learning to produce an image that looks like a higher-resolution image of the original lower resolution image. This technology is advertised as allowing a much higher resolution of the original without video card overhead.[1]
Nvidia advertised DLSS as a key feature of the GeForce RTX 20 series GPUs when they launched in September 2018.[2] At that time, the results were limited to a few video games (namely Battlefield V[3] and Metro Exodus) because the algorithm had to be trained specifically on each game on which it was applied and the results were usually not as good as simple resolution upscaling.[4][5]
In 2019, the videogame Control shipped with Ray tracing and an improved version of DLSS, but which didn't use Deep learning.[6][7]
In April 2020, Nvidia advertised an improved version of DLSS named DLSS 2.0, which would come for upcoming games, which this time is said to use machine learning and don't need to be trained on every game it is applied to.[weasel words][2] Benchmarks on Control tend to show that the resulting image at a 1080 pixels resolution upscaled from a 720 pixels resolution have the same quality as a native 1080 pixels resolution but retain the 720 pixels resolution performance.[fact or opinion?][8][failed verification] A side effect of DLSS 2.0 is that it seems not to work very well with anti-aliasing techniques such as MSAA or TSAA, the performance being very negatively impacted if these techniques are enabled on top of DLSS.[9]
As of April 2020, DLSS 2.0 must still be included per game basis by the game developers.
Release history
Release
Release Date
Highlights
1.0
February 2019
First version, using AI and specifically trained for some specific games, including Battlefield V and Metro ExodusCite error: A <ref> tag is missing the closing </ref> (see the help page).
The neural network is trained by Nvidia using "ideal" images of video games of ultra-high resolution on supercomputers and low resolution images of the same games. The result is stored on the Video card driver. It is said the Nvidia uses DGX-1 servers to perform the training of the network.
The Neural Network stored on the driver compares the actual low resolution image with the reference and produce a full high resolution result. The inputs used by the trained Neural Network are the low resolution aliased images rendered by the game engine, and the low resolution, motion vectors from the same images, also generated by the game engine. The motion vectors tell the network which direction objects in the scene are moving from frame to frame, in order to estimate what the next frame will look like.[10]
Tensor Cores are available since the Nvidia VoltaGPUmicroarchitecture, which was first used on the Tesla V100 line of products.[12] Their specificity is that each Tensor Core operate on 16 bits floating point 4 x 4 matrices, and seem to be designed to be used at the CUDA C++ level, even at the compiler level.[13]
The Tensor Cores use CUDA Warp-Level Primitives on 32 parallel threads to take advantage of their parallel architecture.[14] A Warp is a set of 32 threads which are configured to execute the same instruction.
^"Nvidia RTX DLSS: Everything you need to know". Digital Trends. 2020-02-14. Retrieved 2020-04-05. Deep learning super sampling uses artificial intelligence and machine learning to produce an image that looks like a higher-resolution image, without the rendering overhead. Nvidia's algorithm learns from tens of thousands of rendered sequences of images that were created using a supercomputer. That trains the algorithm to be able to produce similarly beautiful images, but without requiring the graphics card to work as hard to do it.
^"Battlefield V DLSS Tested: Overpromised, Underdelivered". techspot.com. 2019-02-19. Retrieved 2020-04-06. Of course, this is to be expected. DLSS was never going to provide the same image quality as native 4K, while providing a 37% performance uplift. That would be black magic. But the quality difference comparing the two is almost laughable, in how far away DLSS is from the native presentation in these stressful areas.
^"AMD Thinks NVIDIA DLSS is not Good Enough; Calls TAA & SMAA Better Alternatives". techquila.co.in. 2019-02-15. Retrieved 2020-04-06. Recently, two big titles received NVIDIA DLSS support, namely Metro Exodus and Battlefield V. Both these games come with NVIDIA's DXR (DirectX Raytracing) implentation that at the moment is only supported by the GeForce RTX cards. DLSS makes these games playable at higher resolutions with much better frame rates, although there is a notable decrease in image sharpness. Now, AMD has taken a jab at DLSS, saying that traditional AA methods like SMAA and TAA "offer superior combinations of image quality and performance."
^"Nvidia Very Quietly Made DLSS A Hell Of A Lot Better". Kotaku. 2020-02-22. Retrieved 2020-04-06. The benefit for most people is that, generally, DLSS comes with a sizeable FPS improvement. How much varies from game to game. In Metro Exodus, the FPS jump was barely there and certainly not worth the bizarre hit to image quality.
^"NVIDIA DLSS 2.0 Update Will Fix The Geforce RTX Cards' Big Mistake". techquila.co.in. 2020-03-24. Retrieved 2020-04-06. As promised, NVIDIA has updated the DLSS network in a new Geforce update that provides better, sharper image quality while still retaining higher framerates in raytraced games. While the feature wasn't used as well in its first iteration, NVIDIA is now confident that they have successfully fixed all the issues it had before
^"Evaluating NVIDIA DLSS 2.0 Quality And Performance In Mech 5 And Control". hothardware.com. 2020-03-27. Retrieved 2020-04-07. One side effect of DLSS is that it doesn't seem to play nicely with MSAA (forced through the drivers) or TXAA enabled in the game. Performance actually tanked pretty hard with either of those anti-aliasing methods on top of DLSS 2.0, with the Quality mode only performing around half as fast as no DLSS
^"Using CUDA Warp-Level Primitives". Nvidia. 2018-01-15. Retrieved 2020-04-08. NVIDIA GPUs execute groups of threads known as warps in SIMT (Single Instruction, Multiple Thread) fashion