NVIDIA just announced the release of two new developer SDKs, RTXGI and DLSS 2.0. RTXGI is a global illumination library for DXR compatible hardware that …

(Visited 16 times, 1 visits today)

You might be interested in

Comment (16)

  1. You're right, the explanation of DLSS was pretty meh, not the worst, but parts of it sounded opposite of what it's actually doing. On their own servers they are rendering frames of various games or scenes at two resolutions, one low resolution and one high resolution, but the same exact frame. They then feed the low resolution frame into a neural network and ask it to output a higher resolution output, they then compare that scaled-up frame with the high resolution rendered frame to get a score of how well the neural network did at recreating the frame given less details. At the beginning the outputted frame will be extremely bad, but over hundreds of millions, or billions of attempts, and the neural network adjusting its internal numbers by some very complex math, it starts getting closer to the high resolution image. After sufficient training, the values for the weights and biases in the network are packaged up and can be downloaded as part of the driver and ran on the Turing cores on RTX series cards. But it doesn't stop there, the training is continuous, and the network will only get better over time, although usually at diminishing returns, so after some time they may issue an update, in which you can download new drivers containing the updated values of the network. So while the results may have some artifacting, over time the artifacting will be less and less noticeable.


Votre adresse de messagerie ne sera pas publiée. Les champs obligatoires sont indiqués avec *