Takes ‘noisy’ ray traced image and uses AI to predict what it would look like after thousands of passes

Nvidia takes a ‘noisy’ ray traced image and uses AI to predict what it would look like after thousands of passes

Nvidia is looking to rewrite the ray trace rendering rule book by using Artificial Intelligence (AI) to massively accelerate the time-consuming process, rather than simply using brute force processing.

The GPU manufacturer is using AI to significantly lower the number of light ray bounces needed to get a ‘correct picture’.

Nvidia says speed ups are in the region of 8x faster on the same GPU hardware. Nvidia’s AI-enhanced ray tracing works by rendering a ‘noisy’ image with minimal light ray bounces and then predicting what the final image would look like if it was rendered with thousands of bounces.

To give the neural network its knowledge, Nvidia has trained it with 10,000s of image pairs where one image has done 1 path per pixel and the ‘reference image’ has used 4,000 paths per pixel. The neural network learns how to map the different types of noise to the correct de-noised pixels.

Nvidia is making its AI de-noising rendering technology available in the OptiX 5.0 SDK, a ray trace development kit that can been used by third parties for developing GPU renderers. OptiX is already used in Nvidia IRAY and mental ray so we expect AI-enhanced ray tracing to appear in these products in the future. IRAY is available for SolidWorks, Siemens NX, Rhino, 3ds Max and other 3D tools. The technology could also have big implications for physically-based rendering in VR.

Relative performance of GPUs, CPUs and AI enhanced ray trace rendering

Nvidia has shared some performance figures for OptiX and AI-optimised ray tracing, including comparisons of its Pascal and forthcoming Volta architectures. See chart above.

The OptiX SDK will come with a fully trained neural network. Nvidia says it will work really well in some instances, based on its training set, which includes scenes with products and cars, etc.

However, it acknowledges that it won’t work perfectly for every scene. In the future, it may give developers or users the ability to use their own data and repurpose the trained network through the process of ‘transfer learning’. Nvidia is also applying AI to other areas of graphics, including AI for anti-aliasing, to smooth the jagged edges or stepped effect on lines.

The company has also announced a new personal deep learning supercomputer, the Nvidia DGX Station. The water cooled deskside system is ‘whisper quiet’ and features 4 x Tesla V100 GPUs (16GB) and 256GB system memory.

If you enjoyed this article, subscribe to AEC Magazine for FREE