Software techniques that make lighting more realistic in “computer-generated” images were developed by researchers at Dartmouth College and their industry partners.
The new techniques focus on “real-time” graphics that must maintain the illusion of interactivity, as scenes change as a result of user movements. These graphics can be used in applications such as video games, scientific visualization tools, extended reality, etc.
Both relevant scientific articles show how developers can create sophisticated lighting effects by adapting a well-known “rendering” technique known as “ray tracing”.
“Over the past decade “ray tracing” has dramatically increased the realism and visual richness of “computer-generated” images in movies, where producing just one “frame” can take hours”,
said Vochieh Tzarros, an associate professor of computer science at Dartmouth, who had a key role in both research projects.
“Our scientific articles describe two very different approaches to realistic “ray-traced” lighting to the limitations of real-time graphics.”
The first program, developed with NVIDIA, looks at the possibilities for future games when developers integrate the company’s “ray tracing RTX” platform. New games have begun to use “RTX” to correct shadows and reflections, but the quality and complexity of lighting is currently limited by the small number of rays that can be deleted/traced per “frame”.
The new technique, called “ReSTIR (reservoir-based spatiotemporal importance resampling)” creates realistic lighting and shadows from millions of artificial light sources. The “ReSTIR” approach dramatically increases the quality of “rendering” on a computer’s graphics card by reusing rays traced to nearby “pixels” and previous “frames”. The new technique can be integrated into the design of future games and works up to 65 times faster than previous “rendering” techniques.
The second project took place in collaboration with “Activision” and describes how the company has incorporated increasingly realistic lighting effects into its games; traditional video games create “sequences” of lighting in real-time using what is called “baked” solutions: Complex “ray-traced” lighting is calculated only once, through a time-consuming process. The lighting created through this technique can be easily projected during the game, but is limited to a specific and stable layout in a scene, and therefore cannot react easily to the movement of characters and cameras.
This scientific article describes how “Activision” has gradually developed the “UberBake” system from a static approach to one that can reflect small lighting changes in response to interactions with players, such as opening and closing lights, or opening and closing doors. “UberBake” evolved over many years to operate in modern games, so it had to be able to operate on a wide range of hardware, from powerful PCs to previous generation consoles.
“Video games are played by millions of people around the world”,
said Dario Seib, a PhD student at Dartmouth who participated in the science article.
“With so many people interacting with video games, this technology can have a huge impact.”