Multiple gaming technologies are coming together to enable real-time ray tracing. This is a technology that enables processors to replicate the physics of light, which is something video games have had to fudge up until this point. Nvidia’s new RTX tech will work with Microsoft’s just announced DirectX Raytracing (DXR) to unlock this tool that will make games look better and save developers time.

Nvidia, Microsoft, and their partners are demonstrating real-time ray tracing at the Game Developers Conference in San Francisco this week. Certain studios have had it for a couple of months, and Nvidia claims they are already getting stunning results.

The key here is that real-time lighting in games at the moment is a finicky and often broken facsimile of how light works in reality. This mean that developers spend a lot of time and money trying to tune their specific model to get it to function as it should. Raytracing eliminates that because the lighting just works. If a beam of sunlight hits a red wall, the wall will reflect that color with precision.

Here’s an example of real-time ray tracing in action:

Hollywood effects houses have used ray tracing for more than a decade now. This tech enables artists to build digital scenes that light appropriately without requiring hours of adjustments. It was one of the big advancements in automating CGI in film along with AI-animated characters.

For Nvidia’s RTX, you shouldn’t expect ray tracing to pop up in all of your games tomorrow. Instead, this is something that will roll out over the next couple of years, but Nvidia says adoption is inevitable because ray tracing is so much more effective even if it is computationally expensive.

RTX is now running in conjuction with game-making tools like Unreal Engine and Unity, but even Electronic Arts is incorporating it into its proprietary Frostbite engine. Nvidia is also working with Alan Wake developer Remedy, Metro 2033 studio 4A Games, and certain EA developers on testing RTX right now.