Shimmering reflections on water. Light glinting off objects as they move. Shadows of different depth cast by multiple light sources, all of them shifting in real time.
Until now, the world of video games has lacked this kind of naturalistic lighting. Effects have to be programmed in, meaning that they usually remain static and do not constantly adjust with the movement in a scene. The result is an inevitable flatness that detracts from the feeling of realism.
A computer graphics revolution that is just reaching video games could be about to change that. Known as ray tracing, the technique has been common for a decade in animated movies, where complex scenes are rendered in large server farms.
The ability to pack more computing power into smaller, cheaper chips, along with new techniques to simulate sophisticated lighting effects, has now brought this capability to consumer devices. PC graphics cards with the technology first appeared a year ago, and it will be in the next generation of games consoles.
Ray tracing certainly arrives with some big claims to back it. “This is the biggest advance in computer graphics in my lifetime,” said Rev Lebaredian, a veteran graphics expert and head of the simulation group at chipmaker Nvidia. It is the technology that will eventually take gamers across the “uncanny valley”, he said — the gulf that exists when a digital technology falls just short of simulating the real world, leaving a sense of creepiness.
The technology involves a fundamental change to how scenes inside animations are “lit”. As the name suggests, it mimics the way photons move in the real world, using algorithms that calculate the angle that light would reflect off different objects in an animation. Rather than programme in each effect, game developers need only place their light sources and let the algorithms do the rest.
The technology works from the gamer’s viewpoint, sending a notional ray to an object in the game, then following it from that back to a light source. That way, the computer only needs to calculate the light beams that actually reach the viewer’s eyeball, not ones that bounce off objects in other directions.
Calculating the individual light beams puts considerable computing strain on the system. “It’s very resource-intensive,” said Pat Moorhead, a US chip analyst. “The good thing is you can mix it with other effects”, bringing elements of the technology to a scene to generate the most-eye catching effects — for instance, using it to handle just reflections.
The pace at which ray tracing now finds its way to a mass gamer audience now depends on three things: cost, performance and content.
The price of Nvidia’s graphics cards needed to support ray tracing start at $349. But prices could come down quickly: the cheapest cards are likely to fall quickly to $149 by next year, said Mr Moorhead.
A second issue concerns performance. The high demands that ray tracing places on a computer means that the machine can render fewer frames per second, particularly for the highest resolution images. Adding ray tracing degrades performance by 30 per cent to 40 per cent, according to Mr Moorhead. But he added that many games look just fine at between 30 and 60 fps, opening a big part of the market to the technology.
That leaves the chicken-and-egg problem that faces all new gaming technologies: when will there be enough high quality games to make it worth buying the more expensive hardware needed for ray tracing — and vice versa?
Game developers normally take three to five years to adapt to capabilities such as this in a new generation of chips, said Mr Lebaredian, though he added that the pace is moving faster with ray tracing. Part of the reason is financial. A large slice of the $100m that can go into producing a high-end video game goes into the hand-produced art, he said. With an algorithm that produces lighting effects automatically as scenes change, much of the manual effort goes away.