Saturday, April 27, 2024

A.I. helps scientists inch closer to the ‘holy grail’ of computer graphics

Share

[youtube https://www.youtube.com/watch?v=n4DKNbf4Dfc?feature=oembed&w=100&h=100]

Computer scientists at the University of California, San Diego, and UC Berkeley devised a way to make animals in movies and video games more realistic by improving the look of computer-generated fur. It might not sound like much but the researchers call photorealistic fur a “holy grail” of computer graphics.

“Creating photorealistic … characters has long been one of the holy grails of computer graphics in film production, virtual reality, and for predictive design,” Ravi Ramamoorthi, a professor of computer science at UC San Diego, who worked on the project, told Digital Trends. “Realistic rendering of animal fur is a key aspect to creating believable animal characters in special effects, movies, or augmented reality.”

To do so, they leveraged artificial intelligence to better reflect the way light bounces between the fur of an animal pelt, which has a surprisingly significant effect on realism.

Existing models were designed to depict human hair and were less focused on animal fur. However, while human hair and fur both contain an interior cylinder called a medulla, the medulla in fur is much bigger than in hair, and creates an unusual scattering of light. Most existing models haven’t taken the medulla into account in this complex scattering of light.

But the team from UC San Diego and UC Berkeley turned to a concept called subsurface scattering and employed an A.I. algorithm to lend a hand.

“A key innovation is to relate fur rendering to subsurface scattering, which has earlier been used for things like clouds or human skin,” Ramamoorthi said. “There are many techniques to render subsurface scattering efficiently, but the parameters are completely different physically from those used to describe fur reflectance. We have introduced a simple neural network that relates them, enabling one to translate a fur reflectance model to comparable subsurface parameters for fast rendering.”

In terms of speed, Ramamoorthi said his team’s model can generate more accurate simulations ten times faster than the current state-of-the-art models. They shared their findings last week at the SIGGRAPH Asia conference in Thailand.

The new model has future potential in fields from virtual reality to video games, but Ramamoorthi seemed most enthusiastic about its current use for special effects in films.

“Our fur reflectance model is already used, for example in the Rise of the Planet of the Apes, nominated for a visual effects Oscar this year,” he said.

Editors’ Recommendations

  • DIY your own Snapchat World Lenses with new (and free) Lens Studio
  • The 20 best tech toys for kids will make you wish you were 10 again
  • Kano Computer Kit Complete Review
  • Intel is building brain-like processors that will power the robots of the future
  • Get with the Christmas spirit with computer carols from Alan Turing


Read more

More News