Friday, March 29, 2024

You can now moonwalk on the moon with Nvidia’s A.I. and ray tracing tech

Share

Apollo Moon Landing with RTX Technology; courtesy of Nvidia

In addition to launching 10 new RTX Studio laptops targeting creators at SIGGRAPH, Nvidia also announced some of the work that its research teams have been doing in the fields of artificial intelligence, augmented reality, and computer graphics. On the 50th anniversary of the Apollo 11 lunar landing, Nvidia showcased how ray tracing technology from its RTX graphics card is used to visually enhance the images captured by NASA 50 years ago. At SIGGRAPH, Nvidia’s effort to commemorate Apollo 11 goes a step further, allowing fans of space the opportunity to superimpose themselves into a short video clip, as if they were astronauts Neil Armstrong and Buzz Aldrin, by using A.I. and the power of ray tracing to render these videos in real-time.

“What we’re doing is using artificial intelligence to aim a camera and people just in their street clothes, to be able to do 3D pose estimation,” Rob Estes, Nvidia TKTK explained. “We know where they are in 3D space, and we know how their limbs are moving, So we’re drawing that using ray tracing and placing you in with your movement as an astronaut in the scene.”

Doing the moonwalk on the moon walk

Hollywood visual effects directors have been doing something similar for years using a green screen and motion-capture actors wearing suits with dots to replicate limb movement, but ray tracing and A.I. take this a step further by eliminating the need for specialized equipment and actors. “You could do this anywhere you can film somebody,” Estes elaborated on the benefits of A.I.-enabled ray tracing in this lunar example. Essentially, SIGGRAPH attendees can film a short video montage of themselves in Aldrin’s and Armstrong’s spacesuit doing the moonwalk as if they were on the moon as part of the Apollo 11 mission.

“We’re going to have a mock-up of the lunar surface and the lunar lander, and we’re going to get to let them see what it would be like for them to be on the moon.,” he said, with the all the lighting effects and rendering performed in real-time. “This is very leading-edge research, and nobody has done this before. You’re combing A.I. and ray tracing in a way that has many, many practical benefits.” These benefits not only aid Hollywood and its visual effects teams but also designers and researchers trying to solve hard problems with research estimation, Nvidia expanded.

Foveated rendering with prescription glasses

Nvidia wants to make rendering for augmented and virtual reality applications appear more realistic and crisp, and it is applying its foveated rendering technology to accomplish this. What the team of researchers at Nvidia has done is added support for prescription lenses, a first for the industry. Though this is still in the early research stage right now, Nvidia envisions a day where wearers of prescription eyeglasses won’t need to wear separate glasses from their augmented reality devices.

Nvidia is working with several types of display to be able to build the VR or AR display congruent with the prescription glasses that you may have. “This is a big deal,” Estes said. “You’ve never been able to see 20/20 before because there just wasn’t the visual acuity for these displays.”

In the same way that ray tracing can bring life-like cinematic effects to video games, foveated rendering can make AR scenes and images appear more realistic with better resolution by conserving graphics power. Rather than rendering the entire scene, foveated rendering allows creators to just render the middle part of the scene in high fidelity, which is where your eyes typically focus, and the peripheral areas can be rendered in less detail to save on GPU power.

“So we’re doing work to make sure that we’re tracking where your game is, and applying this so that you get faster frame rates and better graphics in doing augmented reality, or this could be applied to virtual reality as well,” said Estes.

GauGAN, the A.I. artist, is freed

NVIDIA

In the past, Nvidia showed how its GauGAN drawing tool can turn even the art-challenged among us into artists by allowing you to draw complex, life-life landscapes with just a few simple strokes by leveraging the power of artificial intelligence. The A.I.-drawn scenes, Nvidia revealed, have been used in some big-name Hollywood movies as backdrops as well, with studios overlaying elements that they wanted to set the overall mood for the clip.

Rather than using A.I. to identify elements of a scene, like a dog or a cat, GauGAN, as its name implies, uses a generative adversarial network, or GAN. By specifying where the sky is by drawing a horizon line and where the ocean meets the sky, GauGAN begins to “draw” the scene with clouds and waves, and the user can add in rocks and sea cliffs to this seascape, rendering the scene with proper shadows and reflections. So instead of identifying objects in an image, GAN helps to fill in the image with a realistic creation and rendering of the scene, Nvidia explained.

Given the popularity of the tool, Nvidia revealed that it will be expanding access to GauGAN to everyone. At this time, Estes said there are no plans to monetize GauGAN, despite its creative work in the movie industry, and that the company is simply just relishing the joy that other people get from using it.

Editors’ Recommendations

  • Nvidia’s RTX shows how Neil Armstrong would appear if Apollo 11 landed today
  • Life after launch: Inside the massive effort to preserve NASA’s space artifacts
  • Jump into hyperdrive and launch yourself into the best space games
  • The moon is shrinking as it loses heat, new images reveal
  • The evolution of NASA’s moon buggy is even wilder than where it landed






Read more

More News