Friday, March 29, 2024

I used an app to create 3D models with my iPhone, and it’s shockingly great

Share

The pace of innovation in artificial intelligence image generation is phenomenal. One company — Luma Labs — provides an excellent example of a practical, yet hugely entertaining use of the latest technology applied to 3D images.

Contents

  • What is Luma AI?
  • Luma AI iPhone compatibility
  • How to use Luma AI
  • Getting better all the time
  • The highs and lows of 3D scanning
  • Luma AI price and availabilityShow 1 more item

Luma AI is in beta testing on the iPhone and eventually will be made available on Android as well. I got into the beta test group and can share some information about what this amazing app does and how easy it is to get incredible results.

What is Luma AI?

Tracey Truly

Luma AI is an app and a service developed by Luma Labs. It captures three-dimensional images using a technique known as Neural Radiance Fields (NeRF). It’s similar to the ray-tracing technique that makes the graphics in high-end gaming look so realistic.

NeRFs have been around for a few years now but have existed primarily in research facilities until very recently. With the explosion of AI image generation, headlined by photorealistic Dall-E renderings, NeRFs are beginning to be explored by a much broader audience. The first wave of new NeRF software required some developer skills and installing software packages from GitHub, then training the A.I. on a set of photos. It was a bit much for the average person.

Luma Labs is about to make the process dramatically simpler with its Luma AI app. From start to finish, the entire process can be managed from an iPhone, and the end result is more accessible as well.

Luma AI iPhone compatibility

Joe Maring/Digital Trends

Since Apple made a point of demonstrating the 3D depth measuring capabilities of LiDAR sensors, you might expect Luma AI to require the more expensive iPhone 14 Pro or iPhone 14 Pro Max to capture 3D models. However, the clever developers at Luma Labs use photogrammetry instead. That makes this technology compatible with iPhones as old as the iPhone 11.

In the future, the app will become available on Android and there’s already a web version in beta testing as well. In an interview, Luma Labs CEO Amit Jain said the iPhone app is expected to be ready for public release in a few weeks.

How to use Luma AI

Joe Maring/Digital Trends

To use Luma AI, you simply circle slowly around an object at three different heights. An AR overlay guides you through the process, which takes a few minutes and becomes easier after a few tries as you get familiar with the process. Before long, you’ll be able to capture a medium-sized object like a chair in a couple of minutes.

Any size object can be handled because, to Luma AI, it’s just a series of images — no matter how big the subject is. If you circle a cup, a statue, or a building, the general idea remains the same.

The app will let you know when it has enough images, and when that happens, a Finish button will appear. You can also keep circling and filling in gaps in the AR cloud of rings and rectangles that represent the photos taken so far. The app will automatically stop the capture when an ideal amount of photos have been collected. There’s also a freeform mode that lets you capture even more photos, at different angles and distances. You can see the process in the YouTube video I created below. It’s an iPhone app, so it’s a portrait video.

Processing is the next step, which happens on Luma Labs’ servers. After an hour or so, the finished NeRF will be available in the app in several different forms. The first view given is a generated video, showing a fly-by of the object in its natural environment. An interactive version is next and lets you spin the view by dragging a finger or a mouse across the image.

Most impressive of all, the subject of the capture, extracted from the background, is also available. With this representation, you can pivot the 3D object on any axis and zoom in to see it more closely. The sharpness depends on how many images were collected and how slow and stable you were during the capture process.

Getting better all the time

Luma Labs is updating the app and service at a remarkable pace. Within a week of receiving the beta test invitation, two powerful new features were added that expand the possibilities greatly. The first is a web upload option that allows you to capture video without the app, then upload it to Luma Labs website for processing. The results appear online and in the app.

This means it’s possible to use any of the iPhone’s camera modes, capture video with a dedicated camera, or even record video with AR glasses like Ray-Ban Stories. For example, a drone video becomes even more epic when you can smooth the motion and change direction after you’ve already landed. Luma Labs shared a good example showing an aerial view of autumn leaves in this tweet.

Fall in Palo Alto is gorgeous! 🍂 https://t.co/EwNkiv0DQV pic.twitter.com/hdd7iBLYgV

— Luma AI (@LumaLabsAI) October 22, 2022

The other new feature opens up 3D editing, painting, and 3D printing opportunities. The 3D meshes can be exported with textures in OBJ or GLTF format. They aren’t optimized but can be viewed with textures intact even with an online viewer such as the free, open-source website Online3DViewer.

Sprout Sprite Fairy Figurine

It’s also possible to open the 3D files in a mesh editor like the free, open-source MeshLab to delete any stray artifacts that appear as floating blobs, as well as clean up, and simplify the model before exporting in a variety of formats. The figurine featured above is about three inches tall and was sculpted by my wife, Tracey, for her business, ALittleCharacter. Luma AI captured a remarkable amount of detail in the sculpture and the log that it was resting upon. The log could have been selected and removed by MeshLab as well.

The highs and lows of 3D scanning

Kyle Brussell shared a dessert display from a party, mentioning he asked the adults to wait for their treats so he could capture it as a digital diorama.

Used @LumaLabsAI at a birthday party last night, made a bunch of adults not eat dessert so I could circle the table with my phone to make a 3D AI dream of the setup like a very cool person pic.twitter.com/sP0vVPB3yx

— Kyle Russell (@kylebrussell) October 30, 2022

Although Luma AI can process video, it relies on still images to construct a three-dimensional scene. That means if the subject moves, it might reduce the quality or clarity of the capture. A 3D image of a person who is seated, as shown in Albert Bozesan’s Tweet, will be good. In the same tweet, the second capture of a sculpture shows what happens when there’s movement within the scene. The background shows people that walked near the subject as distorted shapes.

Took two @LumaLabsAI #NeRFs by a Bavarian lake today. Great way to capture memories, feels like Minority Report. #Tegernsee pic.twitter.com/HLC0ekF7uD

— Albert Bozesan (@AlbertBozesan) October 30, 2022

Luma AI price and availability

Luma AI is currently in beta testing, and invitations are periodically given via the company’s Twitter account. If you have a compatible iPhone and an interest in this technology, you might be able to get early access. There’s also a waitlist on the Luma Labs’ website.

Luma Labs CEO Jain indicated that pricing is yet to be determined and depends upon how broad the user base turns out to be and how the results of the scans are being used. Based on these statements, there might be a professional subscription with more advanced features and a personal subscription for less. For the time being, it will remain free to use.

Read more

More News