Wednesday, April 24, 2024

What does the DepthVision camera do on the Note 10+?

Share

galaxy-note-10-plus-aura-glow-in-hand-ca

Best answer: The Note 10+’s DepthVision camera consists of two physical sensors and allows the Note 10+ to accurately separate a subject from its surroundings. Because of the way it can define the edge of an object, it’s useful for things like portrait photography, AR measurements and effects, and plays a big part in the Note 10+’s 3D scanning application.

  • Phone camera like no other: Samsung Galaxy Note 10+ (From $1,100 at Samsung)

Cool name, even cooler feature

The Note 10+ has a camera system you won’t find on any other phone. Using five different sensors — four camera lenses and an infrared sensor used to measure Time of Flight — it can capture everything from wide-angle shots to selfies, It can also scan objects into 3D mapped images. One of the coolest features you’ll find is what Samsung calls DepthVision.

galaxy-note10_camera_3d_quick_inch.jpg?i

Depth Vision consists of a standalone camera sensor and an infrared light sensor that, as mentioned, is used to calculate Time of Flight. The two combined allow the Note 10+ to accurately find the edges of an object and “pull” it away from its surroundings, isolating it from any foreground or background scenery in the frame. On its own, this is very useful for Samsung’s 3D scanning tool and S Pen AR effects, but that’s not all DepthVision brings to the table.

With the data about an object — be it a toy teddy bear, a person, your favorite pet, or anything else — isolated from the rest of what the camera can see, you can then start to measure the actual size of that object. Since the Time of Flight sensor knows how far away an object is from the back of the camera lens, the software can calculate how long the object is across any of its surfaces, and with that data, it can make a very well-educated guess about the other objects in the camera’s field of view. AR applications that measure objects, for example, become available as well as some very valuable distance information between the camera lens, the subject of a photo, and the background.

This makes the DepthVision’s sensor data really valuable for portrait photography. Bokeh looks bad when it’s done poorly; not all objects are the same distance away so not all objects should have the same level of blurred effect or manipulated color highlights. On a “real” camera, bokeh is a side effect fo a shallow Depth of Field, but on a smartphone without any focal length and a digital shutter, it’s an effect of an algorithm.

It’s possible to create good bokeh using machine learning, and we’ve seen Google, Apple, and Huawei do a pretty good job of it when paired with lenses that capture the necessary data. Samsung could have tried to use Bixby’s Machine Learning abilities to do the same, but adding DepthVision’s data to the mix means the Galaxy Note 10+ should be able to deliver photos with excellent focus and more true-to-life bokeh than it could by using just flat sensor data.

We expect to see some amazing shots from the Note 10+ and DepthVision will be a big part of the reason why.

Biggest and best

Samsung Galaxy Note 10+

note-10-plus-render-front-with-s-pen-lea

From $1,100 at Samsung

An amazing camera system.

It’s a powerhouse of a phone that will do absolutely everything you want and has a camera like no phone you’ve ever seen before. Great photos, AR and even 3D scanning are all possible with the Note 10+.

Read more

More News