What does the DepthVision camera do on the Note 10+?

Galaxy Note 10 Plus
Galaxy Note 10 Plus (Image credit: Android Central)

Best answer: The Note 10+'s DepthVision camera consists of two physical sensors and allows the Note 10+ to accurately separate a subject from its surroundings. Because of the way it can define the edge of an object, it's useful for things like portrait photography, AR measurements and effects, and plays a big part in the Note 10+'s 3D scanning application.

Cool name, even cooler feature

The Note 10+ has a camera system you won't find on any other phone. Using five different sensors — four camera lenses and an infrared sensor used to measure Time of Flight — it can capture everything from wide-angle shots to selfies, It can also scan objects into 3D mapped images. One of the coolest features you'll find is what Samsung calls DepthVision.

Depth Vision consists of a standalone camera sensor and an infrared light sensor that, as mentioned, is used to calculate Time of Flight. The two combined allow the Note 10+ to accurately find the edges of an object and "pull" it away from its surroundings, isolating it from any foreground or background scenery in the frame. On its own, this is very useful for Samsung's 3D scanning tool and S Pen AR effects, but that's not all DepthVision brings to the table.

With the data about an object — be it a toy teddy bear, a person, your favorite pet, or anything else — isolated from the rest of what the camera can see, you can then start to measure the actual size of that object. Since the Time of Flight sensor knows how far away an object is from the back of the camera lens, the software can calculate how long the object is across any of its surfaces, and with that data, it can make a very well-educated guess about the other objects in the camera's field of view. AR applications that measure objects, for example, become available as well as some very valuable distance information between the camera lens, the subject of a photo, and the background.

This makes the DepthVision's sensor data really valuable for portrait photography. Bokeh looks bad when it's done poorly; not all objects are the same distance away so not all objects should have the same level of blurred effect or manipulated color highlights. On a "real" camera, bokeh is a side effect fo a shallow Depth of Field, but on a smartphone without any focal length and a digital shutter, it's an effect of an algorithm.

It's possible to create good bokeh using machine learning, and we've seen Google, Apple, and Huawei do a pretty good job of it when paired with lenses that capture the necessary data. Samsung could have tried to use Bixby's Machine Learning abilities to do the same, but adding DepthVision's data to the mix means the Galaxy Note 10+ should be able to deliver photos with excellent focus and more true-to-life bokeh than it could by using just flat sensor data.

We expect to see some amazing shots from the Note 10+ and DepthVision will be a big part of the reason why.

Jerry Hildenbrand
Senior Editor — Google Ecosystem

Jerry is an amateur woodworker and struggling shade tree mechanic. There's nothing he can't take apart, but many things he can't reassemble. You'll find him writing and speaking his loud opinion on Android Central and occasionally on Threads.