Pokémon could soon hide behind your couch thanks to an ARCore update
What you need to know
- A new ARCore Depth API helps your phone understand the depth that real-world objects have.
- Virtual objects in an AR application can now interact with physical objects, hiding behind them or even illuminating them in a realistic way.
- Phones with ToF sensors will be able to perceive depth on moving objects as well, opening up the doors for several new opportunities within AR/XR games and apps.
Augmented Reality, or AR, is a pretty cool concept. The idea is to use your device's camera (in this case, a smartphone) to film your surroundings and add virtual objects alongside the physical ones. Pokémon Go is likely the first real AR experience most smartphone users had, but there's always been one big issue with AR that breaks the experience — virtual objects don't really look like they're in the real world.
That all changes with today's launch of the ARCore Depth API, a development language update that allows developers to utilize a new depth map functionality within Google's ARCore. Over 200 million Android phones are compatible with this new functionality, and devices with ToF sensors on them will be able to provide dynamic occlusion, while phones without ToF sensors will only be able to occlude virtual objects behind static real-world objects.
What in the world does all that mean? Well, if you have a Galaxy Note 10, for instance, you could feasibly play AR Frogger from atop a highway bridge. The phone would be able to use that ToF sensor to detect the physical size and location of cars as they move and your virtual hopping frog could get squashed if you make the wrong move. While there's no such game quite yet, there will no longer be any technical limitations for developers who would like to make one. While phones without a ToF sensor can't use moving objects for occlusion, they can use static objects, such as couches or chairs for calculation.
To demonstrate the stark contrast between the new and old understanding of depth, two GIFs of a virtual cat are presented standing behind a chair. In the old ARCore language (above GIF, left side), the cat looks as if it belonged in an M.C. Escher painting, as its back half is in front of the chair, while its front half appears to be coming from behind it. The new ARCore Depth language (above GIF, right side) understands that a chair is actually present in the scene and includes that in the depth calculations, properly obstructing the back half of the cat, as if it were really standing on the living room floor.
It's not just good for virtual pet demonstrations, though. Some more real-world examples include the Houzz app, which lets users preview furniture and other decorations right in their own homes via their smartphones. This functionality, as a whole, isn't anything new, as brands like IKEA and Amazon have included this type of functionality for a while now. Like the cat demo, though, the difference is found in the ability to place virtual furniture in your house and see how it looks in front, alongside, and behind your existing pieces.
Google is also working with several video game developers to better integrate this technology into current and upcoming AR-enabled titles. Unity, one of the single most popular gaming engines in modern times, will be including support for Google's latest ARCore updates. Ralph Hauwert, Vice President, Platforms at Unity Technologies, had this to say:
The best action-packed games for Android!
Get the top Black Friday deals right in your inbox: Sign up now!
Receive the hottest deals and product recommendations alongside the biggest tech news from the Android Central team straight to your inbox!