Experiencing Real-time 3D Interaction with Depth Maps for
Mobile Augmented Reality in DepthLab
Ruofei Du, Eric Turner, Maksym Dzitsiuk, Luca Prasso, Ivo Duarte, Jason Dourgarian,
Joao Afonso, Jose Pascoal, Josh Gladstone, Nuno Cruces, Shahram Izadi, Adarsh Kowdle,
Konstantine Tsotsos, David Kim
†
Google LLC
(a) physics-based collisions (b) avatar path planning (c) snow particles
(d) geometry-aware flooding effects (e) ray-marching-based scene relighting (f) occlusion-aware rendering
Figure 1. Real-time interactive components enabled by DepthLab: (a) virtual objects colliding with real stairs; (b) virtual avatar path planning and
geometry-aware shadows; (c) AR snow effect; (d) virtual flooding effects bounded by physical walls; (e) scene relighting with three virtual point lights;
(f) occlusion-aware rendering of a virtual cat behind the real bed. Please refer to the main paper [1] and the accompanying video for more results.
ABSTRACT
We demonstrate DepthLab [1], a playground for interactive
augmented reality experiences leveraging the shape and depth
of the physical environment on a mobile phone. Based on the
ARCore Depth API, DepthLab encapsulates a variety of depth-
based UI/UX paradigms, including geometry-aware rendering
(occlusion, shadows, texture decals), surface interaction be-
haviors (physics, collision detection, avatar path planning),
and visual effects (relighting, 3D-anchored focus and aperture
effects, 3D photos). We have open-sourced our software at
https://github.com/googlesamples/arcore-depth-lab to facili-
tate future research and development in depth-aware mobile
†
Corresponding author. Please contact Ruofei Du at
me@duruofei.com and/or David Kim at contact@davidkim.de.
Permission to make digital or hard copies of part or all of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for profit or commercial advantage and that copies bear this notice and the full citation
on the first page. Copyrights for third-party components of this work must be honored.
For all other uses, contact the owner/author(s).
UIST ’20 Adjunct, October 20–23, 2020, Virtual Event, USA
© 2020 Copyright is held by the author/owner(s).
ACM ISBN 978-1-4503-7514-6/20/10.
http://dx.doi.org/10.1145/3379350.3416136
AR experiences. With DepthLab, we aim to help mobile devel-
opers to effortlessly integrate depth into their AR experiences
and amplify the expression of their creative vision.
Author Keywords
Depth map; interactive 3D graphics; real time; interaction;
augmented reality; mobile AR; rendering; GPU; ARCore.
CCS Concepts
•Human-centered computing → Mixed / augmented real-
ity; User interface toolkits;
INTRODUCTION
Real-time depth data is readily available on mobile phones
with passive or active sensors and on VR/AR devices. How-
ever, the use of this rich data about our environment is under-
explored and merely leveraged in mainstream AR applications.
In this demonstration paper, we present DepthLab [1], an
opensourced library based on ARCore Depth API [2] that
encapsulates a variety of real-time UI/UX features for depth,
including geometry-aware rendering and physics simulation,
surface interaction behaviors, and visual effects. Our goal is
to bring these advanced features to mobile AR experiences