• Stars
    star
    777
  • Rank 58,500 (Top 2 %)
  • Language
    C#
  • License
    Apache License 2.0
  • Created over 4 years ago
  • Updated 6 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

ARCore Depth Lab is a set of Depth API samples that provides assets using depth for advanced geometry-aware features in AR interaction and rendering. (UIST 2020)

ARCore Depth Lab - Depth API Samples for Unity

Copyright 2020 Google LLC

Depth Lab is a set of ARCore Depth API samples that provides assets using depth for advanced geometry-aware features in AR interaction and rendering. Some of these features have been used in this Depth API overview video.

DepthLab examples

ARCore Depth API is enabled on a subset of ARCore-certified Android devices. iOS devices (iPhone, iPad) are not supported. Find the list of devices with Depth API support (marked with Supports Depth API) here: https://developers.google.com/ar/devices. See the ARCore developer documentation for more information.

Download the pre-built ARCore Depth Lab app on Google Play Store today.

Get ARCore Depth Lab on Google Play

Branches

ARCore Depth Lab has two branches: master and arcore_unity_sdk.

The master branch contains a subset of Depth Lab features in v1.1.0 and is built upon the recommended AR Foundation 4.2.0 (preview 7) or newer. The master branch supports features including oriented 3D reticles, depth map visualization, collider with depth mesh, avatar locomotion, raw point cloud visualization, recording and playback.

The arcore_unity_sdk branch contains the full features of Depth Lab and is built upon ARCore SDK for Unity v1.24.0 or newer. We recommend using the master branch to build new projects with the AR Foundation SDK and refer to this branch when necessary.

Getting started

These samples target Unity 2020.3.6f1 and require AR Foundation 4.2.0-pre.7 or newer, ARCore Extensions 1.24 or newer. The ARCore Extensions sources are automatically included via the Unity package manager.

This project only builds with the Build Platform Android. Build the project to an Android device instead of using the Play button in the Unity editor.

Sample features

The sample scenes demonstrate three different ways to access depth. Supported features in the master branch is labeled with ⭐, while the rest features can be found in the arcore_unity_sdk branch.

  1. Localized depth: Sample single depth values at certain texture coordinates (CPU).
    • Oriented 3D reticles
    • Character locomotion on uneven terrain
    • Collision checking for AR object placement
    • Laser beam reflections
    • Rain and snow particle collision
  2. Surface depth: Create a connected mesh representation of the depth data (CPU/GPU).
    • Point cloud fusion ⭐
    • AR shadow receiver
    • Paint splat
    • Physics simulation
    • Surface retexturing
  3. Dense depth: Process depth data at every screen pixel (GPU).
    • False-color depth map
    • AR fog
    • Occlusions
    • Depth-of-field blur
    • Environment relighting
    • 3D photo

Building samples

Individual scenes can be built and run by enabling a particular scene (e.g., OrientedReticle to try out the oriented 3D reticle.) and the ARFDepthComponents object in the scene. Remember to disable the ARFDepthComponents object in individual scenes when building all demos with the DemoCarousel scene.

We also provide a demo user interface that allows users to seamlessly switch between examples. Please make sure to set the Build Platform to Android and verify that the main DemoCarousel scene is the first enabled scene in the Scenes In Build list under Build Settings. Enable all scenes that are part of the demo user interface.

Assets/ARRealismDemos/DemoCarousel/Scenes/DemoCarousel.unity Assets/ARRealismDemos/OrientedReticle/Scenes/OrientedReticle.unity Assets/ARRealismDemos/DepthEffects/Scenes/DepthEffects.unity Assets/ARRealismDemos/Collider/Scenes/Collider.unity Assets/ARRealismDemos/AvatarLocomotion/Scenes/AvatarLocomotion.unity Assets/ARRealismDemos/PointCloud/Scenes/RawPointClouds.unity

The following scenes can be found in the arcore_unity_sdk branch, but are not yet available with the AR Foundation SDK.

Assets/ARRealismDemos/MaterialWrap/Scenes/MaterialWrap.unity Assets/ARRealismDemos/Splat/Scenes/OrientedSplat.unity Assets/ARRealismDemos/LaserBeam/Scenes/LaserBeam.unity Assets/ARRealismDemos/Relighting/Scenes/PointsRelighting.unity Assets/ARRealismDemos/DepthEffects/Scenes/FogEffect.unity Assets/ARRealismDemos/SnowParticles/Scenes/ArCoreSnowParticles.unity Assets/ARRealismDemos/RainParticles/Scenes/RainParticlesScene.unity Assets/ARRealismDemos/DepthEffects/Scenes/DepthOfFieldEffect.unity Assets/ARRealismDemos/Water/Scenes/Water.unity Assets/ARRealismDemos/CollisionDetection/Scenes/CollisionAwareObjectPlacement.unity Assets/ARRealismDemos/ScreenSpaceDepthMesh/Scenes/ScreenSpaceDepthMesh.unity Assets/ARRealismDemos/ScreenSpaceDepthMesh/Scenes/StereoPhoto.unity

Sample project structure

The main sample assets are placed inside the Assets/ARRealismDemos folder. Each subfolder contains sample features or helper components.

AvatarLocomotion

The AR character in this scene follows user-set waypoints while staying close to the surface of an uneven terrain. This scene uses raycasting and depth lookups on the CPU to calculate a 3D point on the surface of the terrain.

Collider

This physics simulation playground uses screen-space depth meshes to enable collisions between Unity's rigid-body objects and the physical environment.

After pressing an on-screen button, a Mesh object is procedurally generated from the latest depth map. This is used to update the sharedMesh parameter of the MeshCollider object. A randomly selected primitive rigid-body object is then thrown into the environment.

CollisionDetection

This AR object placement scene uses depth lookups on the CPU to test collisions between the vertices of virtual objects and the physical environment.

Common

This folder contains scripts and prefabs that are shared between the feature samples. For more details, see the Helper Classes section below.

DemoCarousel

This folder contains the main scene, which provides a carousel user interface. This scene allows the user to seamlessly switch between different features. A scene can be selected by directly touching a preview thumbnail or dragging the carousel UI to the desired position.

DepthEffects

This folder contains three dense depth shader processing examples.

The DepthEffects scene contains a fragment-shader effect that can transition from the AR camera view to a false-color depth map. Warm colors indicate closer regions in the depth map. Cold colors indicate further regions.

The DepthOfFieldEffect scene contains a simulated Bokeh fragment-shader effect. This blurs the regions of the AR view that are not at the user-defined focus distance. The focus anchor is set in the physical environment by touching the screen. The focus anchor is a 3D point that is locked to the environment and always in focus.

The FogEffect scene contains a fragment-shader effect that adds a virtual fog layer on the physical environment. Close objects will be more visible than objects further away. A slider controls the density of the fog.

LaserBeam

This laser reflection scene allows the user to shoot a slowly moving laser beam by touching anywhere on the screen.

This uses:

  • The DepthSource.GetVertexInWorldSpaceFromScreenXY(..) function to look up a raycasted 3D point
  • The ComputeNormalMapFromDepthWeightedMeanGradient(..) function to look up the surface normal based on a provided 2D screen position.

MaterialWrap

This experience allows the user to change the material of real-world surfaces through touch. This uses depth meshes.

OrientedReticle

This sample uses depth hit testing to obtain the raycasted 3D position and surface normal of a raycasted screen point.

PointCloud

This sample computes a point cloud on the CPU using the depth array. Press the Update button to compute a point cloud based on the latest depth data.

RawPointClouds

This sample fuses point clouds with the raw depth maps on the CPU using the depth array. Drag the confidence slider to change the visibility of each point based on the confidence value of the corresponding raw depth.

RainParticles

This sample uses the GPU depth texture to compute collisions between rain particles and the physical environment.

Relighting

This sample uses the GPU depth texture to computationally re-light the physical environment through the AR camera. Areas of the physical environment close to the artificial light sources are lit, while areas farther away are darkened.

ScreenSpaceDepthMesh

This sample uses depth meshes. A template mesh containing a regular grid of triangles is created once on the CPU. The GPU shader displaces each vertex of the regular grid based on the reprojection of the depth values provided by the GPU depth texture. Press Freeze to take a snapshot of the mesh and press Unfreeze to revert back to the live updating mesh.

StereoPhoto

This sample uses depth meshes and ScreenSpaceDepthMesh. After freezing the mesh, we cache the current camera's projection and view matrices, circulate the camera around a circle, and perform projection mapping onto the depth mesh with the cached camera image. Press Capture to create the animated 3D photo and press Preview to go back to camera preview mode.

SnowParticles

This sample uses the GPU depth texture to compute collisions between snow particles, the physical environment, and the orientation of each snowflake.

Splat

This sample uses the Oriented Reticle and the depth mesh in placing a surface-aligned texture decal within the physical environment.

Water

This sample uses a modified GPU occlusion shader to create a flooding effect with artificial water in the physical environment.

Helper classes

DepthSource

A singleton instance of this class contains references to the CPU array and GPU texture of the depth map, camera intrinsics, and many other depth look up and coordinate transformation utilities. This class acts as a high-level wrapper for the MotionStereoDepthDataSource class.

DepthTarget

Each GameObject containing a DepthTarget becomes a subscriber to the GPU depth data. DepthSource will automatically update the depth data for each DepthTarget. At least one instance of DepthTarget has to be present in the scene in order for DepthSource to provide depth data.

MotionStereoDepthDataSource

This class contains low-level operations and direct access to the depth data. It should only be use by advanced developers.

User privacy requirements

You must prominently disclose the use of Google Play Services for AR (ARCore) and how it collects and processes data in your application. This information must be easily accessible to end users. You can do this by adding the following text on your main menu or notice screen: "This application runs on Google Play Services for AR (ARCore), which is provided by Google LLC and governed by the Google Privacy Policy".

Related publication

Please refer to https://augmentedperception.github.io/depthlab/ for our paper, supplementary material, and presentation published in ACM UIST 2020: "DepthLab: Real-Time 3D Interaction With Depth Maps for Mobile Augmented Reality".

References

If you use ARCore Depth Lab in your research, please reference it as:

@inproceedings{Du2020DepthLab,
  title = {{DepthLab: Real-time 3D Interaction with Depth Maps for Mobile Augmented Reality}},
  author = {Du, Ruofei and Turner, Eric and Dzitsiuk, Maksym and Prasso, Luca and Duarte, Ivo and Dourgarian, Jason and Afonso, Joao and Pascoal, Jose and Gladstone, Josh and Cruces, Nuno and Izadi, Shahram and Kowdle, Adarsh and Tsotsos, Konstantine and Kim, David},
  booktitle = {Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology},
  year = {2020},
  publisher = {ACM},
  pages = {829--843},
  series = {UIST '20}
  doi = {10.1145/3379337.3415881}
}

or

Ruofei Du, Eric Turner, Maksym Dzitsiuk, Luca Prasso, Ivo Duarte, Jason Dourgarian, Joao Afonso, Jose Pascoal, Josh Gladstone, Nuno Cruces, Shahram Izadi, Adarsh Kowdle, Konstantine Tsotsos, and David Kim. 2020. DepthLab: Real-Time 3D Interaction With Depth Maps for Mobile Augmented Reality. Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology (UIST '20), 829-843. DOI: http://dx.doi.org/10.1145/3379337.3415881.

We would like to also thank Levana Chen, Xinyun Huang, and Ted Bisson for integrating DepthLab with AR Foundation.

Additional information

You may use this software under the Apache 2.0 License.

More Repositories

1

easypermissions

Simplify Android M system permissions
Java
9,846
star
2

mlkit

A collection of sample apps to demonstrate how to use Google's ML Kit APIs on Android and iOS
Java
3,534
star
3

google-services

A collection of quickstart samples demonstrating the Google APIs for Android and iOS
Java
3,045
star
4

android-vision

Deprecated: The Mobile Vision API is now a part of ML Kit: Check out this repo:
Java
2,928
star
5

android-testing-templates

Java
1,960
star
6

unity-jar-resolver

Unity plugin which resolves Android & iOS dependencies and performs version management
C#
1,275
star
7

mediapipe

Jupyter Notebook
1,253
star
8

android-custom-lint-rules

This sample demonstrates how to create a custom lint checks and corresponding lint tests
Kotlin
949
star
9

assistant-sdk-python

Samples and bindings for the Google Assistant API
Python
914
star
10

android-vulkan-tutorials

A set of samples to illustrate Vulkan API on Android
C++
849
star
11

android-testdpc

Test DPC is a sample device policy controller for use with Android Enterprise. It gives developers the ability to see how their app will behave in a managed context such as device owner or within a managed profile. Users can set up a work profile, enable work apps, set applications restrictions, manage security polices, and much more. The app also serves as a implementation reference for other DPCs
Java
751
star
12

io2015-codelabs

codelabs for Google I/O 2015
Java
517
star
13

google-photos

Samples for the Google Photos Library API 📸
JavaScript
495
star
14

vulkan-basic-samples

C++
494
star
15

android-play-publisher-api

Java
491
star
16

androidtv-sample-inputs

Sample Channel App (TV Input Service) on Android TV using TIF
Java
487
star
17

oauth-apps-for-windows

OAuth for Apps: Samples for Windows
C#
474
star
18

android-media-controller

Kotlin
448
star
19

android-dynamic-code-loading

Android dynamic code loading sample for Dynamic Feature Modules.
Kotlin
423
star
20

google-signin-unity

Google Sign-In API plugin for Unity game engine. Works with Android and iOS.
C++
407
star
21

android-AppUsageStatistics

Java
366
star
22

web-fundamentals

Google Web Fundamentals
HTML
313
star
23

android-play-safetynet

Samples for the Google SafetyNet Attestation API
Java
287
star
24

io2014-codelabs

Google I/O 2014 Codelabs
Java
177
star
25

android-play-games-in-motion

Java
152
star
26

glass-enterprise-samples

Glass Enterprise Edition 2 Samples
Java
123
star
27

cloud-polymer-go

Sample App Engine application with Go, Cloud Endpoints, and Polymer
HTML
118
star
28

appauth-js-electron-sample

This is an Electron Application, which uses the AppAuth-JS library.
TypeScript
115
star
29

arcore-lightboard

C#
97
star
30

assistant-sdk-cpp

Example of Google Assistant gRPC in C++
C++
96
star
31

android-PermissionRequest

Java
95
star
32

ios-vision

Objective-C
93
star
33

assistant-sdk-nodejs

JavaScript
93
star
34

sceneform-samples

Sceneform samples for 3D rendering for ARCore in Java.
Java
87
star
35

arcore-ml-sample

Java
71
star
36

ios-nearby

Objective-C
68
star
37

arcore-illusive-images

C#
62
star
38

mugo

Sample on how to transpile a small subset of go to Arduino sketches
Go
47
star
39

identity-appflip-android

Lightweight Android app that simulates your native app role during App Flip
Java
34
star
40

identity-toolkit-node

JavaScript
29
star
41

functions-as-a-service

A demo showing Google Cloud Functions + Google Maps Platform
TypeScript
28
star
42

identity-toolkit-go

Identity toolkit sample code for Go
Go
25
star
43

identity-toolkit-java

Java
23
star
44

identity-appflip-tester-android

Lightweight Android app that simulates the Google app role during App Flip
Java
18
star
45

onetwoseven

Programmers debugging web server.
Go
16
star
46

identity-toolkit-php

PHP
15
star
47

android-TensorFlowCloudMachineLearningEngineStylizer

Java
14
star
48

Firebase-Plays-GCP-2016

JavaScript
14
star
49

identity-toolkit-ios

Objective-C
14
star
50

brillo-dragonboard-jacksmart

12
star
51

maps-deckgl-scatterplot-example

JavaScript
11
star
52

io19-sonic-boom

Live coding demo of "Sonic Boom!" talk in Google I/O 2019
C++
11
star
53

meet-live-sharing

Java
10
star
54

identity-appflip-ios

Lightweight iOS app that simulates your native app role during App Flip
Swift
10
star
55

dcp-parser-go

Go
9
star
56

identity-appflip-tester-ios

Lightweight iOS app that simulates the Google app role during App Flip
Swift
9
star
57

amapi

Kotlin
8
star
58

testloopmanager

Java
8
star
59

task-interop

Kotlin
6
star
60

sceneform-poly-browser

Java
6
star
61

subgraph_sdk_sample

Kotlin
6
star
62

identity-toolkit-ruby

HTML
4
star
63

engage-sdk-samples

Set of sample apps that demonstrate how to integrate the SDK in your app to publish different types of content. These apps are a great way to learn how to use the SDK, to get started with the integration in your own app, as well as some best practices.
Java
4
star
64

gboard-dev-samples

Java
4
star
65

identity-toolkit-django

2
star
66

snippets

Hosting for miscellaneous code snippets
2
star
67

pluscode-swift-demo

An example Plus Code service written in Swift
Swift
2
star
68

searchinapps-sample

Kotlin
2
star
69

.allstar

1
star
70

.github

1
star
71

zero-touch-enrollment-colabs

Jupyter Notebook
1
star