• Stars
    star
    2,904
  • Rank 15,008 (Top 0.4 %)
  • Language
    C#
  • License
    Other
  • Created almost 6 years ago
  • Updated 30 days ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Example content for Unity projects based on AR Foundation

AR Foundation Samples

Example AR scenes that use AR Foundation 5.1 and demonstrate its features. Each feature is used in a minimal sample scene with example code that you can modify or copy into your project.

This sample project depends on four Unity packages:

Which version should I use?

The main branch of this repository uses AR Foundation 5.1 and is compatible with Unity 2021.2 and later. To access sample scenes for previous versions of AR Foundation, refer to the table below for links to other branches.

Unity Version AR Foundation Version
2023.2 (alpha) 5.1 (prerelease)
2023.1 (beta) 5.1 (prerelease)
2022.2 5.0
2021.3 4.2
2020.3 4.1

How to use these samples

Build and run on device

You can build the AR Foundation Samples project directly to device, which can be a helpful introduction to using AR Foundation features for the first time.

To build to device, follow the steps below:

  1. Install Unity 2021.2 or later and clone this repository.

  2. Open the Unity project at the root of this repository.

  3. As with any other Unity project, go to Build Settings, select your target platform, and build this project.

Understand the sample code

All sample scenes in this project can be found in the Assets/Scenes folder. To learn more about the AR Foundation components used in each scene, see the AR Foundation Documentation. Each scene is explained in more detail below.

Table of Contents

Sample scene(s) Description
Simple AR Demonstrates basic Plane detection and Raycasting
Camera Scenes that demonstrate Camera features
Plane detection Scenes that demonstrate Plane detection
Image tracking Scenes that demonstrate Image tracking
Object tracking Demonstrates Object tracking
Face tracking Scenes that demonstrate Face tracking
Body tracking Scenes that demonstrate Body tracking
Point clouds Demonstrates Point clouds
Anchors Demonstrates Anchors
Meshing Scenes that demonstrate Meshing
Environment Probes Demonstrates Environment Probes
Occlusion Scenes that demonstrate Occlusion
Check support Demonstrates checking for AR support on device
Interation Demonstrates AR Foundation paired with the XR Interaction Toolkit package
Configuration Chooser Demonstrates AR Foundation's Configuration Chooser
Debug Menu Visualize trackables and configurations on device
ARKit iOS-specific sample scenes
ARCore Record Session Demonstrates ARCore's session recording feature

Simple AR

This is a good starting sample that enables point cloud visualization and plane detection. There are buttons on screen that let you pause, resume, reset, and reload the ARSession.

When a plane is detected, you can tap on the detected plane to place a cube on it. This uses the ARRaycastManager to perform a raycast against the plane. If the plane is in TrackingState.Limited, it will highlight red. In the case of ARCore, this means that raycasting will not be available until the plane is in TrackingState.Tracking again.

Action Meaning
Pause Pauses the ARSession, meaning device tracking and trackable detection (e.g., plane detection) is temporarily paused. While paused, the ARSession does not consume CPU resources.
Resume Resumes a paused ARSession. The device will attempt to relocalize and previously detected objects may shift around as tracking is reestablished.
Reset Clears all detected trackables and effectively begins a new ARSession.
Reload Completely destroys the ARSession GameObject and re-instantiates it. This simulates the behavior you might experience during scene switching.

Camera

CPU Images

This samples shows how to acquire and manipulate textures obtained from AR Foundation on the CPU. Most textures in ARFoundation (e.g., the pass-through video supplied by the ARCameraManager, and the human depth and human stencil buffers provided by the AROcclusionManager) are GPU textures. Computer vision or other CPU-based applications often require the pixel buffers on the CPU, which would normally involve an expensive GPU readback. AR Foundation provides an API for obtaining these textures on the CPU for further processing, without incurring the costly GPU readback.

The relevant script is CpuImageSample.cs.

The resolution of the camera image is affected by the camera's configuration. The current configuration is indicated at the bottom left of the screen inside a dropdown box which lets you select one of the supported camera configurations. The CameraConfigController.cs demonstrates enumerating and selecting a camera configuration. It is on the CameraConfigs GameObject.

Where available (currently iOS 13+ only), the human depth and human stencil textures are also available on the CPU. These appear inside two additional boxes underneath the camera's image.

Basic Light Estimation

Demonstrates basic light estimation information from the camera frame. You should see values for "Ambient Intensity" and "Ambient Color" on screen. The relevant script is BasicLightEstimation.cs script.

HDR Light Estimation

This sample attempts to read HDR lighting information. You should see values for "Ambient Intensity", "Ambient Color", "Main Light Direction", "Main Light Intensity Lumens", "Main Light Color", and "Spherical Harmonics". Most devices only support a subset of these 6, so some will be listed as "Unavailable." The relevant script is HDRLightEstimation.cs script.

On iOS, this is only available when face tracking is enabled and requires a device that supports face tracking (such as an iPhone X, XS or 11). When available, a virtual arrow appears in front of the camera which indicates the estimated main light direction. The virtual light direction is also updated, so that virtual content appears to be lit from the direction of the real light source.

When using HDRLightEstimation, the sample will automatically pick the supported camera facing direction for you, for example World on Android and User on iOS, so it does not matter which facing direction you select in the ARCameraManager component.

Background Rendering Order

Produces a visual example of how changing the background rendering between BeforeOpaqueGeometry and AfterOpaqueGeometry would effect a rudimentary AR application. Leverages Occlusion where available to display AfterOpaqueGeometry support for AR Occlusion.

Camera Grain (ARKit)

This sample demonstrates the camera grain effect. Once a plane is detected, you can place a cube on it with a material that simulates the camera grain noise in the camera feed. See the CameraGrain.cs script. Also see CameraGrain.shader which animates and applies the camera grain texture (through linear interpolation) in screenspace.

This sample requires a device running iOS 13 or later and Unity 2020.2 or later.

EXIF Data

This sample demonstrates how to access camera frame's EXIF metadata. You should see values for all the supported EXIF tags on screen. Refer to ExifDataLogger.cs for more details.

This sample requires iOS 16 or newer.

Plane Detection

Feathered Planes

This sample demonstrates basic plane detection, but uses a better looking prefab for the ARPlane. Rather than being drawn as exactly defined, the plane fades out towards the edges.

Toggle Plane Detection

This sample shows how to toggle plane detection on and off. When off, it will also hide all previously detected planes by disabling their GameObjects. See PlaneDetectionController.cs.

Plane Classification

This sample shows how to query for a plane's classification. Some devices attempt to classify planes into categories such as "door", "seat", "window", and "floor". This scene enables plane detection using the ARPlaneManager, and uses a prefab which includes a component which displays the plane's classification, or "none" if it cannot be classified.

Plane Masking

This sample demonstrates basic plane detection, but uses an occlusion shader for the plane's material. This makes the plane appear invisible, but virtual objects behind the plane are culled. This provides an additional level of realism when, for example, placing objects on a table.

Move the device around until a plane is detected (its edges are still drawn) and then tap on the plane to place/move content.

Image Tracking

There are two samples demonstrating image tracking. The image tracking samples are supported on ARCore and ARKit. To enable image tracking, you must first create an XRReferenceImageLibrary. This is the set of images to look for in the environment. Click here for instructions on creating one.

You can also add images to the reference image library at runtime. This sample includes a button that adds the images one.png and two.png to the reference image library. See the script DynamicLibrary.cs for example code.

Run the sample on an ARCore or ARKit-capable device and point your device at one of the images in Assets/Scenes/ImageTracking/Images. They can be displayed on a computer monitor; they do not need to be printed out.

Basic Image Tracking

At runtime, ARFoundation will generate an ARTrackedImage for each detected reference image. This sample uses the TrackedImageInfoManager.cs script to overlay the original image on top of the detected image, along with some meta data.

Image Tracking With Multiple Prefabs

With PrefabImagePairManager.cs script, you can assign different prefabs for each image in the reference image library.

You can also change prefabs at runtime. This sample includes a button that switch between the original and alternative prefab for the first image in the reference image library. See the script DynamicPrefab.cs for example code.

Object Tracking

Similar to the image tracking sample, this sample detects a 3D object from a set of reference objects in an XRReferenceObjectLibrary. Click here for instructions on creating one.

To use this sample, you must have a physical object the device can recognize. The sample's reference object library is built using two reference objects. The sample includes printable templates which can be printed on 8.5x11 inch paper and folded into a cube and cylinder.

Alternatively, you can scan your own objects and add them to the reference object library.

This sample requires iOS 12 or above.

Face Tracking

There are several samples showing different face tracking features. Some are ARCore specific and some are ARKit specific.

Face Pose

This is the simplest face tracking sample and simply draws an axis at the detected face's pose.

This sample uses the front-facing (i.e., selfie) camera.

Face Mesh

This sample instantiates and updates a mesh representing the detected face. Information about the device support (e.g., number of faces that can be simultaneously tracked) is displayed on the screen.

This sample uses the front-facing (i.e., selfie) camera.

Face Regions (ARCore)

"Face regions" are an ARCore-specific feature which provides pose information for specific "regions" on the detected face, e.g., left eyebrow. In this example, axes are drawn at each face region. See the ARCoreFaceRegionManager.cs.

This sample uses the front-facing (i.e., selfie) camera.

Blend Shapes (ARKit)

"Blend shapes" are an ARKit-specific feature which provides information about various facial features on a scale of 0..1. For instance, "wink" and "frown". In this sample, blend shapes are used to puppet a cartoon face which is displayed over the detected face. See the ARKitBlendShapeVisualizer.cs.

This sample uses the front-facing (i.e., selfie) camera.

Eye Lasers, Eye Poses, and Fixation Point (ARKit)

These samples demonstrate eye and fixation point tracking. Eye tracking produces a pose (position and rotation) for each eye in the detected face, and the "fixation point" is the point the face is looking at (i.e., fixated upon). EyeLasers uses the eye pose to draw laser beams emitted from the detected face.

This sample uses the front-facing (i.e., selfie) camera and requires an iOS device with a TrueDepth camera.

Rear Camera (ARKit)

iOS 13 adds support for face tracking while the world-facing (i.e., rear) camera is active. This means the user-facing (i.e., front) camera is used for face tracking, but the pass through video uses the world-facing camera. To enable this mode in ARFoundation, you must enable an ARFaceManager, set the ARSession tracking mode to "Position and Rotation" or "Don't Care", and set the ARCameraManager's facing direction to "World". Tap the screen to toggle between the user-facing and world-facing cameras.

The sample code in DisplayFaceInfo.OnEnable shows how to detect support for these face tracking features.

When using the world-facing camera, a cube is displayed in front of the camera whose orientation is driven by the face in front of the user-facing camera.

This feature requires a device with a TrueDepth camera and an A12 bionic chip running iOS 13.

Body Tracking

Body Tracking 2D

This sample demonstrates 2D screen space body tracking. A 2D skeleton is generated when a person is detected. See the ScreenSpaceJointVisualizer.cs script.

This sample requires a device with an A12 bionic chip running iOS 13 or above.

Body Tracking 3D

This sample demonstrates 3D world space body tracking. A 3D skeleton is generated when a person is detected. See the HumanBodyTracker.cs script.

This sample requires a device with an A12 bionic chip running iOS 13 or above.

Point Clouds

This sample shows all feature points over time, not just the current frame's feature points as the "AR Default Point Cloud" prefab does. It does this by using a slightly modified version of the ARPointCloudParticleVisualizer component that stores all the feature points in a Dictionary. Since each feature point has a unique identifier, it can look up the stored point and update its position in the dictionary if it already exists. This can be a useful starting point for custom solutions that require the entire map of point cloud points, e.g., for custom mesh reconstruction techniques.

This sample has two UI components:

  • A button in the lower left which allows you to switch between visualizing "All" the points and just those in the "Current Frame".
  • Text in the upper right which displays the number of points in each point cloud (ARCore & ARKit will only ever have one).

Anchors

This sample shows how to create anchors as the result of a raycast hit. The "Clear Anchors" button removes all created anchors. See the AnchorCreator.cs script.

This script can create two kinds of anchors:

  1. If a feature point is hit, it creates a normal anchor at the hit pose using the GameObject.AddComponent<ARAnchor>() method.
  2. If a plane is hit, it creates an anchor "attached" to the plane using the AttachAnchor method.

Meshing

These meshing scenes use features of some devices to construct meshes from scanned data of real world surfaces. These meshing scenes will not work on all devices.

For ARKit, this functionality requires at least iPadOS 13.4 running on a device with a LiDAR scanner.

Classification Meshes

This scene demonstrates mesh classification functionality. With mesh classification enabled, each triangle in the mesh surface is identified as one of several surface types. This sample scene creates submeshes for each classification type and renders each mesh type with a different color.

This scene only works on ARKit.

Normal Meshes

This scene renders an overlay on top of the real world scanned geometry illustrating the normal of the surface.

Occlusion Meshes

At first, this scene may appear to be doing nothing. However, it is rendering a depth texture on top of the scene based on the real world geometry. This allows for the real world to occlude virtual content. The scene has a script on it that fires a red ball into the scene when you tap. You will see the occlusion working by firing the red balls into a space which you can then move the iPad camera behind some other real world object to see that the virtual red balls are occluded by the real world object.

Environment Probes

This sample demonstrates environment probes, a feature which attempts to generate a 3D texture from the real environment and applies it to reflection probes in the scene. The scene includes several spheres which start out completely black, but will change to shiny spheres which reflect the real environment when possible.

Occlusion

SimpleOcclusion

This sample demonstrates occlusion of virtual content by real world content through the use of environment depth images on supported Android and iOS devices.

Depth Images

This sample demonstrates raw texture depth images from different methods.

  • Environment depth (certain Android devices and Apple devices with the LiDAR sensor)
  • Human stencil (Apple devices with an A12 bionic chip (or later) running iOS 13 or later)
  • Human depth (Apple devices with an A12 bionic chip (or later) running iOS 13 or later)

Check Support

Demonstrates checking for AR support and logs the results to the screen. The relevant script is SupportChecker.cs.

Interaction

This sample scene demonstrates the functionality of the XR Interaction Toolkit package. In the scene, you are able to place a cube on a plane which you can translate, rotate and scale with gestures. See the XR Interaction Toolkit Documentation for more details.

Configuration Chooser

Demonstrates how to use the AR Foundation session's ConfigurationChooser to swap between rear and front-facing camera configurations.

Debug Menu

The AR Foundation Debug Menu allows you to visualize trackables and configurations on device.

ARKit

These samples are only available on iOS devices.

Coaching Overlay

The coaching overlay is an ARKit-specific feature which will overlay a helpful UI guiding the user to perform certain actions to achieve some "goal", such as finding a horizontal plane.

The coaching overlay can be activated automatically or manually, and you can set its goal. In this sample, we've set the goal to be "Any plane", and for it to activate automatically. This will display a special UI on the screen until a plane is found. There is also a button to activate it manually.

The sample includes a MonoBehavior to define the settings of the coaching overlay. See ARKitCoachingOverlay.cs.

This sample also shows how to subscribe to ARKit session callbacks. See CustomSessionDelegate.

This sample requires iOS 13 or above.

Thermal State

This sample contains the code required to query for an iOS device's thermal state so that the thermal state may be used with C# game code. This sample illustrates how the thermal state may be used to disable AR Foundation features to reduce the thermal state of the device.

AR World Map

An ARWorldMap is an ARKit-specific feature which lets you save a scanned area. ARKit can optionally relocalize to a saved world map at a later time. This can be used to synchronize multiple devices to a common space, or for curated experiences specific to a location, such as a museum exhibition or other special installation. Read more about world maps here. A world map will store most types of trackables, such as reference points and planes.

The ARWorldMapController.cs performs most of the logic in this sample.

This sample requires iOS 12 or above.

Geo Anchors

ARKit's ARGeoAnchors are not yet supported by ARFoundation, but you can still access this feature with a bit of Objective-C. This sample uses a custom ConfigurationChooser to instruct the Apple ARKit XR Plug-in to use an ARGeoTrackingConfiguration.

This sample also shows how to interpret the nativePtr provided by the XRSessionSubsystem as an ARKit ARSession pointer.

This sample requires an iOS device running iOS 14.0 or later, an A12 chip or later, location services enabled, and cellular capability.

AR Collaboration Data

Similar to an ARWorldMap, a "collaborative session" is an ARKit-specific feature which allows multiple devices to share session information in real time. Each device will periodically produce ARCollaborationData which should be sent to all other devices in the collaborative session. ARKit will share each participant's pose and all reference points. Other types of trackables, such as detected planes, are not shared.

See CollaborativeSession.cs. Note there are two types of collaboration data: "Critical" and "Optional". "Critical" data is available periodically and should be sent to all other devices reliably. "Optional" data is available nearly every frame and may be sent unreliably. Data marked as "optional" includes data about the device's location, which is why it is produced very frequently (i.e., every frame).

Note that ARKit's support for collaborative sessions does not include any networking; it is up to the developer to manage the connection and send data to other participants in the collaborative session. For this sample, we used Apple's MultipeerConnectivity Framework. Our implementation can be found here.

You can create reference points by tapping on the screen. Reference points are created when the tap results in a raycast which hits a point in the point cloud.

This sample requires iOS 13 or above.

High Resolution CPU Image

This sample demonstrates high resolution CPU image capture on iOS 16 and newer. See the High Resolution CPU Image package documentation to learn more about this feature.

Camera Exposure

This sample shows how to lock the device camera and set the camera exposure mode, duration, and ISO. See CameraExposureController.cs for example code.

This sample requires iOS 16 or newer and a device with an ultra-wide camera.

Camera White Balance

This sample shows how to lock the device camera and set the camera white balance mode and gains. See CameraWhiteBalanceController.cs for example code.

This sample requires iOS 16 or newer and a device with an ultra-wide camera.

Camera Focus

This sample shows how to lock the device camera and set the camera focus mode and lens position. See CameraFocusController.cs for example code.

This sample requires iOS 16 or newer and a device with an ultra-wide camera.

ARCore Record Session

This sample demonstrates the session recording and playback functionality available in ARCore. This feature allows you to record the sensor and camera telemetry during a live session, and then reply it at later time. When replayed, ARCore runs on the target device using the recorded telemetry rather than live data. See ARCoreSessionRecorder.cs for example code.

Additional demos

While no longer actively maintained, Unity has a separate AR Foundation Demos repository that contains some larger samples including localization, mesh placement, shadows, and user onboarding UX.

Community and Feedback

The intention of this reposititory is to provide a means for getting started with the features in AR Foundation. The samples are intentionally simplistic with a focus on teaching basic scene setup and APIs. If you you have a question, find a bug, or would like to request a new feature concerning any of the AR Foundation packages or these samples please submit a GitHub issue. New issues are reviewed regularly.

Contributions and Pull Requests

We are not accepting pull requests at this time. If you find an issue with the samples, or would like to request a new sample, please submit a GitHub issue.

More Repositories

1

ml-agents

The Unity Machine Learning Agents Toolkit (ML-Agents) is an open-source project that enables games and simulations to serve as environments for training intelligent agents using deep reinforcement learning and imitation learning.
C#
16,244
star
2

UnityCsReference

Unity C# reference source code.
C#
10,410
star
3

EntityComponentSystemSamples

C#
6,826
star
4

FPSSample

A first person multiplayer shooter example project in Unity
C#
4,771
star
5

PostProcessing

Post Processing Stack
C#
3,665
star
6

NavMeshComponents

High Level API Components for Runtime NavMesh Building
C#
2,941
star
7

BoatAttack

Demo Project using the Universal RP from Unity3D
C#
2,459
star
8

Graphics

Unity Graphics - Including Scriptable Render Pipeline
C#
2,404
star
9

com.unity.netcode.gameobjects

Netcode for GameObjects is a high-level netcode SDK that provides networking capabilities to GameObject/MonoBehaviour workflows within Unity and sits on top of underlying transport layer.
C#
2,063
star
10

AssetBundles-Browser

Editor tool for viewing and debugging asset bundle contents before and after builds
C#
1,940
star
11

UniversalRenderingExamples

This project contains a collection of Custom Renderer examples. This will be updated as we refine the feature and add more options.
C#
1,923
star
12

VolumetricLighting

Lighting effects implemented for the Adam demo: volumetric fog, area lights and tube lights
C#
1,576
star
13

Unity-Robotics-Hub

Central repository for tools, tutorials, resources, and documentation for robotics simulation in Unity.
C#
1,544
star
14

AutoLOD

Automatic LOD generation + scene optimization
C#
1,534
star
15

com.unity.multiplayer.samples.coop

A small-scale cooperative game sample built on the new, Unity networking framework to teach developers about creating a similar multiplayer game.
C#
1,500
star
16

VisualEffectGraph-Samples

Visual Effect Graph - Samples Project
C#
1,498
star
17

InputSystem

An efficient and versatile input system for Unity.
C#
1,395
star
18

2d-extras

Fun 2D Stuff that we'd like to share!
C#
1,387
star
19

Animation-Instancing

This technique is designed to instance Characters(SkinnedMeshRender).
C#
1,366
star
20

Unity.Mathematics

The C# math library used in Unity providing vector types and math functions with a shader like syntax
C#
1,354
star
21

multiplayer

Unity multiplayer packages and samples
1,328
star
22

DOTS-training-samples

Samples designed as exercises to be ported from Unity GameObjects/MonoBehaviours to Unity DOTS.
C#
1,281
star
23

UnityRenderStreaming

Streaming server for Unity
C#
1,257
star
24

game-programming-patterns-demo

A repo of small demos that assemble some of the well-known design patterns in Unity development to support the ebook "Level up your code with game programming patterns"
C#
1,255
star
25

Addressables-Sample

Demo project using Addressables package
ShaderLab
1,216
star
26

ShaderGraph

Unity ShaderGraph project
C#
1,154
star
27

AssetGraph

Visual Workflow Automation Tool for Unity.
C#
1,088
star
28

XR-Interaction-Toolkit-Examples

This repository contains various examples to use with the XR Interaction Toolkit
C#
974
star
29

FontainebleauDemo

Fontainebleau demo
C#
947
star
30

uGUI

Source code for the Unity UI system.
C#
937
star
31

DOTSSample

A third person, multiplayer sample project. Built with Unity and using the new Data Oriented Tech Stack (DOTS).
C#
932
star
32

EditorXR

Author XR in XR
C#
923
star
33

SpaceshipDemo

Spaceship Demo - AAA Playable First person demo showcasing effects made with Visual Effect Graph and rendered with High Definition Render Pipeline
C#
916
star
34

VFXToolbox

Additional tools for Visual Effect Artists
C#
909
star
35

2d-techdemos

Tech Demos for Unity 2D Features
C#
881
star
36

ProjectTinySamples

Samples for Project Tiny
C#
860
star
37

UnityPlayground

A collection of simple scripts to create 2D physics game, intended for giving workshops to a young audience
C#
806
star
38

com.unity.formats.alembic

Alembic importer and exporter plugin for Unity
C#
786
star
39

HLODSystem

C#
774
star
40

HLSLcc

DirectX shader bytecode cross compiler
C++
770
star
41

ProjectAuditor

Project Auditor is an experimental static analysis tool for Unity Projects.
C#
763
star
42

giles

GILES: A Runtime Level Editor for Unity3D
C#
743
star
43

com.unity.perception

Perception toolkit for sim2real training and validation in Unity
C#
734
star
44

com.unity.webrtc

WebRTC package for Unity
Assembly
721
star
45

UniteAustinTechnicalPresentation

C#
721
star
46

SuperScience

Gems of Unity Labs for our user-base.
C#
704
star
47

BackgroundDownload

Plugins for mobile platforms to enable file downloads in background
C#
687
star
48

com.unity.demoteam.hair

An integrated solution for authoring / importing / simulating / rendering strand-based hair in Unity.
C#
684
star
49

NativeRenderingPlugin

C++ Rendering Plugin example for Unity
C
633
star
50

uaal-example

Objective-C++
605
star
51

com.unity.uiwidgets

UIWidgets is a Unity Package which helps developers to create, debug and deploy efficient, cross-platform Apps.
C#
603
star
52

guid-based-reference

A component for giving Game Objects a GUID and a class to create references to objects in any Scene by GUID
C#
600
star
53

com.unity.multiplayer.docs

Open Source documentation for Unity Multiplayer, which includes Netcode for GameObjects, the Unity Transport Package, Multiplayer Tools and Educational references and Sample Games such as Boss Room.
JavaScript
590
star
54

Standard-Assets-Characters

Unity Standard Asset Controllers
C#
574
star
55

MeshApiExamples

Example project for Unity 2020.1 Mesh API improvements
C#
558
star
56

unity-cache-server

Unity CacheServer optimized for multi-client local networks
JavaScript
554
star
57

barracuda-release

C#
554
star
58

SimpleAnimation

A simple Animation Component that leverages PlayableGraphs
C#
539
star
59

GenericFrameRecorder

This GitHub package is DEPRECATED. Please get the new Unity Recorder from the Asset Store (https://assetstore.unity.com/packages/essentials/unity-recorder-94079) Use the editor builtin Bug Reporter to report issues. You can track and vote for issues on the Issue Tracker (https://issuetracker.unity3d.com)
C#
533
star
60

obstacle-tower-env

Obstacle Tower Environment
Python
532
star
61

graph-visualizer

Visualizer for your Playable graphs
C#
522
star
62

DeLightingTool

De-Lighting tool
C#
512
star
63

com.unity.demoteam.digital-human

Library of tech features used to realize the digital human from 'The Heretic' and 'Enemies'.
C#
503
star
64

PhysicsExamples2D

Examples of various Unity 2D Physics components and features.
C#
496
star
65

arfoundation-demos

AR Foundation demo projects
C#
490
star
66

usd-unity-sdk

Integration of Pixar's Universal Scene Description into Unity. UPDATE: This package has been superseded by our new bundle of USD packages. Please see README & link below for further details.
C#
489
star
67

facial-ar-remote

**This project is discontinued** Facial AR Remote is a tool that allows you to capture blendshape animations directly from your iPhone X into Unity 3d by use of an app on your phone.
C#
482
star
68

animation-jobs-samples

Code samples using the animation C# jobs feature.
C#
464
star
69

vscode-unity-debug

Unity debugging support for VS Code
C#
458
star
70

com.unity.cinemachine

Smart camera tools for passionate creators
C#
457
star
71

MeshSyncDCCPlugins

DCC plugins for MeshSync in Unity. Supported tools: Maya, Maya LT, 3ds Max, Motion Builder, Modo, Blender, Metasequoia
C++
438
star
72

UIElementsExamples

Unity project containing examples to use UIElements in the Editor
C#
434
star
73

megacity-metro

Megacity-Metro: a thrilling shooter game, using Netcode for Entities for a multiplayer experience supporting 128+ players. Latest DOTS packages and Unity Gaming Services elevate the user experience, demonstrating how to craft engaging multiplayer games.
C#
425
star
74

Unity.Animation.Samples

Repository of projects that showcase the new DOTS animation package (com.unity.animation).
C#
411
star
75

com.unity.demoteam.mesh-to-sdf

A light and fast real-time SDF generator, primarily for animated characters. The dynamic SDF can be used for all sorts of VFX. Also enables hair-to-character collisions in the new hair package.
C#
402
star
76

MeasuredMaterialLibraryURP

HLSL
360
star
77

BatchBreakingCause

This project demonstrates different cases when Unity has to break a batch while rendering.
GLSL
350
star
78

UIToolkitUnityRoyaleRuntimeDemo

This is a sample project to introduce the use of UI Toolkit in Runtime
C#
347
star
79

MeasuredMaterialLibraryHDRP

C#
345
star
80

UnityDataTools

Experimental tools and libraries for reading and analyzing Unity data files.
C#
328
star
81

SynthDet

SynthDet - An end-to-end object detection pipeline using synthetic data
C#
324
star
82

EndlessRunnerSampleGame

Repository for the Endless Runner Game Sample (Trash Dash)
C#
324
star
83

HDRPRayTracingScenes

This repository contains a startup DXR project.
322
star
84

multiplayer-community-contributions

Community contributions to Unity Multiplayer Networking products and services.
C#
316
star
85

com.unity.services.samples.use-cases

The collection of samples in this repo use Unity Gaming Services in a Unity project to demonstrate live gaming operations.
C#
312
star
86

com.unity.probuilder

C#
305
star
87

XRLineRenderer

An XR-Focused line renderer that mimics rendering with 3d capsules while only using two quads worth of geometry.
C#
296
star
88

BuildReportInspector

Editor script which implements an inspector for the BuildReport class
C#
294
star
89

boat-attack-water

Package repo containing the water system created for the URP Boat Attack demo project.
C#
293
star
90

com.unity.demoteam.digital-human.sample

Character sample featuring the digital human from 'The Heretic'.
C#
292
star
91

marathon-envs

A set of high-dimensional continuous control environments for use with Unity ML-Agents Toolkit.
C#
279
star
92

articulations-robot-demo

C#
278
star
93

VRAlchemyLab

VR Demo project using HDRP and unity 2019.3
C#
276
star
94

NotificationsSamples

Sample project for Unity Notifications
C#
265
star
95

SkyboxPanoramicShader

Skybox shader for support of 360/180/cubemap video and static content
ShaderLab
265
star
96

PeopleSansPeople

Unity's privacy-preserving human-centric synthetic data generator
C#
261
star
97

Robotics-Object-Pose-Estimation

A complete end-to-end demonstration in which we collect training data in Unity and use that data to train a deep neural network to predict the pose of a cube. This model is then deployed in a simulated robotic pick-and-place task.
Python
250
star
98

ECS-Network-Racing-Sample

ECS multiplayer racing sample to showcase using Unity Entities and netcode with best practices
C#
248
star
99

2d-pixel-perfect

Pixel Perfect Camera
C#
244
star
100

ShaderGraph-Custom-Lighting

A sample project showcasing a simple method to calculate custom lighting inside of Shader Graph for the Lightweight Render Pipeline. Includes a sample toon shaded scene and example assets. Built for Unity 2019.2 .
Mathematica
244
star