DISCONTINUATION OF PROJECT
This project will no longer be maintained by Intel.
Intel has ceased development and contributions including, but not limited to, maintenance, bug fixes, new releases, or updates, to this project.
Intel no longer accepts patches to this project.
If you have an ongoing need to use this project, are interested in independently developing it, or would like to maintain patches for the open source software community, please create your own fork of this project.
Contact: [email protected]
Depth camera capture in HTML5
|
|
Moving boxes using hands (or a paper) demo shows live depth captured mesh interaction with scene objects; combining 3D world and depth captured hands (or other objects) rendering and Bullet Physics. Run live demo. |
|
Simple background removal implemented as flood-fill of background color to similarly colored pixels. Works only with simple backgrounds - e.g. room walls on the demo gif. Check the tutorial article and run live demo. |
Typing in the air tutorial shows how to use depth stream and WebGL transform feedback to do simple gesture recognition. Check the tutorial article and run live demo. |
3D point cloud rendering demo shows how to render and synchronize depth and color video on GPU. Check the tutorial article and run live demo. |
HTML5 Depth Capture tutorial shows how to access depth stream, check the tutorial article and run live demo. |
To capture and manipulate depth camera stream in HTML5, you'll need:
- Chrome browser version 62 or later (the official release and no need for additional extensions),
- Intel® RealSense™ 3D camera plugged to USB 3.0 port
- SR300 (and related cameras like Razer Stargazer or Creative BlasterX Senz3D) or R200,
- Windows, Linux or ChromeOS PC.
These are the constraints of current implementation. The plan is to support other depth cameras and OSX and Android, too.