Dynablox
An online volumetric mapping-based approach for real-time detection of diverse dynamic objects in complex environments.
Table of Contents
Credits
Setup
Examples
Paper
If you find this package useful for your research, please consider citing our paper:
- Lukas Schmid, Olov Andersson, Aurelio Sulser, Patrick Pfreundschuh, and Roland Siegwart. "Dynablox: Real-time Detection of Diverse Dynamic Objects in Complex Environments" in ArXiv Preprint, 2023. [ ArXiv | Video ]
@inproceedings{schmid2023dynablox, title={Dynablox: Real-time Detection of Diverse Dynamic Objects in Complex Environments}, author={Schmid, Lukas, and Andersson, Olov, and Sulser, Aurelio, and Pfreundschuh, Patrick, and Siegwart, Roland}, booktitle={ArXiv Preprint 2304.10049}, year={2023}, volume={}, number={}, pages={}, doi={}} }
Video
A brief overview of the problem, approach, and results is available on youtube:
Setup
There is a docker image available for this package. Check the usage in the dockerhub page.
Installation
- Note on Versioning: This package was developed using Ubuntu 20.04 using ROS Noetic. Other versions should also work but support can not be guaranteed.
-
If not already done so, install ROS. We recommend using
Desktop-Full
. -
If not already done so, setup a catkin workspace:
mkdir -p ~/catkin_ws/src cd ~/catkin_ws catkin init catkin config --extend /opt/ros/$ROS_DISTRO catkin config --cmake-args -DCMAKE_BUILD_TYPE=RelWithDebInfo catkin config --merge-devel
-
Install system dependencies:
sudo apt-get install python3-vcstool python3-catkin-tools ros-$ROS_DISTRO-cmake-modules protobuf-compiler autoconf git rsync -y
-
Clone the repo using SSH Keys:
cd ~/catkin_ws/src git clone [email protected]:ethz-asl/dynablox.git
-
Install ROS dependencies:
cd ~/catkin_ws/src vcs import . < ./dynablox/ssh.rosinstall --recursive
-
Build:
catkin build dynablox_ros
Datasets
To run the demos we use the Urban Dynamic Objects LiDAR (DOALS) Dataset. To download the data and pre-process it for our demos, use the provided script:
roscd dynablox_ros/scripts
# Or your preferred data destination.
./download_doals_data.sh /home/$USER/data/DOALS
note The dataset will be released shortly!
We further collect a new dataset featuring diverse dynamic objects in complex scenes. To download the processed ready-to-run data for our demos, use the provided script:
roscd dynablox_ros/scripts
# Or your preferred data destination.
./download_dynablox_data.sh /home/$USER/data/Dynablox
Examples
Running a DOALS Sequence
-
If not done so, download the DOALS dataset as explained here.
-
Adjust the dataset path in
dynablox_ros/launch/run_experiment.launch
:<arg name="bag_file" default="/home/$(env USER)/data/DOALS/hauptgebaeude/sequence_1/bag.bag" />
-
Run
roslaunch dynablox_ros run_experiment.launch
-
You should now see dynamic objects being detected as the sensor moves through the scene:
Running a Dynablox Sequence
note The dataset will be released shortly!
-
If not done so, download the Dynablox dataset as explained here.
-
Adjust the dataset path in
dynablox_ros/launch/run_experiment.launch
and setuse_doals
to false:<arg name="use_doals" default="false" /> <arg name="bag_file" default="/home/$(env USER)/data/Dynablox/processed/ramp_1.bag" />
-
Run
roslaunch dynablox_ros run_experiment.launch
-
You should now see dynamic objects being detected as the sensor moves through the scene:
Running and Evaluating an Experiment
Running an Experiment
-
If not done so, download the DOALS dataset as explained here.
-
Adjust the dataset path in
dynablox_ros/launch/run_experiment.launch
:<arg name="bag_file" default="/home/$(env USER)/data/DOALS/hauptgebaeude/sequence_1/bag.bag" />
-
In
dynablox_ros/launch/run_experiment.launch
, set theevaluate
flag, adjust the ground truth data path, and specify where to store the generated outpuit data:<arg name="evaluate" default="true" /> <arg name="eval_output_path" default="/home/$(env USER)/dynablox_output/" /> <arg name="ground_truth_file" default="/home/$(env USER)/data/DOALS/hauptgebaeude/sequence_1/indices.csv" />
-
Run
roslaunch dynablox_ros run_experiment.launch
-
Wait till the dataset finished processing. Dynablox should shutdown automatically afterwards.
Analyzing the Data
-
Printing the Detection Performance Metrics:
- Run:
roscd dynablox_ros/src/evaluation python3 evaluate_data.py /home/$USER/dynablox_output
- You should now see the performance statistics for all experiments in that folder:
1/1 data entries are complete. Data object_IoU object_Precision object_Recall hauptgebaeude_1 89.8 +- 5.6 99.3 +- 0.4 90.3 +- 5.6 All 89.8 +- 5.6 99.3 +- 0.4 90.3 +- 5.6
-
Inspecting the Segmentation:
- Run:
roslaunch dynablox_ros cloud_visualizer.launch file_path:=/home/$USER/dynablox_output/clouds.csv
-
Inspecting the Run-time and Configuration: Additional information is automatically stored in
timings.txt
andconfig.txt
for each experiment.
Advanced Options
-
Adding Drift to an Experiment: To run an experiment with drift specify one of the pre-computed drift rollouts in
dynablox_ros/launch/run_experiment.launch
:<arg name="drift_simulation_rollout" default="doals/hauptgebaeude/sequence_1/light_3.csv" />
All pre-computed rollouts can be found in
drift_simulation/config/rollouts
. Note that the specified sequence needs to match the data being played. For each sequence, there exist 3 rollouts for each intensity.Alternatively, use the
drift_simulation/launch/generate_drift_rollout.launch
to create new rollouts for other datasets. -
Changing th Configuration of Dynablox: All parameters that exist in dynablox are listed in
dynablox_ros/config/motion_detector/default.yaml
, feel free to tune the method for your use case!