• Stars
    star
    200
  • Rank 195,325 (Top 4 %)
  • Language
    Python
  • License
    Other
  • Created almost 2 years ago
  • Updated 6 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

iPlanner: Imperative Path Planning. An end-to-end learning planning framework using a novel unsupervised imperative learning approach

Imperative Path Planner (iPlanner)

Overview

Welcome to the iPlanner: Imperative Path Planning code repository. The iPlanner is trained via an innovative Imperative Learning Approach and exclusively uses front-facing depth images for local path planning.

A video showing the functionalities of iPlanner is available here: Video

Keywords: Navigation, Local Planning, Imperative Learning

License

This code is released under the MIT License.

Author: Fan Yang
Maintainer: Fan Yang, [email protected]

The iPlanner package has been tested under ROS Noetic on Ubuntu 20.04. This is research code, and any fitness for a particular purpose is disclaimed.

Method

Installation

Dependencies

To run iPlanner, you need to install PyTorch. We recommend using Anaconda for installation. Check the official website for installation instructions for Anaconda and PyTorch accordingly.

Please follow the instructions provided in the INSTALL.md file to set up your environment and install the necessary packages. You can find the INSTALL.md file in the root directory of the project.

Simulation Environment Setup

Please refer to the Autonomous Exploration Development Environment repository for setting up the Gazebo simulation Environment: Website, switch to the branch noetic_rgbd_camera.

Building

To build the repository and set up the right Python version for running, use the command below:

catkin build iplanner_node -DPYTHON_EXECUTABLE=$(which python)

The Python3 should be the Python version you set up before with Torch and PyPose ready. If using the Anaconda environment, activate the conda env and check

which python

Training

Go to the iplanner folder

cd <your_imperative_planenr_path>/iplanner

Pre-trained Network and Training Data

Download the pre-trained network weights plannernet.pt here and put it into the models folder. Noted this pre-trained network has not been adapted to real-world data.

You can also collect data yourself either inside the simulation environment or in the real-world. Launch the data_collect_node

roslaunch iplanner_node data_collector.launch

Provide the information for the necessary topics listed in config/data_params.yaml. The collected data will be put into the folder data/CollectedData, and generate folders for different environments that you can specify in config/data_params.yaml under env_name.

For each of the environments, the data contains the structure of:

Environment Data
โ”œโ”€โ”€ camera
|   โ”œโ”€โ”€ camera.png
โ”‚   โ””โ”€โ”€ split.pt
โ”œโ”€โ”€ camera_extrinsic.txt
โ”œโ”€โ”€ cloud.ply
โ”œโ”€โ”€ color_intrinsic.txt
โ”œโ”€โ”€ depth
|    โ”œโ”€โ”€ depth.png
โ”‚   โ””โ”€โ”€ split.pt
โ”œโ”€โ”€ depth_intrinsic.txt
โ”œโ”€โ”€ maps
โ”‚   โ”œโ”€โ”€ cloud
โ”‚   โ”‚   โ””โ”€โ”€ tsdf1_cloud.txt
โ”‚   โ”œโ”€โ”€ data
โ”‚   โ”‚   โ”œโ”€โ”€ tsdf1
โ”œโ”€โ”€ data
โ”‚   โ”‚   โ””โ”€โ”€ tsdf1_map.txt
โ”‚   โ””โ”€โ”€ params
โ”‚       โ””โ”€โ”€ tsdf1_param.txt
โ””โ”€โ”€ odom_ground_truth.txt

You can download the example data we provided using the Google Drive link here.

Generating Training Data

Navigate to the iplanner folder within your project using the following command:

cd <<YORU WORKSPACE>>/src/iPlanner/iplanner

Run the Python script to generate the training data. The environments for which data should be generated are specified in the file collect_list.txt. You can modify the data generation parameters in the config/data_generation.json file.

python data_generation.py

Once you have the training data ready, use the following command to start the training process. You can specify different training parameters in the config/training_config.json file.

python training_run.py

Run iPlanner ROS node

Launch the simulation environment without the default local planner

roslaunch vehicle_simulator simulation_env.launch

Run the iPlanner ROS node without visualization:

roslaunch iplanner_node iplanner.launch

Or run the iPlanner ROS node with visualization:

roslaunch iplanner_node iplanner_viz.launch

Path Following

To ensure the planner executes the planned path correctly, you need to run an independent controller or path follower. Follow the steps below to set up the path follower using the provided launch file from the iplanner repository:

Download the default iplanner_path_follower into your workspace. Navigate to your workspace's source directory using the following command:

cd <<YOUR WORKSPACE>>/src

Then clone the repository:

git clone https://github.com/MichaelFYang/iplanner_path_follow.git

Compile the path follower using the following command:

catkin build iplanner_path_follow

Please note that this repository is a fork of the path following component from CMU-Exploration. You are welcome to explore and try different controllers or path followers suitable for your specific robot platform.

Waypoint Navigation

To send the waypoint through Rviz, please download the rviz waypoint plugin. Navigate to your workspace's source directory using the following command:

cd <<YOUR WORKSPACE>>/src

Then clone the repository:

git clone https://github.com/MichaelFYang/waypoint_rviz_plugin.git

Compile the waypoint rviz plugin using the following command:

catkin build waypoint_rviz_plugin

SmartJoystick

Press the LB button on the joystick, when seeing the output on the screen:

Switch to Smart Joystick mode ...

Now the smartjoystick feature is enabled. It takes the joystick command as motion intention and runs the iPlanner in the background for low-level obstacle avoidance.

Config files

The params file data_params.yaml is for data collection

  • vehicle_sim.yaml The config file contains:
    • main_freq The ROS node running frequency
    • odom_associate_id Depending on different SLAM setup, the odometry base may not be set under robot base frame

The params file vehicle_sim.yaml is for iPlanner ROS node

  • vehicle_sim.yaml The config file contains:
    • main_freq The ROS node running frequency
    • image_flap Depending on the camera setup, it may require to flip the image upside down or not
    • crop_size The size to crop the incoming camera images
    • is_fear_act Using the predicted collision possibility value to stop
    • joyGoal_scale The max distance of goal sent by joystick in smart joystick model

Reference

If you utilize this codebase in your research, we kindly request you to reference our work. You can cite us as follows:

  • Yang, F., Wang, C., Cadena, C., & Hutter, M. (2023). iPlanner: Imperative Path Planning. Robotics: Science and Systems Conference (RSS). Daegu, Republic of Korea, July 2023.

Author

This codebase has been developed and maintained by Fan Yang. Should you have any queries or require further assistance, you may reach out to him at [email protected]

More Repositories

1

darknet_ros

YOLO ROS: Real-Time Object Detection for ROS
C++
2,158
star
2

ros_best_practices

Best practices, conventions, and tricks for ROS
C++
1,477
star
3

legged_gym

Isaac Gym Environments for Legged Robots
Python
1,188
star
4

ocs2

Optimal Control for Switched Systems
C++
802
star
5

elevation_mapping_cupy

Elevation Mapping on GPU.
Python
508
star
6

open3d_slam

Pointcloud-based graph SLAM written in C++ using open3D library.
C++
503
star
7

rsl_rl

Fast and simple implementation of RL algorithms, designed to run fully on GPU.
Python
487
star
8

se2_navigation

Pure Pursuit Control and SE(2) Planning
C++
439
star
9

free_gait

An Architecture for the Versatile Control of Legged Robots
C++
397
star
10

traversability_estimation

Traversability mapping for mobile rough terrain navigation
C++
352
star
11

raisimLib

RAISIM, A PHYSICS ENGINE FOR ROBOTICS AND AI RESEARCH
325
star
12

xpp

Visualization of Motions for Legged Robots in ros-rviz
C++
293
star
13

graph_msf

A graph-based multi-sensor fusion framework. It can be used to fuse various relative or absolute measurments with IMU readings in real-time.
C++
259
star
14

icp_localization

This package provides localization in a pre-built map using ICP and odometry (or the IMU measurements).
C++
258
star
15

viplanner

ViPlanner: Visual Semantic Imperative Learning for Local Navigation
Python
236
star
16

delora

Self-supervised Deep LiDAR Odometry for Robotic Applications
Python
232
star
17

SimBenchmark

Physics engine benchmark for robotics applications: RaiSim vs Bullet vs ODE vs MuJoCo vs DartSim
C++
193
star
18

learning_quadrupedal_locomotion_over_challenging_terrain_supplementary

Supplementary materials for "Learning Locomotion over Challenging Terrain"
C++
173
star
19

raisimGym

Python
141
star
20

art_planner

Local Navigation Planner for Legged Robots
C++
132
star
21

perceptive_mpc

Code for "Perceptive Model Predictive Control for Continuous Mobile Manipulation"
C++
129
star
22

wild_visual_navigation

Wild Visual Navigation: A system for fast traversability learning via pre-trained models and online self-supervision
Python
126
star
23

tensorflow-cpp

Pre-built TensorFlow for C/C++ and CMake.
Shell
114
star
24

terrain-generator

Python
108
star
25

vitruvio

Vitruvio is a framework for rapid leg design analysis and optimization for legged robots. The purpose of the simulation framework is to guide the early stages of legged robot design. The end effectors track an input trajectory and the necessary joint speed, torque, power and energy for the tracking is computed.
MATLAB
88
star
26

L3E

Learning-based localizability estimation for robust LiDAR localization.
87
star
27

elmo_ethercat_sdk

C++
80
star
28

MPC-Net

Accompanying code for the publication "MPC-Net: A First Principles Guided Policy Search"
Python
79
star
29

tree_detection

This package implements a simple tree detector from point cloud data. It makes no assumptions about the ground plane and can handle arbitrary terrains.
C++
69
star
30

rayen

Imposition of Hard Convex Constraints on Neural Networks
Python
68
star
31

raisimOgre

https://rsl.ethz.ch/partnership/spinoff/raisim.html
67
star
32

smug_planner

C++
59
star
33

noesis

A Reinforcement Learning Software Toolbox for Robotics
C++
53
star
34

RSLGym

Reinforcement learning framework from RSL for policy training with RaiSim.
Python
48
star
35

hardware_time_sync

Guidelines on how to hardware synchronize the time of multiple sensors, e.g., IMU, cameras, etc.
46
star
36

anomaly_navigation

Anomaly Navigation - ANNA
Python
41
star
37

cerberus_darpa_subt_datasets

Datasets collected by Team CERBERUS during the DARPA Subterranean Challenge
39
star
38

RaiSimUnity

A visualizer for RaiSim based on Unity
31
star
39

raw_image_pipeline

Image processing pipeline for cameras that provide raw data
C++
31
star
40

soem_interface

This software package serves as a C++ interface for one or more EtherCAT devices running on the same bus. The lower level EtherCAT communication is handled by the SOEM library.
C
28
star
41

tcan

A library to communicate to devices connected through CAN, EtherCat, USB or TCP/IP.
C++
28
star
42

swerve_steering

C++
26
star
43

workflows

Collection of workflows, best-practices and guidelines for software development.
Python
26
star
44

radiance_field_ros

Implementation of Radiance Fields for Robotic Teleoperation
Python
25
star
45

terra

A grid world environment for high-level earthworks planning in JAX for RL.
Python
22
star
46

urdf2robcogen

A tool that translates a robot URDF description into the kindsl format that can be processed by RobCoGen.
C++
21
star
47

maxon_epos_ethercat_sdk

Implementation of an ethercat device driver for the maxon epos 4
C++
18
star
48

self_supervised_segmentation

Python
17
star
49

ocs2_robotic_assets

Various robotic assets for OCS2 Toolbox
CMake
17
star
50

cerberus_anymal_locomotion

C++
16
star
51

cuda_ue4_linux

C++
15
star
52

ethercat_sdk_master

A wrapper around SOEM to allow multiple masters and devices on EtherCAT
C++
14
star
53

ethercat_device_configurator

Manages setup yaml files for the RSL ethercat infrastructure
C++
14
star
54

lunar_planner

Python
14
star
55

pytictac

Simple Timing Utils
Python
13
star
56

catkin_create_rqt

An RQT plugin generator script, supporting several arguments to generate a rqt plugin for ROS, similar to catkin_create_pkg
Python
12
star
57

rl-blindloco

Project page for Science Robotics paper "Learning Quadrupedal Locomotion over Challenging Terrain"
HTML
11
star
58

learning_docker

Shell
9
star
59

any_ping_indicator

An Ubuntu indicator applet to show the ping status.
Python
8
star
60

plr-exercise

Python
7
star
61

terra-baselines

Train, visualize, and evaluate RL policies for the Terra environment.
Python
7
star
62

rsl_heap

CMake
7
star
63

anymal_brax

Python
7
star
64

perfectlyconstrained

Official implementations from the paper "Should We Relax a Bit? A Study on Degeneracy Mitigation in Point Cloud Registration"
7
star
65

unity_ros_teleoperation

C#
6
star
66

gtsam_catkin

Catkinized version of gtsam.
CMake
4
star
67

digbench

Benchmarks and map generation for the Terra environment.
Python
4
star
68

mobile_manipulation

Under construction
3
star
69

realsense_eth_robotics_summer_school_2019

Launch files and utility nodes for running the Realsense on SMB
CMake
2
star
70

xpp-release

Release repository of the xpp repo, necessary for ros hosting.
1
star
71

rsl_panoptic_mapping

C++
1
star
72

rsl_panoptic

Python
1
star
73

pretrained_depth_embedders

Python
1
star
74

darknet_ros-release

1
star
75

webapp-container

Scripts and tools to containerize a PHP-FPM, Nginx, Redis web-application โš™๏ธ
Dockerfile
1
star