• Stars
    star
    557
  • Rank 79,968 (Top 2 %)
  • Language
    Python
  • License
    Other
  • Created almost 7 years ago
  • Updated over 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Code for "Dense Object Nets: Learning Dense Visual Object Descriptors By and For Robotic Manipulation"

Updates

  • September 4, 2018: Tutorial and data now available! We have a tutorial now available here, which walks through step-by-step of getting this repo running.
  • June 26, 2019: We have updated the repo to pytorch 1.1 and CUDA 10. For code used for the experiments in the paper see here.

Dense Correspondence Learning in PyTorch

In this project we learn Dense Object Nets, i.e. dense descriptor networks for previously unseen, potentially deformable objects, and potentially classes of objects:

We also demonstrate using Dense Object Nets for robotic manipulation tasks:

Dense Object Nets: Learning Dense Visual Descriptors by and for Robotic Manipulation

This is the reference implementation for our paper:

PDF | Video

Pete Florence*, Lucas Manuelli*, Russ Tedrake

Abstract: What is the right object representation for manipulation? We would like robots to visually perceive scenes and learn an understanding of the objects in them that (i) is task-agnostic and can be used as a building block for a variety of manipulation tasks, (ii) is generally applicable to both rigid and non-rigid objects, (iii) takes advantage of the strong priors provided by 3D vision, and (iv) is entirely learned from self-supervision. This is hard to achieve with previous methods: much recent work in grasping does not extend to grasping specific objects or other tasks, whereas task-specific learning may require many trials to generalize well across object configurations or other tasks. In this paper we present Dense Object Nets, which build on recent developments in self-supervised dense descriptor learning, as a consistent object representation for visual understanding and manipulation. We demonstrate they can be trained quickly (approximately 20 minutes) for a wide variety of previously unseen and potentially non-rigid objects. We additionally present novel contributions to enable multi-object descriptor learning, and show that by modifying our training procedure, we can either acquire descriptors which generalize across classes of objects, or descriptors that are distinct for each object instance. Finally, we demonstrate the novel application of learned dense descriptors to robotic manipulation. We demonstrate grasping of specific points on an object across potentially deformed object configurations, and demonstrate using class general descriptors to transfer specific grasps across objects in a class.

Citing

If you find this code useful in your work, please consider citing:

@article{florencemanuelli2018dense,
  title={Dense Object Nets: Learning Dense Visual Object Descriptors By and For Robotic Manipulation},
  author={Florence, Peter and Manuelli, Lucas and Tedrake, Russ},
  journal={Conference on Robot Learning},
  year={2018}
}

Tutorial

Code Setup

Dataset

Training and Evaluation

Miscellaneous

Git management

To prevent the repo from growing in size, recommend always "restart and clear outputs" before committing any Jupyter notebooks. If you'd like to save what your notebook looks like, you can always "download as .html", which is a great way to snapshot the state of that notebook and share.

More Repositories

1

drake

Model-based design and verification for robotics.
C++
3,265
star
2

LabelFusion

LabelFusion: A Pipeline for Generating Ground Truth Labels for Real RGBD Data of Cluttered Scenes
Python
385
star
3

director

A robotics interface and visualization framework, with extensive applications for working with http://drake.mit.edu
Python
178
star
4

gcs-science-robotics

Motion Planning around Obstacles with Convex Optimization by Marcucci et al, 2023
Python
155
star
5

drake-external-examples

Examples of how to use Drake in your own project.
C++
101
star
6

drake-ros

Experimental prototyping (for now)
Python
89
star
7

LittleDog

Example of quadruped planning and control in drake using LittleDog -- a small quadruped robot from Boston Dynamics.
MATLAB
67
star
8

spartan

A project repo for robotics research and applications using drake and director.
Python
31
star
9

models

Shareable model files (urdf/sdf + meshes, etc) for our robotics projects
Starlark
25
star
10

drake-iiwa-driver

C++
24
star
11

xfoil

Fork of Mark Drela's XFOIL software (http://raphael.mit.edu/xfoil) which builds using cmake.
Fortran
24
star
12

libbot

Deprecated git mirror of the svn repository formerly located at https://code.google.com/p/libbot. Please use the upstream https://github.com/libbot2/libbot2 or new fork https://github.com/RobotLocomotion/libbot2 instead.
C
23
star
13

drake-blender

Drake glTF Render Client-Server API using Blender
Python
14
star
14

meshConverters

simple wrapper on vcg to give small platform independent mesh converters
C++
12
star
15

RobotLocomotion.github.io

Automatically generated documentation for Drake.
HTML
11
star
16

Hubo

Humanoid robot from KAIST
MATLAB
11
star
17

avl

Fork of Mark Drela's AVL software (http://raphael.mit.edu/avl) which builds using cmake.
Fortran
10
star
18

robot-plan-runner

C++
10
star
19

6-881-examples

Python
7
star
20

realsense2-lcm-driver

Publishes RealSense data to LCM
C++
6
star
21

drake-ci

Continuous integration scripts for Drake.
CMake
6
star
22

cmake

some useful shared cmake utilities
CMake
5
star
23

manip_dataset

Repo with instructions on how to download the dataset
Python
5
star
24

drake-franka-driver

Driver software for the Franka robots.
C++
4
star
25

homebrew-director

Homebrew formulae for Drake.
Ruby
3
star
26

ros_drake

CMake
2
star
27

drake-schunk-driver

Driver software for the Schunk gripper.
C++
2
star
28

bazel-external-data

Handle external large files for use with Bazel.
Python
2
star
29

drake-python3.7

Work in progress. A fork of the Drake toolbox for model-based design and verification for robotics that uses Python 3.7 on Ubuntu 18.04 (Bionic Beaver) and pip to manage Python dependencies.
C++
2
star
30

ros-drake-vendor

Maintainer scripts that package Drake in the ROS build farm
CMake
1
star
31

ipopt-mirror

Fortran
1
star
32

eigen-mirror

C++
1
star
33

ros_drake-release

1
star
34

Pigeon

Full pigeon model w/ Harvard
MATLAB
1
star
35

pdc-ros

Simple wrapper for Kuka pytorch tools + ROS
Python
1
star