• Stars
    star
    656
  • Rank 68,675 (Top 2 %)
  • Language
    Python
  • License
    MIT License
  • Created over 5 years ago
  • Updated 4 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

A Simulation Environment to train Robots in Large Realistic Interactive Scenes

iGibson: A Simulation Environment to train Robots in Large Realistic Interactive Scenes

iGibson is a simulation environment providing fast visual rendering and physics simulation based on Bullet. iGibson is equipped with fifteen fully interactive high quality scenes, hundreds of large 3D scenes reconstructed from real homes and offices, and compatibility with datasets like CubiCasa5K and 3D-Front, providing 8000+ additional interactive scenes. Some of the features of iGibson include domain randomization, integration with motion planners and easy-to-use tools to collect human demonstrations. With these scenes and features, iGibson allows researchers to train and evaluate robotic agents that use visual signals to solve navigation and manipulation tasks such as opening doors, picking up and placing objects, or searching in cabinets.

Latest Updates

[8/9/2021] Major update to iGibson to reach iGibson 2.0, for details please refer to our arxiv preprint.

  • iGibson 2.0 supports object states, including temperature, wetness level, cleanliness level, and toggled and sliced states, necessary to cover a wider range of tasks.
  • iGibson 2.0 implements a set of predicate logic functions that map the simulator states to logic states like Cooked or Soaked.
  • iGibson 2.0 includes a virtual reality (VR) interface to immerse humans in its scenes to collect demonstrations.

[12/1/2020] Major update to iGibson to reach iGibson 1.0, for details please refer to our arxiv preprint.

  • Release of iGibson dataset that includes 15 fully interactive scenes and 500+ object models annotated with materials and physical attributes on top of existing 3D articulated models.
  • Compatibility to import CubiCasa5K and 3D-Front scene descriptions leading to more than 8000 extra interactive scenes!
  • New features in iGibson: Physically based rendering, 1-beam and 16-beam LiDAR, domain randomization, motion planning integration, tools to collect human demos and more!
  • Code refactoring, better class structure and cleanup.

[05/14/2020] Added dynamic light support 🔦

[04/28/2020] Added support for Mac OSX 💻

Citation

If you use iGibson or its assets and models, consider citing the following publication:

@inproceedings{li2022igibson,
  title = 	 {iGibson 2.0: Object-Centric Simulation for Robot Learning of Everyday Household Tasks},
  author =       {Li, Chengshu and Xia, Fei and Mart\'in-Mart\'in, Roberto and Lingelbach, Michael and Srivastava, Sanjana and Shen, Bokui and Vainio, Kent Elliott and Gokmen, Cem and Dharan, Gokul and Jain, Tanish and Kurenkov, Andrey and Liu, Karen and Gweon, Hyowon and Wu, Jiajun and Fei-Fei, Li and Savarese, Silvio},
  booktitle = 	 {Proceedings of the 5th Conference on Robot Learning},
  pages = 	 {455--465},
  year = 	 {2022},
  editor = 	 {Faust, Aleksandra and Hsu, David and Neumann, Gerhard},
  volume = 	 {164},
  series = 	 {Proceedings of Machine Learning Research},
  month = 	 {08--11 Nov},
  publisher =    {PMLR},
  pdf = 	 {https://proceedings.mlr.press/v164/li22b/li22b.pdf},
  url = 	 {https://proceedings.mlr.press/v164/li22b.html},
} 
@inproceedings{shen2021igibson,
      title={iGibson 1.0: a Simulation Environment for Interactive Tasks in Large Realistic Scenes}, 
      author={Bokui Shen and Fei Xia and Chengshu Li and Roberto Mart\'in-Mart\'in and Linxi Fan and Guanzhi Wang and Claudia P\'erez-D'Arpino and Shyamal Buch and Sanjana Srivastava and Lyne P. Tchapmi and Micael E. Tchapmi and Kent Vainio and Josiah Wong and Li Fei-Fei and Silvio Savarese},
      booktitle={2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
      year={2021},
      pages={accepted},
      organization={IEEE}
}

Documentation

The documentation for iGibson can be found here: iGibson Documentation. It includes installation guide (including data download instructions), quickstart guide, code examples, and APIs.

If you want to know more about iGibson, you can also check out our webpage, iGibson 2.0 arxiv preprint and iGibson 1.0 arxiv preprint.

Dowloading the Dataset of 3D Scenes

For instructions to install iGibson and download dataset, you can visit installation guide and dataset download guide.

There are other datasets we link to iGibson. We include support to use CubiCasa5K and 3DFront scenes, adding up more than 10000 extra interactive scenes to use in iGibson! Check our documentation on how to use those.

We also maintain compatibility with datasets of 3D reconstructed large real-world scenes (homes and offices) that you can download and use with iGibson. For Gibson Dataset and Stanford 2D-3D-Semantics Dataset, please fill out this form. For Matterport3D Dataset, please fill in this form and send it to [email protected]. Please put "use with iGibson simulator" in your email. Check our dataset download guide for more details.

Using iGibson with VR

If you want to use iGibson VR interface, please visit the [VR guide (TBA)].

Contributing

This is the github repository for iGibson (pip package igibson) 2.0 release. (For iGibson 1.0, please use 1.0 branch.) Bug reports, suggestions for improvement, as well as community developments are encouraged and appreciated. Please, consider creating an issue or sending us an email.

The support for our previous version of the environment, Gibson, can be found in the following repository.

Acknowledgments

iGibson uses code from a few open source repositories. Without the efforts of these folks (and their willingness to release their implementations under permissable copyleft licenses), iGibson would not be possible. We thanks these authors for their efforts!

More Repositories

1

GibsonEnv

Gibson Environments: Real-World Perception for Embodied Agents
C
864
star
2

taskonomy

Taskonomy: Disentangling Task Transfer Learning [Best Paper, CVPR2018]
Python
845
star
3

cs131_notes

Class notes for CS 131.
TeX
736
star
4

CS131_release

Released assignments for the Stanford's CS131 course on Computer Vision.
Jupyter Notebook
454
star
5

OmniGibson

OmniGibson: a platform for accelerating Embodied AI research built upon NVIDIA's Omniverse engine. Join our Discord for support: https://discord.gg/bccR5vGFEx
Python
425
star
6

ReferringRelationships

Python
260
star
7

3DSceneGraph

The data skeleton from "3D Scene Graph: A Structure for Unified Semantics, 3D Space, and Camera" http://3dscenegraph.stanford.edu
Python
237
star
8

JRMOT_ROS

Source code for JRMOT: A Real-Time 3D Multi-Object Tracker and a New Large-Scale Dataset
Python
145
star
9

RubiksNet

Official repo for ECCV 2020 paper - RubiksNet: Learnable 3D-Shift for Efficient Video Action Recognition
Python
99
star
10

feedback-networks

The repo of Feedback Networks, CVPR17
Lua
89
star
11

ntp

Neural Task Programming
81
star
12

STR-PIP

Spatiotemporal Relationship Reasoning for Pedestrian Intent Prediction
Python
74
star
13

bddl

Jupyter Notebook
67
star
14

robovat

RoboVat: A unified toolkit for simulated and real-world robotic task environments.
Python
67
star
15

iGibsonChallenge2021

Python
55
star
16

behavior

Code to evaluate a solution in the BEHAVIOR benchmark: starter code, baselines, submodules to iGibson and BDDL repos
Python
52
star
17

atp-video-language

Official repo for CVPR 2022 (Oral) paper: Revisiting the "Video" in Video-Language Understanding. Contains code for the Atemporal Probe (ATP).
Python
47
star
18

GibsonSim2RealChallenge

GibsonSim2RealChallenge @ CVPR2020
Python
35
star
19

moma

A dataset for multi-object multi-actor activity parsing
Jupyter Notebook
34
star
20

NTP-vat-release

The PyBullet wrapper (Vat) for Neural Task Programming
Python
34
star
21

mini_behavior

MiniGrid Implementation of BEHAVIOR Tasks
Python
28
star
22

BehaviorChallenge2021

Python
25
star
23

HMS

The repository of the code base of "Multi-Layer Semantic and Geometric Modeling with Neural Message Passing in 3D Scene Graphs for Hierarchical Mechanical Search"
Python
25
star
24

ac-teach

Code for the CoRL 2019 paper AC-Teach: A Bayesian Actor-Critic Method for Policy Learning with an Ensemble of Suboptimal Teachers
Python
24
star
25

STGraph

Codebase for CVPR 2020 paper "Spatio-Temporal Graph for Video Captioning with Knowledge Distillation"
22
star
26

cavin

Python
20
star
27

alignment

ELIGN: Expectation Alignment as a Multi-agent Intrinsic Reward
Python
19
star
28

Sonicverse

HTML
17
star
29

Gym

Custom version of OpenAI Gym
Python
14
star
30

causal_induction

Codebase for "Causal Induction from Visual Observations for Goal-Directed Tasks"
Python
12
star
31

keto

Python
12
star
32

Lasersuite

Forked robosuite for LASER project
Python
11
star
33

perls2

PErception and Robotic Learning System v2
Python
11
star
34

STIP

Python
10
star
35

behavioral_navigation_nlp

Code for translating navigation instructions in natural language to a high-level plan for behavioral navigation for robot navigation
Python
9
star
36

bullet3

C++
8
star
37

arxivbot

Python
8
star
38

egl_probe

A helpful module for listing available GPUs for EGL rendering.
C
6
star
39

ssai

Socially Situated AI
4
star
40

ig_navigation

Python
4
star
41

omnigibson-eccv-tutorial

Jupyter Notebook
4
star
42

RL-Pseudocode

AppleScript
4
star
43

ARPL

Adversarially Robust Policy Learning
Python
4
star
44

sail-blog-new-post

The repository for making new post submissions to the SAIL Blog
HTML
3
star
45

behavior-website-old

HTML
2
star
46

behavior-baselines

Python
2
star
47

behavior-website

SCSS
1
star
48

iris

IRIS: Implicit Reinforcement without Interaction at Scale for Control from Large-Scale Robot Manipulation Datasets
1
star
49

bullet3_ik

Pybullet frozen at version 1.9.5 - purely for using its IK implementation.
C++
1
star