• Stars
    star
    117
  • Rank 301,828 (Top 6 %)
  • Language
    Python
  • License
    Other
  • Created almost 3 years ago
  • Updated 11 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Dynamical movement primitives (DMPs), probabilistic movement primitives (ProMPs), and spatially coupled bimanual DMPs for imitation learning.

CI codecov DOI

Movement Primitives

Dynamical movement primitives (DMPs), probabilistic movement primitives (ProMPs), and spatially coupled bimanual DMPs for imitation learning.

Movement primitives are a common group of policy representations in robotics. There are many types and variations. This repository focuses mainly on imitation learning, generalization, and adaptation of movement primitives for Cartesian motions of robots. It provides implementations in Python and Cython and can be installed directly from PyPI.

Content

Features

  • Dynamical Movement Primitives (DMPs) for
    • positions (with fast Runge-Kutta integration)
    • Cartesian position and orientation (with fast Cython implementation)
    • Dual Cartesian position and orientation (with fast Cython implementation)
  • Coupling terms for synchronization of position and/or orientation of dual Cartesian DMPs
  • Propagation of DMP weight distribution to state space distribution
  • Probabilistic Movement Primitives (ProMPs)

Left: Example of dual Cartesian DMP with RH5 Manus. Right: Example of joint space DMP with UR5.

API Documentation

The API documentation is available here.

Install Library

This library requires Python 3.6 or later and pip is recommended for the installation. In the following instructions, we assume that the command python refers to Python 3. If you use the system's Python version, you might have to add the flag --user to any installation command.

I recommend to install the library via pip:

python -m pip install movement_primitives[all]

or clone the git repository and install it in editable mode:

python -m pip install -e .[all]

If you don't want to have all dependencies installed, just omit [all]. Alternatively, you can install dependencies with

python -m pip install -r requirements.txt

You could also just build the Cython extension with

python setup.py build_ext --inplace

or install the library with

python setup.py install

Examples

You will find a lot of examples in the subfolder examples/. Here are just some highlights to showcase the library.

Potential Field of 2D DMP

A Dynamical Movement Primitive defines a potential field that superimposes several components: transformation system (goal-directed movement), forcing term (learned shape), and coupling terms (e.g., obstacle avoidance).

Script

DMP with Final Velocity

Not all DMPs allow a final velocity > 0. In this case we analyze the effect of changing final velocities in an appropriate variation of the DMP formulation that allows to set the final velocity.

Script

ProMPs

The LASA Handwriting dataset learned with ProMPs. The dataset consists of 2D handwriting motions. The first and third column of the plot represent demonstrations and the second and fourth column show the imitated ProMPs with 1-sigma interval.

Script

Conditional ProMPs

Probabilistic Movement Primitives (ProMPs) define distributions over trajectories that can be conditioned on viapoints. In this example, we plot the resulting posterior distribution after conditioning on varying start positions.

Script

Cartesian DMPs

A trajectory is created manually, imitated with a Cartesian DMP, converted to a joint trajectory by inverse kinematics, and executed with a UR5.

Script

Contextual ProMPs

We use a dataset of Mronga and Kirchner (2021), in which a dual-arm robot rotates panels of varying widths. 10 demonstrations were recorded for 3 different panel widths through kinesthetic teaching. The panel width is the context over which we generalize with contextual ProMPs. We learn a joint distribution of contexts and ProMP weights, and then condition the distribution on the contexts to obtain a ProMP adapted to the context. Each color in the above visualizations corresponds to a ProMP for a different context.

Script

Dependencies that are not publicly available:

Dual Cartesian DMP

This library implements specific dual Cartesian DMPs to control dual-arm robotic systems like humanoid robots.

Scripts: Open3D, PyBullet

Dependencies that are not publicly available:

Coupled Dual Cartesian DMP

We can introduce a coupling term in a dual Cartesian DMP to constrain the relative position, orientation, or pose of two end-effectors of a dual-arm robot.

Scripts: Open3D, PyBullet

Dependencies that are not publicly available:

Propagation of DMP Distribution to State Space

If we have a distribution over DMP parameters, we can propagate them to state space through an unscented transform. On the left we see the original demonstration of a dual-arm movement in state space (two 3D positions and two quaternions) and the distribution of several DMP weight vectors projected to the state space. On the right side we see several dual-arm trajectories sampled from the distribution in state space.

Script

Dependencies that are not publicly available:

Build API Documentation

You can build an API documentation with sphinx. You can install all dependencies with

python -m pip install movement_primitives[doc]

... and build the documentation from the folder doc/ with

make html

It will be located at doc/build/html/index.html.

Test

To run the tests some python libraries are required:

python -m pip install -e .[test]

The tests are located in the folder test/ and can be executed with: python -m nose test

This command searches for all files with test and executes the functions with test_*.

Contributing

You can report bugs in the issue tracker. If you have questions about the software, please use the discussions section. To add new features, documentation, or fix bugs you can open a pull request on GitHub. Directly pushing to the main branch is not allowed.

The recommended workflow to add a new feature, add documentation, or fix a bug is the following:

  • Push your changes to a branch (e.g., feature/x, doc/y, or fix/z) of your fork of the repository.
  • Open a pull request to the main branch of the main repository.

This is a checklist for new features:

  • are there unit tests?
  • does it have docstrings?
  • is it included in the API documentation?
  • run flake8 and pylint
  • should it be part of the readme?
  • should it be included in any example script?

Non-public Extensions

Scripts from the subfolder examples/external_dependencies/ require access to git repositories (URDF files or optional dependencies) and datasets that are not publicly available. They are available on request (email [email protected]).

Note that the library does not have any non-public dependencies! They are only required to run all examples.

MoCap Library

# untested: pip install git+https://git.hb.dfki.de/dfki-interaction/mocap.git
git clone [email protected]:dfki-interaction/mocap.git
cd mocap
python -m pip install -e .
cd ..

Get URDFs

# RH5
git clone [email protected]:models-robots/rh5_models/pybullet-only-arms-urdf.git --recursive
# RH5v2
git clone [email protected]:models-robots/rh5v2_models/pybullet-urdf.git --recursive
# Kuka
git clone [email protected]:models-robots/kuka_lbr.git
# Solar panel
git clone [email protected]:models-objects/solar_panels.git
# RH5 Gripper
git clone [email protected]:motto/abstract-urdf-gripper.git --recursive

Data

I assume that your data is located in the folder data/ in most scripts. You should put a symlink there to point to your actual data folder.

Related Publications

This library implements several types of dynamical movement primitives and probabilistic movement primitives. These are described in detail in the following papers.

[1] Ijspeert, A. J., Nakanishi, J., Hoffmann, H., Pastor, P., Schaal, S. (2013). Dynamical Movement Primitives: Learning Attractor Models for Motor Behaviors, Neural Computation 25 (2), 328-373. DOI: 10.1162/NECO_a_00393, https://homes.cs.washington.edu/~todorov/courses/amath579/reading/DynamicPrimitives.pdf

[2] Pastor, P., Hoffmann, H., Asfour, T., Schaal, S. (2009). Learning and Generalization of Motor Skills by Learning from Demonstration. In 2009 IEEE International Conference on Robotics and Automation, (pp. 763-768). DOI: 10.1109/ROBOT.2009.5152385, https://h2t.iar.kit.edu/pdf/Pastor2009.pdf

[3] Muelling, K., Kober, J., Kroemer, O., Peters, J. (2013). Learning to Select and Generalize Striking Movements in Robot Table Tennis. International Journal of Robotics Research 32 (3), 263-279. https://www.ias.informatik.tu-darmstadt.de/uploads/Publications/Muelling_IJRR_2013.pdf

[4] Ude, A., Nemec, B., Petric, T., Murimoto, J. (2014). Orientation in Cartesian space dynamic movement primitives. In IEEE International Conference on Robotics and Automation (ICRA) (pp. 2997-3004). DOI: 10.1109/ICRA.2014.6907291, https://acat-project.eu/modules/BibtexModule/uploads/PDF/udenemecpetric2014.pdf

[5] Gams, A., Nemec, B., Zlajpah, L., Wächter, M., Asfour, T., Ude, A. (2013). Modulation of Motor Primitives using Force Feedback: Interaction with the Environment and Bimanual Tasks (2013), In 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (pp. 5629-5635). DOI: 10.1109/IROS.2013.6697172, https://h2t.anthropomatik.kit.edu/pdf/Gams2013.pdf

[6] Vidakovic, J., Jerbic, B., Sekoranja, B., Svaco, M., Suligoj, F. (2019). Task Dependent Trajectory Learning from Multiple Demonstrations Using Movement Primitives (2019), In International Conference on Robotics in Alpe-Adria Danube Region (RAAD) (pp. 275-282). DOI: 10.1007/978-3-030-19648-6_32, https://link.springer.com/chapter/10.1007/978-3-030-19648-6_32

[7] Paraschos, A., Daniel, C., Peters, J., Neumann, G. (2013). Probabilistic movement primitives, In C.J. Burges and L. Bottou and M. Welling and Z. Ghahramani and K.Q. Weinberger (Eds.), Advances in Neural Information Processing Systems, 26, https://papers.nips.cc/paper/2013/file/e53a0a2978c28872a4505bdb51db06dc-Paper.pdf

[8] Maeda, G. J., Neumann, G., Ewerton, M., Lioutikov, R., Kroemer, O., Peters, J. (2017). Probabilistic movement primitives for coordination of multiple human–robot collaborative tasks. Autonomous Robots, 41, 593-612. DOI: 10.1007/s10514-016-9556-2, https://link.springer.com/article/10.1007/s10514-016-9556-2

[9] Paraschos, A., Daniel, C., Peters, J., Neumann, G. (2018). Using probabilistic movement primitives in robotics. Autonomous Robots, 42, 529-551. DOI: 10.1007/s10514-017-9648-7, https://www.ias.informatik.tu-darmstadt.de/uploads/Team/AlexandrosParaschos/promps_auro.pdf

[10] Lazaric, A., Ghavamzadeh, M. (2010). Bayesian Multi-Task Reinforcement Learning. In Proceedings of the 27th International Conference on International Conference on Machine Learning (ICML'10) (pp. 599-606). https://hal.inria.fr/inria-00475214/document

Funding

This library has been developed initially at the Robotics Innovation Center of the German Research Center for Artificial Intelligence (DFKI GmbH) in Bremen. At this phase the work was supported through a grant of the German Federal Ministry of Economic Affairs and Energy (BMWi, FKZ 50 RA 1701).

More Repositories

1

phobos

An add-on for Blender allowing to create URDF, SDF and SMURF robot models in a WYSIWYG environment.
Python
706
star
2

pytransform3d

3D transformations for Python.
Python
534
star
3

mir_robot

ROS support for the MiR Robots. This is a community project to use the MiR Robots with ROS. It is not affiliated with Mobile Industrial Robots.
Python
173
star
4

hand_embodiment

Embodiment mapping for robotic hands from human hand motions.
Python
27
star
5

slam3d

The SLAM3D library is a standalone framework for multimodal graph based Simultaneous Localization and Mapping.
C++
20
star
6

deformable_gym

A collection of RL gymnasium environments for learning to grasp 3D deformable objects.
Python
20
star
7

NovelWrist

Julia package for the kinematic analysis of the 2SPU+2RSU+1U wrist mechanism.
Julia
6
star
8

robot_remote_control

A library for framework independent remote control of semi-autonomous robots. The library is in active development and might change a lot.
C++
6
star
9

docker_image_development

Scritps and Dockerfiles to support docker-based, 3D accelerated development and release of docker images
Shell
6
star
10

CoBaIR

Python
4
star
11

xtypes

C++
3
star
12

xtypes_generator

C++
3
star
13

slam3d-ros

C++
3
star
14

vMCI_segmentation

Python
2
star
15

bagel_gui

A GUI to view and edit Bagel graphs.
C++
2
star
16

c_bagel

C implementation for the execution of Bagel graphs.
C
2
star
17

bagel_magic

Wizard to manage Bagel models associated with robot modes of MARS simulation.
Python
2
star
18

xdbi

C++
2
star
19

dfki-ric.github.io

List of open source software
HTML
2
star
20

BagelMARS

A MARS plugin that can load Bagel graphs into MARS and provides the graph in- and outputs to DataBroker.
C++
1
star
21

bagel_wiki

This repository includes the documentation for the Bagel packages.
Shell
1
star
22

gh-landing-page-template

Template for paper descriptions
HTML
1
star
23

cpp_bagel_wrapper

C++ library to load and execute Bagel graphs via the c_bagel package
C++
1
star
24

slam3d-orogen

C++
1
star
25

bagel_package_set

Definition of Bagel packages
1
star
26

ugv_nav4d

C++
1
star