An open-sourced project of autonomous car
Introduction • Documentation • Screenshots • Contribute
A must-see youtube video about this project -->: here❗❗
a.k.a.: "Roomba, that does not suck"
Written in Python 🐍 (mostly).
Contents:
Introduction
This repository contains main software and documentation for the Tonic project. This project aims to create an open-sourced autonomous driving system, along with its hardware prototype implementation. Some essential parts of the projects are contained in other related repos. See the list of related repos The core idea of how this should work is as follows:
- After setting up the robot/car, drive it manually, and dump the video and steering feed (this part is called data taking).
- Create a 3D mapping of the environment with Tonic/autonomous.
- Define checkpoints, through which the machine will drive.
- Program the car to drive on the defined paths.
All of that to be possible for as cheap as possible, with a raspberry PI and only a single camera.
Features
- Camera live feed and recording.
- Live steering system and recording.
- Working IMU live streaming and recording.
- Working odometry live streaming and recording.
- Qt GUI client for driving and data taking.
- SLAM mapping, and navigation implemented with ORB_SLAM2 and its custom fork, custom python bindings, and serialisation.
How does it work
As for now, this repository (mmajewsk/Tonic) contains guides and software for building, running and steering the car 🚘 for the data taking. The code is divided, into Tonic/control and Tonic/car.
The Tonic/control contains the code that is meant to be run on your laptop/pc/mac, that will control the raspberry pi Tonic/car.
The machine and control interface is communicating via WiFi network, using sockets.
Sensors, camera, and steering - each one is implemented as a separate service using sockets. You can steer it with the keyboard on PC, while seeing live feed from the camera. All of the sensors, steering and video can be dumped to files on PC. You don't need to turn all of the sensors to make this work.
The odometry and IMU are not necessary to make an environment mapping
Ok so how do I start
- Take a look at previous versions and the current one in video and screenshots.
- First start by assembling the hardware.
- Then set up the machine and interface software.
- Do the data-taking run, running steering and video data, as described here.
To make your machine drive autonomously, follow the guide in Tonic/autonomous repo.
Contribute
🧑🔧 This project is meant to be open for everyone. The contributions are welcome. If you would like to help see what's listed in the issues here, or add something yourself.
Also, you can join the 🗣️ discord server if you are looking for quick help, or just want to say hi ;)
Related repos
- My fork of ORB_SLAM2
- My fork of Osmap - Dumps ORB_SLAM2 to file
- My fork of PythonBindings - this one combines osmap with orb slam python bindings!
- TonicSlamDunk - Install scripts for all of the above, includes scripts for ubuntu, and dockerfile.
TonicOrange - Exemplary use of orb slam, for pathfinding(moved to Tonic/autonomous)