• Stars
    star
    709
  • Rank 63,849 (Top 2 %)
  • Language
    C++
  • License
    MIT License
  • Created over 7 years ago
  • Updated about 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Object (e.g Pedestrian, vehicles) tracking by Extended Kalman Filter (EKF), with fused data from both lidar and radar sensors.

Object Tracking with Extended Kalman Filter

Objective

Utilize sensor data from both LIDAR and RADAR measurements for object (e.g. pedestrian, vehicles, or other moving objects) tracking with the Extended Kalman Filter.

Demo: Object tracking with both LIDAR and RADAR measurements

gif_demo1

In this demo, the blue car is the object to be tracked, but the tracked object can be any types, e.g. pedestrian, vehicles, or other moving objects. We continuously got both LIDAR (red circle) and RADAR (blue circle) measurements of the car's location in the defined coordinate, but there might be noise and errors in the data. Also, we need to find a way to fuse the two types of sensor measurements to estimate the proper location of the tracked object.

Therefore, we use Extended Kalman Filter to compute the estimated location (green triangle) of the blue car. The estimated trajectory (green triangle) is compared with the ground true trajectory of the blue car, and the error is displayed in RMSE format in real time.

In autonomous driving case, the self-driving cars obtian both Lidar and radar sensors measurements of objects to be tracked, and then apply the Extended Kalman Filter to track the objects based on the two types of sensor data.


Code & Files

1. Dependencies & environment

2. My project files

(Note: the hyperlinks only works if you are on the homepage of this GitHub reop, and if you are viewing it in "github.io" you can be redirected by clicking the View the Project on GitHub on the top)

  • CMakeLists.txt is the cmake file.

  • data folder contains test lidar and radar measurements.

  • Docs folder contains docments which describe the data.

  • src folder contains the source code.

3. Code Style

4. How to run the code

  1. Clone this repo.
  2. Make a build directory: mkdir build && cd build
  3. Compile: cmake .. && make
    • On windows, you may need to run: cmake .. -G "Unix Makefiles" && make
  4. Run it by either of the following commands:
    • ./ExtendedKF ../data/obj_pose-laser-radar-synthetic-input.txt ./output.txt
    • ./ExtendedKF ../data/sample-laser-radar-measurement-data-1.txt ./output.txt

5. Release History

  • 0.2.1

    • Docs: Add a sample video for vehicle tracking
    • Date 3 May 2017
  • 0.2.0

    • Fix: Normalize the angle for EKF updates with Radar
    • Fix: Initialize several variables
    • Date 2 May 2017
  • 0.1.1

    • First proper release
    • Date 1 May 2017
  • 0.1.0

    • Initiate the repo and add the functionality of pedestrian trakcing with lidar data.
    • Date 28 April 2017

System details

1. Demos

Demo 1: Tracking with both LIDAR and RADAR measurements

In this demo, both LIDAR and RADAR measurements are used for object tracking.

gif_demo1

Demo 2: Tracking with only LIDAR measurements

In this demo, only LIDAR measurements are used for the object tracking.

gif_demo2

Demo 3๏ผšTracking with only RADAR measurements

In this demo, only RADAR measurements are used for the object tracking. are more noisy than the LIDAR measurements.

gif_demo3

From these three Demos, we could see that

  • RADAR measurements are tend to be more more noisy than the LIDAR measurements.
  • Extended Kalman Filter tracking by utilizing both measurements from both LIDAR and RADAR can reduce the noise/errors from the sensor measurements, and provide the robust estimations of the tracked object locations.

Note: the advantage of RADAR is that it can estimate the object speed directly by Doppler effect.

2. How does LIDAR measurement look like

The LIDAR will produce 3D measurement px,py,pz. But for the case of driving on the road, we could simplify the pose of the tracked object as: px,py,and one rotation. In other words, we could only use px and px to indicate the position of the object, and one rotation to indicate the orientation of the object. But in real world where you have very steep road, you have to consider z axis as well. Also in application like airplane and drone, you definitely want to consider pz as well.

3. How does RADAR measurement look like

4. Comparison of LIDAR, RADAR and Camera

Sensor type LIDAR RADAR Camera
Resolution median low high
Direct velocity measure no yes no
All-weather bad good bad
Sensor size large small small
sense non-line of sight object no yes no

Note:

  • LIDAR wavelength in infrared; RADAR wavelength in mm.
  • LIDAR most affected by dirt and small debris.

One comparison Figure from another aspect.

5. How does the Extended Kalman Filter Work

4. Extended Kalman Filter V.S. Kalman Filter

  • x is the mean state vector.
  • F is the state transition function.
  • P is the state covariance matrix, indicating the uncertainty of the object's state.
  • u is the process noise, which is a Gaussian with zero mean and covariance as Q.
  • Q is the covariance matrix of the process noise.

For EKF

  • To calculate predicted state vector xโ€ฒ, the prediction function f(x), is used instead of the F matrix.
  • The F matrix will be replaced by Fj (jocobian matrix of f) when calculating Pโ€ฒ.

  • y is the innovation term, i.e. the difference between the measurement and the prediction. In order to compute the innovation term, we transform the state to measurement space by measurement function, so that we can compare the measurement and prediction directly.
  • S is the predicted measurement covariance matrix, or named innovation covariance matrix.
  • H is the measurement function.
  • z is the measurement.
  • R is the covariance matrix of the measurement noise.
  • I is the identity matrix.
  • K is the Kalman filter gain.
  • Hj and Fj are the jacobian matrix.

For EKF

  • To calculate innovation y, the measurement function h(x') is used instead of the H matrix.
  • The H matrix in the Kalman filter will be replaced by the Hj(Jacobian matrix of h(x'))when calculating S, K, and P.

All Kalman filters have the same three steps:

  1. Initialization
  2. Prediction
  3. Update

A standard Kalman filter can only handle linear equations. Both the Extended Kalman Filter (EKF) and the Unscented Kalman Filter (UKF will be disuccsed in the next project) allow you to use non-linear equations; the difference between EKF and UKF is how they handle non-linear equations: Extended Kalman Filter uses the Jacobian matrix to linearize non-linear functions; Unscented Kalman Filter, on the other hand, does not need to linearize non-linear functions, insteadly, the unscented Kalman filter takes representative points from a Gaussian distribution.

More Repositories

1

vehicle-detection

Created vehicle detection pipeline with two approaches: (1) deep neural networks (YOLO framework) and (2) support vector machines ( OpenCV + HOG).
Python
622
star
2

driving-lane-departure-warning

Built a real-time lane departure warning system with a monocular camera, using OpenCV.
Python
211
star
3

tracking-with-Unscented-Kalman-Filter

Object (e.g Pedestrian, biker, vehicles) tracking by Unscented Kalman Filter (UKF), with fused data from both lidar and radar sensors.
C++
155
star
4

Model-Predictive-Control

This project is to use Model Predictive Control (MPC) to drive a car in a game simulator. The server provides reference waypoints (yellow line in the demo video) via websocket, and we use MPC to compute steering and throttle commands to drive the car. The solution must be robust to 100ms latency, since it might encounter in real-world application.
C++
152
star
5

semantic_segmentation

Semantically segment the road in the given image.
Python
105
star
6

traffic-light-detector

Detect traffic lights and classify the state of them, then give the commands "go" or "stop".
Python
62
star
7

PID-controller

Use a PID controller to control the steering angle and throttle for driving a car in a car game simulator.
C++
56
star
8

camera-pose-estimation

Given a map data (image + lidar), estimate the 6 DoF camera pose of the query image.
MATLAB
28
star
9

path_planning

Implement a simple real-time path planner in C++ to navigate a car around a simulated highway scenario
C++
21
star
10

kidnapped-vehicle

Implemented a C++ particle filter for real-time vehicle localization with only current visual observations and a map.
C++
19
star
11

traffic-light-classifier

A simple network to classifier the states of the traffic lights.
Python
16
star
12

DrivingLaneDetection

Detected highway lane lines on a video stream. Used OpencV image analysis techniques to identify lines, including Hough Transforms and Canny edge detection.
Jupyter Notebook
13
star
13

driving-behavioral-cloning

Built and trained a convolutional neural network for end-to-end driving in a simulator, using TensorFlow and Keras.
Python
11
star
14

traffic-sign-recognition

Built and trained a deep neural network to classify traffic signs, using TensorFlow.
Jupyter Notebook
9
star
15

ResNet_MobileNet_InceptionV3

An example to run ResNet, MobileNet and InceptionV3 with Keras and Imagenet weights
Python
1
star
16

rental-bike

Predict the number of rental bikes should be placed based on the environmental conditions.
Jupyter Notebook
1
star
17

calculator

C++ based Calculation Engines which can do multiplication and division with numbers, and it is implemented with template and factory method.
HTML
1
star