• Stars
    star
    1,713
  • Rank 27,138 (Top 0.6 %)
  • Language
    Python
  • License
    Other
  • Created over 6 years ago
  • Updated about 2 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

PyTorch pre-trained model for real-time interest point detection, description, and sparse tracking (https://arxiv.org/abs/1712.07629)

Research @ Magic Leap

SuperPoint Weights File and Demo Script

Introduction

This repo contains the pretrained SuperPoint network, as implemented by the originating authors. SuperPoint is a research project at Magic Leap. The SuperPoint network is a fully convolutional deep neural network trained to detect interest points and compute their accompanying descriptors. The detected points and descriptors can thus be used for various image-to-image matching tasks. For more details please see

This demo showcases a simple sparse optical flow point tracker that uses SuperPoint to detect points and match them across video sequences. The repo contains two core files (1) a PyTorch weights file and (2) a python deployment script that defines the network, loads images and runs the pytorch weights file on them, creating a sparse optical flow visualization. Here are videos of the demo running on various publically available datsets:

Freiburg RGBD:

KITTI:

Microsoft 7 Scenes:

MonoVO:

Dependencies

This repo depends on a few standard pythonic modules, plus OpenCV and PyTorch. These commands usually work (tested on Mac and Ubuntu) for installing the two libraries:

pip install opencv-python
pip install torch

Running the Demo

This demo will run the SuperPoint network on an image sequence and compute points and descriptors from the images, using a helper class called SuperPointFrontend. The tracks are formed by the PointTracker class which finds sequential pair-wise nearest neighbors using two-way matching of the points' descriptors. The demo script uses a helper class called VideoStreamer which can process inputs from three different input streams:

  1. A directory of images, such as .png or .jpg
  2. A video file, such as .mp4 or .avi
  3. A USB Webcam

Run the demo on provided directory of images in CPU-mode:

./demo_superpoint.py assets/icl_snippet/

You should see the following output from the ICL-NUIM sequence snippet:

Run the demo on provided .mp4 file in GPU-mode:

./demo_superpoint.py assets/nyu_snippet.mp4 --cuda

You should see the following output from the NYU sequence snippet:

Run a live demo via webcam (id #1) in CPU-mode:

./demo_superpoint.py camera --camid=1

Run the demo on a remote GPU (no display) on 640x480 images and write the output to myoutput/

./demo_superpoint.py assets/icl_snippet/ --W=640 --H=480 --no_display --write --write_dir=myoutput/

Additional useful command line parameters

  • Use --H to change the input image height (default: 120).
  • Use --W to change the input image width (default: 160).
  • Use --display_scale to scale the output visualization image height and width (default: 2).
  • Use --cuda flag to enable the GPU.
  • Use --img_glob to change the image file extension (default: *.png).
  • Use --min_length to change the minimum track length (default: 2).
  • Use --max_length to change the maximum track length (default: 5).
  • Use --conf_thresh to change the point confidence threshold (default: 0.015).
  • Use --nn_thresh to change the descriptor matching distance threshold (default: 0.7).
  • Use --show_extra to show more computer vision outputs.
  • Press the q key to quit.

BibTeX Citation

@inproceedings{detone18superpoint,
  author    = {Daniel DeTone and
               Tomasz Malisiewicz and
               Andrew Rabinovich},
  title     = {SuperPoint: Self-Supervised Interest Point Detection and Description},
  booktitle = {CVPR Deep Learning for Visual SLAM Workshop},
  year      = {2018},
  url       = {http://arxiv.org/abs/1712.07629}
}

Additional Notes

  • We do not intend to release the SuperPoint training or evaluation code, please do not email us to ask for it.
  • We do not intend to release the Synthetic Shapes dataset used to bootstrap the SuperPoint training, please do not email us to ask for it.
  • We use bi-linear interpolation rather than the bi-cubic interpolation described in the paper to sample the descriptor as it is faster and gave us similar results.

Legal Disclaimer

Magic Leap is proud to provide its latest samples, toolkits, and research projects on Github to foster development and gather feedback from the spatial computing community. Use of the resources within this repo is subject to (a) the license(s) included herein, or (b) if no license is included, Magic Leap's Developer Agreement, which is available on our Developer Portal. If you need more, just ask on the forums! We're thrilled to be part of a well-meaning, friendly and welcoming community of millions.

More Repositories

1

SuperGluePretrainedNetwork

SuperGlue: Learning Feature Matching with Graph Neural Networks (CVPR 2020, Oral)
Python
3,056
star
2

Atlas

Atlas: End-to-End 3D Scene Reconstruction from Posed Images
Python
1,750
star
3

DELTAS

Inference Code for DELTAS: Depth Estimation by Learning Triangulation And densification of Sparse point (ECCV 2020)s
Python
95
star
4

prismatic

Prismatic is a declarative JS library for creating 3D content for the Helio browser.
JavaScript
38
star
5

Magic-Leap-Toolkit-Unity

C#
34
star
6

MRTK-MagicLeapOne

An extension to provide compatibility with Magic Leap features such as hand tracking and 6dof controller support, to Microsoft's Mixed reality Toolkit (MRTK).
C#
31
star
7

xr-kit-samples-unity

Magicverse SDK sample project
C#
26
star
8

UnityTemplate

C#
21
star
9

MagicLeapUnityExamples

This project contains Examples for the Magic Leap Unity SDK and has been configured so developers can start developing for the Magic Leap platform quickly.
C#
17
star
10

LeapBrush

Magic Leap 2's AR Cloud reference application for Unity that lets you draw in AR with other ML2 devices.
C#
15
star
11

MagicLeapUnitySDK

Magic Leap Unity Developer SDK
C#
12
star
12

arcloud

AR Cloud from Magic Leap allows for shared experiences using features such as mapping, localization, and spatial anchors.
Shell
12
star
13

detached_explainer

6
star
14

IconCreationPlugin

Python
6
star
15

Desktop-Companion-App-Developer-Tools

Desktop Companion App Developer Tools
C++
4
star
16

3DBrainVisualizer

C#
3
star
17

developer-portal-docs

Home to the documentation and API section of the developer portal
JavaScript
3
star
18

ml1-spectator-mode

A Unity project that shows how to record a spectator view on the Magic Leap 1 using two co-located Magic Leap Headsets.
C#
3
star
19

kernel-lumin

C
2
star
20

c_api_samples

Lumin SDK CAPI Samples
C++
2
star
21

tfmodules

Repository for Terraform modules compatible with RenovateBot
Go
2
star
22

MagicLeapXRKeyboard

A keyboard that can be used in any project that supports Unity's XR Interaction Toolkit.
C#
2
star
23

SpatialAnchorsExample

Unity App that uses Magic Leap 2’s Spatial Anchors API and a JSON file to create content that persists in a Space across reboots. If the user is not localized, they the app allows users to localize using a QR Code.
ShaderLab
2
star
24

wifi-direct-shared-experience-sample

Shared Experience sample app that uses a Wi-Fi Direct Service Discovery Android native Unity plug-in.
C#
2
star
25

c3

Python
1
star
26

ML1MarkerAndImageTrackingExample

Tutorials and demo projects for tracking images and ArUco markers with the Magic Leap
C#
1
star
27

wifi-direct-plugin-sample

Sample Android Plugin for Unity to use Wi-Fi Direct Service Discovery. This project is an Android app harness written in Java and the plugin is an Android Activity contained in a Java Module.
Java
1
star
28

com.magicleap.spectator.networkanchors

A lightweight package for the Magic Leap 1 that makes creating colocation experiences easier using a shared origin.
C#
1
star
29

MagicLeapReadyPlayerMe

Package containing scripts and samples to use Ready Player Me avatars with the Magic Leap 2.
C#
1
star
30

MagicLeapPhotonFusionExample

This repository contains an example project demonstrating how to use Photon Fusion to create a colocation application for the Magic Leap 2. This project provides a simple multiuser and colocation application using Photon Fusion. The example is designed to work with the Magic Leap 2 and demonstrates the basics of creating a shared AR experience.
C#
1
star