• Stars
    star
    289
  • Rank 140,053 (Top 3 %)
  • Language
    Python
  • License
    MIT License
  • Created over 4 years ago
  • Updated 7 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

TransNet V2: Shot Boundary Detection Neural Network

TransNet V2: Shot Boundary Detection Neural Network

This repository contains code for TransNet V2: An effective deep network architecture for fast shot transition detection.

Our reevaluation of other publicly available state-of-the-art shot boundary methods (F1 scores):

Model ClipShots BBC Planet Earth RAI
TransNet V2 (this repo) 77.9 96.2 93.9
TransNet (github) 73.5 92.9 94.3
Hassanien et al. (github) 75.9 92.6 93.9
Tang et al., ResNet baseline (github) 76.1 89.3 92.8

๐ŸŽฅ USE IT ON YOUR VIDEO

โžก๏ธ See inference folder and its README file. โฌ…๏ธ

๐Ÿš€ PYTORCH VERSION for inference RELEASED

See inference-pytorch folder and its README file.

REPLICATE THE WORK

Note the datasets for training are tens of gigabytes in size, hundreds of gigabytes when exported.

You do not need to train the network, use code and instructions in inference folder to detect shots in your videos.

This repository contains all that is needed to run any experiment for TransNet V2 network including network training and dataset creation. All experiments should be runnable in this NVIDIA DOCKER file.

In general these steps need to be done in order to replicate our work (in training folder):

  1. Download RAI and BBC Planet Earth test datasets (link). Download ClipShots train/test dataset (link). Optionally get IACC.3 dataset.
  2. Edit and run consolidate_datasets.py in order to transform ground truth from all the datasets into one common format.
  3. Take some videos from ClipShotsTrain aside as a validation dataset.
  4. Run create_dataset.py to create all train/validation/test datasets.
  5. Run training.py ../configs/transnetv2.gin to train a model.
  6. Run evaluate.py /path/to/run_log_dir epoch_no /path/to/test_dataset for proper evaluation.

CREDITS

If found useful, please cite us;)