• Stars
    star
    1
  • Language
    HTML
  • Created over 1 year ago
  • Updated 10 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

More Repositories

1

torchdistill

A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆20 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
Python
1,083
star
2

head-network-distillation

[IEEE Access] "Head Network Distillation: Splitting Distilled Deep Neural Networks for Resource-constrained Edge Computing Systems" and [ACM MobiCom HotEdgeVideo 2019] "Distilled Split Deep Neural Networks for Edge-assisted Real-time Systems"
Jupyter Notebook
32
star
3

supervised-compression

[WACV 2022] "Supervised Compression for Resource-Constrained Edge Computing Systems"
Python
29
star
4

hnd-ghnd-object-detectors

[ICPR 2020] "Neural Compression and Filtering for Edge-assisted Real-time Object Detection in Challenged Networks" and [ACM MobiCom EMDL 2020] "Split Computing for Complex Object Detectors: Challenges and Preliminary Results"
Jupyter Notebook
24
star
5

sc2-benchmark

[TMLR] "SC2 Benchmark: Supervised Compression for Split Computing"
Python
19
star
6

uci-cs273a-project

Basic Support for Final Projects in UCI CS 273A: Machine Learning
Python
6
star
7

bottlefit-split_computing

[IEEE WoWMoM 2022] "BottleFit: Learning Compressed Representations in Deep Neural Networks for Effective and Efficient Split Computing"
Python
5
star
8

split-beam

[ICDCS 2023] SplitBeam: Effective and Efficient Beamforming in Wi-Fi Networks Through Split Computing
Shell
4
star
9

section-categorization

Automated Section Categorization in Scientific Papers
Python
1
star
10

hadoop-example

Run example codes on your computer to understand how Hadoop works
Java
1
star