There are no reviews yet. Be the first to send feedback to the community and the maintainers!
torchdistill
A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆20 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.head-network-distillation
[IEEE Access] "Head Network Distillation: Splitting Distilled Deep Neural Networks for Resource-constrained Edge Computing Systems" and [ACM MobiCom HotEdgeVideo 2019] "Distilled Split Deep Neural Networks for Edge-assisted Real-time Systems"supervised-compression
[WACV 2022] "Supervised Compression for Resource-Constrained Edge Computing Systems"hnd-ghnd-object-detectors
[ICPR 2020] "Neural Compression and Filtering for Edge-assisted Real-time Object Detection in Challenged Networks" and [ACM MobiCom EMDL 2020] "Split Computing for Complex Object Detectors: Challenges and Preliminary Results"sc2-benchmark
[TMLR] "SC2 Benchmark: Supervised Compression for Split Computing"bottlefit-split_computing
[IEEE WoWMoM 2022] "BottleFit: Learning Compressed Representations in Deep Neural Networks for Effective and Efficient Split Computing"split-beam
[ICDCS 2023] SplitBeam: Effective and Efficient Beamforming in Wi-Fi Networks Through Split Computingyoshitomo-matsubara.github.io
section-categorization
Automated Section Categorization in Scientific Papershadoop-example
Run example codes on your computer to understand how Hadoop worksLove Open Source and this site? Check out how you can help us