There are no reviews yet. Be the first to send feedback to the community and the maintainers!
KD_methods_with_TF
Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)Knowledge_distillation_via_TF2.0
The codes for recent knowledge distillation algorithms and benchmark results via TF2.0 low-level APIZero-shot_Knowledge_Distillation
Zero-Shot Knowledge Distillation in Deep Networks in ICML2019SSKD_SVD
GALA_TF2.0
Tensorflow 2.0 implementation of "Symmetric Graph Convolutional Autoencoder for Unsupervised Graph Representation Learning" in ICCV2019Lightweighting_Cookbook
This project attempts to build neural network training and lightweighting cookbook including three kinds of lightweighting solutions, i.e., knowledge distillation, filter pruning, and quantization.Variational_Information_Distillation
Reproducing VID in CVPR2019 (on working)TF2-jit-compile-on-multi-gpu
Tensorflow2 training code with jit compiling on multi-GPU.EKG
Ensemble Knowledge Guided Sub-network Search and Fine-tuning for Filter PruningGoogle_Colab_tutorial
Google Colab tutorial with simple network training and Tensorboard.Autoslim_TF2
Implementation of Autoslim using Tensorflow2MHGD
the presentation materials for Multi-head Graph Distillation ( BMVC2019 oral )SSKD
IEPKT
Implementation of "Interpretable embedding procedure knowledge transfer" on AAAI2021sseung0703.github.io
tensorflow2.0_wo_keras
CNN_via_Tensorflow2_low-level
Colvolutional neural network implementation with Tensorflow2.0 low level API onlyACCESS_KD
This project solves a problem of ZSKT in neurips2019aingo03304.github.io
Minjae's Blogsseung0703
Love Open Source and this site? Check out how you can help us