Seunghyun Lee (@sseung0703)

Top repositories

1

Knowledge_distillation_via_TF2.0

The codes for recent knowledge distillation algorithms and benchmark results via TF2.0 low-level API
Python
104
star
2

Zero-shot_Knowledge_Distillation

Zero-Shot Knowledge Distillation in Deep Networks in ICML2019
Python
49
star
3

SSKD_SVD

Python
49
star
4

GALA_TF2.0

Tensorflow 2.0 implementation of "Symmetric Graph Convolutional Autoencoder for Unsupervised Graph Representation Learning" in ICCV2019
Python
42
star
5

Lightweighting_Cookbook

This project attempts to build neural network training and lightweighting cookbook including three kinds of lightweighting solutions, i.e., knowledge distillation, filter pruning, and quantization.
Python
23
star
6

Variational_Information_Distillation

Reproducing VID in CVPR2019 (on working)
Python
20
star
7

TF2-jit-compile-on-multi-gpu

Tensorflow2 training code with jit compiling on multi-GPU.
Python
17
star
8

EKG

Ensemble Knowledge Guided Sub-network Search and Fine-tuning for Filter Pruning
Python
17
star
9

ADNet-tensorflow

Python
16
star
10

Google_Colab_tutorial

Google Colab tutorial with simple network training and Tensorboard.
Jupyter Notebook
14
star
11

Autoslim_TF2

Implementation of Autoslim using Tensorflow2
Python
12
star
12

MHGD

the presentation materials for Multi-head Graph Distillation ( BMVC2019 oral )
5
star
13

SSKD

4
star
14

IEPKT

Implementation of "Interpretable embedding procedure knowledge transfer" on AAAI2021
Python
4
star
15

sseung0703.github.io

SCSS
4
star
16

tensorflow2.0_wo_keras

Python
3
star
17

CNN_via_Tensorflow2_low-level

Colvolutional neural network implementation with Tensorflow2.0 low level API only
Python
3
star
18

ACCESS_KD

This project solves a problem of ZSKT in neurips2019
Jupyter Notebook
1
star
19

aingo03304.github.io

Minjae's Blog
CSS
1
star
20

sseung0703

1
star