Paper Lists of Neural Architecture Search
This document lists the papers published from 2017 to February 2021 on Neural Architecture Search (NAS). We collect these papers from 13 conferences and journals, including ACL、IJCAL、AAAI、JMLR、ICLR、EMNLP、CVPR、UAI、ICCV、NeurIPS、ECCV、INTERSPEECH、ICML with covering most NAS research directions. We also categorize these papers into popular topics and collect the code for them.
Outline
-
Code
We find the code for some of the papers (if there are any). Parentheses in code links indicate that the code was not written by the author of the paper. (like (github))
-
Type
We classify the papers according to their types and integrate the papers of the same type together. Please refer to the previous directory for specific classification criteria.
-
Task
We summarize the tasks of these papers according to the experiments described in them. For notational simplicity, we use short names of the tasks (in alphabetical order).
- ASR (Automatic Speech Recognition)
- speech recognition
- speaker verification
- speaker identification
- acoustic scene classification
- keyword spotting
- spoken language identification
- multilingual speech recognition
- CL
- object classification
- scene classification
- point cloud classification
- node classification
- IC (Image Classification)
- IR (Image Recognition)
- IRT (Image ResToration)
- image inpainting
- image denoising
- image de-raining
- image restoration
- LM (Language Model)
- MT (Machine Translation)
- NER (Named Entity Recognition)
- O (Other)
- OD (Object Detection)
- SE
- instance segmentation
- OAR segmentation
- SS (Semantic Segmentation)
- VU (Video Understanding)
- ASR (Automatic Speech Recognition)
-
Info
Info is an extension of the paper introduction, including Title, Author, Abstract and Bib.
-
A Short List
For a quick look at the field, we summary a short list of must-read papers. In this section, we list some highly cited papers, and we also recommend some papers that may be helpful for beginners (bold).
Statistics
Figure 1: The number of each Type and annual amounts in recent four years.(See Outline for Type details)
Figure 2: The Number of papers in each Task area in recent years.(See Outline for Task details)
Figure 3: The word cloud for NAS.
A Short List
A Long List
1 Surveys - S
Title | Venue | Code | Year | Info |
---|---|---|---|---|
Neural architecture search: A survey | JMLR | - | 2018 | details |
A Survey on Neural Architecture Search | - | - | 2019 | details |
Reinforcement Learning for Neural Architecture Search: A Review | IVC | - | 2019 | details |
A Comprehensive Survey of Neural Architecture Search: Challenges and Solutions | - | - | 2020 | details |
Evaluating The Search Phase of Neural Architecture Search | ICLR | github | 2020 | details |
NAS evaluation is frustratingly hard | ICLR | github | 2020 | details |
AutoML: A survey of the state-of-the-art. | Knowl Based Syst | - | 2021 | - |
2 Methods
2.1 Search Space - SP
2.2 Search Strategy
2.2.1 Reinforcement Learning Methods - RL
2.2.2 Gradient-based Methods - G
2.2.3 Evolutionary Algorithms - EA
2.2.4 Bayesian Optimization - BO
Title | Venue | Task | Code | Year | Info |
---|---|---|---|---|---|
Neural Architecture Search with Bayesian Optimisation and Optimal Transport | NeurIPS | IC | github | 2018 | details |
Learnable Embedding Space for Efficient Neural Architecture Compression | ICLR | IC | github | 2019 | details |
Posterior-Guided Neural Architecture Search | AAAI | IC | github | 2020 | details |
Bridging the Gap between Sample-based and One-shot Neural Architecture Search with BONAS | NeurIPS | IC | github | 2020 | details |
Interpretable Neural Architecture Search via Bayesian Optimisation with Weisfeiler-Lehman Kernels | ICLR | IC | - | 2021 | details |
2.3 Performance Prediction - PD
2.4 Others - O
3 Systems
Team Members
The project is maintained by
Yongyu Mu, Zefan Zhou, Zhongxiang Yan, Chi Hu, Yinqiao Li, Tong Xiao, and Jingbo Zhu
Natural Language Processing Lab., School of Computer Science and Engineering, Northeastern University
NiuTrans Research
For any questions, please feel free to contact us (heshengmo [at] foxmail [dot] com or li.yin.qiao.2012 [at] hotmail [dot] com)