• Stars
    star
    2,961
  • Rank 15,305 (Top 0.4 %)
  • Language
    Python
  • Created almost 5 years ago
  • Updated 10 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.

Pretrained Language Model

This repository provides the latest pretrained language models and its related optimization techniques developed by Huawei Noah's Ark Lab.

Directory structure

  • PanGu-α is a Large-scale autoregressive pretrained Chinese language model with up to 200B parameter. The models are developed under the MindSpore and trained on a cluster of Ascend 910 AI processors.
  • NEZHA-TensorFlow is a pretrained Chinese language model which achieves the state-of-the-art performances on several Chinese NLP tasks developed under TensorFlow.
  • NEZHA-PyTorch is the PyTorch version of NEZHA.
  • NEZHA-Gen-TensorFlow provides two GPT models. One is Yuefu (乐府), a Chinese Classical Poetry generation model, the other is a common Chinese GPT model.
  • TinyBERT is a compressed BERT model which achieves 7.5x smaller and 9.4x faster on inference.
  • TinyBERT-MindSpore is a MindSpore version of TinyBERT.
  • DynaBERT is a dynamic BERT model with adaptive width and depth.
  • BBPE provides a byte-level vocabulary building tool and its correspoinding tokenizer.
  • PMLM is a probabilistically masked language model. Trained without the complex two-stream self-attention, PMLM can be treated as a simple approximation of XLNet.
  • TernaryBERT is a weights ternarization method for BERT model developed under PyTorch.
  • TernaryBERT-MindSpore is the MindSpore version of TernaryBERT.
  • HyperText is an efficient text classification model based on hyperbolic geometry theories.
  • BinaryBERT is a weights binarization method using ternary weight splitting for BERT model, developed under PyTorch.
  • AutoTinyBERT provides a model zoo that can meet different latency requirements.
  • PanGu-Bot is a Chinese pre-trained open-domain dialog model build based on the GPU implementation of PanGu-α.
  • CeMAT is a universal sequence-to-sequence multi-lingual pre-training language model for both autoregressive and non-autoregressive neural machine translation tasks.
  • Noah_WuKong is a large-scale Chinese vision-language dataset and a group of benchmarking models trained on it.
  • Noah_WuKong-MindSpore is a MindSpore version of Noah_WuKong.
  • CAME is a Confidence-guided Adaptive Memory Efficient Optimizer.

More Repositories

1

Efficient-AI-Backbones

Efficient AI Backbones including GhostNet, TNT and MLP, developed by Huawei Noah's Ark Lab.
Python
4,021
star
2

HEBO

Bayesian optimisation & Reinforcement Learning library developped by Huawei Noah's Ark Lab
Jupyter Notebook
3,266
star
3

Efficient-Computing

Efficient computing methods developed by Huawei Noah's Ark Lab
Jupyter Notebook
1,116
star
4

AdderNet

Code for paper " AdderNet: Do We Really Need Multiplications in Deep Learning?"
Python
952
star
5

trustworthyAI

Trustworthy AI related projects
Python
949
star
6

SMARTS

Scalable Multi-Agent RL Training School for Autonomous Driving
Python
922
star
7

bolt

Bolt is a deep learning library with high performance and heterogeneous flexibility.
C++
896
star
8

noah-research

Noah Research
Python
867
star
9

vega

AutoML tools chain
Python
840
star
10

VanillaNet

Python
810
star
11

Speech-Backbones

This is the main repository of open-sourced speech technology by Huawei Noah's Ark Lab.
Jupyter Notebook
547
star
12

streamDM

Stream Data Mining Library for Spark Streaming
Scala
492
star
13

Pretrained-IPT

Python
406
star
14

xingtian

xingtian is a componentized library for the development and verification of reinforcement learning algorithms
Python
305
star
15

benchmark

HTML
274
star
16

Disout

Code for AAAI 2020 paper, Beyond Dropout: Feature Map Distortion to Regularize Deep Neural Networks (Disout).
Python
219
star
17

BGCN

A Tensorflow implementation of "Bayesian Graph Convolutional Neural Networks" (AAAI 2019).
Python
152
star
18

BHT-ARIMA

Code for paper: Block Hankel Tensor ARIMA for Multiple Short Time Series Forecasting (AAAI-20)
Python
97
star
19

multi_hyp_cc

[CVPR2020] A Multi-Hypothesis Approach to Color Constancy
Python
82
star
20

Efficient-NLP

Python
79
star
21

streamDM-Cpp

stream Machine Learning in C++
C++
68
star
22

Federated-Learning

Python
15
star