• Stars
    star
    23
  • Rank 1,010,814 (Top 21 %)
  • Language
  • Created over 2 years ago
  • Updated over 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Learning to Model Editing Processes

More Repositories

1

can-wikipedia-help-offline-rl

Official code for "Can Wikipedia Help Offline Reinforcement Learning?" by Machel Reid, Yutaro Yamada and Shixiang Shane Gu
Python
91
star
2

diffuser

DiffusER: Discrete Diffusion via Edit-based Reconstruction (Reid, Hellendoorn & Neubig, 2022)
44
star
3

m2d2

M2D2: A Massively Multi-domain Language Modeling Dataset (EMNLP 2022) by Machel Reid, Victor Zhong, Suchin Gururangan, Luke Zettlemoyer
Python
44
star
4

lewis

Official code for LEWIS, from: "LEWIS: Levenshtein Editing for Unsupervised Text Style Transfer", ACL-IJCNLP 2021 Findings by Machel Reid and Victor Zhong
Python
27
star
5

subformer

The code for the Subformer, from the EMNLP 2021 Findings paper: "Subformer: Exploring Weight Sharing for Parameter Efficiency in Generative Transformers", by Machel Reid, Edison Marrese-Taylor, and Yutaka Matsuo
Python
11
star
6

afromt

Code for the EMNLP 2021 Paper "AfroMT: Pretraining Strategies and Reproducible Benchmarks for Translation of 8 African Languages" by Machel Reid, Junjie Hu, Graham Neubig, Yutaka Matsuo
Python
8
star
7

vcdm

The official implementation for "VCDM: Leveraging Variational Bi-encoding and Deep Contextualized Word Representations for Improved Definition Modeling", EMNLP 2020
Python
8
star
8

paradise

PARADISE: Exploiting Parallel Data for Multilingual Sequence-to-Sequence Pretraining (NAACL 2022), Machel Reid and Mikel Artetxe
Python
6
star
9

abci-utils

Utils for ABCI (https://abci.ai/) cluster
Shell
4
star
10

twitter-search-api

Twitter Search API (open-source). Project developed as part of internship with Numada Lab with Professor Muneyoshi Numada at the University of Tokyo
Python
1
star