There are no reviews yet. Be the first to send feedback to the community and the maintainers!
can-wikipedia-help-offline-rl
Official code for "Can Wikipedia Help Offline Reinforcement Learning?" by Machel Reid, Yutaro Yamada and Shixiang Shane Gudiffuser
DiffusER: Discrete Diffusion via Edit-based Reconstruction (Reid, Hellendoorn & Neubig, 2022)m2d2
M2D2: A Massively Multi-domain Language Modeling Dataset (EMNLP 2022) by Machel Reid, Victor Zhong, Suchin Gururangan, Luke Zettlemoyereditpro
Learning to Model Editing Processessubformer
The code for the Subformer, from the EMNLP 2021 Findings paper: "Subformer: Exploring Weight Sharing for Parameter Efficiency in Generative Transformers", by Machel Reid, Edison Marrese-Taylor, and Yutaka Matsuoafromt
Code for the EMNLP 2021 Paper "AfroMT: Pretraining Strategies and Reproducible Benchmarks for Translation of 8 African Languages" by Machel Reid, Junjie Hu, Graham Neubig, Yutaka Matsuovcdm
The official implementation for "VCDM: Leveraging Variational Bi-encoding and Deep Contextualized Word Representations for Improved Definition Modeling", EMNLP 2020paradise
PARADISE: Exploiting Parallel Data for Multilingual Sequence-to-Sequence Pretraining (NAACL 2022), Machel Reid and Mikel Artetxeabci-utils
Utils for ABCI (https://abci.ai/) clustertwitter-search-api
Twitter Search API (open-source). Project developed as part of internship with Numada Lab with Professor Muneyoshi Numada at the University of TokyoLove Open Source and this site? Check out how you can help us