IR From Bag-of-words to BERT and Beyond through Practical Experiments
This is the official repository of "IR From Bag-of-words to BERT and Beyond through Practical Experiments", an ECIR 2021 full-day tutorial with PyTerrier and OpenNIR search toolkits.
About the tutorial
Advances from the natural language processing community have recently sparked a renaissance in the task of adhoc search. Particularly, large contextualized language modeling techniques, such as BERT, have equipped ranking models with a far deeper understanding of language than the capabilities of previous bag-of-words (BoW) models. Applying these techniques to a new task is tricky, requiring knowledge of deep learning frameworks, and significant scripting and data munging. In this full-day tutorial, we build up from foundational retrieval principles to the latest neural ranking techniques. We provide background on classical (e.g., BoW), modern (e.g., Learning to Rank) and contemporary (e.g., BERT search ranking and re-ranking techniques. Going further, we detail and demonstrate how these can be easily experimentally applied to new search tasks in a new declarative style of conducting experiments exemplified by the PyTerrier and OpenNIR search toolkits.
This tutorial is interactive in nature for participants; it is broken into sessions, each of which mixes explanatory presentation with hands-on activities using prepared Jupyter notebooks running on the Google Colab platform.
At the end of the tutorial, participants will be comfortable accessing classical inverted index data structures, building declarative retrieval pipelines, and conducting experiments using state-of-the-art neural ranking models.
Authors
- Sean MacAvaney, University of Glasgow, UK
- Craig Macdonald, University of Glasgow, UK
- Nicola Tonellotto, University of Pisa, IT
Contents
- Part 1: Classical IR: indexing, retrieval and evaluation
- Part 2: Modern Retrieval Architectures: PyTerrier data model and operators, towards re-rankers and learning-to-rank
- Part 3: Contemporary Retrieval Architectures: Neural re-rankers such as BERT, EPIC, ColBERT
- Part 4: Recent Advances beyond the classical inverted index: neural inverted index augmentation, nearest neighbor search, dense retrieval
Useful Links
- PyTerrier: [Github] [Documentation]
- OpenNIR: [Github] [Documentation]
- PyTerrier_ColBERT: [Github]
- PyTerrier_T5: [Github]
- PyTerrier_doc2query: [Github]
- PyTerrier_DeepCT: [Github]
- PyTerrier_ANCE: [Github]
Citation Policy
If you make using of any of these slides, notebooks, or additional PyTerrier plugins, please cite our tutorial abstract:
@inproceedings{ecir2021-tut-bow2b,
author = {MacAvaney, Sean and Macdonald, Craig and Tonellotto, Nicola},
title = {IR From Bag-of-words to BERT and Beyond through Practical Experiments: An ECIR 2021 tutorial with PyTerrier and OpenNIR},
booktitle = {Proceedings of the 43rd European Conference on Information Retrieval Research},
year = {2021}
}
Feedback
If you attended our ECIR 2021 tutorial, we would appreciate your (anonymous) feedback quiz via this quiz https://forms.office.com/r/2WbpLiQmWV