• Stars
    star
    3
  • Rank 3,963,521 (Top 79 %)
  • Language
    TeX
  • License
    MIT License
  • Created about 5 years ago
  • Updated almost 2 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

code for the trec 2019 fair ranking track

More Repositories

1

dis12-2020

DIS12 - Information Retrieval
11
star
2

repro_eval

A Python Interface to Reproducibility Measures of System-Oriented IR Experiments
Jupyter Notebook
11
star
3

dis12-bdk24-2021

10
star
4

dis25-2021

DIS25 - Natural Language Processing
Jupyter Notebook
8
star
5

Qbias

๐‘„๐‘๐‘–๐‘Ž๐‘  - A Dataset on Media Bias in Search Queries and Query Suggestions
Jupyter Notebook
5
star
6

datasets

Datasets in the IR-Group
R
5
star
7

labelstudio-to-fonduer

This small module connects Label Studio with Fonduer by creating a fonduer labeling function for gold labels from a label studio export. Documentation: https://irgroup.github.io/labelstudio-to-fonduer/
Python
5
star
8

trec-covid

As part of the TREC-COVID challenge the Information Retrieval Research Group at Technische Hochschule Kรถln develops search and retrieval algorithms to support the search for relevant information on COVID-19.
Python
4
star
9

ir_metadata

Jupyter Notebook
4
star
10

ecir2022-uqv-sim

Validating Simulations of User Query Variants
Python
4
star
11

gelic

German Library Indexing Collection
Python
3
star
12

sigir2020-measure-reproducibility

How to Measure the Reproducibility of System-oriented IR Experiments
MATLAB
3
star
13

SUIR

Jupyter Notebook
2
star
14

validating-synthetic-usage-data

Python
2
star
15

LWDA2023-IR-community

Jupyter Notebook
2
star
16

ipm-reproducibility

An in-depth Investigation on the Behaviour of Measures to Quantify Reproducibility
Jupyter Notebook
1
star
17

CLEF2023-LongEval-IRC

Smalltalk
1
star
18

clef2017

Accompanying material for CLEF2017 presentation
1
star