Home

Awesome

Anserini <img src="docs/anserini-logo.png" width="300" />

build codecov Generic badge Maven Central LICENSE doi

Anserini is a toolkit for reproducible information retrieval research. By building on Lucene, we aim to bridge the gap between academic information retrieval research and the practice of building real-world search applications. Among other goals, our effort aims to be the opposite of this.* Anserini grew out of a reproducibility study of various open-source retrieval engines in 2016 (Lin et al., ECIR 2016). See Yang et al. (SIGIR 2017) and Yang et al. (JDIQ 2018) for overviews.

❗ Anserini was upgraded from JDK 11 to JDK 21 at commit 272565 (2024/04/03), which corresponds to the release of v0.35.0.

πŸ’₯ Try It!

Anserini is packaged in a self-contained fatjar, which also provides the simplest way to get started. Assuming you've already got Java installed, fetch the fatjar:

wget https://repo1.maven.org/maven2/io/anserini/anserini/0.38.0/anserini-0.38.0-fatjar.jar

The follow commands will generate a SPLADE++ ED run with the dev queries (encoded using ONNX) on the MS MARCO passage corpus:

java -cp anserini-0.38.0-fatjar.jar io.anserini.search.SearchCollection \
  -index msmarco-v1-passage.splade-pp-ed \
  -topics msmarco-v1-passage.dev \
  -encoder SpladePlusPlusEnsembleDistil \
  -output run.msmarco-v1-passage-dev.splade-pp-ed-onnx.txt \
  -impact -pretokenized

To evaluate:

java -cp anserini-0.38.0-fatjar.jar trec_eval -c -M 10 -m recip_rank msmarco-passage.dev-subset run.msmarco-v1-passage-dev.splade-pp-ed-onnx.txt

See detailed instructions for the current fatjar release of Anserini (v0.38.0) to reproduce regression experiments on the MS MARCO V2.1 corpora for TREC 2024 RAG, on MS MARCO V1 Passage, and on BEIR, all directly from the fatjar!

Also, Anserini comes with a built-in webapp for interactive querying along with a REST API that can be used by other applications. Check out our documentation here.

<!-- We also have [forthcoming instructions](docs/fatjar-regressions/fatjar-regressions-v0.38.1-SNAPSHOT.md) for the next release (v0.38.1-SNAPSHOT) if you're interested. --> <details> <summary>Older instructions</summary> </details>

🎬 Installation

Most Anserini features are exposed in the Pyserini Python interface. If you're more comfortable with Python, start there, although Anserini forms an important building block of Pyserini, so it remains worthwhile to learn about Anserini.

You'll need Java 21 and Maven 3.9+ to build Anserini. Clone our repo with the --recurse-submodules option to make sure the eval/ submodule also gets cloned (alternatively, use git submodule update --init). Then, build using Maven:

mvn clean package

The tools/ directory, which contains evaluation tools and other scripts, is actually this repo, integrated as a Git submodule (so that it can be shared across related projects). Build as follows (you might get warnings, but okay to ignore):

cd tools/eval && tar xvfz trec_eval.9.0.4.tar.gz && cd trec_eval.9.0.4 && make && cd ../../..
cd tools/eval/ndeval && make && cd ../../..

With that, you should be ready to go. The onboarding path for Anserini starts here!

<details> <summary>Windows tips</summary>

If you are using Windows, please use WSL2 to build Anserini. Please refer to the WSL2 Installation document to install WSL2 if you haven't already.

Note that on Windows without WSL2, tests may fail due to encoding issues, see #1466. A simple workaround is to skip tests by adding -Dmaven.test.skip=true to the above mvn command. See #1121 for additional discussions on debugging Windows build errors.

</details>

βš—οΈ End-to-End Regression Experiments

Anserini is designed to support end-to-end experiments on various standard IR test collections out of the box. Each of these end-to-end regressions starts from the raw corpus, builds the necessary index, performs retrieval runs, and generates evaluation results. See individual pages for details.

<details> <summary>MS MARCO V1 Passage Regressions</summary>

MS MARCO V1 Passage Regressions

devDL19DL20
Unsupervised Sparse
Lucene BoW baselinesπŸ”‘πŸ”‘πŸ”‘
Quantized BM25πŸ”‘πŸ”‘πŸ”‘
WordPiece baselines (pre-tokenized)πŸ”‘πŸ”‘πŸ”‘
WordPiece baselines (Huggingface)πŸ”‘πŸ”‘πŸ”‘
WordPiece + Lucene BoW baselinesπŸ”‘πŸ”‘πŸ”‘
doc2queryπŸ”‘
doc2query-T5πŸ”‘πŸ”‘πŸ”‘
Learned Sparse (uniCOIL family)
uniCOIL noexpπŸ«™πŸ«™πŸ«™
uniCOIL with doc2query-T5πŸ«™πŸ«™πŸ«™
uniCOIL with TILDEπŸ«™
Learned Sparse (other)
DeepImpactπŸ«™
SPLADEv2πŸ«™
SPLADE++ CoCondenser-EnsembleDistilπŸ«™πŸ…ΎοΈπŸ«™πŸ…ΎοΈπŸ«™πŸ…ΎοΈ
SPLADE++ CoCondenser-SelfDistilπŸ«™πŸ…ΎοΈπŸ«™πŸ…ΎοΈπŸ«™πŸ…ΎοΈ
Learned Dense (HNSW indexes)
cosDPR-distilfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
BGE-base-en-v1.5full:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
OpenAI Ada2full:πŸ«™ int8:πŸ«™full:πŸ«™ int8:πŸ«™full:πŸ«™ int8:πŸ«™
Cohere English v3.0full:πŸ«™ int8:πŸ«™full:πŸ«™ int8:πŸ«™full:πŸ«™ int8:πŸ«™
Learned Dense (flat indexes)
cosDPR-distilfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
BGE-base-en-v1.5full:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
OpenAI Ada2full:πŸ«™ int8:πŸ«™οΈfull:πŸ«™ int8:πŸ«™full:πŸ«™ int8:πŸ«™
Cohere English v3.0full:πŸ«™ int8:πŸ«™full:πŸ«™ int8:πŸ«™full:πŸ«™ int8:πŸ«™
Learned Dense (Inverted; experimental)
cosDPR-distil w/ "fake words"πŸ«™πŸ«™πŸ«™
cosDPR-distil w/ "LexLSH"πŸ«™πŸ«™πŸ«™
<details> <summary>Deprecated instructions for learned dense models using corpora in jsonl format</summary>
devDL19DL20
Learned Dense (HNSW indexes)
cosDPR-distilfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
BGE-base-en-v1.5full:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
OpenAI Ada2full:πŸ«™ int8:πŸ«™full:πŸ«™ int8:πŸ«™full:πŸ«™ int8:πŸ«™
Cohere English v3.0full:πŸ«™ int8:πŸ«™full:πŸ«™ int8:πŸ«™full:πŸ«™ int8:πŸ«™
Learned Dense (flat indexes)
cosDPR-distilfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
BGE-base-en-v1.5full:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
OpenAI Ada2full:πŸ«™ int8:πŸ«™οΈfull:πŸ«™ int8:πŸ«™full:πŸ«™ int8:πŸ«™
Cohere English v3.0full:πŸ«™ int8:πŸ«™full:πŸ«™ int8:πŸ«™full:πŸ«™ int8:πŸ«™
</details>

Key:

Available Corpora for Download

CorporaSizeChecksum
Quantized BM251.2 GB0a623e2c97ac6b7e814bf1323a97b435
uniCOIL (noexp)2.7 GBf17ddd8c7c00ff121c3c3b147d2e17d8
uniCOIL (d2q-T5)3.4 GB78eef752c78c8691f7d61600ceed306f
uniCOIL (TILDE)3.9 GB12a9c289d94e32fd63a7d39c9677d75c
DeepImpact3.6 GB73843885b503af3c8b3ee62e5f5a9900
SPLADEv29.9 GBb5d126f5d9a8e1b3ef3f5cb0ba651725
SPLADE++ CoCondenser-EnsembleDistil4.2 GBe489133bdc54ee1e7c62a32aa582bc77
SPLADE++ CoCondenser-SelfDistil4.8 GBcb7e264222f2bf2221dd2c9d28190be1
cosDPR-distil (parquet)38 GBc8a204fbc3ccda581aa375936af43a97
BGE-base-en-v1.5 (parquet)39 GBb235e19ec492c18a18057b30b8b23fd4
OpenAI-ada2 (parquet)75 GBfa3637e9c4150b157270e19ef3a4f779
Cohere embed-english-v3.0 (parquet)16 GB40c5caf33476746e93ceeb75174b8d64
<details> <summary>Deprecated corpora for learned dense models using corpora in jsonl format</summary>
CorporaSizeChecksum
cosDPR-distil (jsonl, deprecated)57 GBe20ffbc8b5e7f760af31298aefeaebbd
BGE-base-en-v1.5 (jsonl, deprecated)59 GB353d2c9e72e858897ad479cca4ea0db1
OpenAI-ada2 (jsonl, deprecated)109 GBa4d843d522ff3a3af7edbee789a63402
Cohere embed-english-v3.0 (jsonl, deprecated)38 GB06a6e38a0522850c6aa504db7b2617f5
</details> <hr/> </details> <details> <summary>MS MARCO V1 Document Regressions</summary>

MS MARCO V1 Document Regressions

devDL19DL20
Unsupervised Lexical, Complete Doc*
Lucene BoW baselines+++
WordPiece baselines (pre-tokenized)+++
WordPiece baselines (Huggingface tokenizer)+++
WordPiece + Lucene BoW baselines+++
doc2query-T5+++
Unsupervised Lexical, Segmented Doc*
Lucene BoW baselines+++
WordPiece baselines (pre-tokenized)+++
WordPiece + Lucene BoW baselines+++
doc2query-T5+++
Learned Sparse Lexical
uniCOIL noexpβœ“βœ“βœ“
uniCOIL with doc2query-T5βœ“βœ“βœ“

Available Corpora for Download

CorporaSizeChecksum
MS MARCO V1 doc: uniCOIL (noexp)11 GB11b226e1cacd9c8ae0a660fd14cdd710
MS MARCO V1 doc: uniCOIL (d2q-T5)19 GB6a00e2c0c375cb1e52c83ae5ac377ebb
<hr/> </details> <details> <summary>MS MARCO V2 Passage Regressions</summary>

MS MARCO V2 Passage Regressions

devDL21DL22DL23
Unsupervised Lexical, Original Corpus
baselines++++
doc2query-T5++++
Unsupervised Lexical, Augmented Corpus
baselines++++
doc2query-T5++++
Learned Sparse Lexical
uniCOIL noexp zero-shotβœ“βœ“βœ“βœ“
uniCOIL with doc2query-T5 zero-shotβœ“βœ“βœ“βœ“
SPLADE++ CoCondenser-EnsembleDistil (cached queries)βœ“βœ“βœ“βœ“
SPLADE++ CoCondenser-EnsembleDistil (ONNX)βœ“βœ“βœ“βœ“
SPLADE++ CoCondenser-SelfDistil (cached queries)βœ“βœ“βœ“βœ“
SPLADE++ CoCondenser-SelfDistil (ONNX)βœ“βœ“βœ“βœ“

Available Corpora for Download

CorporaSizeChecksum
uniCOIL (noexp)24 GBd9cc1ed3049746e68a2c91bf90e5212d
uniCOIL (d2q-T5)41 GB1949a00bfd5e1f1a230a04bbc1f01539
SPLADE++ CoCondenser-EnsembleDistil66 GB2cdb2adc259b8fa6caf666b20ebdc0e8
SPLADE++ CoCondenser-SelfDistil76 GB061930dd615c7c807323ea7fc7957877
<hr/> </details> <details> <summary>MS MARCO V2 Document Regressions</summary>

MS MARCO V2 Document Regressions

devDL21DL22DL23
Unsupervised Lexical, Complete Doc
baselines++++
doc2query-T5++++
Unsupervised Lexical, Segmented Doc
baselines++++
doc2query-T5++++
Learned Sparse Lexical
uniCOIL noexp zero-shotβœ“βœ“βœ“βœ“
uniCOIL with doc2query-T5 zero-shotβœ“βœ“βœ“βœ“

Available Corpora for Download

CorporaSizeChecksum
MS MARCO V2 doc: uniCOIL (noexp)55 GB97ba262c497164de1054f357caea0c63
MS MARCO V2 doc: uniCOIL (d2q-T5)72 GBc5639748c2cbad0152e10b0ebde3b804
<hr/> </details> <details> <summary>MS MARCO V2.1 Segmented Document Regressions</summary>

MS MARCO V2.1 Segmented Document Regressions

The MS MARCO V2.1 corpora were derived from the V2 corpora for the TREC 2024 RAG Track. Instructions for downloading the corpus can be found here. The experiments below use passage-level qrels.

RAG 24
baselines+
<hr/> </details> <details> <summary>MS MARCO V2.1 Document Regressions</summary>

MS MARCO V2.1 Document Regressions

The MS MARCO V2.1 corpora were derived from the V2 corpora for the TREC 2024 RAG Track. Instructions for downloading the corpus can be found here. The experiments below capture topics and document-level qrels originally targeted at the V2 corpora, but have been "projected" over to the V2.1 corpora. These should be treated like dev topics for the TREC 2024 RAG Track; actual qrels for that track were generated at the passage level.

devDL21DL22DL23RAGgy dev
Unsupervised Lexical, Complete Doc
baselines+++++
Unsupervised Lexical, Segmented Doc
baselines+++++
<hr/> </details> <details> <summary>BEIR (v1.0.0) Regressions</summary>

BEIR (v1.0.0) Regressions

Key:

See instructions below the table for how to reproduce results programmatically.

CorpusF1F2MFU1S1BGE (flat)BGE (HNSW)
TREC-COVIDπŸ”‘πŸ”‘πŸ”‘πŸ«™πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
BioASQπŸ”‘πŸ”‘πŸ”‘πŸ«™πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
NFCorpusπŸ”‘πŸ”‘πŸ”‘πŸ«™πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
NQπŸ”‘πŸ”‘πŸ”‘πŸ«™πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
HotpotQAπŸ”‘πŸ”‘πŸ”‘πŸ«™πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
FiQA-2018πŸ”‘πŸ”‘πŸ”‘πŸ«™πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
Signal-1M(RT)πŸ”‘πŸ”‘πŸ”‘πŸ«™πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
TREC-NEWSπŸ”‘πŸ”‘πŸ”‘πŸ«™πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
Robust04πŸ”‘πŸ”‘πŸ”‘πŸ«™πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
ArguAnaπŸ”‘πŸ”‘πŸ”‘πŸ«™πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
Touche2020πŸ”‘πŸ”‘πŸ”‘πŸ«™πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
CQADupStack-AndroidπŸ”‘πŸ”‘πŸ”‘πŸ«™πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
CQADupStack-EnglishπŸ”‘πŸ”‘πŸ”‘πŸ«™πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
CQADupStack-GamingπŸ”‘πŸ”‘πŸ”‘πŸ«™πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
CQADupStack-GisπŸ”‘πŸ”‘πŸ”‘πŸ«™πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
CQADupStack-MathematicaπŸ”‘πŸ”‘πŸ”‘πŸ«™πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
CQADupStack-PhysicsπŸ”‘πŸ”‘πŸ”‘πŸ«™πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
CQADupStack-ProgrammersπŸ”‘πŸ”‘πŸ”‘πŸ«™πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
CQADupStack-StatsπŸ”‘πŸ”‘πŸ”‘πŸ«™πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
CQADupStack-TexπŸ”‘πŸ”‘πŸ”‘πŸ«™πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
CQADupStack-UnixπŸ”‘πŸ”‘πŸ”‘πŸ«™πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
CQADupStack-WebmastersπŸ”‘πŸ”‘πŸ”‘πŸ«™πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
CQADupStack-WordpressπŸ”‘πŸ”‘πŸ”‘πŸ«™πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
QuoraπŸ”‘πŸ”‘πŸ”‘πŸ«™πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
DBPediaπŸ”‘πŸ”‘πŸ”‘πŸ«™πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
SCIDOCSπŸ”‘πŸ”‘πŸ”‘πŸ«™πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
FEVERπŸ”‘πŸ”‘πŸ”‘πŸ«™πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
Climate-FEVERπŸ”‘πŸ”‘πŸ”‘πŸ«™πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
SciFactπŸ”‘πŸ”‘πŸ”‘πŸ«™πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
<details> <summary>Deprecated BGE instructions using corpora in jsonl format</summary>
CorpusBGE (flat)BGE (HNSW)
TREC-COVIDfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
BioASQfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
NFCorpusfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
NQfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
HotpotQAfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
FiQA-2018full:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
Signal-1M(RT)full:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
TREC-NEWSfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
Robust04full:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
ArguAnafull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
Touche2020full:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
CQADupStack-Androidfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
CQADupStack-Englishfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
CQADupStack-Gamingfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
CQADupStack-Gisfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
CQADupStack-Mathematicafull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
CQADupStack-Physicsfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
CQADupStack-Programmersfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
CQADupStack-Statsfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
CQADupStack-Texfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
CQADupStack-Unixfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
CQADupStack-Webmastersfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
CQADupStack-Wordpressfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
Quorafull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
DBPediafull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
SCIDOCSfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
FEVERfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
Climate-FEVERfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
SciFactfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈfull:πŸ«™πŸ…ΎοΈ int8:πŸ«™πŸ…ΎοΈ
</details>

To reproduce the above results programmatically, use the following commands to download and unpack the data:

wget https://rgw.cs.uwaterloo.ca/pyserini/data/$COLLECTION -P collections/
tar xvf collections/$COLLECTION -C collections/

Substitute the appropriate $COLLECTION from the table below.

$COLLECTIONSizeChecksum
beir-v1.0.0-corpus.tar14 GBfaefd5281b662c72ce03d22021e4ff6b
beir-v1.0.0-corpus-wp.tar13 GB3cf8f3dcdcadd49362965dd4466e6ff2
beir-v1.0.0-unicoil-noexp.tar30 GB4fd04d2af816a6637fc12922cccc8a83
beir-v1.0.0-splade-pp-ed.tar43 GB9c7de5b444a788c9e74c340bf833173b
beir-v1.0.0-bge-base-en-v1.5.parquet.tar194 GBc279f9fc2464574b482ec53efcc1c487
beir-v1.0.0-bge-base-en-v1.5.tar (jsonl, deprecated)294 GBe4e8324ba3da3b46e715297407a24f00

Once you've unpacked the data, the following commands will loop over all BEIR corpora and run the regressions:

MODEL="$MODEL"; CORPORA=(trec-covid bioasq nfcorpus nq hotpotqa fiqa signal1m trec-news robust04 arguana webis-touche2020 cqadupstack-android cqadupstack-english cqadupstack-gaming cqadupstack-gis cqadupstack-mathematica cqadupstack-physics cqadupstack-programmers cqadupstack-stats cqadupstack-tex cqadupstack-unix cqadupstack-webmasters cqadupstack-wordpress quora dbpedia-entity scidocs fever climate-fever scifact); for c in "${CORPORA[@]}"
do
    echo "Running $c..."
    python src/main/python/run_regression.py --index --verify --search --regression beir-v1.0.0-${c}.${MODEL} > logs/log.beir-v1.0.0-${c}-${MODEL} 2>&1
done

Substitute the appropriate $MODEL from the table below.

Key$MODEL
F1flat
F2flat-wp
MFmultifield
U1 (cached)unicoil-noexp.cached
S1 (cached)splade-pp-ed.cached
S1 (ONNX)splade-pp-ed.onnx
BGE (flat, full; cached)bge-base-en-v1.5.parquet.flat.cached
BGE (flat, int8; cached)bge-base-en-v1.5.parquet.flat-int8.cached
BGE (HNSW, full; cached)bge-base-en-v1.5.parquet.hnsw.cached
BGE (HNSW, int8; cached)bge-base-en-v1.5.parquet.hnsw-int8.cached
BGE (flat, full; ONNX)bge-base-en-v1.5.parquet.flat.onnx
BGE (flat, int8; ONNX)bge-base-en-v1.5.parquet.flat-int8.onnx
BGE (HNSW, full; ONNX)bge-base-en-v1.5.parquet.hnsw.onnx
BGE (HNSW, int8; ONNX)bge-base-en-v1.5.parquet.hnsw-int8.onnx
<hr/> </details> <details> <summary>Cross-lingual and Multi-lingual Regressions</summary>

Cross-lingual and Multi-lingual Regressions

<hr/> </details> <details> <summary>Other Regressions</summary>

Other Regressions

<hr/> </details>

πŸ“ƒ Additional Documentation

The experiments described below are not associated with rigorous end-to-end regression testing and thus provide a lower standard of reproducibility. For the most part, manual copying and pasting of commands into a shell is required to reproduce our results.

<details> <summary>MS MARCO V1</summary>

MS MARCO V1

</details> <details> <summary>MS MARCO V2</summary>

MS MARCO V2

</details> <details> <summary>TREC-COVID and CORD-19</summary>

TREC-COVID and CORD-19

</details> <details> <summary>Other Experiments and Features</summary>

Other Experiments and Features

</details>

πŸ™‹ How Can I Contribute?

If you've found Anserini to be helpful, we have a simple request for you to contribute back. In the course of reproducing baseline results on standard test collections, please let us know if you're successful by sending us a pull request with a simple note, like what appears at the bottom of the page for Disks 4 & 5. Reproducibility is important to us, and we'd like to know about successes as well as failures. Since the regression documentation is auto-generated, pull requests should be sent against the raw templates. Then the regression documentation can be generated using the bin/build.sh script. In turn, you'll be recognized as a contributor.

Beyond that, there are always open issues we would appreciate help on!

πŸ“œοΈ Release History

<details> <summary>older... (and historic notes)</summary>

πŸ“œοΈ Historical Notes

</details>

✨ References

πŸ™ Acknowledgments

This research is supported in part by the Natural Sciences and Engineering Research Council (NSERC) of Canada. Previous support came from the U.S. National Science Foundation under IIS-1423002 and CNS-1405688. Any opinions, findings, and conclusions or recommendations expressed do not necessarily reflect the views of the sponsors.