Search engines use highly complex ranking function to return high-quality results. Recently, major engines have started using transformers such as BERT for better ranking, but this can significantly increase the computational cost. One approach to increase efficiency for this case involves using neural networks to automatically derive better inverted index structures that can be used to efficiently approximate complex transformer-based ranking systems. The goal of this research project will be to improve the state of the art in this area in one or more ways, for example by learning index structures for pairs of terms, by pruning learned indexes, or by finding new processing methods that can deal with the wacky impact score distributions that occur in these learned structures

Week 1



  • BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding (paper)

Document Expansion

  • Document Expansion by Query Prediction (paper)
  • From doc2query to docTTTTTquery (paper)
  • Doc2Query--: When Less is More (paper)

Information Retrieval

  • Passage Re-ranking with BERT (paper)
  • Context-Aware Sentence/Passage Term Importance Estimation For First Stage Retrieval (paper)
  • Context-Aware Document Term Weighting for Ad-Hoc Search (paper)
  • Efficiency Implications of Term Weighting for Passage Retrieval (paper)
  • ColBERT: Efficient and Effective Passage Search via Contextualized Late Interaction over BERT (paper)
  • SPLADE: Sparse Lexical and Expansion Model for First Stage Ranking (paper)
  • Wacky Weights in Learned Sparse Representations and the Revenge of Score-at-a-Time Query Evaluation (paper)


  • Learning Passage Impacts for Inverted Indexes (paper)
  • Faster Learned Sparse Retrieval with Guided Traversal (paper)
  • A Few Brief Notes on DeepImpact, COIL, and a Conceptual Framework for Information Retrieval Techniques (paper)

Text REtrieval Conference (TREC)

  • Overview of the TREC 2019 Deep Learning Track (paper)
  • Overview of the TREC 2020 Deep Learning Track (paper)
  • Overview of the TREC 2021 Deep Learning Track (paper)