Home

Awesome

Hi there 👋 Welcome to my teaching materials!

I'm working on Information Retrieval at the Vienna University of Technology (TU Wien), mainly focusing on the award-wining master-level Advanced Information Retrieval course. I try to create engaging, fun, and informative lectures and exercises – both in-person and online!

Please feel free to open up an issue or a pull request if you want to add something, find a mistake, or think something should be explained better!

Contents


<a id="air-2022"></a>

Advanced Information Retrieval 2021 & 2022

🏆 Won the Best Distance Learning Award 2021 @ TU Wien

Information Retrieval is the science behind search technology. Certainly, the most visible instances are the large Web Search engines, the likes of Google and Bing, but information retrieval appears everywhere we have to deal with unstructured data (e.g. free text).

A paradigm shift. Taking off in 2019 the Information Retrieval research field began an enormous paradigm shift towards utilizing BERT-based language models in various forms to great effect with huge leaps in quality improvements for search results using large-scale training data. This course aims to showcase a slice of these advances in state-of-the-art IR research towards the next generation of search engines.


New in 2022: Use GitHub Discussions to ask questions about the lecture!


Syllabus The AIR syllabus overview

Lectures

In the following we provide links to recordings, slides, and closed captions for our lectures. Here is a complete playlist on YouTube.

TopicDescriptionRecordingsSlidesText
0: Introduction 2022Infos on requirements, topics, organizationYouTubePDFTranscript
1: Crash Course IR FundamentalsWe explore two fundamental building blocks of IR: indexing and ranked retrievalYouTubePDFTranscript
2: Crash Course IR EvaluationWe explore how we evaluate ranked retrieval results and common IR metrics (MRR, MAP, NDCG)YouTubePDFTranscript
3: Crash Course IR Test CollectionsWe get to know existing IR test collections, look at how to create your own, and survey potential biases & their effect in the dataYouTubePDFTranscript
4: Word Representation LearningWe take a look at word representations and basic word embeddings including a usage example in Information RetrievalYouTubePDFTranscript
5: Sequence ModellingWe look at CNNs and RNNs for sequence modelling, including the basics of the attention mechanism.YouTubePDFTranscript
6: Transformer & BERTWe study the Transformer architecture; pre-training with BERT, the HuggingFace ecosystem where the community can share models; and overview Extractive Question Answering (QA).YouTubePDFTranscript
7: Introduction to Neural Re‑RankingWe look at the workflow (including training and evaluation) of neural re-ranking models and some basic neural re-ranking architectures.YouTubePDFTranscript
8: Transformer Contextualized Re‑RankingWe learn how to use Transformers (and the pre-trained BERT model) for neural re-ranking - for the best possible results and more efficient approaches, where we tradeoff quality for performance.YouTubePDFTranscript
9: Domain Specific Applications Guest lecture by @sophiaalthammerWe learn how about different task settings, challenges, and solutions in domains other than web search.YouTubePDFTranscript
10: Dense Retrieval ❤ Knowledge DistillationWe learn about the (potential) future of search: dense retrieval. We study the setup, specific models, and how to train DR models. Then we look at how knowledge distillation greatly improves the training of DR models and topic aware sampling to get state-of-the-art results.YouTubePDFTranscript

Neural IR & Extractive QA Exercise

In this exercise your group is implementing neural network re-ranking models, using pre-trained extractive QA models, and analyze their behavior with respect to our FiRA data.

📃 To the 2021 assignment

📃 To the 2022 assignment


<a id="workflow"></a>

Our Time-Optimized Content Creation Workflow for Remote Teaching

Our workflow creates an engaging remote learning experience for a university course, while minimizing the post-production time of the educators. We make use of ubiquitous and commonly free services and platforms, so that our workflow is inclusive for all educators and provides polished experiences for students. Our learning materials provide for each lecture: 1) a recorded video, uploaded on YouTube, with exact slide timestamp indices, which enables an enhanced navigation UI; and 2) a high-quality flow-text automated transcript of the narration with proper punctuation and capitalization, improved with a student participation workflow on GitHub. We automate the transformation and post-production between raw narrated slides and our published materials with custom tools.

Workflow Overview

Head over to our workflow folder for more information and our custom python-based transformation tools. Or check out our full paper for an in-depth evaluation of our methods published at the SIGCSE Technical Symposium 2022:

A Time-Optimized Content Creation Workflow for Remote Teaching Sebastian Hofstätter, Sophia Althammer, Mete Sertkan and Allan Hanbury https://arxiv.org/abs/2110.05601