Home

Awesome

UnKD

This is an implemention for our WSDM 2023 paper "Unbiased Knowledge Distillation for Recommendation" based on Pytorch.

Requirements

Datasets

Data preprocessing through python preprocess_xxx.py

Parameters

Key parameters in distill_new_api.py:

Commands

We provide following commands for our method and baselines.The following two steps are required.

1. Train a teacher model:

2. Simply Reproduce the Results:

Citation

If you use our codes in your research, please cite our paper.