Home

Awesome

Infinite Recommendation Networks: A Data-Centric Approach (Distill-CF)

This repository contains the implementation of Distill-CF (using ∞-AE) from the paper "Infinite Recommendation Networks: A Data-Centric Approach" [arXiv] where we leverage the NTK of an infinitely-wide autoencoder to perform data distillation i.e. instead of training models on large datasets, train them on terse, high-fidelity, and synthetic data summaries generated by Distill-CF. Notably, Distill-CF:

The paper also proposes ∞-AE: a SoTA implicit feedback recommendation model that has a closed-form solution, and only a single hyper-parameter. We provide ∞-AE's code in a separate GitHub repository.

If you find any module of this repository helpful for your own research, please consider citing the below paper. Thanks!

@article{inf_ae_distill_cf,
  title={Infinite Recommendation Networks: A Data-Centric Approach},
  author={Sachdeva, Noveen and Dhaliwal, Mehak Preet and Wu, Carole-Jean and McAuley, Julian},
  booktitle={Advances in Neural Information Processing Systems},
  series={NeurIPS '22},
  year={2022}
}

Code Author: Noveen Sachdeva (nosachde@ucsd.edu)


Setup

Environment Setup

pip install -r requirements.txt

Data Setup

This repository already includes the pre-processed data for ML-1M, Amazon Magazine, and Douban datasets as described in the paper. The code for pre-processing is in preprocess.py.


How to distill data?

CUDA_VISIBLE_DEVICES=0 python distill.py
python grid_search_distill.py

Results sneak-peak

<br>

Performance of ∞-AE when trained on data synthesized by Distill-CF

<center>Performance of ∞-AE with the amount of users (log-scale) sampled according to different sampling strategies over the HR@10 and PSP@10 metrics. Results for the Netflix dataset have been clipped due to memory constraints. Other results can be found in the <a href="https://arxiv.org/abs/2206.02626">paper</a>.</center>

<br><br> Below are the nDCG@10 results for the datasets used in the paper:

DatasetPopRecMFNeuMFMVAELightGCNEASE<center>∞-AE <br> (Full)</center><center>∞-AE <br> (Distill-CF)</center>
Amazon Magazine8.4213.113.612.1822.5722.8423.0623.81
MovieLens-1M13.8425.6524.4422.1428.8529.8832.8232.52
Douban11.6313.2113.3316.1716.6819.4824.9424.20
Netflix12.3412.0411.4820.85Timed out26.8330.59*30.54

Note: The user synthesis budget for Distill-CF is only 500 for this table. ∞-AE's results on the Netflix dataset (marked with a *) are obtained by training only on 5% of the total users. Note however, all other methods are trained on the full dataset.


MIT License