Home

Awesome

Infinite Recommendation Networks (∞-AE)

This repository contains the implementation of ∞-AE from the paper "Infinite Recommendation Networks: A Data-Centric Approach" [arXiv] where we leverage the NTK of an infinitely-wide autoencoder for implicit-feedback recommendation. Notably, ∞-AE:

The paper also proposes Distill-CF: how to use ∞-AE for data distillation to create terse, high-fidelity, and synthetic data summaries for model training. We provide Distill-CF's code in a separate GitHub repository.

If you find any module of this repository helpful for your own research, please consider citing the below paper. Thanks!

@article{inf_ae_distill_cf,
  title={Infinite Recommendation Networks: A Data-Centric Approach},
  author={Sachdeva, Noveen and Dhaliwal, Mehak Preet and Wu, Carole-Jean and McAuley, Julian},
  booktitle={Advances in Neural Information Processing Systems},
  series={NeurIPS '22},
  year={2022}
}

Code Author: Noveen Sachdeva (nosachde@ucsd.edu)


Setup

Environment Setup

pip install -r requirements.txt

Data Setup

This repository already includes the pre-processed data for ML-1M, Amazon Magazine, and Douban datasets as described in the paper. The code for pre-processing is in preprocess.py.


How to train ∞-AE?

CUDA_VISIBLE_DEVICES=0 python main.py

Results sneak-peak

Below are the nDCG@10 results for the datasets used in the paper:

DatasetPopRecMFNeuMFMVAELightGCNEASE∞-AE
Amazon Magazine8.4213.113.612.1822.5722.8423.06
MovieLens-1M13.8425.6524.4422.1428.8529.8832.82
Douban11.6313.2113.3316.1716.6819.4824.94
Netflix12.3412.0411.4820.85Timed out26.8330.59*

Note: ∞-AE's results on the Netflix dataset (marked with a *) are obtained by training only on 5% of the total users. Note however, all other methods are trained on the full dataset.


MIT License