Home

Awesome

NETE (NEural TEmplate)

Papers

A small unpretrained Transformer version is available at PETER!

A large pretrained GPT-2 version is available at PEPLER!

A small ecosystem for Recommender Systems-based Natural Language Generation is available at NLG4RS!

Code dependencies

Datasets to download

For those who are interested in how to obtain (feature, opinion, template, sentiment) quadruples, please refer to Sentires-Guide.

Citations

@inproceedings{CIKM20-NETE,
	title={Generate Neural Template Explanations for Recommendation},
	author={Li, Lei and Zhang, Yongfeng and Chen, Li},
	booktitle={CIKM},
	year={2020}
}
@inproceedings{WWW20-NETE,
	title={Towards Controllable Explanation Generation for Recommender Systems via Neural Template},
	author={Li, Lei and Chen, Li and Zhang, Yongfeng},
	booktitle={WWW Demo},
	year={2020}
}