Home

Awesome

Commonsense Knowledge Aware Conversation Generation with Graph Attention

Introduction

Commonsense knowledge is vital to many natural language processing tasks. In this paper, we present a novel open-domain conversation generation model to demonstrate how large-scale commonsense knowledge can facilitate language understanding and generation. Given a user post, the model retrieves relevant knowledge graphs from a knowledge base and then encodes the graphs with a static graph attention mechanism, which augments the semantic information of the post and thus supports better understanding of the post. Then, during word generation, the model attentively reads the retrieved knowledge graphs and the knowledge triples within each graph to facilitate better generation through a dynamic graph attention mechanism, as shown in Figure 1.

image

This project is a tensorflow implement of our work, CCM.

Dependencies

Quick Start

Details

Training

You can change the model parameters using:

--units xxx 				the hidden units
--layers xxx 				the number of RNN layers
--batch_size xxx 			batch size to use during training 
--per_checkpoint xxx 			steps to save and evaluate the model
--train_dir xxx				training directory

Evaluation

image

Paper

Hao Zhou, Tom Yang, Minlie Huang, Haizhou Zhao, Jingfang Xu, Xiaoyan Zhu.
Commonsense Knowledge Aware Conversation Generation with Graph Attention.
IJCAI-ECAI 2018, Stockholm, Sweden.

Please kindly cite our paper if this paper and the code are helpful.

Acknowlegments

Thanks for the kind help of Prof. Minlie Huang and Prof. Xiaoyan Zhu. Thanks for the support of my teammates.

License

Apache License 2.0