Awesome
New record is achieved by ERNIE_English (2019/06/13)
We got the new, best score of R_10 at 1 (85.67%) in the Ubuntu Corpus by incorporating ERNIE_English, an English pre-trained model from Baidu. Please refer to DMTK (the Dialogue Modeling ToolKit) for more details. https://github.com/PaddlePaddle/models/tree/develop/PaddleNLP/PaddleDialogue
Baidu NLP Dialogue team
The dialogue team, at Baidu NLP, is a group of engineers and researchers who truly trust in technology and work together to accelerate the development of open-domain dialogues.
Our battlefields include but not limited to the fundamental technology of neural dialogue system (seq2seq generation or context-response matching), knowledge-driven dialogue, life-long learning dialogue system with reinforcement-learning, and also we provide the system-level solution for open-domain chatbots.
Together we built the largest Chinese Human-Computer conversation systems and support many businesses such as DuerOS, the largest Chatbot in China, our life-long learning system interacts with hundreds of millions of Chinese users every day and learns through imitation/user-feedback, distilling knowledge from the conversation and learning to be smarter.
We will release some source code of our previous work in the future, to make some small contribution to the whole community of human-computer conversation.
Publication
- Proactive Human-Machine Conversation with Explicit Conversation Goals. ACL 2019, Full Paper, poster
- Multi-Turn Response Selection for Chatbots with Deep Attention Matching Network. ACL 2018, Full Paper, oral
- Multi-View Response Selection for Human-Computer Conversation. EMNLP 2016, Full Paper, poster
- Shall I be Your Chat Companion towards an Online Human-Computer Conversation System. CIKM 2016, Full Paper, oral
Connected to our Chatbot Service
Any Chinese developers can enable their own smart devices to talk with customers on open-domain topics by using our open chatbot service. Please find the usage manual at http://ai.baidu.com/forum/topic/show/497679 (in Chinese).