Self-Attention-Based Message-Relevant Response Generation for Neural Conversation Model (1805.08983v1)
Abstract: Using a sequence-to-sequence framework, many neural conversation models for chit-chat succeed in naturalness of the response. Nevertheless, the neural conversation models tend to give generic responses which are not specific to given messages, and it still remains as a challenge. To alleviate the tendency, we propose a method to promote message-relevant and diverse responses for neural conversation model by using self-attention, which is time-efficient as well as effective. Furthermore, we present an investigation of why and how effective self-attention is in deep comparison with the standard dialogue generation. The experiment results show that the proposed method improves the standard dialogue generation in various evaluation metrics.
- Jonggu Kim (4 papers)
- Doyeon Kong (1 paper)
- Jong-Hyeok Lee (10 papers)