Learning Conversational Systems that Interleave Task and Non-Task Content

Learning Conversational Systems that Interleave Task and Non-Task Content

Zhou Yu, Alexander Rudnicky, Alan Black

Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence
Main track. Pages 4214-4220. https://doi.org/10.24963/ijcai.2017/589

Task-oriented dialog systems have been applied in various tasks, such as automated personal assistants, customer service providers and tutors. These systems work well when users have clear and explicit intentions that are well-aligned to the systems' capabilities. However, they fail if users intentions are not explicit.To address this shortcoming, we propose a framework to interleave non-task content (i.e.everyday social conversation) into task conversations. When the task content fails, the system can still keep the user engaged with the non-task content. We trained a policy using reinforcement learning algorithms to promote long-turn conversation coherence and consistency, so that the system can have smooth transitions between task and non-task content.To test the effectiveness of the proposed framework, we developed a movie promotion dialog system. Experiments with human users indicate that a system that interleaves social and task content achieves a better task success rate and is also rated as more engaging compared to a pure task-oriented system.
Keywords:
Natural Language Processing: Dialogue
Natural Language Processing: NLP Applications and Tools