Downloads: 2
India | Information Technology | Volume 6 Issue 2, February 2017 | Pages: 2219 - 2230
Dynamic Memory Inference Network for Natural Language Inference (2017)
Abstract: In this paper, the NLI task is viewed as a question - and - answer problem, which leads to proposing the application of DMNs to enhance performance. NLI classifies the relationship between two sentences as entailment, contradiction, or neutral. It is necessary for many NLP applications, including question - answering, text summarization, and information retrieval. The most significant contribution of the paper is to demonstrate that DMNs are effective for episodic memory and experiment with DMNs outside of their original domain. This feature allows the model to incrementally update and reawaken memory, which provides a more nuanced view of the interactions of the sentences and hence also improves the accuracy of its inferential process. Furthermore, the paper analyses the aspects of the employed attention mechanisms in the structure of DMNs that enhance the capability of the model to pay attention to the essential words and phrases for successful task completion. The study highlights the impact of integrating the episodic memory updates with the attention mechanisms due to extensive experimentation, showing the benefits of such an NLI improvement and the potential for enhancing other NLP tasks.
Keywords: Natural Language Inference, Dynamic Memory Networks, Attention
Rating submitted successfully!
Received Comments
No approved comments available.