Long short-term memory over recursive structures

AuthorSearch for: ; Search for: ; Search for:
Proceedings title32nd International Conference on Machine Learning(ICML 2015)
Conference32nd International Conference on Machine Learning, July 6-11, 2015, Lille, France
AbstractThe chain-structured long short-term memory (LSTM) has showed to be effective in a wide range of problems such as speech recognition and machine translation. In this paper, we propose to extend it to tree structures, in which a memory cell can reflect the history memories of multiple child cells or multiple descendant cells in a recursive process. We call the model S-LSTM, which provides a principled way of considering long-distance interaction over hierarchies, e.g., language or image parse structures. We leverage the models for semantic composition to understand the meaning of text, a fundamental problem in natural language understanding, and show that it outperforms a state-of-the-art recursive model by replacing its composition layers with the S-LSTM memory blocks. We also show that utilizing the given structures is helpful in achieving a performance better than that without considering the structures. Copyright © 2015 by the author(s).
Publication date
PublisherInternational Machine Learning Society
AffiliationNational Research Council Canada; Information and Communication Technologies
Peer reviewedYes
NPARC number23000279
Export citationExport as RIS
Report a correctionReport a correction
Record identifier3f52b3f9-330f-4ca8-95b8-1c3a68fc4fa7
Record created2016-07-04
Record modified2016-07-04
Bookmark and share
  • Share this page with Facebook (Opens in a new window)
  • Share this page with Twitter (Opens in a new window)
  • Share this page with Google+ (Opens in a new window)
  • Share this page with Delicious (Opens in a new window)