Understanding Gating Operations in Recurrent Neural Networks through Opinion Expression Extraction

Clicks: 268
ID: 73185
2016
Extracting opinion expressions from text is an essential task of sentiment analysis, which is usually treated as one of the word-level sequence labeling problems. In such problems, compositional models with multiplicative gating operations provide efficient ways to encode the contexts, as well as to choose critical information. Thus, in this paper, we adopt Long Short-Term Memory (LSTM) recurrent neural networks to address the task of opinion expression extraction and explore the internal mechanisms of the model. The proposed approach is evaluated on the Multi-Perspective Question Answering (MPQA) opinion corpus. The experimental results demonstrate improvement over previous approaches, including the state-of-the-art method based on simple recurrent neural networks. We also provide a novel micro perspective to analyze the run-time processes and gain new insights into the advantages of LSTM selecting the source of information with its flexible connections and multiplicative gating operations.
Reference Key
wang2016understandingentropy Use this key to autocite in the manuscript while using SciMatic Manuscript Manager or Thesis Manager
Authors Wang, Xin;Liu, Yuanchao;Liu, Ming;Sun, Chengjie;Wang, Xiaolong;
Journal entropy
Year 2016
DOI DOI not found
URL
Keywords

Citations

No citations found. To add a citation, contact the admin at info@scimatic.org

No comments yet. Be the first to comment on this article.