Logo: to the web site of Uppsala University

uu.sePublications from Uppsala University
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Recursive Subtree Composition in LSTM-Based Dependency Parsing
Uppsala University, Disciplinary Domain of Humanities and Social Sciences, Faculty of Languages, Department of Linguistics and Philology. (Computational Linguistics)ORCID iD: 0000-0001-8844-2126
IBM. (IBM Research AI)
Uppsala University, Disciplinary Domain of Humanities and Social Sciences, Faculty of Languages, Department of Linguistics and Philology. (Computational Linguistics)ORCID iD: 0000-0002-7873-3971
2019 (English)In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers) / [ed] Jill Burstein; Christy Doran; Thamar Solorio, Stroudsburg: Association for Computational Linguistics, 2019, p. 1566-1576Conference paper, Published paper (Refereed)
Abstract [en]

The need for tree structure modelling on top of sequence modelling is an open issue in neural dependency parsing. We investigate the impact of adding a tree layer on top of a sequential model by recursively composing subtree representations (composition) in a transition-based parser that uses features extracted by a BiLSTM. Composition seems superfluous with such a model, suggesting that BiLSTMs capture information about subtrees. We perform model ablations to tease out the conditions under which composition helps. When ablating the backward LSTM, performance drops and composition does not recover much of the gap. When ablating the forward LSTM, performance drops less dramatically and composition recovers a substantial part of the gap, indicating that a forward LSTM and composition capture similar information. We take the backward LSTM to be related to lookahead features and the forward LSTM to the rich history-based features both crucial for transition-based parsers. To capture history-based information, composition is better than a forward LSTM on its own, but it is even better to have a forward LSTM as part of a BiLSTM. We correlate results with language properties, showing that the improved lookahead of a backward LSTM is especially important for head-final languages.

Place, publisher, year, edition, pages
Stroudsburg: Association for Computational Linguistics, 2019. p. 1566-1576
Keywords [en]
dependency parsing, recursive neural networks, recurrent neural networks, long short-term memory networks
National Category
General Language Studies and Linguistics
Research subject
Computational Linguistics
Identifiers
URN: urn:nbn:se:uu:diva-395676ISI: 000900116901063ISBN: 978-1-950737-13-0 (print)OAI: oai:DiVA.org:uu-395676DiVA, id: diva2:1364985
Conference
2019 Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL-HLT), Minneapolis, June 2-7, 2019
Funder
Swedish Research Council, 2016-01817Available from: 2019-10-23 Created: 2019-10-23 Last updated: 2023-05-29Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Paper in full-text

Authority records

de Lhoneux, MiryamNivre, Joakim

Search in DiVA

By author/editor
de Lhoneux, MiryamNivre, Joakim
By organisation
Department of Linguistics and Philology
General Language Studies and Linguistics

Search outside of DiVA

GoogleGoogle Scholar

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 160 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf