uu.seUppsala University Publications
Change search
Link to record
Permanent link

Direct link
BETA
Nivre, Joakim
Publications (10 of 189) Show all publications
Kulmizev, A., de Lhoneux, M., Gontrum, J., Fano, E. & Nivre, J. (2019). Deep Contextualized Word Embeddings in Transition-Based and Graph-Based Dependency Parsing – A Tale of Two Parsers Revisited. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP): . Paper presented at 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP) (pp. 2755-2768).
Open this publication in new window or tab >>Deep Contextualized Word Embeddings in Transition-Based and Graph-Based Dependency Parsing – A Tale of Two Parsers Revisited
Show others...
2019 (English)In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), 2019, p. 2755-2768Conference paper, Published paper (Refereed)
National Category
Language Technology (Computational Linguistics)
Research subject
Computational Linguistics
Identifiers
urn:nbn:se:uu:diva-406697 (URN)
Conference
2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
Available from: 2020-03-11 Created: 2020-03-11 Last updated: 2020-03-11
de Marneffe, M.-C. & Nivre, J. (2019). Dependency Grammar. Annual review of linguistics, 5, 197-218
Open this publication in new window or tab >>Dependency Grammar
2019 (English)In: Annual review of linguistics, E-ISSN 2333-9691, Vol. 5, p. 197-218Article in journal (Refereed) Published
Abstract [en]

Dependency grammar is a descriptive and theoretical tradition in linguistics that can be traced back to antiquity. It has long been influential in the European linguistics tradition and has more recently become a mainstream approach to representing syntactic and semantic structure in natural language processing. In this review, we introduce the basic theoretical assumptions of dependency grammar and review some key aspects in which different dependency frameworks agree or disagree. We also discuss advantages and disadvantages of dependency representations and introduce Universal Dependencies, a framework for multilingual dependency-based morphosyntactic annotation that has been applied to more than 60 languages.

Place, publisher, year, edition, pages
ANNUAL REVIEWS, 2019
Keywords
dependency grammar, dependency frameworks, dependency parsing, Universal Dependencies
National Category
General Language Studies and Linguistics
Identifiers
urn:nbn:se:uu:diva-381531 (URN)10.1146/annurev-linguistics-011718-011842 (DOI)000460289100010 ()
Available from: 2019-04-11 Created: 2019-04-11 Last updated: 2019-04-11Bibliographically approved
Tang, G., Sennrich, R. & Nivre, J. (2019). Encoders Help You Disambiguate Word Senses in Neural Machine Translation. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP): . Paper presented at 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP) (pp. 1429-1435).
Open this publication in new window or tab >>Encoders Help You Disambiguate Word Senses in Neural Machine Translation
2019 (English)In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), 2019, p. 1429-1435Conference paper, Published paper (Refereed)
National Category
Language Technology (Computational Linguistics)
Research subject
Computational Linguistics
Identifiers
urn:nbn:se:uu:diva-406696 (URN)
Conference
2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
Available from: 2020-03-11 Created: 2020-03-11 Last updated: 2020-03-11
Meechan-Maddon, A. & Nivre, J. (2019). How to Parse Low-Resource Languages: Cross-Lingual Parsing, Target Language Annotation, or Both?. In: Proceedings of the Fifth International Conference on Dependency Linguistics (Depling, SyntaxFest 2019): . Paper presented at Fifth International Conference on Dependency Linguistics (Depling, SyntaxFest 2019) (pp. 112-120).
Open this publication in new window or tab >>How to Parse Low-Resource Languages: Cross-Lingual Parsing, Target Language Annotation, or Both?
2019 (English)In: Proceedings of the Fifth International Conference on Dependency Linguistics (Depling, SyntaxFest 2019), 2019, p. 112-120Conference paper, Published paper (Refereed)
National Category
Language Technology (Computational Linguistics)
Research subject
Computational Linguistics
Identifiers
urn:nbn:se:uu:diva-406700 (URN)
Conference
Fifth International Conference on Dependency Linguistics (Depling, SyntaxFest 2019)
Available from: 2020-03-11 Created: 2020-03-11 Last updated: 2020-03-11
Basirat, A., de Lhoneux, M., Kulmizev, A., Kurfal, M., Nivre, J. & Östling, R. (2019). Polyglot Parsing for One Thousand and One Languages (And Then Some). In: : . Paper presented at First workshop on Typology for Polyglot NLP, Florence, Italy, August 1 2019.
Open this publication in new window or tab >>Polyglot Parsing for One Thousand and One Languages (And Then Some)
Show others...
2019 (English)Conference paper, Poster (with or without abstract) (Other academic)
National Category
General Language Studies and Linguistics
Identifiers
urn:nbn:se:uu:diva-392156 (URN)
Conference
First workshop on Typology for Polyglot NLP, Florence, Italy, August 1 2019
Available from: 2019-08-29 Created: 2019-08-29 Last updated: 2019-08-30Bibliographically approved
Basirat, A. & Nivre, J. (2019). Real-valued syntactic word vectors. Journal of experimental and theoretical artificial intelligence (Print)
Open this publication in new window or tab >>Real-valued syntactic word vectors
2019 (English)In: Journal of experimental and theoretical artificial intelligence (Print), ISSN 0952-813X, E-ISSN 1362-3079Article in journal (Refereed) Published
Abstract [en]

We introduce a word embedding method that generates a set of real-valued word vectors from a distributional semantic space. The semantic space is built with a set of context units (words) which are selected by an entropy-based feature selection approach with respect to the certainty involved in their contextual environments. We show that the most predictive context of a target word is its preceding word. An adaptive transformation function is also introduced that reshapes the data distribution to make it suitable for dimensionality reduction techniques. The final low-dimensional word vectors are formed by the singular vectors of a matrix of transformed data. We show that the resulting word vectors are as good as other sets of word vectors generated with popular word embedding methods.

Keywords
Word embeddings, context selection, transformation, dependency parsing, singular value decomposition, entropy
National Category
Languages and Literature General Language Studies and Linguistics Computer Systems
Identifiers
urn:nbn:se:uu:diva-392095 (URN)10.1080/0952813X.2019.1653385 (DOI)
Available from: 2019-08-29 Created: 2019-08-29 Last updated: 2019-08-29Bibliographically approved
Tang, G., Sennrich, R. & Nivre, J. (2019). Understanding Neural Machine Translation by Simplification: The Case of Encoder-Free Models. In: Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2019): . Paper presented at International Conference on Recent Advances in Natural Language Processing (RANLP 2019) (pp. 1186-1193).
Open this publication in new window or tab >>Understanding Neural Machine Translation by Simplification: The Case of Encoder-Free Models
2019 (English)In: Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2019), 2019, p. 1186-1193Conference paper, Published paper (Refereed)
National Category
Language Technology (Computational Linguistics)
Research subject
Computational Linguistics; Computational Linguistics
Identifiers
urn:nbn:se:uu:diva-406698 (URN)
Conference
International Conference on Recent Advances in Natural Language Processing (RANLP 2019)
Available from: 2020-03-11 Created: 2020-03-11 Last updated: 2020-03-11
Fano, E., Karlgren, J. & Nivre, J. (2019). Uppsala University and Gavagai at CLEF eRISK: Comparing Word Embedding Models. In: Working Notes of CLEF 2019 - Conference and Labs of the Evaluation Forum, Lugano, Switzerland, September 9-12, 2019: . Paper presented at CLEF 2019 - Conference and Labs of the Evaluation Forum. CEUR-WS.org
Open this publication in new window or tab >>Uppsala University and Gavagai at CLEF eRISK: Comparing Word Embedding Models
2019 (English)In: Working Notes of CLEF 2019 - Conference and Labs of the Evaluation Forum, Lugano, Switzerland, September 9-12, 2019, CEUR-WS.org , 2019Conference paper, Published paper (Refereed)
Place, publisher, year, edition, pages
CEUR-WS.org, 2019
National Category
Language Technology (Computational Linguistics)
Research subject
Computational Linguistics
Identifiers
urn:nbn:se:uu:diva-406701 (URN)
Conference
CLEF 2019 - Conference and Labs of the Evaluation Forum
Available from: 2020-03-11 Created: 2020-03-11 Last updated: 2020-03-11
de Lhoneux, M., Stymne, S. & Nivre, J. (2019). What Should/Do/Can LSTMs Learn When Parsing Auxiliary Verb Constructions?. CoRR, abs/1907.07950
Open this publication in new window or tab >>What Should/Do/Can LSTMs Learn When Parsing Auxiliary Verb Constructions?
2019 (English)In: CoRR, Vol. abs/1907.07950Article in journal (Other academic) Published
National Category
Language Technology (Computational Linguistics)
Research subject
Computational Linguistics
Identifiers
urn:nbn:se:uu:diva-406699 (URN)
Available from: 2020-03-11 Created: 2020-03-11 Last updated: 2020-03-11
Smith, A., Bohnet, B., de Lhoneux, M., Nivre, J., Shao, Y. & Stymne, S. (2018). 82 Treebanks, 34 Models: Universal Dependency Parsing with Multi-Treebank Models. In: Proceedings of the CoNLL 2018 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies. Paper presented at Conference on Computational Natural Language Learning (CoNLL),October 31 - November 1, 2018 Brussels, Belgium (pp. 113-123).
Open this publication in new window or tab >>82 Treebanks, 34 Models: Universal Dependency Parsing with Multi-Treebank Models
Show others...
2018 (English)In: Proceedings of the CoNLL 2018 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies, 2018, p. 113-123Conference paper, Published paper (Refereed)
National Category
Language Technology (Computational Linguistics)
Research subject
Computational Linguistics
Identifiers
urn:nbn:se:uu:diva-371246 (URN)
Conference
Conference on Computational Natural Language Learning (CoNLL),October 31 - November 1, 2018 Brussels, Belgium
Available from: 2018-12-19 Created: 2018-12-19 Last updated: 2019-03-06Bibliographically approved
Organisations

Search in DiVA

Show all publications