Seminar in Computational Linguistics
- Date: –14:30
- Location: Engelska parken 9-3042
- Lecturer: Daniel Hershcovich
- Contact person: Miryam de Lhoneux
Universal Meaning Representation Parsing
Natural language understanding requires the ability to comprehend text, reason about it, and act upon it intelligently. While simplistic approaches such as end-to-end sequence-to-sequence models go a long way, symbolic meaning representation can provide an invaluable inductive bias. Among coverage of semantic phenomena and ease of annotation, important design principles for meaning representations include cross-linguistic applicability and stability. I will present cross-lingual experiments with a transition-based meaning representation parser (which parses text to UCCA, AMR, DM and UD), and show that multi-task learning across meaning representation frameworks improves its performance by effectively learning shared generalizations. The results of two shared tasks on meaning representation parsing (in SemEval and CoNLL 2019) further highlight the contribution of cross-lingual learning, but raise questions about the comparability and information between meaning representation frameworks. I will also present an empirical comparison of the content of semantic and syntactic representations, revealing several aspects of divergence, which have profound impact on the potential contribution of syntax to semantic parsing, and on the usefulness of each for natural language processing.
Daniel is a postdoctoral researcher at the University of Copenhagen, working on meaning representations and semantic parsing. He completed his Ph.D. at the Hebrew University of Jerusalem with Ari Rappoport and Omri Abend, and his B.Sc. in Mathematics and Computer Science at the Open University of Israel. During 2008-2019, Daniel was a software engineer at IBM Research, where he was part of Project Debater.