Invited talk, University of Washington, Seattle, WA
Mapping unstructured text to structured meaning representations, semantic parsing covers a wide variety of problems in the domain of natural language understanding and interaction. Common applications include translating human commands to executable programs, as well as question answering when using databases or semi-structured tables as the information source. Settings of real-world semantic parsing problems, such as the large space of legitimate semantic parses, weak or mixed supervision signals, and complex semantic/syntactic constraints, pose interesting and yet difficult structured prediction challenges. In this talk, I will give an overview on these technical challenges and present a case study on sequential question answering, answering sequences of simple but inter-related questions using semi-structured tables from Wikipedia. In particular, I will describe our dynamic neural semantic parsing framework trained using a weakly supervised reward-guided search, which effectively leverages the sequential context to outperform state-of-the-art QA systems that are designed to answer highly complex questions. The talk will be concluded with discussion on the open problems and promising directions for future research.