Papers
Topics
Authors
Recent
2000 character limit reached

SLING: A framework for frame semantic parsing (1710.07032v1)

Published 19 Oct 2017 in cs.CL

Abstract: We describe SLING, a framework for parsing natural language into semantic frames. SLING supports general transition-based, neural-network parsing with bidirectional LSTM input encoding and a Transition Based Recurrent Unit (TBRU) for output decoding. The parsing model is trained end-to-end using only the text tokens as input. The transition system has been designed to output frame graphs directly without any intervening symbolic representation. The SLING framework includes an efficient and scalable frame store implementation as well as a neural network JIT compiler for fast inference during parsing. SLING is implemented in C++ and it is available for download on GitHub.

Citations (51)

Summary

  • The paper presents a novel transition-based parsing model that directly generates semantic frame graphs from text tokens.
  • It leverages a bidirectional LSTM encoder coupled with a Transition Based Recurrent Unit and a just-in-time neural compiler for efficient inference.
  • Experiments show promising F1 scores (Span F1: 93.69%, Combined F1: 87.55%), indicating strong potential for improving semantic parsing tasks.

SLING: A Framework for Frame Semantic Parsing

The paper "SLING: A framework for frame semantic parsing" presents an innovative system aimed at parsing natural language into semantic frames. The authors introduce a framework that leverages neural network techniques to enhance the efficiency and accuracy of frame-based semantic parsers. The focus of SLING is on the direct derivation of frame graphs without relying on intermediary symbolic representations, a significant departure from traditional pipelined approaches.

Technical Overview

SLING is built upon a transition-based parsing model, employing a bidirectional LSTM for input encoding and a Transition Based Recurrent Unit (TBRU) for output decoding. This setup facilitates an end-to-end training approach utilizing only text tokens. The integration of a neural network just-in-time (JIT) compiler within SLING significantly expedites inference during parsing tasks. Additionally, a robust frame store implementation supports efficient and scalable operations within the system.

Design and Implementation

The framework is distinguished by its transition system designed explicitly for generating frame graphs. Unlike traditional systems that culminate in dependency trees, SLING's transition-based design effectively handles the complexities of semantic frame graphs. A notable feature is the attention mechanism, which deviates from typical sequence-to-sequence models by focusing on the frame representation during parsing processes.

The SLING parser utilizes various actions in its transition system, such as EVOKE, CONNECT, and ASSIGN, which directly correspond to operations within frame graphs. This is complemented by several lexical features driving the bidirectional LSTM, ensuring a comprehensive parsing capability that integrates both simple and complex contextual information.

Evaluation and Results

The authors conducted experiments using a corpus derived from OntoNotes, annotated with semantic frames. The model showed promising results, achieving substantial F1 scores across various metrics, notably a Span F1 of 93.69% and a Combined F1 of 87.55%. Despite its strengths, the role accuracy poses an area for future exploration, with the ROLE F1 observed at 69.39%.

Table statistics reveal that EVOKE and CONNECT actions are frequently utilized, underlining their centrality in frame generation. The performance metrics suggest that the model generalizes well, showcasing minimal variation between development and test corpus evaluations.

Runtime Efficiency

To address runtime efficiency, SLING incorporates Myelin, a JIT compiler for neural networks, which considerably enhances parsing speed, achieving up to 2500 tokens per second on CPU—surpassing TensorFlow-based implementations by over ten times. The Myelin runtime's optimization exploits specific CPU features, offering a lightweight and swift alternative to conventional frameworks.

Implications and Future Directions

The implications of SLING extend across natural language understanding applications where semantic frame parsing is critical. The demonstrated capability for direct frame graph generation offers potential enhancements in tasks such as entity recognition, semantic role labeling, and beyond. Future developments could involve more complex frame structures, improved role linking precision, and exploration into other potential linguistic tasks.

This paper provides a comprehensive framework with potential use cases in both academic research and practical applications, and its open-source release invites further innovation and optimization by the research community.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Github Logo Streamline Icon: https://streamlinehq.com