arxivst stuff from arxiv that you should probably bookmark

New State of the Art In Semantic Role Labeling

Post · Apr 4, 2017 18:41 ·

state-of-the-art semantic-role-labeling CPB-dataset

Semantic Role Labeling took a big step forward today. The newly proposed Syntax Aware LSTM model reset the benchmark on the challenging Chinese Proposition Bank dataset, achieving a new state-of-the-art F1 score of 79.60%.

Highlights From The Paper

  • “Manual feature engineering costs a lot of time, and also, even though complex features are designed, long-distance relationships in a sentence are still hard to capture.”
  • “To take dependency relationship type into account, we introduce trainable weights for different types of dependency relationship.”
  • “The trained weights form a stable pattern, adding evidence that the SA-LSTM model is reliable.”

Arxiv Abstract

  • Feng Qian
  • Lei Sha
  • Baobao Chang
  • Lu-chen Liu
  • Ming Zhang

Traditional approaches to Semantic Role Labeling (SRL) depend heavily on manual feature engineering. Recurrent neural network (RNN) with long-short-term memory (LSTM) only treats sentence as sequence data and can not utilize higher level syntactic information. In this paper, we propose Syntax Aware LSTM (SA-LSTM) which gives RNN-LSTM ability to utilize higher level syntactic information gained from dependency relationship information. SA-LSTM also assigns different trainable weights to different types of dependency relationship automatically. Experiment results on Chinese Proposition Bank (CPB) show that, even without pre-training or introducing any other extra semantically annotated resources, our SA-LSTM model still outperforms the state of the art significantly base on Student's t-test (p<0.05). Trained weights of types of dependency relationship form a stable and self-explanatory pattern.

Read the paper (pdf) »