arxivst stuff from arxiv that you should probably bookmark

Syntax Aware LSTM Model for Chinese Semantic Role Labeling

Abstract · Apr 3, 2017 02:10 ·


Arxiv Abstract

  • Feng Qian
  • Lei Sha
  • Baobao Chang
  • Lu-chen Liu
  • Ming Zhang

Traditional approaches to Semantic Role Labeling (SRL) depend heavily on manual feature engineering. Recurrent neural network (RNN) with long-short-term memory (LSTM) only treats sentence as sequence data and can not utilize higher level syntactic information. In this paper, we propose Syntax Aware LSTM (SA-LSTM) which gives RNN-LSTM ability to utilize higher level syntactic information gained from dependency relationship information. SA-LSTM also assigns different trainable weights to different types of dependency relationship automatically. Experiment results on Chinese Proposition Bank (CPB) show that, even without pre-training or introducing any other extra semantically annotated resources, our SA-LSTM model still outperforms the state of the art significantly base on Student’s t-test ($p<0.05$). Trained weights of types of dependency relationship form a stable and self-explanatory pattern.

Read the paper (pdf) »