arxivst stuff from arxiv that you should probably bookmark

Optimizing Differentiable Relaxations of Coreference Evaluation Metrics

Abstract · Apr 14, 2017 15:22 ·

metrics clark decisions coreference differentiable cs-cl cs-ai cs-lg

Arxiv Abstract

  • Phong Le
  • Ivan Titov

Coreference evaluation metrics are hard to optimize directly as they are non-differentiable functions, not easily decomposable into elementary decisions. Consequently, most approaches optimize objectives only indirectly related to the end goal, resulting in suboptimal performance. Instead, we propose a differentiable relaxation that lends itself to gradient-based optimisation, thus bypassing the need for reinforcement learning or heuristic modification of cross-entropy. We show that by modifying the training objective of a competitive neural coreference system, we obtain a substantial gain in performance. This suggests that our approach can be regarded as a viable alternative to using reinforcement learning or more computationally expensive imitation learning.

Read the paper (pdf) »