arxivst stuff from arxiv that you should probably bookmark

Early Stopping without a Validation Set

Abstract · Mar 28, 2017 14:01 ·

cs-lg stat-ml

Arxiv Abstract

  • Maren Mahsereci
  • Lukas Balles
  • Christoph Lassner
  • Philipp Hennig

Early stopping is a widely used technique to prevent poor generalization performance when training an over-expressive model by means of gradient-based optimization. To find a good point to halt the optimizer, a common practice is to split the dataset into a training and a smaller validation set to obtain an ongoing estimate of the generalization performance. In this paper we propose a novel early stopping criterion which is based on fast-to-compute, local statistics of the computed gradients and entirely removes the need for a held-out validation set. Our experiments show that this is a viable approach in the setting of least-squares and logistic regression as well as neural networks.

Read the paper (pdf) »