arxivst stuff from arxiv that you should probably bookmark

New State-of-the-Art for Few-Shot Learning

Post · Mar 16, 2017 21:29 ·

state-of-the-art one-shot zero-shot learning

Prototypical networks achieve a new state of the art on zero-shot classification (inference on unseen classes) on the CU-Birds dataset and one-shot classification (learning from just a few examples per class) on the Omniglot dataset.

Arxiv Abstract

  • Jake Snell
  • Kevin Swersky
  • Richard S. Zemel

We propose prototypical networks for the problem of few-shot classification, where a classifier must generalize to new classes not seen in the training set, given only a small number of examples of each new class. Prototypical networks learn a metric space in which classification can be performed by computing Euclidean distances to prototype representations of each class. Compared to recent approaches for few-shot learning, they reflect a simpler inductive bias that is beneficial in this limited-data regime, and achieve state-of-the-art results. We provide an analysis showing that some simple design decisions can yield substantial improvements over recent approaches involving complicated architectural choices and meta-learning. We further extend prototypical networks to the case of zero-shot learning and achieve state-of-the-art zero-shot results on the CU-Birds dataset.

Read the paper (pdf) »