Skip to yearly menu bar Skip to main content


Oral

Anytime Continual Learning for Open Vocabulary Classification

Zhen Zhu · Yiming Gong · Derek Hoiem

[ ] [ Visit Oral 7A: Learning Architectures, Transfer, Continual And Long-Tail ] [ Paper ]
Fri 4 Oct midnight — 12:10 a.m. PDT

Abstract:

We propose an approach for anytime continual learning (AnytimeCL) for open vocabulary image classification. The AnytimeCL problem aims to break away from batch training and rigid models by requiring that a system can predict any set of labels at any time and efficiently update and improve when receiving one or more training samples at any time. Despite the challenging goal, we achieve substantial improvements over recent methods. We propose a dynamic weighting between predictions of a partially fine-tuned model and a fixed open vocabulary model that enables continual improvement when training samples are available for a subset of a task's labels. We also propose an attention-weighted PCA compression for compression of training features that reduces storage and computation with little impact to model accuracy. Our methods are validated with experiments that test flexibility of learning and inference.

Chat is not available.