Skip to yearly menu bar Skip to main content


Poster

Dual-Consistency Model Inversion for Non-Exemplar Class Incremental Learning

Zihuan Qiu · Yi Xu · Fanman Meng · Hongliang Li · Linfeng Xu · Qingbo Wu

Arch 4A-E Poster #437
[ ] [ Paper PDF ]
Fri 21 Jun 10:30 a.m. PDT — noon PDT

Abstract:

Non-exemplar class incremental learning (NECIL) aims to continuously assimilate new knowledge without forgetting previously acquired ones when historical data are unavailable.One of the generative NECIL methods is to invert the images of old classes for joint training. However, these synthetic images suffer significant domain shifts compared with real data, hampering the recognition of old classes.In this paper, we present a novel method termed Dual-Consistency Model Inversion (DCMI) to generate better synthetic samples of old classes through two pivotal consistency alignments: (1) the semantic consistency between the synthetic images and the corresponding prototypes, and (2) domain consistency between synthetic and real images of new classes.Additionally, we introduce Prototypical Routing (PR) to provide task-prior information and generate unbiased and accurate predictions.Our comprehensive experiments across diverse datasets consistently showcase the superiority of our method over previous state-of-the-art approaches. The code will be released.

Chat is not available.