Exemplar-free Class Incremental Learning (EFCIL) aims to sequentially learn tasks with access only to data from the current one. EFCIL is of interest because it mit-igates concerns about privacy and long-term storage of data, while at the same time alleviating the problem of catastrophic forgetting in incremental learning. In this work, we introduce task-adaptive saliency for EFCIL and propose a new framework, which we call Task-Adaptive Saliency Supervision (TASS), for mitigating the negative effects of saliency drift between different tasks. We first apply boundary-guided saliency to maintain task adaptiv-ity and plasticity on model attention. Besides, we introduce task-agnostic low-level signals as auxiliary supervision to increase the stability of model attention. Finally, we introduce a module for injecting and recovering saliency noise to increase the robustness of saliency preservation. Our experiments demonstrate that our method can better preserve saliency maps across tasks and achieve state-of-the-art results on the CIFAR-100, Tiny-ImageNet, and ImageNet-Subset EFCIL benchmarks. Code is available at https://github.com/scok30/tass.

Task-Adaptive Saliency Guidance for Exemplar-Free Class Incremental Learning / Liu, Xialei; Zhai, Jiang-Tian; Bagdanov, Andrew D.; Li, Ke; Cheng, Ming-Ming. - ELETTRONICO. - (2024), pp. 23954-23963. ( 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2024 usa 2024) [10.1109/cvpr52733.2024.02261].

Task-Adaptive Saliency Guidance for Exemplar-Free Class Incremental Learning

Bagdanov, Andrew D.;
2024

Abstract

Exemplar-free Class Incremental Learning (EFCIL) aims to sequentially learn tasks with access only to data from the current one. EFCIL is of interest because it mit-igates concerns about privacy and long-term storage of data, while at the same time alleviating the problem of catastrophic forgetting in incremental learning. In this work, we introduce task-adaptive saliency for EFCIL and propose a new framework, which we call Task-Adaptive Saliency Supervision (TASS), for mitigating the negative effects of saliency drift between different tasks. We first apply boundary-guided saliency to maintain task adaptiv-ity and plasticity on model attention. Besides, we introduce task-agnostic low-level signals as auxiliary supervision to increase the stability of model attention. Finally, we introduce a module for injecting and recovering saliency noise to increase the robustness of saliency preservation. Our experiments demonstrate that our method can better preserve saliency maps across tasks and achieve state-of-the-art results on the CIFAR-100, Tiny-ImageNet, and ImageNet-Subset EFCIL benchmarks. Code is available at https://github.com/scok30/tass.
2024
Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2024
usa
2024
Liu, Xialei; Zhai, Jiang-Tian; Bagdanov, Andrew D.; Li, Ke; Cheng, Ming-Ming
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in FLORE sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificatore per citare o creare un link a questa risorsa: https://hdl.handle.net/2158/1442923
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 7
  • ???jsp.display-item.citation.isi??? 3
social impact