Title
Continual Normalization: Rethinking Batch Normalization for Online Continual Learning
Abstract
Existing continual learning methods use Batch Normalization (BN) to facilitate training and improve generalization across tasks. However, the non-i.i.d and non-stationary nature of continual learning data, especially in the online setting, amplify the discrepancy between training and testing in BN and hinder the performance of older tasks. In this work, we study the cross-task normalization effect of BN in online continual learning where BN normalizes the testing data using moments biased towards the current task, resulting in higher catastrophic forgetting. This limitation motivates us to propose a simple yet effective method that we call Continual Normalization (CN) to facilitate training similar to BN while mitigating its negative effect. Extensive experiments on different continual learning algorithms and online scenarios show that CN is a direct replacement for BN and can provide substantial performance improvements. Our implementation is available at \url{https://github.com/phquang/Continual-Normalization}.
Year
Venue
Keywords
2022
International Conference on Learning Representations (ICLR)
Continual Learning,Batch Normalization
DocType
ISSN
Citations 
Conference
International Conference on Learning Representations, 2022
0
PageRank 
References 
Authors
0.34
0
3
Name
Order
Citations
PageRank
Quang Pham1140.99
Chenghao Liu233432.66
Steven Hoi300.34