TY - GEN
T1 - Energy-based Latent Aligner for Incremental Learning
AU - Joseph, K. J.
AU - Khan, Salman
AU - Khan, Fahad Shahbaz
AU - Anwer, Rao Muhammad
AU - Balasubramanian, Vineeth N.
PY - 2022
Y1 - 2022
N2 - Deep learning models tend to forget their earlier knowledge while incrementally learning new tasks. This behavior emerges because the parameter updates optimized for the new tasks may not align well with the updates suitable for older tasks. The resulting latent representation mismatch causes forgetting. In this work, we propose ELI: Energy-based Latent Aligner for Incremental Learning, which first learns an energy manifold for the latent representations such that previous task latents will have low energy and the current task latents have high energy values. This learned manifold is used to counter the representational shift that happens during incremental learning. The implicit regularization that is offered by our proposed methodology can be used as a plug-and-play module in existing incremental learning methodologies. We validate this through extensive evaluation on CIFAR-100, ImageNet subset, ImageNet 1k and Pascal VOC datasets. We observe consistent improvement when ELI is added to three prominent methodologies in class-incremental learning, across multiple incremental settings. Further, when added to the state-of-the-art incremental object detector, ELI provides over 5% improvement in detection accuracy, corroborating its effectiveness and complementary advantage to the existing art. Code is available at: https://github.com/JosephKJ/ELI.
AB - Deep learning models tend to forget their earlier knowledge while incrementally learning new tasks. This behavior emerges because the parameter updates optimized for the new tasks may not align well with the updates suitable for older tasks. The resulting latent representation mismatch causes forgetting. In this work, we propose ELI: Energy-based Latent Aligner for Incremental Learning, which first learns an energy manifold for the latent representations such that previous task latents will have low energy and the current task latents have high energy values. This learned manifold is used to counter the representational shift that happens during incremental learning. The implicit regularization that is offered by our proposed methodology can be used as a plug-and-play module in existing incremental learning methodologies. We validate this through extensive evaluation on CIFAR-100, ImageNet subset, ImageNet 1k and Pascal VOC datasets. We observe consistent improvement when ELI is added to three prominent methodologies in class-incremental learning, across multiple incremental settings. Further, when added to the state-of-the-art incremental object detector, ELI provides over 5% improvement in detection accuracy, corroborating its effectiveness and complementary advantage to the existing art. Code is available at: https://github.com/JosephKJ/ELI.
UR - http://www.scopus.com/inward/record.url?scp=85136019328&partnerID=8YFLogxK
U2 - 10.1109/CVPR52688.2022.00730
DO - 10.1109/CVPR52688.2022.00730
M3 - Conference contribution
T3 - IEEE Computer Society Conference on Computer Vision and Pattern Recognition
SP - 7442
EP - 7451
BT - 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
PB - IEEE
T2 - IEEE Conference on Computer Vision and Pattern Recognition
Y2 - 18 June 2022 through 24 June 2022
ER -