TY - GEN
T1 - Class Incremental Learning with Task-Selection
AU - Kim, Eun Sung
AU - Kim, Jung Uk
AU - Lee, Sangmin
AU - Moon, Sang Keun
AU - Ro, Yong Man
N1 - Publisher Copyright:
© 2020 IEEE.
PY - 2020/10
Y1 - 2020/10
N2 - Despite the success of the deep neural networks (DNNs), in case of incremental learning, DNNs are known to suffer from catastrophic forgetting problems which are the phenomenon of entirely forgetting previously learned task information upon learning current task information. To alleviate this problem, we propose a novel knowledge distillation-based class incremental learning method with a task-selective autoencoder (TsAE). By learning the TsAE to reconstruct the feature map of each task, the proposed method effectively memorizes not only the classes of the current task but also the classes of previously learned tasks. Since the proposed TsAE has a simple but powerful architecture, it can be easily generalized to other knowledge distillation-based class incremental learning methods. Our experimental results on various datasets, including iCIFAR-100 and iILSVRC-small, demonstrated that the proposed method achieves higher classification accuracy and less forgetting compared to the stateof-the-art methods.
AB - Despite the success of the deep neural networks (DNNs), in case of incremental learning, DNNs are known to suffer from catastrophic forgetting problems which are the phenomenon of entirely forgetting previously learned task information upon learning current task information. To alleviate this problem, we propose a novel knowledge distillation-based class incremental learning method with a task-selective autoencoder (TsAE). By learning the TsAE to reconstruct the feature map of each task, the proposed method effectively memorizes not only the classes of the current task but also the classes of previously learned tasks. Since the proposed TsAE has a simple but powerful architecture, it can be easily generalized to other knowledge distillation-based class incremental learning methods. Our experimental results on various datasets, including iCIFAR-100 and iILSVRC-small, demonstrated that the proposed method achieves higher classification accuracy and less forgetting compared to the stateof-the-art methods.
KW - autoencoder
KW - catastrophic forgetting
KW - Deep learning
KW - incremental learning
KW - knowledge distillation
UR - https://www.scopus.com/pages/publications/85098639793
U2 - 10.1109/ICIP40778.2020.9190703
DO - 10.1109/ICIP40778.2020.9190703
M3 - Conference contribution
AN - SCOPUS:85098639793
T3 - Proceedings - International Conference on Image Processing, ICIP
SP - 1846
EP - 1850
BT - 2020 IEEE International Conference on Image Processing, ICIP 2020 - Proceedings
PB - IEEE Computer Society
T2 - 2020 IEEE International Conference on Image Processing, ICIP 2020
Y2 - 25 September 2020 through 28 September 2020
ER -