TY - GEN
T1 - CAFTTA
T2 - Joint 13th International Conference on Soft Computing and Intelligent Systems and 25th International Symposium on Advanced Intelligent Systems, SCIS and ISIS 2024
AU - Lee, Byung Joon
AU - Lee, Jin Seop
AU - Lee, Jee Hyong
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - In real-world applications, deployed deep learning models often encounter test data from domains significantly different from their training data. This domain shift typically results in performance degradation for conventional deep learning models. Test-time adaptation (TTA) has emerged as a promising research direction to address this challenge by adapting models during test-time to unpredictable domains. However, in real-world scenarios, data could exhibit different class distributions as well as domains. Consequently, models are deployed in environments where they temporarily cannot observe data of a particular class. Under these conditions, test-time adaptation methods exhibit catastrophic forgetting of unseen class, significantly compromising their ability to generalize. To address this critical issue, we propose Class Anti-Forgetting Test-Time Adaptation (CAFTTA). Our method shares domain knowledge by weight sharing between the original source model and the test-time adaptation model, and preserves knowledge of unseen classes through entropy-based knowledge fusion. Our method minimizes performance degradation on seen classes while preventing unseen class forgetting. Experimentally, we demonstrate that our method significantly mitigates the unseen class forgetting problem faced by existing test time adaptation methods in both static and continuous domain scenarios.
AB - In real-world applications, deployed deep learning models often encounter test data from domains significantly different from their training data. This domain shift typically results in performance degradation for conventional deep learning models. Test-time adaptation (TTA) has emerged as a promising research direction to address this challenge by adapting models during test-time to unpredictable domains. However, in real-world scenarios, data could exhibit different class distributions as well as domains. Consequently, models are deployed in environments where they temporarily cannot observe data of a particular class. Under these conditions, test-time adaptation methods exhibit catastrophic forgetting of unseen class, significantly compromising their ability to generalize. To address this critical issue, we propose Class Anti-Forgetting Test-Time Adaptation (CAFTTA). Our method shares domain knowledge by weight sharing between the original source model and the test-time adaptation model, and preserves knowledge of unseen classes through entropy-based knowledge fusion. Our method minimizes performance degradation on seen classes while preventing unseen class forgetting. Experimentally, we demonstrate that our method significantly mitigates the unseen class forgetting problem faced by existing test time adaptation methods in both static and continuous domain scenarios.
KW - Catastrophic Forgetting
KW - Domain Shift
KW - Knowledge Fusion
KW - Test-Time Adaptation
UR - https://www.scopus.com/pages/publications/85214697801
U2 - 10.1109/SCISISIS61014.2024.10759970
DO - 10.1109/SCISISIS61014.2024.10759970
M3 - Conference contribution
AN - SCOPUS:85214697801
T3 - 2024 Joint 13th International Conference on Soft Computing and Intelligent Systems and 25th International Symposium on Advanced Intelligent Systems, SCIS and ISIS 2024
BT - 2024 Joint 13th International Conference on Soft Computing and Intelligent Systems and 25th International Symposium on Advanced Intelligent Systems, SCIS and ISIS 2024
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 9 November 2024 through 12 November 2024
ER -