TY - GEN
T1 - Multi-Head CNN-Attention Prompt Pools
T2 - 2025 International Technical Conference on Circuits/Systems, Computers, and Communications, ITC-CSCC 2025
AU - Oh, Gyutae
AU - Shin, Jitae
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025
Y1 - 2025
N2 - Modern AI models heavily depend on training data, so when they are presented with new inputs that were not encountered during the training process, their prediction performance can drop sharply. Furthermore, in real world environments, the data distribution constantly changes over time, which exacerbates these limitations. Therefore, continual learning techniques that allow models to continuously learn while effectively retaining existing knowledge even as new data is sequentially introduced are essential. However, simple retraining methods can lead to catastrophic forgetting, where new information overwrites previously learned knowledge, significantly degrading past performance. In this paper, we propose a novel approach to improve the prompt pool, an important component in prompt-based learning, in order to mitigate such forgetting. The proposed method achieves performance improvements ranging from approximately 2% to over 8% compared to conventional prompt-based learning techniques in various benchmark experiments.
AB - Modern AI models heavily depend on training data, so when they are presented with new inputs that were not encountered during the training process, their prediction performance can drop sharply. Furthermore, in real world environments, the data distribution constantly changes over time, which exacerbates these limitations. Therefore, continual learning techniques that allow models to continuously learn while effectively retaining existing knowledge even as new data is sequentially introduced are essential. However, simple retraining methods can lead to catastrophic forgetting, where new information overwrites previously learned knowledge, significantly degrading past performance. In this paper, we propose a novel approach to improve the prompt pool, an important component in prompt-based learning, in order to mitigate such forgetting. The proposed method achieves performance improvements ranging from approximately 2% to over 8% compared to conventional prompt-based learning techniques in various benchmark experiments.
KW - Catastrophic Forgetting
KW - Continual Learning
KW - Medical Artificial Intelligence
UR - https://www.scopus.com/pages/publications/105016337207
U2 - 10.1109/ITC-CSCC66376.2025.11137692
DO - 10.1109/ITC-CSCC66376.2025.11137692
M3 - Conference contribution
AN - SCOPUS:105016337207
T3 - 2025 International Technical Conference on Circuits/Systems, Computers, and Communications, ITC-CSCC 2025
BT - 2025 International Technical Conference on Circuits/Systems, Computers, and Communications, ITC-CSCC 2025
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 7 July 2025 through 10 July 2025
ER -