TY - GEN
T1 - Efficient Multi-Scale Feature Generation Adaptive Network
AU - Lee, Gwanghan
AU - Kim, Minha
AU - Kim, Minha
AU - Woo, Simon S.
N1 - Publisher Copyright:
© 2021 ACM.
PY - 2021/10/30
Y1 - 2021/10/30
N2 - Recently, an early exit network, which dynamically adjusts the model complexity during inference time, has achieved remarkable performance and neural network efficiency to be used for various applications. So far, many researchers have been focusing on reducing the redundancy of input sample or model architecture. However, they were unsuccessful at resolving the performance drop of early classifiers that make predictions with insufficient high-level feature information. Consequently, the performance degradation of early classifiers had a devastating effect on the entire network performance sharing the backbone. Thus, in this paper, we propose an Efficient Multi-Scale Feature Generation Adaptive Network (EMGNet), which not only reduced the redundancy of the architecture but also generates multi-scale features to improve the performance of the early exit network. Our approach renders multi-scale feature generation highly efficient through sharing weights in the center of the convolution kernel. Also, our gating network effectively learns to automatically determine the proper multi-scale feature ratio required for each convolution layer in different locations of the network. We demonstrate that our proposed model outperforms the state-of-the-art adaptive networks on CIFAR10, CIFAR100, and ImageNet datasets. The implementation code is available at https://github.com/lee-gwang/EMGNet
AB - Recently, an early exit network, which dynamically adjusts the model complexity during inference time, has achieved remarkable performance and neural network efficiency to be used for various applications. So far, many researchers have been focusing on reducing the redundancy of input sample or model architecture. However, they were unsuccessful at resolving the performance drop of early classifiers that make predictions with insufficient high-level feature information. Consequently, the performance degradation of early classifiers had a devastating effect on the entire network performance sharing the backbone. Thus, in this paper, we propose an Efficient Multi-Scale Feature Generation Adaptive Network (EMGNet), which not only reduced the redundancy of the architecture but also generates multi-scale features to improve the performance of the early exit network. Our approach renders multi-scale feature generation highly efficient through sharing weights in the center of the convolution kernel. Also, our gating network effectively learns to automatically determine the proper multi-scale feature ratio required for each convolution layer in different locations of the network. We demonstrate that our proposed model outperforms the state-of-the-art adaptive networks on CIFAR10, CIFAR100, and ImageNet datasets. The implementation code is available at https://github.com/lee-gwang/EMGNet
KW - adaptive network
KW - dynamic neural networks
KW - early exit networks
UR - https://www.scopus.com/pages/publications/85119180357
U2 - 10.1145/3459637.3482337
DO - 10.1145/3459637.3482337
M3 - Conference contribution
AN - SCOPUS:85119180357
T3 - International Conference on Information and Knowledge Management, Proceedings
SP - 883
EP - 892
BT - CIKM 2021 - Proceedings of the 30th ACM International Conference on Information and Knowledge Management
PB - Association for Computing Machinery
T2 - 30th ACM International Conference on Information and Knowledge Management, CIKM 2021
Y2 - 1 November 2021 through 5 November 2021
ER -