TY - GEN
T1 - Diversity-Aware Channel Pruning for StyleGAN Compression
AU - Chung, Jiwoo
AU - Hyun, Sangeek
AU - Shim, Sang Heon
AU - Heo, Jae Pil
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - StyleGAN has shown remarkable performance in unconditional image generation. However, its high computational cost poses a significant challenge for practical applications. Although recent efforts have been made to compress Style-GAN while preserving its performance, existing compressed models still lag behind the original model, particularly in terms of sample diversity. To overcome this, we propose a novel channel pruning method that leverages varying sensitivities of channels to latent vectors, which is a key factor in sample diversity. Specifically, by assessing channel importance based on their sensitivities to latent vector perturbations, our method enhances the diversity of samples in the compressed model. Since our method solely focuses on the channel pruning stage, it has complementary benefits with prior training schemes without additional training cost. Ex-tensive experiments demonstrate that our method significantly enhances sample diversity across various datasets. Moreover, in terms of FID scores, our method not only sur-passes state-of-the-art by a large margin but also achieves comparable scores with only half training iterations. Codes are available at github.com/jiwoogit/DCP-GAN.
AB - StyleGAN has shown remarkable performance in unconditional image generation. However, its high computational cost poses a significant challenge for practical applications. Although recent efforts have been made to compress Style-GAN while preserving its performance, existing compressed models still lag behind the original model, particularly in terms of sample diversity. To overcome this, we propose a novel channel pruning method that leverages varying sensitivities of channels to latent vectors, which is a key factor in sample diversity. Specifically, by assessing channel importance based on their sensitivities to latent vector perturbations, our method enhances the diversity of samples in the compressed model. Since our method solely focuses on the channel pruning stage, it has complementary benefits with prior training schemes without additional training cost. Ex-tensive experiments demonstrate that our method significantly enhances sample diversity across various datasets. Moreover, in terms of FID scores, our method not only sur-passes state-of-the-art by a large margin but also achieves comparable scores with only half training iterations. Codes are available at github.com/jiwoogit/DCP-GAN.
KW - Channel-Pruning
KW - Generative models
UR - https://www.scopus.com/pages/publications/85207274836
U2 - 10.1109/CVPR52733.2024.00755
DO - 10.1109/CVPR52733.2024.00755
M3 - Conference contribution
AN - SCOPUS:85207274836
T3 - Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
SP - 7902
EP - 7911
BT - Proceedings - 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2024
PB - IEEE Computer Society
T2 - 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2024
Y2 - 16 June 2024 through 22 June 2024
ER -