Abstract
Neural networks suffer performance degradation when the source and target data lie in a different distribution hampering direct deployment of the model to diverse target domains. To this end, domain generalization (DG) aims to generalize the model well to an unknown target domain by utilizing multiple source domains. This paper proposes two simple swapping mechanisms, texture and channel swapping (TCX), for DG. Texture swapping augments the source dataset by replacing textures of an image with other textures in source dataset to alleviate the texture bias problem in convolution neural networks (CNNs). Furthermore, channel swapping switches channels of input feature vectors of the classifier along with its labels to encourage the model to utilize more channels when classifying an image. Together, we expect the model to learn less domain-specific features but more generalized class-specific features resulting in better domain generalization performance. We demonstrate the effectiveness of our approach with state-of-the-art results on three domain generalization benchmarks.
| Original language | English |
|---|---|
| Pages (from-to) | 74-80 |
| Number of pages | 7 |
| Journal | Pattern Recognition Letters |
| Volume | 175 |
| DOIs | |
| State | Published - Nov 2023 |
Keywords
- Deep learning
- Domain generalization
- Image classification
- Model regularization