TY - GEN
T1 - Robust Object Detection Across Diverse Environments Using Contrastive Learning-Based Domain Adaptation
AU - Kong, Hyunmin
AU - Shin, Jitae
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - The performance of object detection in road-driving scenarios is critically important but often hindered by challenges such as occlusion, time-of-day changes (e.g., day and night), and adverse weather conditions (e.g., rain, snow, fog), which lead to significant performance degradation. To overcome these challenges, this paper introduces a contrastive learning-based approach that integrates three specialized loss functions: Predicted Object Loss (POL), Missed Object Loss (MOL), and Distribution Alignment Loss (DAL) within a domain adaptation framework. The proposed method generates noise datasets from clean images, pairs these clean and noise images, and utilizes them as input for the model, with a focus on rain and night-time conditions during training. POL aligns the feature distributions of predicted objects between clean and noise images, while MOL adjusts the features of objects missed in the noise images to align with those in the clean images. Additionally, the model is trained to enhance object-background separation by increasing the distance between object features and background features in noise images. DAL further reduces the domain gap by minimizing the feature distribution discrepancies between noise and clean datasets. This approach not only reduces domain discrepancies but also improves detection accuracy, robustness in noisy environments, and generalization across diverse conditions. Experiments conducted on the KITTI dataset demonstrate the effectiveness of this method, with notable improvements in mAP performance across various noise conditions, including rain, night-time, fog, and snow, derived from clean images.
AB - The performance of object detection in road-driving scenarios is critically important but often hindered by challenges such as occlusion, time-of-day changes (e.g., day and night), and adverse weather conditions (e.g., rain, snow, fog), which lead to significant performance degradation. To overcome these challenges, this paper introduces a contrastive learning-based approach that integrates three specialized loss functions: Predicted Object Loss (POL), Missed Object Loss (MOL), and Distribution Alignment Loss (DAL) within a domain adaptation framework. The proposed method generates noise datasets from clean images, pairs these clean and noise images, and utilizes them as input for the model, with a focus on rain and night-time conditions during training. POL aligns the feature distributions of predicted objects between clean and noise images, while MOL adjusts the features of objects missed in the noise images to align with those in the clean images. Additionally, the model is trained to enhance object-background separation by increasing the distance between object features and background features in noise images. DAL further reduces the domain gap by minimizing the feature distribution discrepancies between noise and clean datasets. This approach not only reduces domain discrepancies but also improves detection accuracy, robustness in noisy environments, and generalization across diverse conditions. Experiments conducted on the KITTI dataset demonstrate the effectiveness of this method, with notable improvements in mAP performance across various noise conditions, including rain, night-time, fog, and snow, derived from clean images.
KW - common feature
KW - contrastive learning
KW - robust
UR - https://www.scopus.com/pages/publications/85214870051
U2 - 10.1109/ICCE-Asia63397.2024.10773828
DO - 10.1109/ICCE-Asia63397.2024.10773828
M3 - Conference contribution
AN - SCOPUS:85214870051
T3 - 2024 IEEE International Conference on Consumer Electronics-Asia, ICCE-Asia 2024
BT - 2024 IEEE International Conference on Consumer Electronics-Asia, ICCE-Asia 2024
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2024 IEEE International Conference on Consumer Electronics-Asia, ICCE-Asia 2024
Y2 - 3 November 2024 through 6 November 2024
ER -