TY - GEN
T1 - Learning with Structural Labels for Learning with Noisy Labels
AU - Kim, Noo Ri
AU - Lee, Jin Seop
AU - Lee, Jee Hyong
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Deep Neural Networks (DNNs) have demonstrated remarkable performance across diverse domains and tasks with large-scale datasets. To reduce labeling costs for large-scale datasets, semi-automated and crowdsourcing labeling methods are developed, but their labels are in-evitably noisy. Learning with Noisy Labels (LNL) approaches aim to train DNNs despite the presence of noisy labels. These approaches utilize the memorization effect to select correct labels and refine noisy ones, which are then used for subsequent training. However, these methods en-counter a significant decrease in the model's generalization performance due to the inevitably existing noise labels. To overcome this limitation, we propose a new approach to enhance learning with noisy labels by incorporating additional distribution informationstructural labels. In order to leverage additional distribution information for generalization, we employ a reverse k-NN, which helps the model in achieving a better feature manifold and mitigating over-fitting to noisy labels. The proposed method shows outperformed performance in multiple benchmark datasets with IDN and real-world noisy datasets.
AB - Deep Neural Networks (DNNs) have demonstrated remarkable performance across diverse domains and tasks with large-scale datasets. To reduce labeling costs for large-scale datasets, semi-automated and crowdsourcing labeling methods are developed, but their labels are in-evitably noisy. Learning with Noisy Labels (LNL) approaches aim to train DNNs despite the presence of noisy labels. These approaches utilize the memorization effect to select correct labels and refine noisy ones, which are then used for subsequent training. However, these methods en-counter a significant decrease in the model's generalization performance due to the inevitably existing noise labels. To overcome this limitation, we propose a new approach to enhance learning with noisy labels by incorporating additional distribution informationstructural labels. In order to leverage additional distribution information for generalization, we employ a reverse k-NN, which helps the model in achieving a better feature manifold and mitigating over-fitting to noisy labels. The proposed method shows outperformed performance in multiple benchmark datasets with IDN and real-world noisy datasets.
KW - Deep Neural Networks
KW - Learning with Noisy Labels
KW - Structural Labels
UR - https://www.scopus.com/pages/publications/85211249553
U2 - 10.1109/CVPR52733.2024.02607
DO - 10.1109/CVPR52733.2024.02607
M3 - Conference contribution
AN - SCOPUS:85211249553
T3 - Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
SP - 27600
EP - 27610
BT - Proceedings - 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2024
PB - IEEE Computer Society
T2 - 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2024
Y2 - 16 June 2024 through 22 June 2024
ER -