TY - GEN
T1 - CYFLOD
T2 - 2025 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2025
AU - Ullah, Nauman Gilal
AU - Al Thelaya, Khaled
AU - Majeed, Fahad
AU - Lu, Zhihe
AU - Boughorbel, Sabri
AU - Schneider, Jens
AU - Agus, Marco
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025
Y1 - 2025
N2 - We address the challenge of Learning with Noisy Labels (LNL) in fine-grained data sets, a domain exhibiting significant inter-class overlap. Conventional LNL methods fall short in this context. We propose a simple and effective dual-stage approach that can be integrated into any standard transfer learning framework: i) a cyclical iterative filtering scheme in the learning process and, ii) a cyclical loss damping using a SmoothStep function that can be incorporated into any loss function. The proposed integrated scheme iteratively removes noisy labels, enhances data quality, and boosts model generalization. We evaluate our dual-stage solution across diverse data sets, including Stanford Cars and Aircraft for fine-grained categorization, CIFAR-10 for a generic benchmark, and the real-world noise-afflicted Food-101N data set. We conduct our experiments under various noise models, including both symmetric and asymmetric conditions. Our method demonstrates a marked improvement in performance, showcasing its potential in fine-grained classification tasks with noisy labels.
AB - We address the challenge of Learning with Noisy Labels (LNL) in fine-grained data sets, a domain exhibiting significant inter-class overlap. Conventional LNL methods fall short in this context. We propose a simple and effective dual-stage approach that can be integrated into any standard transfer learning framework: i) a cyclical iterative filtering scheme in the learning process and, ii) a cyclical loss damping using a SmoothStep function that can be incorporated into any loss function. The proposed integrated scheme iteratively removes noisy labels, enhances data quality, and boosts model generalization. We evaluate our dual-stage solution across diverse data sets, including Stanford Cars and Aircraft for fine-grained categorization, CIFAR-10 for a generic benchmark, and the real-world noise-afflicted Food-101N data set. We conduct our experiments under various noise models, including both symmetric and asymmetric conditions. Our method demonstrates a marked improvement in performance, showcasing its potential in fine-grained classification tasks with noisy labels.
KW - fine-grained classification; noisy labels learning; cyclical schedules; robust loss
UR - https://www.scopus.com/pages/publications/105017845052
U2 - 10.1109/CVPRW67362.2025.00194
DO - 10.1109/CVPRW67362.2025.00194
M3 - Conference contribution
AN - SCOPUS:105017845052
T3 - IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops
SP - 2059
EP - 2069
BT - Proceedings - 2025 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2025
PB - IEEE Computer Society
Y2 - 11 June 2025 through 12 June 2025
ER -