Stochastic Classifiers for Unsupervised Domain Adaptation

  • Zhihe Lu
  • , Yongxin Yang
  • , Xiatian Zhu
  • , Cong Liu
  • , Yi Zhe Song
  • , Tao Xiang

Research output: Contribution to journalConference articlepeer-review

159 Citations (Scopus)

Abstract

A common strategy adopted by existing state-of-The-Art unsupervised domain adaptation (UDA) methods is to employ two classifiers to identify the misaligned local regions between source and target domain. Following the 'wisdom of the crowd' principle, one has to ask: why stop at two? Indeed, we find that using more classifiers leads to better performance, but also introduces more model parameters, therefore risking overfitting. In this paper, we introduce a novel method called STochastic clAssifieRs (STAR) for addressing this problem. Instead of representing one classifier as a weight vector, STAR models it as a Gaussian distribution with its variance representing the inter-classifier discrepancy. With STAR, we can now sample an arbitrary number of classifiers from the distribution, whilst keeping the model size the same as having two classifiers. Extensive experiments demonstrate that a variety of existing UDA methods can greatly benefit from STAR and achieve the state-of-The-Art performance on both image classification and semantic segmentation tasks.

Original languageEnglish
Article number9157742
Pages (from-to)9108-9117
Number of pages10
JournalProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
DOIs
Publication statusPublished - 19 Jun 2020
Externally publishedYes
Event2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2020 - Virtual, Online, United States
Duration: 14 Jun 202019 Jun 2020

Fingerprint

Dive into the research topics of 'Stochastic Classifiers for Unsupervised Domain Adaptation'. Together they form a unique fingerprint.

Cite this