arxivst stuff from arxiv that you should probably bookmark

Close Yet Distinctive Domain Adaptation

Abstract · Apr 13, 2017 08:30 ·

domain target transfer shanghai discrepancy label source adaptation domains cs-lg cs-cv stat-ml

Arxiv Abstract

  • Lingkun Luo
  • Xiaofang Wang
  • Shiqiang Hu
  • Chao Wang
  • Yuxing Tang
  • Liming Chen

Domain adaptation is transfer learning which aims to generalize a learning model across training and testing data with different distributions. Most previous research tackle this problem in seeking a shared feature representation between source and target domains while reducing the mismatch of their data distributions. In this paper, we propose a close yet discriminative domain adaptation method, namely CDDA, which generates a latent feature representation with two interesting properties. First, the discrepancy between the source and target domain, measured in terms of both marginal and conditional probability distribution via Maximum Mean Discrepancy is minimized so as to attract two domains close to each other. More importantly, we also design a repulsive force term, which maximizes the distances between each label dependent sub-domain to all others so as to drag different class dependent sub-domains far away from each other and thereby increase the discriminative power of the adapted domain. Moreover, given the fact that the underlying data manifold could have complex geometric structure, we further propose the constraints of label smoothness and geometric structure consistency for label propagation. Extensive experiments are conducted on 36 cross-domain image classification tasks over four public datasets. The comprehensive results show that the proposed method consistently outperforms the state-of-the-art methods with significant margins.

Read the paper (pdf) »