•  
  •  
 

Turkish Journal of Electrical Engineering and Computer Sciences

DOI

10.3906/elk-1503-245

Abstract

Learning invariant features across domains is of vital importance to unsupervised domain adaptation, where classifiers trained on the training examples (source domain) need to adapt to a different set of test examples (target domain) in which no labeled examples are available. In this paper, we propose a novel approach to find the invariant features in the original space and transfer the knowledge across domains. We extract invariant features of input data by a kernel-based feature weighting approach, which exploits distribution difference and instance clustering to find desired features. The proposed method is called the kernel-based feature weighting (KFW) approach and benefits from the maximum mean discrepancy to measure the difference between domains. KFW uses condensed clusters in the reduced domains, the domains that do not contain variant features, to enhance the classification performance. Simultaneous use of feature weighting and instance clustering increases the adaptation and classification performance. Our approach automatically discovers the invariant features across domains and employs them to bridge between source and target domains. We demonstrate the effectiveness of our approach in the task of artificial and real world dataset examinations. Empirical results show that the proposed method outperforms other state-of-the-art methods on the standard transfer learning benchmark datasets.

First Page

292

Last Page

307

Share

COinS