Learning invariant features across domains is of vital importance to unsupervised domain adaptation, where classifiers trained on the training examples (source domain) need to adapt to a different set of test examples (target domain) in which no labeled examples are available. In this paper, we propose a novel approach to find the invariant features in the original space and transfer the knowledge across domains. We extract invariant features of input data by a kernel-based feature weighting approach, which exploits distribution difference and instance clustering to find desired features. The proposed method is called the kernel-based feature weighting (KFW) approach and benefits from the maximum mean discrepancy to measure the difference between domains. KFW uses condensed clusters in the reduced domains, the domains that do not contain variant features, to enhance the classification performance. Simultaneous use of feature weighting and instance clustering increases the adaptation and classification performance. Our approach automatically discovers the invariant features across domains and employs them to bridge between source and target domains. We demonstrate the effectiveness of our approach in the task of artificial and real world dataset examinations. Empirical results show that the proposed method outperforms other state-of-the-art methods on the standard transfer learning benchmark datasets.
Transfer learning, unsupervised domain adaptation, feature weighting, instance clustering, maximum mean discrepancy
TAHMORESNEZHAD, JAFAR and HASHEMI, SATTAR
"Exploiting kernel-based feature weighting and instance clustering to transfer knowledge across domains,"
Turkish Journal of Electrical Engineering and Computer Sciences: Vol. 25:
1, Article 24.
Available at: https://journals.tubitak.gov.tr/elektrik/vol25/iss1/24