Skip to Main content Skip to Navigation
Book sections

Domain Adaptation Transfer Learning by Kernel Representation Adaptation

Abstract : Domain adaptation, where no labeled target data is available, is a challenging task. To solve this problem, we first propose a new SVM based approach with a supplementary MaximumMean Discrepancy (MMD)-like constraint. With this heuristic, source and target data are projected onto a common subspace of a Reproducing Kernel Hilbert Space (RKHS) where both data distributions are expected to become similar. Therefore, a classifier trained on source data might perform well on target data, if the conditional probabilities of labels are similar for source and target data, which is the main assumption of this paper. We demonstrate that adding this constraint does not change the quadratic nature of the optimization problem, so we can use common quadratic optimization tools. Secondly, using the same idea that rendering source and target data similar might ensure efficient transfer learning, and with the same assumption, a Kernel Principal Component Analysis (KPCA) based transfer learning method is proposed. Different from the first heuristic, this second method ensures other higher order moments to be aligned in the RKHS, which leads to better performances. Here again, we select MMD as the similarity measure. Then, a linear transformation is also applied to further improve the alignment between source and target data. We finally compare both methods with other transfer learning methods from the literature to show their efficiency on synthetic and real datasets.
Document type :
Book sections
Complete list of metadata
Contributor : Jean-Baptiste VU VAN Connect in order to contact the contributor
Submitted on : Tuesday, June 9, 2020 - 12:05:33 PM
Last modification on : Wednesday, August 31, 2022 - 6:56:19 PM




Xiaoyi Chen, Régis Lengellé. Domain Adaptation Transfer Learning by Kernel Representation Adaptation. De Marsico M., di Baja G., Fred A. Lecture Notes In Computer Science, 10857, pp.45-61, 2018, ⟨10.1007/978-3-319-93647-5_3⟩. ⟨hal-02862063⟩



Record views