go back

Domain Mixture: An Overlooked Scenario in Domain Adaptation

Sebastian Schrom and Stephan Hasler, "Domain Mixture: An Overlooked Scenario in Domain Adaptation", IEEE International Conference on Machine Learning and Applications (ICMLA), 2019.

Abstract

An image based object classification system that is trained on one domain usually shows a decreased performance when transferred to other domains during test if their belonging data distributions differ significantly. There exist various domain adaptation approaches that improve generalization from a source to a target domain. However, those approaches consider during transfer only the case where at least from one domain all supervised samples of all competing classes are available. In this paper we investigate the so far overlooked scenario in domain adaptation, where during training only a subset of all competing classes is shown in one domain and another subset of all competing classes in another domain. We show the tendency of a vanilla deep learning classifier to use the domain origin as an additional feature, which is resulting in a poor performance when testing on samples of all classes in both domains. To overcome this issue we show how an existing domain adaptation method can be used and extensively evaluate and discuss first results on a modified MNIST benchmark of this overlooked scenario.



Download Bibtex file Download PDF

Search