Papers
Topics
Authors
Recent
Search
2000 character limit reached

Instance-based Deep Transfer Learning

Published 8 Sep 2018 in cs.CV | (1809.02776v2)

Abstract: Deep transfer learning recently has acquired significant research interest. It makes use of pre-trained models that are learned from a source domain, and utilizes these models for the tasks in a target domain. Model-based deep transfer learning is probably the most frequently used method. However, very little research work has been devoted to enhancing deep transfer learning by focusing on the influence of data. In this paper, we propose an instance-based approach to improve deep transfer learning in a target domain. Specifically, we choose a pre-trained model from a source domain and apply this model to estimate the influence of training samples in a target domain. Then we optimize the training data of the target domain by removing the training samples that will lower the performance of the pre-trained model. We later either fine-tune the pre-trained model with the optimized training data in the target domain, or build a new model which is initialized partially based on the pre-trained model, and fine-tune it with the optimized training data in the target domain. Using this approach, transfer learning can help deep learning models to capture more useful features. Extensive experiments demonstrate the effectiveness of our approach on boosting the quality of deep learning models for some common computer vision tasks, such as image classification.

Citations (34)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (3)

Collections

Sign up for free to add this paper to one or more collections.