Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

An Information-Theoretic Approach to Transferability in Task Transfer Learning (2212.10082v1)

Published 20 Dec 2022 in cs.LG and cs.CV

Abstract: Task transfer learning is a popular technique in image processing applications that uses pre-trained models to reduce the supervision cost of related tasks. An important question is to determine task transferability, i.e. given a common input domain, estimating to what extent representations learned from a source task can help in learning a target task. Typically, transferability is either measured experimentally or inferred through task relatedness, which is often defined without a clear operational meaning. In this paper, we present a novel metric, H-score, an easily-computable evaluation function that estimates the performance of transferred representations from one task to another in classification problems using statistical and information theoretic principles. Experiments on real image data show that our metric is not only consistent with the empirical transferability measurement, but also useful to practitioners in applications such as source model selection and task transfer curriculum learning.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Yajie Bao (14 papers)
  2. Yang Li (1142 papers)
  3. Shao-Lun Huang (48 papers)
  4. Lin Zhang (342 papers)
  5. Lizhong Zheng (44 papers)
  6. Amir Zamir (28 papers)
  7. Leonidas Guibas (177 papers)
Citations (107)

Summary

We haven't generated a summary for this paper yet.