Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Boosting Binary Masks for Multi-Domain Learning through Affine Transformations (2103.13894v1)

Published 25 Mar 2021 in cs.CV

Abstract: In this work, we present a new, algorithm for multi-domain learning. Given a pretrained architecture and a set of visual domains received sequentially, the goal of multi-domain learning is to produce a single model performing a task in all the domains together. Recent works showed how we can address this problem by masking the internal weights of a given original conv-net through learned binary variables. In this work, we provide a general formulation of binary mask based models for multi-domain learning by affine transformations of the original network parameters. Our formulation obtains significantly higher levels of adaptation to new domains, achieving performances comparable to domain-specific models while requiring slightly more than 1 bit per network parameter per additional domain. Experiments on two popular benchmarks showcase the power of our approach, achieving performances close to state-of-the-art methods on the Visual Decathlon Challenge.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Massimiliano Mancini (66 papers)
  2. Elisa Ricci (137 papers)
  3. Barbara Caputo (105 papers)
  4. Samuel Rota Buló (3 papers)
Citations (7)

Summary

We haven't generated a summary for this paper yet.