Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Decoupled Networks (1804.08071v1)

Published 22 Apr 2018 in cs.CV, cs.LG, and stat.ML

Abstract: Inner product-based convolution has been a central component of convolutional neural networks (CNNs) and the key to learning visual representations. Inspired by the observation that CNN-learned features are naturally decoupled with the norm of features corresponding to the intra-class variation and the angle corresponding to the semantic difference, we propose a generic decoupled learning framework which models the intra-class variation and semantic difference independently. Specifically, we first reparametrize the inner product to a decoupled form and then generalize it to the decoupled convolution operator which serves as the building block of our decoupled networks. We present several effective instances of the decoupled convolution operator. Each decoupled operator is well motivated and has an intuitive geometric interpretation. Based on these decoupled operators, we further propose to directly learn the operator from data. Extensive experiments show that such decoupled reparameterization renders significant performance gain with easier convergence and stronger robustness.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Weiyang Liu (83 papers)
  2. Zhen Liu (234 papers)
  3. Zhiding Yu (94 papers)
  4. Bo Dai (245 papers)
  5. Rongmei Lin (11 papers)
  6. Yisen Wang (120 papers)
  7. James M. Rehg (91 papers)
  8. Le Song (140 papers)
Citations (64)

Summary

We haven't generated a summary for this paper yet.