Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Improved Generalization of Weight Space Networks via Augmentations (2402.04081v2)

Published 6 Feb 2024 in cs.LG and cs.AI

Abstract: Learning in deep weight spaces (DWS), where neural networks process the weights of other neural networks, is an emerging research direction, with applications to 2D and 3D neural fields (INRs, NeRFs), as well as making inferences about other types of neural networks. Unfortunately, weight space models tend to suffer from substantial overfitting. We empirically analyze the reasons for this overfitting and find that a key reason is the lack of diversity in DWS datasets. While a given object can be represented by many different weight configurations, typical INR training sets fail to capture variability across INRs that represent the same object. To address this, we explore strategies for data augmentation in weight spaces and propose a MixUp method adapted for weight spaces. We demonstrate the effectiveness of these methods in two setups. In classification, they improve performance similarly to having up to 10 times more data. In self-supervised contrastive learning, they yield substantial 5-10% gains in downstream classification.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Aviv Shamsian (23 papers)
  2. Aviv Navon (23 papers)
  3. Yan Zhang (954 papers)
  4. Ethan Fetaya (46 papers)
  5. Gal Chechik (110 papers)
  6. Haggai Maron (61 papers)
  7. David W. Zhang (13 papers)
Citations (5)

Summary

We haven't generated a summary for this paper yet.