Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

ASNI: Adaptive Structured Noise Injection for shallow and deep neural networks (1909.09819v1)

Published 21 Sep 2019 in stat.ML and cs.LG

Abstract: Dropout is a regularisation technique in neural network training where unit activations are randomly set to zero with a given probability \emph{independently}. In this work, we propose a generalisation of dropout and other multiplicative noise injection schemes for shallow and deep neural networks, where the random noise applied to different units is not independent but follows a joint distribution that is either fixed or estimated during training. We provide theoretical insights on why such adaptive structured noise injection (ASNI) may be relevant, and empirically confirm that it helps boost the accuracy of simple feedforward and convolutional neural networks, disentangles the hidden layer representations, and leads to sparser representations. Our proposed method is a straightforward modification of the classical dropout and does not require additional computational overhead.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Beyrem Khalfaoui (2 papers)
  2. Joseph Boyd (5 papers)
  3. Jean-Philippe Vert (41 papers)
Citations (3)

Summary

We haven't generated a summary for this paper yet.