Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Introducing Noise in Decentralized Training of Neural Networks (1809.10678v1)

Published 27 Sep 2018 in cs.LG and stat.ML

Abstract: It has been shown that injecting noise into the neural network weights during the training process leads to a better generalization of the resulting model. Noise injection in the distributed setup is a straightforward technique and it represents a promising approach to improve the locally trained models. We investigate the effects of noise injection into the neural networks during a decentralized training process. We show both theoretically and empirically that noise injection has no positive effect in expectation on linear models, though. However for non-linear neural networks we empirically show that noise injection substantially improves model quality helping to reach a generalization ability of a local model close to the serial baseline.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Linara Adilova (16 papers)
  2. Nathalie Paul (3 papers)
  3. Peter Schlicht (22 papers)
Citations (5)

Summary

We haven't generated a summary for this paper yet.