Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

LocalDrop: A Hybrid Regularization for Deep Neural Networks (2103.00719v1)

Published 1 Mar 2021 in cs.LG, cs.AI, and stat.ML

Abstract: In neural networks, developing regularization algorithms to settle overfitting is one of the major study areas. We propose a new approach for the regularization of neural networks by the local Rademacher complexity called LocalDrop. A new regularization function for both fully-connected networks (FCNs) and convolutional neural networks (CNNs), including drop rates and weight matrices, has been developed based on the proposed upper bound of the local Rademacher complexity by the strict mathematical deduction. The analyses of dropout in FCNs and DropBlock in CNNs with keep rate matrices in different layers are also included in the complexity analyses. With the new regularization function, we establish a two-stage procedure to obtain the optimal keep rate matrix and weight matrix to realize the whole training model. Extensive experiments have been conducted to demonstrate the effectiveness of LocalDrop in different models by comparing it with several algorithms and the effects of different hyperparameters on the final performances.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Ziqing Lu (7 papers)
  2. Chang Xu (325 papers)
  3. Bo Du (264 papers)
  4. Takashi Ishida (11 papers)
  5. Lefei Zhang (64 papers)
  6. Masashi Sugiyama (286 papers)
Citations (14)

Summary

We haven't generated a summary for this paper yet.