Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 73 tok/s
Gemini 2.5 Pro 42 tok/s Pro
GPT-5 Medium 39 tok/s Pro
GPT-5 High 31 tok/s Pro
GPT-4o 85 tok/s Pro
Kimi K2 202 tok/s Pro
GPT OSS 120B 464 tok/s Pro
Claude Sonnet 4.5 34 tok/s Pro
2000 character limit reached

Regularization for convolutional kernel tensors to avoid unstable gradient problem in convolutional neural networks (2102.04294v1)

Published 5 Feb 2021 in cs.LG, cs.NA, and math.NA

Abstract: Convolutional neural networks are very popular nowadays. Training neural networks is not an easy task. Each convolution corresponds to a structured transformation matrix. In order to help avoid the exploding/vanishing gradient problem, it is desirable that the singular values of each transformation matrix are not large/small in the training process. We propose three new regularization terms for a convolutional kernel tensor to constrain the singular values of each transformation matrix. We show how to carry out the gradient type methods, which provides new insight about the training of convolutional neural networks.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.