Papers
Topics
Authors
Recent
2000 character limit reached

Structured and Balanced Multi-Component and Multi-Layer Neural Networks

Published 30 Jun 2024 in cs.LG, cs.NA, cs.NE, math.NA, and stat.ML | (2407.00765v2)

Abstract: In this work, we propose a balanced multi-component and multi-layer neural network (MMNN) structure to accurately and efficiently approximate functions with complex features, in terms of both degrees of freedom and computational cost. The main idea is inspired by a multi-component approach, in which each component can be effectively approximated by a single-layer network, combined with a multi-layer decomposition strategy to capture the complexity of the target function. Although MMNNs can be viewed as a simple modification of fully connected neural networks (FCNNs) or multi-layer perceptrons (MLPs) by introducing balanced multi-component structures, they achieve a significant reduction in training parameters, a much more efficient training process, and improved accuracy compared to FCNNs or MLPs. Extensive numerical experiments demonstrate the effectiveness of MMNNs in approximating highly oscillatory functions and their ability to automatically adapt to localized features.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Paper to Video (Beta)

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 0 likes about this paper.