Papers
Topics
Authors
Recent
2000 character limit reached

A New Constructive Method to Optimize Neural Network Architecture and Generalization (1302.0324v1)

Published 2 Feb 2013 in cs.NE

Abstract: In this paper, after analyzing the reasons of poor generalization and overfitting in neural networks, we consider some noise data as a singular value of a continuous function - jump discontinuity point. The continuous part can be approximated with the simplest neural networks, which have good generalization performance and optimal network architecture, by traditional algorithms such as constructive algorithm for feed-forward neural networks with incremental training, BP algorithm, ELM algorithm, various constructive algorithm, RBF approximation and SVM. At the same time, we will construct RBF neural networks to fit the singular value with every error in, and we prove that a function with jumping discontinuity points can be approximated by the simplest neural networks with a decay RBF neural networks in by each error, and a function with jumping discontinuity point can be constructively approximated by a decay RBF neural networks in by each error and the constructive part have no generalization influence to the whole machine learning system which will optimize neural network architecture and generalization performance, reduce the overfitting phenomenon by avoid fitting the noisy data.

Citations (3)

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Paper to Video (Beta)

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.