Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Gegenbauer Neural Network with Regularized Weights Direct Determination for Classification (1910.11552v1)

Published 25 Oct 2019 in cs.LG and stat.ML

Abstract: Single-hidden layer feed forward neural networks (SLFNs) are widely used in pattern classification problems, but a huge bottleneck encountered is the slow speed and poor performance of the traditional iterative gradient-based learning algorithms. Although the famous extreme learning machine (ELM) has successfully addressed the problems of slow convergence, it still has computational robustness problems brought by input weights and biases randomly assigned. Thus, in order to overcome the aforementioned problems, in this paper, a novel type neural network based on Gegenbauer orthogonal polynomials, termed as GNN, is constructed and investigated. This model could overcome the computational robustness problems of ELM, while still has comparable structural simplicity and approximation capability. Based on this, we propose a regularized weights direct determination (R-WDD) based on equality-constrained optimization to determine the optimal output weights. The R-WDD tends to minimize the empirical risks and structural risks of the network, thus to lower the risk of over fitting and improve the generalization ability. This leads us to a the final GNN with R-WDD, which is a unified learning mechanism for binary and multi-class classification problems. Finally, as is verified in the various comparison experiments, GNN with R-WDD tends to have comparable (or even better) generalization performances, computational scalability and efficiency, and classification robustness, compared to least square support vector machine (LS-SVM), ELM with Gaussian kernel.

Summary

We haven't generated a summary for this paper yet.