Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Bucketed PCA Neural Networks with Neurons Mirroring Signals (2108.00605v1)

Published 2 Aug 2021 in cs.LG, cs.AI, math.OC, and stat.ML

Abstract: The bucketed PCA neural network (PCA-NN) with transforms is developed here in an effort to benchmark deep neural networks (DNN's), for problems on supervised classification. Most classical PCA models apply PCA to the entire training data set to establish a reductive representation and then employ non-network tools such as high-order polynomial classifiers. In contrast, the bucketed PCA-NN applies PCA to individual buckets which are constructed in two consecutive phases, as well as retains a genuine architecture of a neural network. This facilitates a fair apple-to-apple comparison to DNN's, esp. to reveal that a major chunk of accuracy achieved by many impressive DNN's could possibly be explained by the bucketed PCA-NN (e.g., 96% out of 98% for the MNIST data set as an example). Compared with most DNN's, the three building blocks of the bucketed PCA-NN are easier to comprehend conceptually - PCA, transforms, and bucketing for error correction. Furthermore, unlike the somewhat quasi-random neurons ubiquitously observed in DNN's, the PCA neurons resemble or mirror the input signals and are more straightforward to decipher as a result.

Summary

We haven't generated a summary for this paper yet.