Papers
Topics
Authors
Recent
Search
2000 character limit reached

All-In-One: Artificial Association Neural Networks

Published 31 Oct 2021 in cs.AI | (2111.00424v8)

Abstract: Most deep learning models are limited to specific datasets or tasks because of network structures using fixed layers. In this paper, we discuss the differences between existing neural networks and real human neurons, propose association networks to connect existing models, and describe multiple types of deep learning exercises performed using a single structure. Further, we propose a new neural data structure that can express all basic models of existing neural networks in a tree structure. We also propose an approach in which information propagates from leaf to a root node using the proposed recursive convolution approach (i.e., depth-first convolution) and feed-forward propagation is performed. Thus, we design a data-based,'' as opposed to amodel-based,'' neural network. In experiments conducted, we compared the learning performances of the models specializing in specific domains with those of models simultaneously learning various domains using an association network. The model learned well without significant performance degradation compared to that for models performing individual learning. In addition, the performance results were similar to those of the special case models; the output of the tree contained all information from the tree. Finally, we developed a theory for using arbitrary input data and learning all data simultaneously.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.