Papers
Topics
Authors
Recent
Search
2000 character limit reached

Brain experiments imply adaptation mechanisms which outperform common AI learning algorithms

Published 23 Apr 2020 in q-bio.NC and physics.bio-ph | (2005.04106v1)

Abstract: Attempting to imitate the brain functionalities, researchers have bridged between neuroscience and artificial intelligence for decades; however, experimental neuroscience has not directly advanced the field of machine learning. Here, using neuronal cultures, we demonstrate that increased training frequency accelerates the neuronal adaptation processes. This mechanism was implemented on artificial neural networks, where a local learning step-size increases for coherent consecutive learning steps and tested on a simple dataset of handwritten digits, MNIST. Based on our online learning results with a few handwriting examples, success rates for brain-inspired algorithms substantially outperform the commonly used machine learning algorithms. We speculate this emerging bridge from slow brain function to machine learning will promote ultrafast decision making under limited examples, which is the reality in many aspects of human activity, robotic control, and network optimization.

Citations (15)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.