Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
157 tokens/sec
GPT-4o
43 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Noise-robust chemical reaction networks training artificial neural networks (2410.11919v1)

Published 15 Oct 2024 in q-bio.MN, cs.ET, and physics.chem-ph

Abstract: Artificial neural networks (NNs) can be implemented using chemical reaction networks (CRNs), where the concentrations of species act as inputs and outputs. In such biochemical computing, noise-robust computing is crucial due to the intrinsic and extrinsic noise present in chemical reactions. Previously suggested CRNs for feed-forward networks often utilized the rectified linear unit (ReLU) or discrete activation functions. However, one concern in this case is the discontinuities of the derivatives of those non-smooth functions, which can cause significant noise disruption during backpropagation. In this study, we propose a CRN that performs both feed-forward and training processes using smooth activation functions to avoid discontinuities in the backpropagation. All reactions occur in a single pot, and the reactions for training are bimolecular. Our case studies on XOR, Iris, MNIST datasets, and a non-linear regression model demonstrate that computation via the CRN (i) maintains accuracy despite noise in the reaction rates and the concentration of species and (ii) is insensitive to the choice of the running time and the magnitude of the noise in comparison to NNs with a non-smooth activation function. This work presents a noise-robust CRN for full NN computation, including backpropagation, paving the way for more stable and efficient biochemical computing systems.

Summary

We haven't generated a summary for this paper yet.