Hyperparameter optimization for neural-network models in molecular communication
Develop systematic hyperparameter optimization methods for neural-network architectures applied to molecular communication, including the tuning of the number of nodes, hidden layers, learning rate, and related design parameters, and quantify their impact on performance beyond ad hoc manual selection.
References
Reported methods in the literature just selected these without further discussion; the impact of such hyperparameters and their optimization (see Supplementary Material, Section X) remains an open topic.
— Communicating Smartly in the Molecular Domain: Neural Networks in the Internet of Bio-Nano Things
(2506.20589 - Gómez et al., 25 Jun 2025) in Subsection Hyperparameter Tunning, Section 3 (Neural Networks as Enablers of IoBNT Networks)