Dice Question Streamline Icon: https://streamlinehq.com

Hyperparameter optimization for neural-network models in molecular communication

Develop systematic hyperparameter optimization methods for neural-network architectures applied to molecular communication, including the tuning of the number of nodes, hidden layers, learning rate, and related design parameters, and quantify their impact on performance beyond ad hoc manual selection.

Information Square Streamline Icon: https://streamlinehq.com

Background

The paper notes that current molecular communication studies employing neural networks typically select hyperparameters (e.g., architecture depth, neuron counts, learning rates) manually without principled optimization or analysis of their downstream effects. This undermines reproducibility and may yield suboptimal models, especially in channels with significant inter-symbol interference or time variability.

A dedicated optimization framework tailored to MC tasks would support better model performance and generalization, enabling fair comparisons across architectures and guiding practitioners in resource-constrained nanoscale settings.

References

Reported methods in the literature just selected these without further discussion; the impact of such hyperparameters and their optimization (see Supplementary Material, Section X) remains an open topic.

Communicating Smartly in the Molecular Domain: Neural Networks in the Internet of Bio-Nano Things (2506.20589 - Gómez et al., 25 Jun 2025) in Subsection Hyperparameter Tunning, Section 3 (Neural Networks as Enablers of IoBNT Networks)