Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Biologically-Informed Excitatory and Inhibitory Balance for Robust Spiking Neural Network Training (2404.15627v1)

Published 24 Apr 2024 in cs.NE and cs.ET

Abstract: Spiking neural networks drawing inspiration from biological constraints of the brain promise an energy-efficient paradigm for artificial intelligence. However, challenges exist in identifying guiding principles to train these networks in a robust fashion. In addition, training becomes an even more difficult problem when incorporating biological constraints of excitatory and inhibitory connections. In this work, we identify several key factors, such as low initial firing rates and diverse inhibitory spiking patterns, that determine the overall ability to train spiking networks with various ratios of excitatory to inhibitory neurons on AI-relevant datasets. The results indicate networks with the biologically realistic 80:20 excitatory:inhibitory balance can reliably train at low activity levels and in noisy environments. Additionally, the Van Rossum distance, a measure of spike train synchrony, provides insight into the importance of inhibitory neurons to increase network robustness to noise. This work supports further biologically-informed large-scale networks and energy efficient hardware implementations.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (40)
  1. Pfeiffer, M., Pfeil, T.: Deep Learning With Spiking Neurons: Opportunities and Challenges. Frontiers in Neuroscience 12 (2018) [3] The new NeuroAI. Nature Machine Intelligence 6(3), 245–245 (2024) https://doi.org/10.1038/s42256-024-00826-6 Rodarie et al. [2022] Rodarie, D., Verasztó, C., Roussel, Y., Reimann, M., Keller, D., Ramaswamy, S., Markram, H., Gewaltig, M.-O.: A method to estimate the cellular composition of the mouse brain from heterogeneous datasets. PLOS Computational Biology 18(12), 1010739 (2022) https://doi.org/10.1371/journal.pcbi.1010739 Alreja et al. [2022] Alreja, A., Nemenman, I., Rozell, C.J.: Constrained brain volume in an efficient coding model explains the fraction of excitatory and inhibitory neurons in sensory cortices. PLOS Computational Biology 18(1), 1009642 (2022) https://doi.org/10.1371/journal.pcbi.1009642 Rossbroich et al. [2022] Rossbroich, J., Gygax, J., Zenke, F.: Fluctuation-driven initialization for spiking neural network training. Neuromorphic Computing and Engineering 2(4), 044016 (2022) https://doi.org/10.1088/2634-4386/ac97bb van Rossum [2001] van Rossum, M.C.W.: A Novel Spike Distance. Neural Computation 13(4), 751–763 (2001) https://doi.org/10.1162/089976601300014321 Xiao et al. [2017] Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv (2017). https://doi.org/10.48550/arXiv.1708.07747 Cramer et al. [2022] Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z The new NeuroAI. Nature Machine Intelligence 6(3), 245–245 (2024) https://doi.org/10.1038/s42256-024-00826-6 Rodarie et al. [2022] Rodarie, D., Verasztó, C., Roussel, Y., Reimann, M., Keller, D., Ramaswamy, S., Markram, H., Gewaltig, M.-O.: A method to estimate the cellular composition of the mouse brain from heterogeneous datasets. PLOS Computational Biology 18(12), 1010739 (2022) https://doi.org/10.1371/journal.pcbi.1010739 Alreja et al. [2022] Alreja, A., Nemenman, I., Rozell, C.J.: Constrained brain volume in an efficient coding model explains the fraction of excitatory and inhibitory neurons in sensory cortices. PLOS Computational Biology 18(1), 1009642 (2022) https://doi.org/10.1371/journal.pcbi.1009642 Rossbroich et al. [2022] Rossbroich, J., Gygax, J., Zenke, F.: Fluctuation-driven initialization for spiking neural network training. Neuromorphic Computing and Engineering 2(4), 044016 (2022) https://doi.org/10.1088/2634-4386/ac97bb van Rossum [2001] van Rossum, M.C.W.: A Novel Spike Distance. Neural Computation 13(4), 751–763 (2001) https://doi.org/10.1162/089976601300014321 Xiao et al. [2017] Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv (2017). https://doi.org/10.48550/arXiv.1708.07747 Cramer et al. [2022] Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Rodarie, D., Verasztó, C., Roussel, Y., Reimann, M., Keller, D., Ramaswamy, S., Markram, H., Gewaltig, M.-O.: A method to estimate the cellular composition of the mouse brain from heterogeneous datasets. PLOS Computational Biology 18(12), 1010739 (2022) https://doi.org/10.1371/journal.pcbi.1010739 Alreja et al. [2022] Alreja, A., Nemenman, I., Rozell, C.J.: Constrained brain volume in an efficient coding model explains the fraction of excitatory and inhibitory neurons in sensory cortices. PLOS Computational Biology 18(1), 1009642 (2022) https://doi.org/10.1371/journal.pcbi.1009642 Rossbroich et al. [2022] Rossbroich, J., Gygax, J., Zenke, F.: Fluctuation-driven initialization for spiking neural network training. Neuromorphic Computing and Engineering 2(4), 044016 (2022) https://doi.org/10.1088/2634-4386/ac97bb van Rossum [2001] van Rossum, M.C.W.: A Novel Spike Distance. Neural Computation 13(4), 751–763 (2001) https://doi.org/10.1162/089976601300014321 Xiao et al. [2017] Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv (2017). https://doi.org/10.48550/arXiv.1708.07747 Cramer et al. [2022] Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alreja, A., Nemenman, I., Rozell, C.J.: Constrained brain volume in an efficient coding model explains the fraction of excitatory and inhibitory neurons in sensory cortices. PLOS Computational Biology 18(1), 1009642 (2022) https://doi.org/10.1371/journal.pcbi.1009642 Rossbroich et al. [2022] Rossbroich, J., Gygax, J., Zenke, F.: Fluctuation-driven initialization for spiking neural network training. Neuromorphic Computing and Engineering 2(4), 044016 (2022) https://doi.org/10.1088/2634-4386/ac97bb van Rossum [2001] van Rossum, M.C.W.: A Novel Spike Distance. Neural Computation 13(4), 751–763 (2001) https://doi.org/10.1162/089976601300014321 Xiao et al. [2017] Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv (2017). https://doi.org/10.48550/arXiv.1708.07747 Cramer et al. [2022] Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Rossbroich, J., Gygax, J., Zenke, F.: Fluctuation-driven initialization for spiking neural network training. Neuromorphic Computing and Engineering 2(4), 044016 (2022) https://doi.org/10.1088/2634-4386/ac97bb van Rossum [2001] van Rossum, M.C.W.: A Novel Spike Distance. Neural Computation 13(4), 751–763 (2001) https://doi.org/10.1162/089976601300014321 Xiao et al. [2017] Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv (2017). https://doi.org/10.48550/arXiv.1708.07747 Cramer et al. [2022] Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z van Rossum, M.C.W.: A Novel Spike Distance. Neural Computation 13(4), 751–763 (2001) https://doi.org/10.1162/089976601300014321 Xiao et al. [2017] Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv (2017). https://doi.org/10.48550/arXiv.1708.07747 Cramer et al. [2022] Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv (2017). https://doi.org/10.48550/arXiv.1708.07747 Cramer et al. [2022] Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  2. The new NeuroAI. Nature Machine Intelligence 6(3), 245–245 (2024) https://doi.org/10.1038/s42256-024-00826-6 Rodarie et al. [2022] Rodarie, D., Verasztó, C., Roussel, Y., Reimann, M., Keller, D., Ramaswamy, S., Markram, H., Gewaltig, M.-O.: A method to estimate the cellular composition of the mouse brain from heterogeneous datasets. PLOS Computational Biology 18(12), 1010739 (2022) https://doi.org/10.1371/journal.pcbi.1010739 Alreja et al. [2022] Alreja, A., Nemenman, I., Rozell, C.J.: Constrained brain volume in an efficient coding model explains the fraction of excitatory and inhibitory neurons in sensory cortices. PLOS Computational Biology 18(1), 1009642 (2022) https://doi.org/10.1371/journal.pcbi.1009642 Rossbroich et al. [2022] Rossbroich, J., Gygax, J., Zenke, F.: Fluctuation-driven initialization for spiking neural network training. Neuromorphic Computing and Engineering 2(4), 044016 (2022) https://doi.org/10.1088/2634-4386/ac97bb van Rossum [2001] van Rossum, M.C.W.: A Novel Spike Distance. Neural Computation 13(4), 751–763 (2001) https://doi.org/10.1162/089976601300014321 Xiao et al. [2017] Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv (2017). https://doi.org/10.48550/arXiv.1708.07747 Cramer et al. [2022] Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Rodarie, D., Verasztó, C., Roussel, Y., Reimann, M., Keller, D., Ramaswamy, S., Markram, H., Gewaltig, M.-O.: A method to estimate the cellular composition of the mouse brain from heterogeneous datasets. PLOS Computational Biology 18(12), 1010739 (2022) https://doi.org/10.1371/journal.pcbi.1010739 Alreja et al. [2022] Alreja, A., Nemenman, I., Rozell, C.J.: Constrained brain volume in an efficient coding model explains the fraction of excitatory and inhibitory neurons in sensory cortices. PLOS Computational Biology 18(1), 1009642 (2022) https://doi.org/10.1371/journal.pcbi.1009642 Rossbroich et al. [2022] Rossbroich, J., Gygax, J., Zenke, F.: Fluctuation-driven initialization for spiking neural network training. Neuromorphic Computing and Engineering 2(4), 044016 (2022) https://doi.org/10.1088/2634-4386/ac97bb van Rossum [2001] van Rossum, M.C.W.: A Novel Spike Distance. Neural Computation 13(4), 751–763 (2001) https://doi.org/10.1162/089976601300014321 Xiao et al. [2017] Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv (2017). https://doi.org/10.48550/arXiv.1708.07747 Cramer et al. [2022] Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alreja, A., Nemenman, I., Rozell, C.J.: Constrained brain volume in an efficient coding model explains the fraction of excitatory and inhibitory neurons in sensory cortices. PLOS Computational Biology 18(1), 1009642 (2022) https://doi.org/10.1371/journal.pcbi.1009642 Rossbroich et al. [2022] Rossbroich, J., Gygax, J., Zenke, F.: Fluctuation-driven initialization for spiking neural network training. Neuromorphic Computing and Engineering 2(4), 044016 (2022) https://doi.org/10.1088/2634-4386/ac97bb van Rossum [2001] van Rossum, M.C.W.: A Novel Spike Distance. Neural Computation 13(4), 751–763 (2001) https://doi.org/10.1162/089976601300014321 Xiao et al. [2017] Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv (2017). https://doi.org/10.48550/arXiv.1708.07747 Cramer et al. [2022] Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Rossbroich, J., Gygax, J., Zenke, F.: Fluctuation-driven initialization for spiking neural network training. Neuromorphic Computing and Engineering 2(4), 044016 (2022) https://doi.org/10.1088/2634-4386/ac97bb van Rossum [2001] van Rossum, M.C.W.: A Novel Spike Distance. Neural Computation 13(4), 751–763 (2001) https://doi.org/10.1162/089976601300014321 Xiao et al. [2017] Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv (2017). https://doi.org/10.48550/arXiv.1708.07747 Cramer et al. [2022] Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z van Rossum, M.C.W.: A Novel Spike Distance. Neural Computation 13(4), 751–763 (2001) https://doi.org/10.1162/089976601300014321 Xiao et al. [2017] Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv (2017). https://doi.org/10.48550/arXiv.1708.07747 Cramer et al. [2022] Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv (2017). https://doi.org/10.48550/arXiv.1708.07747 Cramer et al. [2022] Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  3. Rodarie, D., Verasztó, C., Roussel, Y., Reimann, M., Keller, D., Ramaswamy, S., Markram, H., Gewaltig, M.-O.: A method to estimate the cellular composition of the mouse brain from heterogeneous datasets. PLOS Computational Biology 18(12), 1010739 (2022) https://doi.org/10.1371/journal.pcbi.1010739 Alreja et al. [2022] Alreja, A., Nemenman, I., Rozell, C.J.: Constrained brain volume in an efficient coding model explains the fraction of excitatory and inhibitory neurons in sensory cortices. PLOS Computational Biology 18(1), 1009642 (2022) https://doi.org/10.1371/journal.pcbi.1009642 Rossbroich et al. [2022] Rossbroich, J., Gygax, J., Zenke, F.: Fluctuation-driven initialization for spiking neural network training. Neuromorphic Computing and Engineering 2(4), 044016 (2022) https://doi.org/10.1088/2634-4386/ac97bb van Rossum [2001] van Rossum, M.C.W.: A Novel Spike Distance. Neural Computation 13(4), 751–763 (2001) https://doi.org/10.1162/089976601300014321 Xiao et al. [2017] Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv (2017). https://doi.org/10.48550/arXiv.1708.07747 Cramer et al. [2022] Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alreja, A., Nemenman, I., Rozell, C.J.: Constrained brain volume in an efficient coding model explains the fraction of excitatory and inhibitory neurons in sensory cortices. PLOS Computational Biology 18(1), 1009642 (2022) https://doi.org/10.1371/journal.pcbi.1009642 Rossbroich et al. [2022] Rossbroich, J., Gygax, J., Zenke, F.: Fluctuation-driven initialization for spiking neural network training. Neuromorphic Computing and Engineering 2(4), 044016 (2022) https://doi.org/10.1088/2634-4386/ac97bb van Rossum [2001] van Rossum, M.C.W.: A Novel Spike Distance. Neural Computation 13(4), 751–763 (2001) https://doi.org/10.1162/089976601300014321 Xiao et al. [2017] Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv (2017). https://doi.org/10.48550/arXiv.1708.07747 Cramer et al. [2022] Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Rossbroich, J., Gygax, J., Zenke, F.: Fluctuation-driven initialization for spiking neural network training. Neuromorphic Computing and Engineering 2(4), 044016 (2022) https://doi.org/10.1088/2634-4386/ac97bb van Rossum [2001] van Rossum, M.C.W.: A Novel Spike Distance. Neural Computation 13(4), 751–763 (2001) https://doi.org/10.1162/089976601300014321 Xiao et al. [2017] Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv (2017). https://doi.org/10.48550/arXiv.1708.07747 Cramer et al. [2022] Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z van Rossum, M.C.W.: A Novel Spike Distance. Neural Computation 13(4), 751–763 (2001) https://doi.org/10.1162/089976601300014321 Xiao et al. [2017] Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv (2017). https://doi.org/10.48550/arXiv.1708.07747 Cramer et al. [2022] Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv (2017). https://doi.org/10.48550/arXiv.1708.07747 Cramer et al. [2022] Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  4. Alreja, A., Nemenman, I., Rozell, C.J.: Constrained brain volume in an efficient coding model explains the fraction of excitatory and inhibitory neurons in sensory cortices. PLOS Computational Biology 18(1), 1009642 (2022) https://doi.org/10.1371/journal.pcbi.1009642 Rossbroich et al. [2022] Rossbroich, J., Gygax, J., Zenke, F.: Fluctuation-driven initialization for spiking neural network training. Neuromorphic Computing and Engineering 2(4), 044016 (2022) https://doi.org/10.1088/2634-4386/ac97bb van Rossum [2001] van Rossum, M.C.W.: A Novel Spike Distance. Neural Computation 13(4), 751–763 (2001) https://doi.org/10.1162/089976601300014321 Xiao et al. [2017] Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv (2017). https://doi.org/10.48550/arXiv.1708.07747 Cramer et al. [2022] Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Rossbroich, J., Gygax, J., Zenke, F.: Fluctuation-driven initialization for spiking neural network training. Neuromorphic Computing and Engineering 2(4), 044016 (2022) https://doi.org/10.1088/2634-4386/ac97bb van Rossum [2001] van Rossum, M.C.W.: A Novel Spike Distance. Neural Computation 13(4), 751–763 (2001) https://doi.org/10.1162/089976601300014321 Xiao et al. [2017] Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv (2017). https://doi.org/10.48550/arXiv.1708.07747 Cramer et al. [2022] Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z van Rossum, M.C.W.: A Novel Spike Distance. Neural Computation 13(4), 751–763 (2001) https://doi.org/10.1162/089976601300014321 Xiao et al. [2017] Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv (2017). https://doi.org/10.48550/arXiv.1708.07747 Cramer et al. [2022] Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv (2017). https://doi.org/10.48550/arXiv.1708.07747 Cramer et al. [2022] Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  5. Rossbroich, J., Gygax, J., Zenke, F.: Fluctuation-driven initialization for spiking neural network training. Neuromorphic Computing and Engineering 2(4), 044016 (2022) https://doi.org/10.1088/2634-4386/ac97bb van Rossum [2001] van Rossum, M.C.W.: A Novel Spike Distance. Neural Computation 13(4), 751–763 (2001) https://doi.org/10.1162/089976601300014321 Xiao et al. [2017] Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv (2017). https://doi.org/10.48550/arXiv.1708.07747 Cramer et al. [2022] Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z van Rossum, M.C.W.: A Novel Spike Distance. Neural Computation 13(4), 751–763 (2001) https://doi.org/10.1162/089976601300014321 Xiao et al. [2017] Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv (2017). https://doi.org/10.48550/arXiv.1708.07747 Cramer et al. [2022] Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv (2017). https://doi.org/10.48550/arXiv.1708.07747 Cramer et al. [2022] Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  6. van Rossum, M.C.W.: A Novel Spike Distance. Neural Computation 13(4), 751–763 (2001) https://doi.org/10.1162/089976601300014321 Xiao et al. [2017] Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv (2017). https://doi.org/10.48550/arXiv.1708.07747 Cramer et al. [2022] Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv (2017). https://doi.org/10.48550/arXiv.1708.07747 Cramer et al. [2022] Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  7. Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv (2017). https://doi.org/10.48550/arXiv.1708.07747 Cramer et al. [2022] Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  8. Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (2022) https://doi.org/10.1109/TNNLS.2020.3044364 Neftci et al. [2019] Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  9. Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36(6), 51–63 (2019) https://doi.org/10.1109/MSP.2019.2931595 Sanchez-Aguilera et al. [2021] Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  10. Sanchez-Aguilera, A., Wheeler, D.W., Jurado-Parras, T., Valero, M., Nokia, M.S., Cid, E., Fernandez-Lamo, I., Sutton, N., García-Rincón, D., Prida, L.M., Ascoli, G.A.: An update to Hippocampome.org by integrating single-cell phenotypes with circuit function in vivo. PLOS Biology 19(5), 3001213 (2021) https://doi.org/10.1371/journal.pbio.3001213 Harris et al. [2012] Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  11. Harris, J.J., Jolivet, R., Attwell, D.: Synaptic Energy Use and Supply. Neuron 75(5), 762–777 (2012) https://doi.org/10.1016/j.neuron.2012.08.019 Grossi et al. [2016] Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  12. Grossi, A., Nowak, E., Zambelli, C., Pellissier, C., Bernasconi, S., Cibrario, G., El Hajjam, K., Crochemore, R., Nodin, J.F., Olivo, P., Perniola, L.: Fundamental variability limits of filament-based RRAM. In: 2016 IEEE International Electron Devices Meeting (IEDM), pp. 4–71474. IEEE, San Francisco, CA, USA (2016). https://doi.org/10.1109/IEDM.2016.7838348 Garbin [2015] Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  13. Garbin, D.: A variability study of PCM and OxRAM technologies for use as synapses in neuromorphic systems. PhD thesis, Université Grenoble Alpes (2015) Fishell and Kepecs [2020] Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  14. Fishell, G., Kepecs, A.: Interneuron Types as Attractors and Controllers. Annual Review of Neuroscience 43(Volume 43, 2020), 1–30 (2020) https://doi.org/10.1146/annurev-neuro-070918-050421 Komendantov et al. [2019] Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  15. Komendantov, A.O., Venkadesh, S., Rees, C.L., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Quantitative firing pattern phenotyping of hippocampal neuron types. Scientific Reports 9(1), 17915 (2019) https://doi.org/10.1038/s41598-019-52611-w Wamsley and Fishell [2017] Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  16. Wamsley, B., Fishell, G.: Genetic and activity-dependent mechanisms underlying interneuron diversity. Nature Reviews Neuroscience 18(5), 299–309 (2017) https://doi.org/10.1038/nrn.2017.30 Izhikevich [2007] Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  17. Izhikevich, E.M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Computational Neuroscience. MIT Press, Cambridge, Mass (2007) Venkadesh et al. [2019] Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  18. Venkadesh, S., Komendantov, A.O., Wheeler, D.W., Hamilton, D.J., Ascoli, G.A.: Simple models of quantitative firing phenotypes in hippocampal neurons: Comprehensive coverage of intrinsic diversity. PLOS Computational Biology 15(10), 1007462 (2019) https://doi.org/10.1371/journal.pcbi.1007462 Teeter et al. [2018] Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  19. Teeter, C., Iyer, R., Menon, V., Gouwens, N., Feng, D., Berg, J., Szafer, A., Cain, N., Zeng, H., Hawrylycz, M., Koch, C., Mihalas, S.: Generalized leaky integrate-and-fire models classify multiple neuron types. Nature Communications 9(1), 709 (2018) https://doi.org/10.1038/s41467-017-02717-4 Gast et al. [2024] Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  20. Gast, R., Solla, S.A., Kennedy, A.: Neural heterogeneity controls computations in spiking neural networks. Proceedings of the National Academy of Sciences 121(3), 2311885121 (2024) https://doi.org/10.1073/pnas.2311885121 Denève and Machens [2016] Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  21. Denève, S., Machens, C.K.: Efficient codes and balanced networks. Nature Neuroscience 19(3), 375–382 (2016) https://doi.org/10.1038/nn.4243 Ding et al. [2023] Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  22. Ding, N., Qin, Y., Yang, G., Wei, F., Yang, Z., Su, Y., Hu, S., Chen, Y., Chan, C.-M., Chen, W., Yi, J., Zhao, W., Wang, X., Liu, Z., Zheng, H.-T., Chen, J., Liu, Y., Tang, J., Li, J., Sun, M.: Parameter-efficient fine-tuning of large-scale pre-trained language models. Nature Machine Intelligence 5(3), 220–235 (2023) https://doi.org/10.1038/s42256-023-00626-4 Wheeler et al. [2024] Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  23. Wheeler, D.W., Kopsick, J.D., Sutton, N., Tecuatl, C., Komendantov, A.O., Nadella, K., Ascoli, G.A.: Hippocampome.org 2.0 is a knowledge base enabling data-driven spiking neural network simulations of rodent hippocampal circuits. eLife 12, 90597 (2024) https://doi.org/10.7554/eLife.90597 Kopsick et al. [2023] Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  24. Kopsick, J.D., Tecuatl, C., Moradi, K., Attili, S.M., Kashyap, H.J., Xing, J., Chen, K., Krichmar, J.L., Ascoli, G.A.: Robust Resting-State Dynamics in a Large-Scale Spiking Neural Network Model of Area CA3 in the Mouse Hippocampus. Cognitive Computation 15(4), 1190–1210 (2023) https://doi.org/10.1007/s12559-021-09954-2 Wang et al. [2020] Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  25. Wang, Z., Wu, H., Burr, G.W., Hwang, C.S., Wang, K.L., Xia, Q., Yang, J.J.: Resistive switching materials for information processing. Nature Reviews Materials 5(3), 173–195 (2020) https://doi.org/10.1038/s41578-019-0159-3 Adam et al. [2018] Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  26. Adam, G.C., Khiat, A., Prodromakis, T.: Challenges hindering memristive neuromorphic hardware from going mainstream. Nature Communications 9(1), 5267 (2018) https://doi.org/10.1038/s41467-018-07565-4 Song et al. [2024] Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  27. Song, W., Rao, M., Li, Y., Li, C., Zhuo, Y., Cai, F., Wu, M., Yin, W., Li, Z., Wei, Q., Lee, S., Zhu, H., Gong, L., Barnell, M., Wu, Q., Beerel, P.A., Chen, M.S.-W., Ge, N., Hu, M., Xia, Q., Yang, J.J.: Programming memristor arrays with arbitrarily high precision for analog computing. Science 383(6685), 903–910 (2024) https://doi.org/10.1126/science.adi9405 Alibart et al. [2012] Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  28. Alibart, F., Gao, L., Hoskins, B.D., Strukov, D.B.: High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23(7), 075201 (2012) https://doi.org/10.1088/0957-4484/23/7/075201 Wang et al. [2016] Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  29. Wang, R., Thakur, C.S., Hamilton, T.J., Tapson, J., Van Schaik, A.: A stochastic approach to STDP. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2082–2085. IEEE, Montréal, QC, Canada (2016). https://doi.org/10.1109/ISCAS.2016.7538989 Kass et al. [2014] Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  30. Kass, R.E., Eden, U.T., Brown, E.N.: Analysis of Neural Data, 1st edn. Springer Series in Statistics, (2014) Gerstner et al. [2014] Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  31. Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. University Press, Cambridge (2014) Cariani [1999] Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  32. Cariani, P.: Temporal Coding of Periodicity Pitch in the Auditory System: An Overview. Neural Plasticity 6(4), 147–172 (1999) https://doi.org/10.1155/NP.1999.147 Cariani [2004] Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  33. Cariani, P.A.: Temporal codes and computations for sensory representation and scene analysis. IEEE Transactions on Neural Networks 15(5), 1100–1111 (2004) https://doi.org/10.1109/TNN.2004.833305 Zenke and Ganguli [2018] Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  34. Zenke, F., Ganguli, S.: SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30(6), 1514–1541 (2018) https://doi.org/10.1162/neco_a_01086 Kreuz et al. [2007] Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  35. Kreuz, T., Haas, J.S., Morelli, A., Abarbanel, H.D.I., Politi, A.: Measuring spike train synchrony. Journal of Neuroscience Methods 165(1), 151–161 (2007) https://doi.org/10.1016/j.jneumeth.2007.05.031 Dayan and Abbott [2001] Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  36. Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Computational Neuroscience. Massachusetts Institute of Technology Press, Cambridge, Mass (2001) Wu et al. [2022] Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  37. Wu, G., Liang, D., Luan, S., Wang, J.: Training Spiking Neural Networks for Reinforcement Learning Tasks With Temporal Coding Method. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.877701 Bittar and Garner [2022] Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  38. Bittar, A., Garner, P.N.: A surrogate gradient spiking baseline for speech command recognition. Frontiers in Neuroscience 16 (2022) https://doi.org/10.3389/fnins.2022.865897 Yin et al. [2021] Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  39. Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence 3(10), 905–913 (2021) https://doi.org/10.1038/s42256-021-00397-w Narkhede et al. [2022] Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z
  40. Narkhede, M.V., Bartakke, P.P., Sutaone, M.S.: A review on weight initialization strategies for neural networks. Artificial Intelligence Review 55(1), 291–322 (2022) https://doi.org/10.1007/s10462-021-10033-z

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com