Papers
Topics
Authors
Recent
2000 character limit reached

An image-computable model of speeded decision-making (2403.16382v2)

Published 25 Mar 2024 in q-bio.NC

Abstract: Evidence accumulation models (EAMs) are the dominant framework for modeling response time (RT) data from speeded decision-making tasks. While providing a good quantitative description of RT data in terms of abstract perceptual representations, EAMs do not explain how the visual system extracts these representations in the first place. To address this limitation, we introduce the visual accumulator model (VAM), in which convolutional neural network models of visual processing and traditional EAMs are jointly fitted to trial-level RTs and raw (pixel-space) visual stimuli from individual subjects in a unified Bayesian framework. Models fitted to large-scale cognitive training data from a stylized flanker task captured individual differences in congruency effects, RTs, and accuracy. We find evidence that the selection of task-relevant information occurs through the orthogonalization of relevant and irrelevant representations, demonstrating how our framework can be used to relate visual representations to behavioral outputs. Together, our work provides a probabilistic framework for both constraining neural network models of vision with behavioral data and studying how the visual system extracts representations that guide decisions.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (88)
  1. “Intrinsic dimension of data representations in deep neural networks” In Adv. Neural Inf. Process. Syst. 32, 2019
  2. “Deep convolutional networks do not classify based on global object shape” In PLoS Comput. Biol. 14.12, 2018, pp. e1006613
  3. Boaz M. Ben-David, Ami Eidels and Chris Donkin “Effects of Aging and Distractors on Detection of Redundant Visual Targets and Capacity: Do Older Adults Integrate Visual Targets Differently than Younger Adults?” In PLoS One 9.12, 2014, pp. e113551
  4. “The Geometry of Abstraction in the Hippocampus and Prefrontal Cortex” In Cell 183.4, 2020, pp. 954–967.e21
  5. “Deep Problems with Neural Network Models of Human Vision” In Behav. Brain Sci., 2022, pp. 1–74
  6. “JAX: composable transformations of Python+NumPy programs”, 2018 URL: http://github.com/google/jax
  7. “Gradual progression from sensory to task-related processing in cerebral cortex” In Proc. Natl. Acad. Sci. U. S. A. 115.30, 2018, pp. E7202–E7211
  8. Scott D. Brown and Andrew Heathcote “The simplest complete model of choice response time: Linear ballistic accumulation” In Cogn. Psychol. 57.3, 2008, pp. 153–178
  9. Jonathan D Cohen, David Servan-Schreiber and James L McClelland “A Parallel Distributed Processing Approach to Automaticity” In Am. J. Psychol. 105.2, 1992, pp. 239
  10. Thomas M. Cover “Geometrical and Statistical Properties of Systems of Linear Inequalities with Applications in Pattern Recognition” In IEEE Trans. Electron. Comput. EC-14.3, 1965, pp. 326–334
  11. “Efficient Selection Between Hierarchical Cognitive Models: Cross-Validation With Variational Bayes” In Psychol. Methods, 2022
  12. “ImageNet: A large-scale hierarchical image database” In Proc. IEEE Conf. Comput. Vis. Pattern Recognit., 2009, pp. 248–255
  13. “Models that learn how humans learn: The case of decision-making and its disorders” In PLoS Comput. Biol. 15.6, 2019, pp. e1006903
  14. James J. DiCarlo, Davide Zoccolan and Nicole C. Rust “How Does the Brain Solve Visual Object Recognition?” In Neuron 73.3, 2012, pp. 415–434
  15. “An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale” In International Conference on Learning Representations, 2021
  16. Barbara A. Eriksen and Charles W. Eriksen “Effects of noise letters upon the identification of a target letter in a nonsearch task” In Percept. Psychophys. 16, 1974, pp. 143–149
  17. Nathan J. Evans and Eric-Jan Wagenmakers “Evidence Accumulation Models: Current Limitations and Future Directions” In Quant. Meth. Psychol. 16.2 TQMP, 2020, pp. 73–90
  18. “Harmonizing the object recognition strategies of deep neural networks with humans” In Adv. Neural Inf. Process. Syst. 35, 2022, pp. 9432–9446
  19. “Orthogonal representations for robust context-dependent task performance in brains and neural networks” In Neuron 110.7, 2022, pp. 1258–1270.e11
  20. “The Speed-Accuracy Tradeoff in the Elderly Brain: A Structural Model-Based Approach” In J. Neurosci. 31.47, 2011, pp. 17242–17249
  21. “A theory of multineuronal dimensionality, dynamics and measurement” bioRxiv preprint at https://doi.org/10.1101/214262, 2017
  22. “Computing a human-like reaction time metric from stable recurrent vision models” In Adv. Neural Inf. Process. Syst. 36, 2023, pp. 14338–14365
  23. Robert Gottsdanker “Age and Simple Reaction Time” In J. Gerontol. 37.3, 1982, pp. 342–348
  24. Umut Güçlü and Marcel A.J. Gerven “Deep Neural Networks Reveal a Gradient in the Complexity of Neural Representations across the Ventral Stream” In J. Neurosci. 35.27 Society for Neuroscience, 2015, pp. 10005–10014
  25. “New estimation approaches for the hierarchical Linear Ballistic Accumulator model” In J. Math. Psychol. 96, 2020, pp. 102368
  26. Konrad Heidler “Augmax”, 2022 URL: https://github.com/khdlr/augmax
  27. “Summit: Scaling Deep Learning Interpretability by Visualizing Activation and Attribution Summarizations” In IEEE Trans. Vis. Comput. Graph. 26.1 USA: IEEE Educational Activities Department, 2020, pp. 1096–1106
  28. “Fast Readout of Object Identity from Macaque Inferior Temporal Cortex” In Science 310.5749, 2005, pp. 863–866
  29. “Modelling human behaviour in cognitive tasks with latent dynamical systems” In Nat. Hum. Behav. 7.6, 2023, pp. 986–1000
  30. Ritske De Jong, Chia-Chin Liang and Erick Lauber “Conditional and Unconditional Automaticity: A Dual-Process Model of Effects of Spatial Stimulus–Response Correspondence” In J. Exp. Psychol. Hum. Percept. Perform. 20.4, 1994, pp. 731–750
  31. Camille Jordan “Essai sur la géométrie à n𝑛nitalic_n dimensions” In Bulletin de la Société Mathématique de France 3 Société mathématique de France, 1875, pp. 103–174
  32. “Cortical activity in the null space: permitting preparation without movement” In Nat. Neurosci. 17.3, 2014, pp. 440–448
  33. Diederik P Kingma and Max Welling “Auto-Encoding Variational Bayes”, 2013 arXiv:1312.6114
  34. Diederik P. Kingma and Jimmy Ba “Adam: A Method for Stochastic Optimization”, 2017 arXiv:1412.6980 [cs.LG]
  35. Diederik P. Kingma, Tim Salimans and Max Welling “Variational Dropout and the Local Reparameterization Trick”, 2015 arXiv:1506.02557 [stat.ML]
  36. “Self-Normalizing Neural Networks”, 2017 arXiv:1706.02515 [cs.LG]
  37. Nikolaus Kriegeskorte “Deep Neural Networks: A New Framework for Modeling Biological Vision and Brain Information Processing” In Annu. Rev. Vis. Sci. 1.1, 2015, pp. 417–446
  38. “Automatic Differentiation Variational Inference” In J. Mach. Learn. Res. 18.14, 2017, pp. 1–45
  39. “Anytime Prediction as a Model of Human Reaction Time”, 2020 arXiv:0902.0885
  40. Alexandra Libby and Timothy J. Buschman “Rotational dynamics reduce interference between sensory and memory representations” In Nat. Neurosci. 24.5, 2021, pp. 715–726
  41. Grace W. Lindsay “Convolutional Neural Networks as a Model of the Visual System: Past, Present, and Future” In J. Cogn. Neurosci. 33.10, 2021, pp. 2017–2031
  42. “What are the Visual Features Underlying Human Versus Machine Vision?” In 2017 IEEE International Conference on Computer Vision Workshops (ICCVW), 2017, pp. 2706–2714
  43. “Modified leaky competing accumulator model of decision making with multiple alternatives: the Lie-algebraic approach” In Sci. Rep. 11.1, 2021, pp. 10923
  44. Gaurav Malhotra, Marin Dujmović and Jeffrey S. Bowers “Feature blindness: A challenge for understanding and modelling visual object recognition” In PLoS Comput. Biol. 18.5, 2022, pp. e1009572
  45. “Context-dependent computation by recurrent dynamics in prefrontal cortex” In Nature 503.7474, 2013, pp. 78–84
  46. Miriam L.R. Meister, Jay A. Hennig and Alexander C. Huk “Signal Multiplexing and Single-Neuron Computations in Lateral Intraparietal Area During Decision-Making” In J. Neurosci. 33.6, 2013, pp. 2254–2267
  47. “Prune and distill: similar reformatting of image information along rat visual cortex and deep neural networks” In Adv. Neural Inf. Process. Syst., 2022
  48. Daniel J. Navarro and Ian G. Fuss “Fast and accurate calculations for first-passage times in Wiener diffusion models” In J. Math. Psychol. 53.4, 2009, pp. 222–230
  49. “Task-Driven Convolutional Recurrent Models of the Visual System”, 2018 arXiv:1807.00053
  50. Ted Nettelbeck and Patrick M.A. Rabbitt “Aging, cognitive performance, and mental speed” In Intelligence 16.2, 1992, pp. 189–205
  51. “Signals in inferotemporal and perirhinal cortex suggest an untangling of visual target information” In Nat. Neurosci. 16.8, 2013, pp. 1132–1139
  52. Matthew F. Panichello and Timothy J. Buschman “Shared mechanisms underlie the control of working memory and attention” In Nature 592.7855, 2021, pp. 601–605
  53. Vardan Papyan, X.Y. Han and David L. Donoho “Prevalence of neural collapse during the terminal phase of deep learning training” In Proc. Natl. Acad. Sci. U. S. A. 117.40, 2020, pp. 24652–24663
  54. “Scikit-learn: Machine Learning in Python” In J. Mach. Learn. Res. 12, 2011, pp. 2825–2830
  55. “RTNet: A neural network that exhibits the signatures of human perceptual decision making” bioRxiv preprint at https://doi.org/10.1101/2022.08.23.505015, 2022
  56. “Feature learning in deep classifiers through Intermediate Neural Collapse” In Proc. Mach. Learn. Res. 202, 2023, pp. 28729–28745
  57. R Ratcliff, A Thapar and G McKoon “The effects of aging on reaction time in a signal detection task.” In Psychol. Aging 16.2, 2001
  58. Roger Ratcliff “A theory of memory retrieval” In Psychol. Rev. 85.2, 1978, pp. 59–108
  59. “The Diffusion Decision Model: Theory and Data for Two-Choice Decision Tasks” In Neural Comput. 20.4, 2008, pp. 873–922
  60. Roger Ratcliff and Jeffrey N. Rouder “Modeling Response Times for Two-Choice Decisions” In Psychol. Sci. 9.5, 1998, pp. 347–356
  61. Danilo Jimenez Rezende, Shakir Mohamed and Daan Wierstra “Stochastic Backpropagation and Approximate Inference in Deep Generative Models”, 2014 arXiv:1401.4082
  62. K Richard Ridderinkhof “Activation and suppression in conflict tasks: empirical clarification through distributional analyses” In Common Mechanisms in Perception and Action: Attention and Performance XIX Oxford University Press, 2002
  63. Richard K. Ridderinkhof “Micro- and macro-adjustments of task set: activation and suppression in conflict tasks” In Psychol. Res. 66.4, 2002, pp. 312–323
  64. “The importance of mixed selectivity in complex cognitive tasks” In Nature 497.7451, 2013, pp. 585–590
  65. “Orthogonal neural encoding of targets and distractors supports multivariate cognitive control” In Nat. Hum. Behav., 2024, pp. 1–17
  66. Nicole C Rust and James J DiCarlo “Selectivity and Tolerance (“Invariance”) Both Increase as Visual Information Propagates from Cortical Area V4 to IT” In J. Neurosci. 30.39, 2010, pp. 12978–12995
  67. Mathieu Servant and Nathan J. Evans “A Diffusion Model Analysis of the Effects of Aging in the Flanker Task” In Psychol. Aging 35.6, 2020, pp. 831–849
  68. P.Y. Simard, D. Steinkraus and J.C. Platt “Best practices for convolutional neural networks applied to visual document analysis” In Seventh International Conference on Document Analysis and Recognition, 2003. Proceedings., 2003, pp. 958–963
  69. J.Richard Simon “Effect of an auditory stimulus on the processing of a visual stimulus under single- and dual-tasks conditions” In Acta Psychol. 51.1, 1982, pp. 61–73
  70. “Very Deep Convolutional Networks for Large-Scale Image Recognition”, 2015 arXiv:1409.1556 [cs.CV]
  71. “Recurrent neural networks can explain flexible trading of speed and accuracy in biological vision” In PLoS Comput. Biol. 16.10, 2020, pp. e1008215
  72. “A large-scale analysis of task switching practice effects across the lifespan” In Proc. Natl. Acad. Sci. U. S. A. 116.36, 2019, pp. 17735–17740
  73. E.J. Stoffels and M.W.van der Molen “Effects of visual and auditory noise on visual choice reaction time in a continuous-flow paradigm” In Percept. Psychophys. 44.1, 1988, pp. 7–14
  74. J.R. Stroop “Studies of interference in serial verbal reactions” In J. Exp. Psychol. 18.6, 1935, pp. 643–662
  75. “A neural network that finds a naturalistic solution for the production of muscle activity” In Nat. Neurosci. 18.7, 2015, pp. 1025–1033
  76. “Emergence of transformation-tolerant representations of visual objects in rat lateral extrastriate cortex” In eLife 6, 2017, pp. e22794
  77. J.Eric T. Taylor, Shashank Shekhar and Graham W. Taylor “Neural response time analysis: Explainable artificial intelligence using only a stopwatch” In Appl. AI Lett. 2.4, 2021
  78. “Automatic and controlled stimulus processing in conflict tasks: Superimposed diffusion processes and delta functions” In Cogn. Psychol. 78, 2015, pp. 148–174
  79. Dmitry Ulyanov, Andrea Vedaldi and Victor Lempitsky “Instance Normalization: The Missing Ingredient for Fast Stylization”, 2017 arXiv:1607.08022 [cs.CV]
  80. Marius Usher and James L. McClelland “The Time Course of Perceptual Choice: The Leaky, Competing Accumulator Model” In Psychol. Rev. 108.3, 2001, pp. 550–592
  81. “Flexible timing by temporal scaling of cortical responses” In Nat. Neurosci. 21.1, 2018, pp. 102–110
  82. Corey N. White, Roger Ratcliff and Jeffrey J. Starns “Diffusion models of the flanker task: Discrete versus gradual attentional selection” In Cogn. Psychol. 63.4, 2011, pp. 210–238
  83. “To Head or to Heed? Beyond the Surface of Selective Action Inhibition: A Review” In Front. Hum. Neurosci. 4, 2010, pp. 222
  84. “Geometry of sequence working memory in macaque prefrontal cortex” In Science 375.6581, 2022, pp. 632–639
  85. Daniel L K Yamins and James J DiCarlo “Using goal-driven deep learning models to understand sensory cortex” In Nat. Neurosci. 19.3, 2016, pp. 356–365
  86. “Performance-optimized hierarchical models predict neural responses in higher visual cortex” In Proc. Natl. Acad. Sci. U. S. A. 111.23, 2014, pp. 8619–8624
  87. “Task representations in neural networks trained to perform many cognitive tasks” In Nat. Neurosci. 22.2, 2019, pp. 297–306
  88. “Angles between subspaces and their tangents” In J. Numer. Math. 21.4 Walter de Gruyter GmbH, 2013

Summary

We haven't generated a summary for this paper yet.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We found no open problems mentioned in this paper.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

Sign up for free to view the 4 tweets with 38 likes about this paper.