Robust Bayesian framework for modeling psychophysical tasks Peter W. Battaglia 1 and Paul R. Schrater 1 1 University of Minnesota Bayesian models of perception and action offer powerful tools for studying behavior and have achieved notable successes when combined with experimental investigation [1]. By representing sensory noise, motor imprecision, and internal processing strategies using parameters in a principled model, they may be quantified and empirically measured. One major challenge facing this approach is that even for simple behaviors, Bayesian models can be quite complex and have many parameters. Conducting a single experiment may not distinguish between ambiguous sets of parameter values, analogous to being unable to solve for two unknowns in one algebraic equation. Many studies make simplifying assumptions to effectively reduce the number of unknowns, and/or conduct baseline experiments to increase the number of effective equations. In an attempt to formalize such strategies, we propose a robust Bayesian framework in which perception and action are explicitly separated in the model structure. This renders many complex models experimentally tractable because multiple experiments can be integrated to jointly infer parameter values. We apply our methodology to a set of psychophysical experiments on human object size and distance perception to demonstrate how otherwise confounded parameters can be measured. Classical decision-making models [2] treat observers’ responses as functions of sensory measurements (decision rules). In contrast, our approach assumes the brain uses sensory measurements to compute general beliefs about the world state, which are then used to select responses. Assuming the perception and action (decision) components of behavior are distinct is a natural and common scientific practice; many experimental investigations implicitly make this assumption. For example, measuring discrimination thresholds independently to predict some composite behavior (like cue integration) assumes stimulus discrimination is constant regardless of task. Our framework specifies two brain processing components that lead to any psychophysical response: using sensory data to infer a posterior distribution over world states followed by the application of a loss function and selection of a response that maximizes the expected reward given the posterior and loss function. The relationships between sensory measurements and the posterior distribution are expressed as conditional distributions with perceptual parameters; the relationships between the posterior distribution and action decision are conditional distributions with action parameters. The novelty of our proposal is to explicitly assume that perceptual parameters are constant, while action parameters vary, across tasks. Merging data from multiple tasks can be accomplished by controlling for specific action parameters while coupling perceptual parameters. In this manner perceptual parameters can be estimated through standard methods like maximum-likelihood estimation or full Bayesian analysis. We apply our approach to a novel spatial perception psychophysical study to illustrate its utility in testing complex Bayesian models. Acknowledgments This work was supported by NIH grant R01EY015261 and a NSF Graduate Student Fellowship. References [1] Knill DC, Pouget A (2004) The Bayesian brain: The role of uncertainty in neural coding and computation for perception and action, Trends in Neuroscience, 27(12): 712-719. [2] Green DM, Swets JM (1966) Signal detection theory and psychophysics. New York: John Wiley.