Nudge to Health: Harnessing Decision Research to Promote Health Behavior Meng Li 1 * and Gretchen B. Chapman 2 1 University of Colorado Denver 2 Rutgers University Abstract We review selected literature that examines how biases in decision making can be utilized to encourage individual health behavior, such as vaccination, and promote policy decisions, such as resource allocation. These studies use simple interventions to nudge people towards more optimal health decisions without restricting decision-makers’ freedom of choice. Examples include framing effects, defaults, implementation intentions, position effects, social norm, incentives, and emotions. We suggest that nudges are an effective way to promote healthy behavior. Research on health behavior indicates that, too often, individuals fail to do what is good for them - by engaging in risky behavior and neglecting to take preventive measures. Meanwhile, research on decision making indicates that decision makers are often irratio- nal, or biased, in the sense that their behavior deviates systematically from normative principles that would maximize their goal satisfaction, or utility (Baron, 2000; Baron, 2004). It may seem that humanity is fated to be both unhealthy and irrational. However, in the current paper we explore whether decision biases can be exploited to make us healthier. That is, we investigate whether psychological research about decision making can be harnessed to improve health behavior and health outcomes. We review selected recent research that addresses this question. Human beings are not perfect decision makers. Too often, they use mental short cuts or rules of thumb instead of making a comprehensive cost benefits analysis to arrive at a decision. Our tendency to conserve cognitive effort means that a majority of our daily decisions are made using System 1 thinking (fast, automatic, based on intuition, gut feel- ings, rules and heuristics), from which route to drive to work, what we put in our morn- ing coffee, to what to order for lunch dinner. In contrast, some complicated decisions usually elicit System 2 thinking (slow, systematic, deliberative, based on reason and calcu- lation), such as which stock performs better, and what kind of mortgage loans offers the best financial outcome (Kahneman, 2003). The field of judgment and decision making has traditionally examined biases and errors, which are common by-product of our reliance on System 1 thinking. A well- known example of such biases and errors is the framing effect: two normatively equiva- lent descriptions of the same decision often lead to systematically different choices. In Tversky and Kahneman’s (1981) classic demonstration, participants were presented with a description of the outbreak of a new Asian disease that was expected to kill 600 people. Two strategies for combating the disease were described. For participants in the gain frame condition, the strategies were framed in terms of lives saved (save 200 for sure versus save 600 with a 1 3 probability and none with a 2 3 probability). For participants in the loss frame condition, the strategies were framed in terms of lives lost (400 die for Social and Personality Psychology Compass 7/3 (2013): 187–198, 10.1111/spc3.12019 ª 2013 Blackwell Publishing Ltd