This paper was published in the International Council on Systems Engineering (INCOSE) publication Insight in the December 2018 edition, Volume 21, Issue 4 Cognitive Bias: A Game Changer for Decision Management? Scott Jackson, PhD (jackson@burnhamsystems.net) There are things happening in the world of psychology that may affect the way decision management is viewed and practiced. Some of them are not intuitive. What sent this development into action was the awarding of the Nobel prize to two psychologists, Daniel Kahneman and Richard Thaler. Kahneman’s surprise finding described in (Kahneman 2011, 411) was that people in general have a limited ability to make good decisions. This finding raised a few eyebrows. (Thaler and Sunstein 2008) followed up with the conclusion that people can be influenced to make better decisions. So how does all of this fit into the (INCOSE 2015, 110) statement that decisions should be “objective”? Of course, it is a lofty goal that all decisions should be objective, but is that always possible, and if not how can then be made objective when our brains are not wired to do so? Rationality and Cognitive Bias. According to (Kahneman 2011, 411) the test for whether a person is rational is not whether a person’s beliefs are reasonable but “whether they are internally consistent.” Thus even rational decisions may lead to undesirable results. The culprit in these decisions is a psychological phenomenon called cognitive bias, a term which is almost unknown in the systems engineering world. According to (Haselton, Nettle, and Andrews 2005, 2), a cognitive bias represents a situation in which “human cognition reliably produces representations that are systematically distorted compared to some aspect of objective reality.” So Hazleton et al’s definition does not imply that people ignore facts; however, they may misinterpret the facts due to stress or preconceived ideas. In the popular culture, these preconceived ideas are often called mind sets. Hence, cognitive bias is a mental state that prevents us from making better decisions. There are hundreds of cognitive biases in the literature. One of the more well known ones if the confirmation bias which according to (2011) (p. 81) “favors the uncritical acceptance of suggestions and exaggeration of the likelihood of extreme and improbable events.” The Systems Engineering View. So why should systems engineers be interested in cognitive bias and how should they treat it if they find it? First of all, according to (INCOSE 2015, 110-112) decision management is one of the principal processes in systems engineering. Decisions are important at every stage of the system life cycle including design, development, and operation. So if any of these decisions are influenced by cognitive bias, do we have a good, or even adequate, system? Decisions involve architecture and trade-offs. Finally decisions occur at all stages of operation, for example, to launch a space craft or take off an airplane. These decisions involve circumstances, such as the environment and whether there are other aircraft on the runway. So if these decisions are, for example, too early or too late, what might be the consequences? (Kahneman 2011) and (Thaler and Sunstein 2008) provide many examples of everyday decisions, such as whether to go on a diet or what to pay for a suit. But in the systems engineering world, decisions may be of more concern, such as launching a space craft. So can it be assumed that the same cognitive biases that apply to going on a diet also apply to launching a space craft? It is reasonable to assume that they do.