Submitted for presentation at the International Conference on Intelligent User Interfaces. Redondo Beach, CA: January 5-8, 1999. 1 Adaptiveness Unpr e dic t a b ili ty Workload Or a u w a u w Increased human manage- ment role yields increased human workload Increased automation man- agement role yields increased unpredictability to human Figure 1a. Conceptual view of the relationship between sys- tem adaptiveness, human workload and unpredictability. “Tasking” Interfaces for Flexible Interaction with Automa- tion: Keeping the Operator in Control Christopher A. Miller, Michael Pelican and Robert Goldman Honeywell Technology Center 3660 Technology Dr. Minneapolis, MN 55418 U.S.A. cmiller, pelican, goldman@htc.honeywell.com ABSTRACT The ongoing debate in the HCI community between direct manipulation and intelligent, automated agents points to a fundamental problem in complex systems. Humans want to remain in charge even if they don’t want to (or can’t) make every action and decision themselves. We have been ex- ploring a middle road through the development of “tasking interfaces”—interfaces which share a task model with a projective planning system to enable human operators to flexibly “call plays” (that is, stipulate plans) at various lev- els of abstraction, leaving the remainder of the plan to be fleshed out by the planning system. The result is akin to ‘tasking’ a knowledgeable subordinate to whom one can give more or less detailed instructions. We describe a pro- totype tasking interface for developing mission plans for Uninhabited Combat Air Vehicles (UCAVs). Keywords Tasking Interface, Mixed Initiative Planning and Control, Task Model, Hierarchical Task Network Planning, Unin- habited Combat Air Vehicles INTRODUCTION As systems become more complex, there is increasing temptation to control them via “automation” [e.g., 1]— either in the form of subsystems which fully perform the task, or as ‘decision aids’ which provide guidance but leave the final execution to the human. In spite of extensive tech- nological achievements in automation technology (as well as some spectacular failures), experience has consistently shown that advanced automation suffers from a basic socio- logical problem. Human operators of complex systems want to remain in charge. For example, in developing the Rotorcraft Pilot’s Associate Cockpit Information Manager [10], we multiple pilots and designers to develop a consen- sus list of prioritized goals for a “good” cockpit configura- tion manager. Two of the top three items on the list were “Pilot remains in charge of task allocation” and “Pilot re- mains in charge of information presented.” There are good reasons to design and use systems at the higher levels of automation. By definition [17], such sys- tems share responsibility, authority and autonomy over many work behaviors with human operator(s) to accomplish their goals of reducing operator workload and information overload. While operators may wish to remain in charge, and it is critical that they do so, today’s complex systems no longer permit them to be fully in charge of all system opera- tions—at least not in the same way as in earlier cockpits and workstations. Conceptually, the problem can be presented as in Figure 1a. This figure shows the relationship between the adaptiveness of a human-machine system as a function of the workload or unpredictability it causes for the human operator. This view implies that for any increase in adaptiveness (the abil- ity of the human-machine system to perform in an appropri- ate, context-dependent manner across situations) there must be an accompanying increase in one or both of the other two legs of the triangle. Either human workload (the amount of physical, attentional or cognitive “energy” the human must exert to use the system) or unpredictability (in- ability of the human to know what the automation will do at any given time) must increase. Since adaptiveness is generally the goal of added complex- ity (though systems can be complex without achieving it), this is equivalent to saying that any increase in human- machine system complexity must affect the human operator in two ways—either (1) the added complexity must be fully controlled by the human, resulting in increases in workload,