Pergamon
Expert Systems With Applications, Vol. 9, No. 1, pp. 27-34, 1995
Copyright © 1995 Elsevier Science Ltd
Printed in the USA. All rights reserved
0957-4174/95 $9.50 + .00
0957-4174(94) 00045-X
Automated Knowledge Acquisition by Reasoning Failures
YOUNG-TACK PARK
School of Computing,SoongSil University, Seoul, South Korea
AbstractBAutomated knowledge acquisition is viewed as a problem in modeling a knowledge
engineer's introspective capabilities. We formulate a computational model of automated knowledge
acquisition by modeling such introspective debugging actions. We propose that an automated
knowledge acquisition system should be provided with an explicit model of performance-failure
explanation mechanisms, and show that linking the expectations of the knowledge-based system to
the model enables the knowledge acquisition program to determine what parts of the domain
knowledge base are responsible for observed performance failures. The knowledge acquisition
process is failure driven and is guided by the explanations of failures. Generating explanations of
bugs in the knowledge base is perceived as the abductive as well as model-based process. We use
an explanation apprentice that analyzes the faulty behavior of the knowledge-based system and
answers a broad range of questions of the knowledge acquisition system. A training-and-test experi-
ment using the knowledge acquisition system increases the performance of the knowledge-based
system of the training and the testing set by 31% and 51%, respectively.
1. INTRODUCTION
KNOWLEDGE ACQUISITION is the process of eliciting the
expertise of domain experts and of formulating the
expertise into a representation that can be used by a
knowledge-based system. Knowledge engineers who
build the knowledge-based system, as middlemen, me-
diate the interaction between the domain experts and
the knowledge-based system to operationalize the ex-
pertise of the domain experts. A goal of automated
knowledge acquisition research, computer-mediated
knowledge acquisition, has been to replace the knowl-
edge engineer with an intelligent assistant so that do-
main experts can enter their knowledge directly into
knowledge-based systems (Davis, 1979; Eshelman &
McDermott, 1987; Marcus, 1988; Porter, Bareiss,
& Holte, 1990). Consequently, automated knowledge
acquisition is perceived as a problem in knowledge
transfer.
The view of the knowledge acquisition as knowl-
edge transfer, however, does not account for the cre-
ative aspects of the knowledge acquisition process
(Clancey, 1989; Gruber, 1989; Musen, 1989). Knowl-
edge engineers have creative expertise in debugging
knowledge-based systems. Knowledge acquisition is a
knowledge intensive task that requires integration of
Requests for reprints should be sent to Young-Tack Park, Soong Sil
University, School of Computing, Seoul, South Korea.
knowledge about the problem solving model and the
generic debugging expertise. Recent knowledge-based
systems model the reasoning of domain experts. We
extend this concept to automated knowledge acquisi-
tion. In this context, the automated knowledge acquisi-
tion system is viewed as a model of the :knowledge
engineer's reasoning when debugging a knowledge-
based system.
When using the model of a knowledge engineer's
debugging expertise, the knowledge acquisition pro-
gram can insulate domain experts from details of the
structure and representation of the knowledge-based
system. Domain experts need not familiarize them-
selves with complex artificial intelligence concepts.
Knowledge acquisition programs without such a model
assume the domain experts can understand the details
of knowledge-based systems, such as backward chain-
ing (Davis, 1979) or constraint satisfaction (Marcus,
1988). These assumptions have constituted a major
stumbling block in knowledge acquisition.
Our apprenticeship learning research (Wilkins,
1988) has demonstrated that expert systems can model
the reasoning of a physician doing medical diagnosis.
ODYSSEUS (Wilkins, 1988) is a knowledge acquisi-
tion system that refines an incomplete knowledge base
by modeling an expert's problem solving and ex-
ploiting the use of an explicit strategy knowledge. We
extend this method to model the knowledge: engineer's
reasoning when debugging knowledge-based systems.
27