Question Generation for Adaptive Assessment for Student Knowledge Modeling in Probabilistic Domains Nabila Khodeir 1 , Nayer Wanas 1 , Nevin Darwish 2 , and Nadia Hegazy 1 1 Informatics Dept., Electronics Research Institute, Tahrir St., Giza, Egypt 2 Dept. of Computer Engineering, Faculty of Engineering, Cairo University, Giza, Egypt nabila khodier@yahoo.com,(nwanas, ndarwish)@ieee.org, nhegazy@mcit.gov.eg Abstract. In this paper a question generation approach for adaptive assessment is purposed to estimate the student knowledge model in a probabilistic domain within an intelligent tutoring system. Assessing questions are generated adap- tively according to the student knowledge based on two factors (i) the student misconceptions that are entailed in the student knowledge model, and (ii) the in- formation gain maximizing. Updating and verification of the student model are conducted based on the matching between the student’s and model answers to assessing questions. Comparison between using the adapted questions and ran- dom questions is investigated. Results suggest that utilizing adapted generated questions increases the approximation accuracy of the student model by 40% in addition to decreasing of the required assessing questions by 50% compared to using fixed questions. Key words: Student Modeling, Adaptive Assessment, Item Response Theory 1 Introduction Most adaptive tutoring systems that model the student knowledge deal with deter- ministic domains models. However, in the real world, a degree of uncertainty is inher- ent, which requires the use of probabilistic models to represent such domains. Bayesian networks (BN) are amongst the most widely used approaches to represent such do- mains [1]. Suebnukam et al. presented an example of modeling the student knowledge in probabilistic domains [2]. In their study, a modeling algorithm that focuses on the skill of reasoning through domain variables relations around practical patient problems in medical domains is suggested. This work suggests a student knowledge modeling algorithm that utilizes diagnostic skill. Using diagnostic questions allows inferring, not only the student knowledge, but also the student misconceptions with utilizing lower number of questions. Generally, modeling of the student knowledge process depends on analysis of the student responses through assessing questions. Recently, trends in Intelligent Tutoring Systems (ITS) utilize Computer Adaptive Testing technology (CAT) in building of the student model [3,4]. CAT technology considers the student knowledge and the ques- tion difficulty level to adapt the selection of the next assessing questions. That aims to decreasing of the number of required assessing questions in addition to increasing