Revisiting and Extending the Item Difficulty Effect Model
Abstract
Data collected by learning environments and online courses contains many potentially useful features, but traditionally many of these are ignored when modeling students. One feature that could use further examination is item difficulty. In their KT-IDEM model, Pardos and Heffernan proposed the use of question templates to differentiate guess and slip rates in knowledge tracing based on the difficulty of the template-here, we examine extensions and variations of that model. We propose two new models that differentiate based on template-one in which the learn rate is differentiated and another in which learn, guess, and slip parameters all depend on template. We compare these two new models to knowledge tracing and KT-IDEM. We also propose a generalization of IDEM in which, rather than individual templates, we differentiate between multiple choice and short answer questions and compare this model to traditional knowledge tracing and IDEM. We test these models using data from ASSISTments, an open online learning environment used in many middle and high school classrooms throughout the United States.
BibTeX
@workshop{Schultz-2013-126570,author = {Sarah Schultz and Trenton Tabor},
title = {Revisiting and Extending the Item Difficulty Effect Model},
booktitle = {Proceedings of AIED '13 1st Workshop on Massive Open Online Courses (MOOC '13)},
year = {2013},
month = {June},
pages = {33 - 40},
}