Open Access. Powered by Scholars. Published by Universities.®

Education Commons

Open Access. Powered by Scholars. Published by Universities.®

Educational Assessment, Evaluation, and Research

James Madison University

IRT

Publication Year

Articles 1 - 3 of 3

Full-Text Articles in Education

Considerations In S-Χ2: Rest Score Or Summed Score, Priors, And Violations Of Normality, Christine E. Demars, Derek Sauder Apr 2019

Considerations In S-Χ2: Rest Score Or Summed Score, Priors, And Violations Of Normality, Christine E. Demars, Derek Sauder

Department of Graduate Psychology - Faculty Scholarship

The S-χ2 item fit index is one of the few item fit indices that appears to maintain accurate Type I error rates. This study explored grouping examinees by the rest score or summed score, prior distributions for the item parameters, and the shape of the ability distribution. Type I error was slightly closer to the nominal level for the total-score S-χ2 for the longest tests, but power was higher for the rest-score S-χ2 in every condition where power was < 1. Prior distributions reduced the proportion of estimates with extreme standard errors but slightly inflated the Type I error rates in some conditions. When the ability distribution was not normally distributed, integrating over an empirically-estimated distribution yielded Type I error rates closer to the nominal value than integrating over a normal distribution.


Multilevel Irt: When Is Local Independence Violated?, Christine E. Demars, Jessica Jacovidis Apr 2016

Multilevel Irt: When Is Local Independence Violated?, Christine E. Demars, Jessica Jacovidis

Department of Graduate Psychology - Faculty Scholarship

Calibration data often is often collected within schools. This illustration shows that random school effects for ability do not bias IRT parameter estimates or their standard errors. However, random school effects for item difficulty lead to bias in item discrimination estimates and inflated standard errors for difficulty and ability.


Equating Multiple Forms Of A Competency Test: An Item Response Theory Approach, Christine E. Demars Jun 2002

Equating Multiple Forms Of A Competency Test: An Item Response Theory Approach, Christine E. Demars

Department of Graduate Psychology - Faculty Scholarship

A competency test was developed to assess students' skills in using electronic library resources. Because all students were required to pass the test, and had multiple opportunities to do so, multiple test forms were desired. Standards had been set on the original form, and minor differences in form difficulty needed to be taken into account. Students were randomly administered one of six new test forms; each form contained the original items and 12 pilot items which were different on each form. The pilot items were then calibrated to the metric of the original items and incorporated in two additional operational …