Open Access. Powered by Scholars. Published by Universities.®

Education Commons

Open Access. Powered by Scholars. Published by Universities.®

Educational Assessment, Evaluation, and Research

Dr Luc Tu Le

Rasch model

Publication Year
File Type

Articles 1 - 3 of 3

Full-Text Articles in Education

Reduce Randomly Guessing Effects In A University Generic Skills Test, Luc T. Le Jun 2018

Reduce Randomly Guessing Effects In A University Generic Skills Test, Luc T. Le

Dr Luc Tu Le

There frequent concern of guessing with multiple-choice questions (MCQ) in item response theory (IRT), particularly with the Rach model where a guessing parameter is not included.  A practical solution is done post-hoc by removing responses to items too hard for a low ability examinee (see Andrich, et al., 2011; RUMM 2030, Andrich et al., 2012; Winsteps, Linacre, 2012). This study is used data from The Special Tertiary Admissions Test (STAT) to explore whether such responses from lower ability candidates to a hard item in a real test could be considered as randomly guessing and how Rasch item difficulty estimates improve …


Evaluation Of Item Parameter Recovery Estimation By Acer Conquest Software, Luc T. Le, Ray Adams Aug 2014

Evaluation Of Item Parameter Recovery Estimation By Acer Conquest Software, Luc T. Le, Ray Adams

Dr Luc Tu Le

ACER ConQuest (Adams, Wu, and Wilson, 2012) has been popularly used for analysing testing and assessment data. Two of the most common estimation methods for Rasch measurement models (Rasch, 1960/1980) are available in this software, marginal maximum likelihood estimation (MML) and joint maximum likelihood estimation (JML). This study is concerned with item parameter recovery for the dichotomous Rasch model. Our primary focus is on comparing JML and MML when the assumptions of MML are violated, that is the abilities are not sampled from the distribution that is assumed in the estimation.


Investigating Item Difficulty Change By Item Positions Under The Rasch Model, Luc T. Le, Van Nguyen Apr 2011

Investigating Item Difficulty Change By Item Positions Under The Rasch Model, Luc T. Le, Van Nguyen

Dr Luc Tu Le

In operational testing programs using item response theory (IRT), item parameter invariance is one of essential requirements in common item equating designs. However, in practice, the stability of item parameters can be affected by many factors. Particularly, this study utilised data from the large-scale Graduate Skills Assessment (GSA) in 2010 to investigate the change of Rasch item difficulty by item positions. The test included 78 multiple-choice items and was presented in eight test forms by arranging the items in different orders. Items addressed three components of generic skills: Critical Thinking, Problem Solving and Interpersonal Understandings. Each test form was randomly …