Open Access. Powered by Scholars. Published by Universities.®

Education Commons

Open Access. Powered by Scholars. Published by Universities.®

2012

Educational Assessment, Evaluation, and Research

Selected Works

PISA

Articles 1 - 8 of 8

Full-Text Articles in Education

Factors That Influence The Difficulty Of Problem Solving Items, Dara Ramalingam, Ray Philpot Sep 2015

Factors That Influence The Difficulty Of Problem Solving Items, Dara Ramalingam, Ray Philpot

Ray Philpot

Computer-based assessment of problem solving allows problems of both static and interactive natures to be posed. Examples of static problems are scheduling and logic puzzles in which all relevant information is available to the solver at the outset. Problems of an interactive nature, on the other hand, require exploration of the situation to acquire additional knowledge needed to solve the problem. Examples include discovering how to use an unfamiliar mobile telephone or automatic vending machine. This study used data from the 2011 Field Trial of the PISA 2012 computer-based assessment of problem solving which comprised 34 static and 45 interactive …


An Exploratory Analysis Of The Talis And Pisa Link Data: An Investigation Of The Possible Relationships, Frances Eveleigh, Chris Freeman Aug 2012

An Exploratory Analysis Of The Talis And Pisa Link Data: An Investigation Of The Possible Relationships, Frances Eveleigh, Chris Freeman

Chris Freeman

This paper proposes to report a preliminary investigation of the field trial data of PISA combined with the TALIS data from the same pool of schools. It proposes exploratory analyses of the data through correlation, ANOVA and MANOVA, and multi-level modelling techniques to identify plausible relationships and explained variation that may be uncovered within the data. This investigation will inform the types of analyses that may be performed on the main study data that are being collected in mid to late 2012.


A Framework For Predicting Item Difficulty In Reading Tests, Tom Lumley, Alla Routitsky, Juliette Mendelovits, Dara Ramalingam Aug 2012

A Framework For Predicting Item Difficulty In Reading Tests, Tom Lumley, Alla Routitsky, Juliette Mendelovits, Dara Ramalingam

Juliette Mendelovits

Results on reading tests are typically reported on scales composed of levels, each giving a statement of student achievement or proficiency. The PISA reading scales provide broad descriptions of skill levels associated with reading items, intended to communicate to policy makers and teachers about the reading proficiency of students at different levels. However, the described scales are not explicitly tied to features that predict difficulty. Difficulty is thus treated as an empirical issue, using a post hoc solution, while a priori estimates of item difficulty have tended to be unreliable. Understanding features influencing the difficulty of reading tasks has the …


Print And Digital Reading In Pisa 2009 : Comparison And Contrast, Juliette Mendelovits, Dara Ramalingam, Tom Lumley Aug 2012

Print And Digital Reading In Pisa 2009 : Comparison And Contrast, Juliette Mendelovits, Dara Ramalingam, Tom Lumley

Juliette Mendelovits

PISA was administered for the fourth time in 2009. Since in each administration, one of reading, maths or science is chosen as the major domain, the 2009 survey marked the first time that a domain (in this case, reading) was revisited as the major focus of the assessment. This allowed a full review of the framework for reading literacy and the inclusion of new elements to reflect the way that reading has changed since 2000 (OECD, 2009). One such change is the increasing prevalence of digital texts. The assessment of digital reading in the PISA 2009 cycle, undertaken by 19 …


Some Drivers Of Test Item Difficulty In Mathematics, Ross Turner Aug 2012

Some Drivers Of Test Item Difficulty In Mathematics, Ross Turner

Ross Turner

This paper is one of four contributions to the symposium session at the AERA’s 2012 Annual Meeting titled Exploring Reading and Mathematics Item Difficulty: Teaching and Learning Implications of PISA Survey Data. The author presents a rubric used to analyse mathematics test items developed for use in the OECD’s PISA survey. The rubric focuses on a set of mathematical competencies that are components of mathematical literacy. The work on which this report is based suggests that demand for activation of these competencies functions as a significant driver of item difficulty, which potentially has implications for the teaching and learning of …


How Well Do Young People Deal With Contradictory And Unreliable Information On Line? What The Pisa Digital Reading Assessment Tells Us, Tom Lumley, Juliette Mendelovits Jul 2012

How Well Do Young People Deal With Contradictory And Unreliable Information On Line? What The Pisa Digital Reading Assessment Tells Us, Tom Lumley, Juliette Mendelovits

Juliette Mendelovits

There is sometimes an assumption that young people, as ‘digital natives’, are able to use online information effectively, including selecting and negotiating digital texts that are not only relevant for what they need, but also are likely to provide reliable information. This paper examines the question of how well young people are in fact able to recognise whether information is likely to be trustworthy. While some small-scale work has been done in this area, this paper draws on data from the first large-scale international assessment of online reading, the Digital Reading Assessment (DRA) that was part of the Organisation for …


Evaluation Of Booklet Effect In Pisa Mathematics Across Countries, Luc Le Jun 2012

Evaluation Of Booklet Effect In Pisa Mathematics Across Countries, Luc Le

Dr Luc Tu Le

In PISA 2003, 13 main linked test booklets were constructed by item cluster rotation design. Each student was randomly assigned one of the test booklets. Each of the selected clusters was located in four of the test booklets in different positions. There were seven mathematics clusters with a total of 85 items. Booklet effect was identified and was used to adjust the measure of student performance in PISA (see PISA 2003 technical report; OECD, 2004). This study was designed to explore which factors could account for this booklet effect in mathematics across 41 PISA countries. A three-faceted partial credit model …


A Framework For Predicting Item Difficulty In Reading Tests, Tom Lumley, Alla Routitsky, Juliette Mendelovits, Dara Ramalingam Mar 2012

A Framework For Predicting Item Difficulty In Reading Tests, Tom Lumley, Alla Routitsky, Juliette Mendelovits, Dara Ramalingam

Dr Tom Lumley

Results on reading tests are typically reported on scales composed of levels, each giving a statement of student achievement or proficiency. The PISA reading scales provide broad descriptions of skill levels associated with reading items, intended to communicate to policy makers and teachers about the reading proficiency of students at different levels. However, the described scales are not explicitly tied to features that predict difficulty. Difficulty is thus treated as an empirical issue, using a post hoc solution, while a priori estimates of item difficulty have tended to be unreliable. Understanding features influencing the difficulty of reading tasks has the …