Open Access. Powered by Scholars. Published by Universities.®

Education Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 8 of 8

Full-Text Articles in Education

Pirls 2016 Reporting Australia's Results Data Tables [Excel File], Sue Thomson, Kylie Hillman, Marina Schmid, Sima Rodrigues, Jessica Fullarton May 2018

Pirls 2016 Reporting Australia's Results Data Tables [Excel File], Sue Thomson, Kylie Hillman, Marina Schmid, Sima Rodrigues, Jessica Fullarton

Kylie Hillman

This Excel file contains data pertaining to the report PIRLS 2016: Reporting Australia's results (2017). This report provides the Australian results from PIRLS 2016.


Pirls 2016 Australian Year 4 Data [Sas], Sue Thomson, Kylie Hillman, Marina Schmid, Sima Rodrigues, Jessica Fullarton May 2018

Pirls 2016 Australian Year 4 Data [Sas], Sue Thomson, Kylie Hillman, Marina Schmid, Sima Rodrigues, Jessica Fullarton

Kylie Hillman

This dataset (SAS zipped) is a data source for the report PIRLS 2016: Reporting Australia's results. Refer to the readme.txt file for details.


Pirls 2016 Australian Year 4 Data [Sas], Sue Thomson, Kylie Hillman, Marina Schmid, Sima Rodrigues, Jessica Fullarton May 2018

Pirls 2016 Australian Year 4 Data [Sas], Sue Thomson, Kylie Hillman, Marina Schmid, Sima Rodrigues, Jessica Fullarton

Dr Sue Thomson

This dataset (SAS zipped) is a data source for the report PIRLS 2016: Reporting Australia's results. Refer to the readme.txt file for details.


Pirls 2016 Australian Year 4 Data [Spss], Sue Thomson, Kylie Hillman, Marina Schmid, Sima Rodrigues, Jessica Fullarton May 2018

Pirls 2016 Australian Year 4 Data [Spss], Sue Thomson, Kylie Hillman, Marina Schmid, Sima Rodrigues, Jessica Fullarton

Dr Sue Thomson

This dataset (SPSS zipped) is a data source for the report PIRLS 2016: Reporting Australia's results. Refer to the readme.txt file for details.


A Framework For Predicting Item Difficulty In Reading Tests, Tom Lumley, Alla Routitsky, Juliette Mendelovits, Dara Ramalingam Jul 2013

A Framework For Predicting Item Difficulty In Reading Tests, Tom Lumley, Alla Routitsky, Juliette Mendelovits, Dara Ramalingam

Dr Alla Routitsky

Results on reading tests are typically reported on scales composed of levels, each giving a statement of student achievement or proficiency. The PISA reading scales provide broad descriptions of skill levels associated with reading items, intended to communicate to policy makers and teachers about the reading proficiency of students at different levels. However, the described scales are not explicitly tied to features that predict difficulty. Difficulty is thus treated as an empirical issue, using a post hoc solution, while a priori estimates of item difficulty have tended to be unreliable. Understanding features influencing the difficulty of reading tasks has the …


A Framework For Predicting Item Difficulty In Reading Tests, Tom Lumley, Alla Routitsky, Juliette Mendelovits, Dara Ramalingam Aug 2012

A Framework For Predicting Item Difficulty In Reading Tests, Tom Lumley, Alla Routitsky, Juliette Mendelovits, Dara Ramalingam

Juliette Mendelovits

Results on reading tests are typically reported on scales composed of levels, each giving a statement of student achievement or proficiency. The PISA reading scales provide broad descriptions of skill levels associated with reading items, intended to communicate to policy makers and teachers about the reading proficiency of students at different levels. However, the described scales are not explicitly tied to features that predict difficulty. Difficulty is thus treated as an empirical issue, using a post hoc solution, while a priori estimates of item difficulty have tended to be unreliable. Understanding features influencing the difficulty of reading tasks has the …


A Framework For Predicting Item Difficulty In Reading Tests, Tom Lumley, Alla Routitsky, Juliette Mendelovits, Dara Ramalingam Mar 2012

A Framework For Predicting Item Difficulty In Reading Tests, Tom Lumley, Alla Routitsky, Juliette Mendelovits, Dara Ramalingam

Dr Tom Lumley

Results on reading tests are typically reported on scales composed of levels, each giving a statement of student achievement or proficiency. The PISA reading scales provide broad descriptions of skill levels associated with reading items, intended to communicate to policy makers and teachers about the reading proficiency of students at different levels. However, the described scales are not explicitly tied to features that predict difficulty. Difficulty is thus treated as an empirical issue, using a post hoc solution, while a priori estimates of item difficulty have tended to be unreliable. Understanding features influencing the difficulty of reading tasks has the …


Common Person Equating With The Rasch Model, Geoff Masters Dec 1984

Common Person Equating With The Rasch Model, Geoff Masters

Prof Geoff Masters AO

Two procedures, one based on item difficulties, the other based on person abilities, were used to equate 14 forms of a reading comprehension test using the Rasch model. These forms had no items in common. For practical purposes, the two procedures produced equivalent results. An advantage of common person equating for testing the unidimensionality assumption is pointed out, and the need for caution in interpreting tests of common item invariance is stressed.