Open Access. Powered by Scholars. Published by Universities.®
- Discipline
-
- Higher Education (4)
- Medicine and Health Sciences (3)
- Occupational Therapy (3)
- Rehabilitation and Therapy (3)
- Scholarship of Teaching and Learning (3)
-
- Adult and Continuing Education (2)
- Educational Assessment, Evaluation, and Research (2)
- Social and Behavioral Sciences (2)
- Educational Sociology (1)
- Health and Physical Education (1)
- Language and Literacy Education (1)
- Mathematics (1)
- Medical Education (1)
- Physical Sciences and Mathematics (1)
- Psychology (1)
- Quantitative Psychology (1)
- Science and Mathematics Education (1)
- Sociology (1)
- Student Counseling and Personnel Services (1)
- Institution
Articles 1 - 8 of 8
Full-Text Articles in Education
Cross-Cultural Adaptation Of The Inventory Of Reading Occupations-Adult Into Filipino And Its Content Validation, Peñafrancia E. Ching, Treisha Naedine H. Santos, Lenin Grajo, Maria Concepcion Cabatan, Anna Liza Y. Tan Pascual
Cross-Cultural Adaptation Of The Inventory Of Reading Occupations-Adult Into Filipino And Its Content Validation, Peñafrancia E. Ching, Treisha Naedine H. Santos, Lenin Grajo, Maria Concepcion Cabatan, Anna Liza Y. Tan Pascual
The Open Journal of Occupational Therapy
Background: Adult functional literacy ensures adequate and safe engagement in daily activities. It is assessed through the Inventory of Reading Occupations-Adult (IRO-A). The instrument underwent translation with cultural adaptation and content validation to ensure relevance to the Filipino context.
Method: The translation and cultural adaptation of the IRO-A to Filipino (Fil IRO-A) was guided by the process proposed by two international guidelines for cross-cultural adaptations that involves (a) forward translation and synthesis, (b) back translation, and (c) pre-panel review of the adaptation to the Filipino context. The Fil IRO-A also underwent content validation by seven experts. Item and …
Assessing Differential Item Functioning And Differential Test Functioning In An Academic Motivation Scale Using Item Response Theory Methods, Gerald J. Bean
Assessing Differential Item Functioning And Differential Test Functioning In An Academic Motivation Scale Using Item Response Theory Methods, Gerald J. Bean
International Journal of School Social Work
Social work researchers and practitioners who use measurement instruments to make data-informed decisions need to ensure those decisions are based on items and scales that are free from possible bias or undesirable differential functioning. In this study, we provide an example of how a set of Item Response Theory (IRT) statistical methods and tools can be used by social work measurement researchers to assess differential item (DIF) and scale (DTF) functioning. For the example, we explored the possible race, gender, and family composition differential functioning of a scale—the Academic Motivation Scale (AMS)—developed for use by school social workers. The data …
An Evaluation Of The Factor Structure And Internal Consistency Of The ‘Conceptions Of Learning’ And ‘Preferences For Teaching’ Measures In American Occupational Therapy Students, Tore Bonsaksen, Adele Breen-Franklin
An Evaluation Of The Factor Structure And Internal Consistency Of The ‘Conceptions Of Learning’ And ‘Preferences For Teaching’ Measures In American Occupational Therapy Students, Tore Bonsaksen, Adele Breen-Franklin
Journal of Occupational Therapy Education
When planning to use measurement scales in new samples and contexts, examining the scales’ psychometric properties is an important initial step. This study examined the factor structure and internal consistency of two measures that are part of the Approaches and Study Skills Inventory for Students (ASSIST) – the Conceptions of learning and Preferences for teaching and courses – in a sample of American occupational therapy students. The students (n = 115) completed the measures and provided basic sociodemographic information. Scale structure was examined with Principal Components Analysis (PCA), while consistency between scale items was assessed with mean inter-item correlations. …
Assessing Online Viewing Practices Among College Students, Elizabeth J. Threadgill, Larry R. Price
Assessing Online Viewing Practices Among College Students, Elizabeth J. Threadgill, Larry R. Price
Journal of Media Literacy Education
This article focuses on media literacy education for college students. First, we conducted psychometric analyses to verify the properties of the Critical Evaluation and Analysis of Media (CEAM) scale. CEAM measures college students’ self-reported practices for critically evaluating and analyzing the credibility, audience, and technical design elements of online media, such as news, advertisement, and entertainment media. Using CEAM, our second goal was to identify trends in critical viewing practices among first-year students enrolled in college. Results of confirmatory factor analysis (CFA) and item response theory (IRT) supported a three-factor structure for the CEAM scale. Composite score reliability for all …
The Short Assist Scales: Measurement Properties In A Sample Of Occupational Therapy Students In The Usa, Tore Bonsaksen, Adele Breen-Franklin
The Short Assist Scales: Measurement Properties In A Sample Of Occupational Therapy Students In The Usa, Tore Bonsaksen, Adele Breen-Franklin
Journal of Occupational Therapy Education
Shortening measurement scales can improve the scales’ feasibility, but at the same time, their measurement properties can be affected. This study investigated psychometric properties of the short Approaches and Study Skills Inventory for Students (ASSIST) among occupational therapy students in the United States. The students (n = 120) completed the ASSIST and provided basic socio-demographic and education-related information. Scale structure was examined with Principal Components Analysis (PCA), while consistency between scale items was assessed with Cronbach’s α and inter-item correlations. Three factors were confirmed, but three items showed poor or ambiguous fit with the proposed scales. These items were …
The Trouble With Test Banks, Harvey Richman, Molly Hrezo
The Trouble With Test Banks, Harvey Richman, Molly Hrezo
Perspectives In Learning
We compared the psychometrics of quiz questions randomly selected from a test bank with the psychometrics of quiz questions the instructor had selected from the bank for quality and modified (if necessary). On multiple psychometric indices, the instructor selected/modified questions were superior to questions randomly selected from the test bank. Most notably, when compared with instructor written/modified questions, randomly selected bank questions were nearly 6.5 times more likely to contain a distractor that drew more responses than the correct answer. Details and implications are discussed.
Development And Validation Of The Motivation For Tutoring Questionnaire In Problem-Based Learning Programs, Salah Eldin Kassab, Nahla Hassan, Shimaa El-Araby, Abdel Halim Salem, Saleh Ali Alrebish, Ahmed S. Al-Amro, Hani A. Al-Shobaili, Hossam Hamdy
Development And Validation Of The Motivation For Tutoring Questionnaire In Problem-Based Learning Programs, Salah Eldin Kassab, Nahla Hassan, Shimaa El-Araby, Abdel Halim Salem, Saleh Ali Alrebish, Ahmed S. Al-Amro, Hani A. Al-Shobaili, Hossam Hamdy
Health Professions Education
Purpose: There are no published instruments, which measure tutor motivation for conducting small group tutorials in problembased learning programs. Therefore, we aimed to develop a motivation for tutoring questionnaire in problem-based learning (MTQ-PBL) and evaluate its construct validity.
Methods: The questionnaire included 28 items representing four constructs: tutoring self-efficacy (15 items), tutoring interest (6 items), tutoring value (4 items), and tutoring effort (3 items). Tutors (n¼158) from three problem-based medical schools in Egypt, Saudi Arabia and Bahrain rated their perceptions for each item on a 7-point Likert scale. Statistical analyses included examining the factor structure of the questionnaire, the differences …
Quantitative Literacy Assessments: An Introduction To Testing Tests, Dorothy Wallace, Kim Rheinlander, Steven Woloshin, Lisa Schwartz
Quantitative Literacy Assessments: An Introduction To Testing Tests, Dorothy Wallace, Kim Rheinlander, Steven Woloshin, Lisa Schwartz
Numeracy
This paper describes how professional evaluators construct assessment instruments that work properly to measure the right thing. Constructing an assessment tool begins with getting feedback from relevant experts on the content of questions. The tool is developed and refined through comparison with existing instruments, focus groups and cognitive interviews. The final instrument is formally tested for content validity, usability, reliability and construct validity through a variety of statistical measures. This process of construction is illustrated by two examples relevant to quantitative literacy: the Medical Data Interpretation Test and the Math Attitudes Survey.