Open Access. Powered by Scholars. Published by Universities.®

Social and Behavioral Sciences Commons

Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics

Journal of Modern Applied Statistical Methods

Meta-analysis

Articles 1 - 10 of 10

Full-Text Articles in Social and Behavioral Sciences

Longitudinal Stability Of Effect Sizes In Education Research, Joshua Stephens Nov 2016

Longitudinal Stability Of Effect Sizes In Education Research, Joshua Stephens

Journal of Modern Applied Statistical Methods

Educators use meta-analyses to decide best practices. It has been suggested that effect sizes have declined over time due to various biases. This study applies an established methodological framework to educational meta-analyses and finds that effect sizes have increased from 1970–present. Potential causes for this phenomenon are discussed.


Graphing Effects As Fuzzy Numbers In Meta-Analysis, Christopher G. Thompson May 2016

Graphing Effects As Fuzzy Numbers In Meta-Analysis, Christopher G. Thompson

Journal of Modern Applied Statistical Methods

Prior to quantitative analyses, meta-analysts often explore descriptive characteristics of effect sizes. A graphic is proposed that treats effect sizes as fuzzy numbers. This plot can provide meta-analysts with such information such as heterogeneity of effects, precision of estimates, possible clusters, and existence of outliers.


An Evaluation Of Multiple Imputation For Meta-Analytic Structural Equation Modeling, Carolyn F. Furlow, S. Natasha Beretvas May 2010

An Evaluation Of Multiple Imputation For Meta-Analytic Structural Equation Modeling, Carolyn F. Furlow, S. Natasha Beretvas

Journal of Modern Applied Statistical Methods

A simulation study was used to evaluate multiple imputation (MI) to handle MCAR correlations in the first step of meta-analytic structural equation modeling: the synthesis of the correlation matrix and the test of homogeneity. No substantial parameter bias resulted from using MI. Although some SE bias was found for meta-analyses involving smaller numbers of studies, the homogeneity test was never rejected when using MI.


Estimation Of The Standardized Mean Difference For Repeated Measures Designs, Lindsey J. Wolff Smith, S. Natasha Beretvas Nov 2009

Estimation Of The Standardized Mean Difference For Repeated Measures Designs, Lindsey J. Wolff Smith, S. Natasha Beretvas

Journal of Modern Applied Statistical Methods

This simulation study modified the repeated measures mean difference effect size, d=RM , for scenarios with unequal pre- and post-test score variances. Relative parameter and SE bias were calculated for dRM ≠ versus dRM = . Results consistently favored dRM over d=RM with worse positive parameter and negative SE bias identified for d=RM for increasingly heterogeneous variance conditions.


Measuring Overall Heterogeneity In Meta-Analyses: Application To Csf Biomarker Studies In Alzheimer’S Disease, Chengjie Xiong, Feng Gao, Yan Yan, Jingqin Luo, Yunju Sung, Gang Shi May 2008

Measuring Overall Heterogeneity In Meta-Analyses: Application To Csf Biomarker Studies In Alzheimer’S Disease, Chengjie Xiong, Feng Gao, Yan Yan, Jingqin Luo, Yunju Sung, Gang Shi

Journal of Modern Applied Statistical Methods

The interpretations of statistical inferences from meta-analyses depend on the degree of heterogeneity in the meta-analyses. Several new indices of heterogeneity in meta-analyses are proposed, and assessed the variation/difference of these indices through a large simulation study. The proposed methods are applied to biomakers of Alzheimer’s disease.


Meta-Analysis Of Results And Individual Patient Data In Epidemiologal Studies, Aurelio Tobías, Marc Saez, Manolis Kogevinas May 2004

Meta-Analysis Of Results And Individual Patient Data In Epidemiologal Studies, Aurelio Tobías, Marc Saez, Manolis Kogevinas

Journal of Modern Applied Statistical Methods

Epidemiological information can be aggregated by combining results through a meta-analysis technique, or by pooling and analyzing primary data. Common approaches to analyzing pooled studies through an example on the effect of occupational exposure to wood dust on sinonasal cancer are described. Results were combined applying a meta-analysis technique. Alternatively, primary data from all studies were pooled and re-analyzed using mixed effect models. The combination of individual information rather than results is desirable to facilitate interpretations of epidemiological findings, leading also to more precise estimations and more powerful statistical tests for study heterogeneity.


Jmasm9: Converting Kendall’S Tau For Correlational Or Meta-Analytic Analyses, David A. Walker Nov 2003

Jmasm9: Converting Kendall’S Tau For Correlational Or Meta-Analytic Analyses, David A. Walker

Journal of Modern Applied Statistical Methods

Expanding on past research, this study provides researchers with a detailed table for use in meta-analytic applications when engaged in assorted examinations of various r-related statistics, such as Kendall’s tau (τ) and Cohen’s d, that estimate the magnitude of experimental or observational effect. A program to convert from the lesser-used tau coefficient to other effect size indices when conducting correlational or meta-analytic analyses is presented.


Correcting Publication Bias In Meta-Analysis: A Truncation Approach, Guillermo Montes, Bohdan S. Lotyczewski Nov 2003

Correcting Publication Bias In Meta-Analysis: A Truncation Approach, Guillermo Montes, Bohdan S. Lotyczewski

Journal of Modern Applied Statistical Methods

Meta-analyses are increasingly used to support national policy decision making. The practical implications of publications bias in meta-analysis are discussed. Standard approaches to correct for publication bias require knowledge of the selection mechanism that leads to publication. In this study, an alternative approach is proposed based on Cohen’s corrections for a truncated normal. The approach makes less assumptions, is easy to implement, and performs well in simulations with small samples. The approach is illustrated with two published meta-analyses.


You Think You’Ve Got Trivials?, Shlomo S. Sawilowsky May 2003

You Think You’Ve Got Trivials?, Shlomo S. Sawilowsky

Journal of Modern Applied Statistical Methods

Effect sizes are important for power analysis and meta-analysis. This has led to a debate on reporting effect sizes for studies that are not statistically significant. Contrary and supportive evidence has been offered on the basis of Monte Carlo methods. In this article, clarifications are given regarding what should be simulated to determine the possible effects of piecemeal publishing trivial effect sizes.


Trivials: The Birth, Sale, And Final Production Of Meta-Analysis, Shlomo S. Sawilowsky May 2003

Trivials: The Birth, Sale, And Final Production Of Meta-Analysis, Shlomo S. Sawilowsky

Journal of Modern Applied Statistical Methods

The structure of the first invited debate in JMASM is to present a target article (Sawilowsky, 2003), provide an opportunity for a response (Roberts & Henson, 2003), and to follow with independent comments from noted scholars in the field (Knapp, 2003; Levin & Robinson, 2003). In this rejoinder, I provide a correction and a clarification in an effort to bring some closure to the debate. The intension, however, is not to rehash previously made points, even where I disagree with the response of Roberts & Henson (2003).