Open Access. Powered by Scholars. Published by Universities.®

Statistical Methodology Commons

Open Access. Powered by Scholars. Published by Universities.®

1101 Full-Text Articles 1196 Authors 200801 Downloads 53 Institutions

All Articles in Statistical Methodology

Faceted Search

1101 full-text articles. Page 1 of 28.

Penalized Nonparametric Scalar-On-Function Regression Via Principal Coordinates, Philip T. Reiss, David L. Miller, Pei-Shien Wu, Wen-Yu Hua 2016 New York University School of Medicine

Penalized Nonparametric Scalar-On-Function Regression Via Principal Coordinates, Philip T. Reiss, David L. Miller, Pei-Shien Wu, Wen-Yu Hua

Philip T. Reiss

A number of classical approaches to nonparametric regression have recently been extended to the case of functional predictors. This paper introduces a new method of this type, which extends intermediate-rank penalized smoothing to scalar-on-function regression. The core idea is to regress the response on leading principal coordinates defined by a relevant distance among the functional predictors, while applying a ridge penalty. Our publicly available implementation, based on generalized additive modeling software, allows for fast optimal tuning parameter selection and for extensions to multiple functional predictors, exponential family-valued responses, and mixed-effects models. In an application to signature verification data, the proposed ...


Estimating Consumers' Valuation Of Organic And Cosmetically Damaged Apples, Chenyan Yue, Helen H. Jensen, Daren S. Mueller, Gail R. Nonnecke, Douglas Bonnet, Mark L. Gleason 2016 Iowa State University

Estimating Consumers' Valuation Of Organic And Cosmetically Damaged Apples, Chenyan Yue, Helen H. Jensen, Daren S. Mueller, Gail R. Nonnecke, Douglas Bonnet, Mark L. Gleason

Helen Jensen

The sooty blotch and flyspeck (SBFS) disease complex causes cosmetic damage but does not affect the safety or eating quality of apples. Treatment for disease is more difficult and costly for organic producers, and consumers' willingness to pay for organic apples needs to be considered in growers' choice of production technologies. A mixed probit model was applied to survey data to evaluate consumers' willingness to buy apples. The results show consumers will pay a premium for organic production methods and for apples with low amounts of SBFS damage. Behavioral variables such as experience growing fruit significantly affect the willingness to ...


Rao-Lovric And The Triwizard Point Null Hypothesis Tournament, Shlomo Sawilowsky 2016 Wayne State University

Rao-Lovric And The Triwizard Point Null Hypothesis Tournament, Shlomo Sawilowsky

Journal of Modern Applied Statistical Methods

The debate if the point null hypothesis is ever literally true cannot be resolved, because there are three competing statistical systems claiming ownership of the construct. The local resolution depends on personal acclimatization to a Fisherian, Frequentist, or Bayesian orientation (or an unexpected fourth champion if decision theory is allowed to compete). Implications of Rao and Lovric’s proposed Hodges-Lehman paradigm are discussed in the Appendix.


Censoring Unbiased Regression Trees And Ensembles, Jon Arni Steingrimsson, Liqun Diao, Robert L. Strawderman 2016 Department of Biostatistics, Johns Hopkins Bloomberg School of Public Health

Censoring Unbiased Regression Trees And Ensembles, Jon Arni Steingrimsson, Liqun Diao, Robert L. Strawderman

Johns Hopkins University, Dept. of Biostatistics Working Papers

This paper proposes a novel approach to building regression trees and ensemble learning in survival analysis. By first extending the theory of censoring unbiased transformations, we construct observed data estimators of full data loss functions in cases where responses can be right censored. This theory is used to construct two specific classes of methods for building regression trees and regression ensembles that respectively make use of Buckley-James and doubly robust estimating equations for a given full data risk function. For the particular case of squared error loss, we further show how to implement these algorithms using existing software (e.g ...


Matching The Efficiency Gains Of The Logistic Regression Estimator While Avoiding Its Interpretability Problems, In Randomized Trials, Michael Rosenblum, Jon Arni Steingrimsson 2016 Johns Hopkins Bloomberg School of Public Health, Department of Biostatistics

Matching The Efficiency Gains Of The Logistic Regression Estimator While Avoiding Its Interpretability Problems, In Randomized Trials, Michael Rosenblum, Jon Arni Steingrimsson

Johns Hopkins University, Dept. of Biostatistics Working Papers

Adjusting for prognostic baseline variables can lead to improved power in randomized trials. For binary outcomes, a logistic regression estimator is commonly used for such adjustment. This has resulted in substantial efficiency gains in practice, e.g., gains equivalent to reducing the required sample size by 20-28% were observed in a recent survey of traumatic brain injury trials. Robinson and Jewell (1991) proved that the logistic regression estimator is guaranteed to have equal or better asymptotic efficiency compared to the unadjusted estimator (which ignores baseline variables). Unfortunately, the logistic regression estimator has the following dangerous vulnerabilities: it is only interpretable ...


A Synthesis Of Current Surveillance Planning Methods For The Sequential Monitoring Of Drug And Vaccine Adverse Effects Using Electronic Health Care Data, Jennifer C. Nelson, Robert Wellman, Onchee Yu, Andrea J. Cook, Judith C. Maro, Rita Ouellet-Hellstrom, Denise Boudreau, James S. Floyd, Susan R. Heckbert, Simone Pinheiro, Marsha Reichman, Azadeh Shoaibi 2016 Group Health Research Institute; University of Washington

A Synthesis Of Current Surveillance Planning Methods For The Sequential Monitoring Of Drug And Vaccine Adverse Effects Using Electronic Health Care Data, Jennifer C. Nelson, Robert Wellman, Onchee Yu, Andrea J. Cook, Judith C. Maro, Rita Ouellet-Hellstrom, Denise Boudreau, James S. Floyd, Susan R. Heckbert, Simone Pinheiro, Marsha Reichman, Azadeh Shoaibi

eGEMs (Generating Evidence & Methods to improve patient outcomes)

Introduction: The large-scale assembly of electronic health care data combined with the use of sequential monitoring has made proactive postmarket drug- and vaccine-safety surveillance possible. Although sequential designs have been used extensively in randomized trials, less attention has been given to methods for applying them in observational electronic health care database settings.

Existing Methods: We review current sequential-surveillance planning methods from randomized trials, and the Vaccine Safety Datalink (VSD) and Mini-Sentinel Pilot projects—two national observational electronic health care database safety monitoring programs.

Future Surveillance Planning: Based on this examination, we suggest three steps for future surveillance planning in health ...


Advances In Portmanteau Diagnostic Tests, Jinkun Xiao 2016 The University of Western Ontario

Advances In Portmanteau Diagnostic Tests, Jinkun Xiao

Electronic Thesis and Dissertation Repository

Portmanteau test serves an important role in model diagnostics for Box-Jenkins Modelling procedures. A large number of Portmanteau test based on the autocorrelation function are proposed for a general purpose goodness-of-fit test. Since the asymptotic distributions for the statistics has a complicated form which makes it hard to obtain the p-value directly, the gamma approximation is introduced to obtain the p-value. But the approximation will inevitably introduce approximation errors and needs a large number of observations to yield a good approximation. To avoid some pitfalls in the approximation, the Lin-Mcleod Test is further proposed to obtain a numeric solution to ...


Improving Precision By Adjusting For Baseline Variables In Randomized Trials With Binary Outcomes, Without Regression Model Assumptions, Jon Arni Steingrimsson, Daniel F. Hanley, Michael Rosenblum 2016 Johns Hopkins Bloomberg School of Public Health

Improving Precision By Adjusting For Baseline Variables In Randomized Trials With Binary Outcomes, Without Regression Model Assumptions, Jon Arni Steingrimsson, Daniel F. Hanley, Michael Rosenblum

Johns Hopkins University, Dept. of Biostatistics Working Papers

In randomized clinical trials with baseline variables that are prognostic for the primary outcome, there is potential to improve precision and reduce sample size by appropriately adjusting for these variables. A major challenge is that there are multiple statistical methods to adjust for baseline variables, but little guidance on which is best to use in a given context. The choice of method can have important consequences. For example, one commonly used method leads to uninterpretable estimates if there is any treatment effect heterogeneity, which would jeopardize the validity of trial conclusions. We give practical guidance on how to avoid this ...


After Halliburton: Event Studies And Their Role In Federal Securities Fraud Litigation, Jill E. Fisch, Jonah B. Gelbach, Jonathan Klick 2016 University of Pennsylvania Law School

After Halliburton: Event Studies And Their Role In Federal Securities Fraud Litigation, Jill E. Fisch, Jonah B. Gelbach, Jonathan Klick

Jill Fisch

Event studies have become increasingly important in securities fraud litigation after the Supreme Court’s decision in Halliburton II. Litigants have used event study methodology, which empirically analyzes the relationship between the disclosure of corporate information and the issuer’s stock price, to provide evidence in the evaluation of key elements of federal securities fraud, including materiality, reliance, causation, and damages. As the use of event studies grows and they increasingly serve a gatekeeping function in determining whether litigation will proceed beyond a preliminary stage, it will be critical for courts to use them correctly.

This Article explores an array ...


After Halliburton: Event Studies And Their Role In Federal Securities Fraud Litigation, Jill E. Fisch, Jonah B. Gelbach, Jonathan Klick 2016 University of Pennsylvania Law School

After Halliburton: Event Studies And Their Role In Federal Securities Fraud Litigation, Jill E. Fisch, Jonah B. Gelbach, Jonathan Klick

Jill Fisch

Event studies have become increasingly important in securities fraud litigation after the Supreme Court’s decision in Halliburton II. Litigants have used event study methodology, which empirically analyzes the relationship between the disclosure of corporate information and the issuer’s stock price, to provide evidence in the evaluation of key elements of federal securities fraud, including materiality, reliance, causation, and damages. As the use of event studies grows and they increasingly serve a gatekeeping function in determining whether litigation will proceed beyond a preliminary stage, it will be critical for courts to use them correctly.

This Article explores an array ...


After Halliburton: Event Studies And Their Role In Federal Securities Fraud Litigation, Jill E. Fisch, Jonah B. Gelbach, Jonathan Klick 2016 University of Pennsylvania Law School

After Halliburton: Event Studies And Their Role In Federal Securities Fraud Litigation, Jill E. Fisch, Jonah B. Gelbach, Jonathan Klick

Faculty Scholarship

Event studies have become increasingly important in securities fraud litigation after the Supreme Court’s decision in Halliburton II. Litigants have used event study methodology, which empirically analyzes the relationship between the disclosure of corporate information and the issuer’s stock price, to provide evidence in the evaluation of key elements of federal securities fraud, including materiality, reliance, causation, and damages. As the use of event studies grows and they increasingly serve a gatekeeping function in determining whether litigation will proceed beyond a preliminary stage, it will be critical for courts to use them correctly.

This Article explores an array ...


Newsvendor Models With Monte Carlo Sampling, Ijeoma W. Ekwegh 2016 East Tennessee State University

Newsvendor Models With Monte Carlo Sampling, Ijeoma W. Ekwegh

Electronic Theses and Dissertations

Newsvendor Models with Monte Carlo Sampling by Ijeoma Winifred Ekwegh The newsvendor model is used in solving inventory problems in which demand is random. In this thesis, we will focus on a method of using Monte Carlo sampling to estimate the order quantity that will either maximizes revenue or minimizes cost given that demand is uncertain. Given data, the Monte Carlo approach will be used in sampling data over scenarios and also estimating the probability density function. A bootstrapping process yields an empirical distribution for the order quantity that will maximize the expected profit. Finally, this method will be used ...


Sensitivity Of Trial Performance To Delay Outcomes, Accrual Rates, And Prognostic Variables Based On A Simulated Randomized Trial With Adaptive Enrichment, Tiachen Qian, Elizabeth Colantuoni, Aaron Fisher, Michael Rosenblum 2016 Johns Hopkins Bloomberg School of Public Health, Department of Biostatistics

Sensitivity Of Trial Performance To Delay Outcomes, Accrual Rates, And Prognostic Variables Based On A Simulated Randomized Trial With Adaptive Enrichment, Tiachen Qian, Elizabeth Colantuoni, Aaron Fisher, Michael Rosenblum

Johns Hopkins University, Dept. of Biostatistics Working Papers

Adaptive enrichment designs involve rules for restricting enrollment to a subset of the population during the course of an ongoing trial. This can be used to target those who benefit from the experimental treatment. To leverage prognostic information in baseline variables and short-term outcomes, we use a semiparametric, locally efficient estimator, and investigate its strengths and limitations compared to standard estimators. Through simulation studies, we assess how sensitive the trial performance (Type I error, power, expected sample size, trial duration) is to different design characteristics. Our simulation distributions mimic features of data from the Alzheimer’s Disease Neuroimaging Initiative, and ...


Variable Selection For Estimating The Optimal Treatment Regimes In The Presence Of A Large Number Of Covariate, Baqun Zhang, Min Zhang 2016 School of Statistics, Renmin University

Variable Selection For Estimating The Optimal Treatment Regimes In The Presence Of A Large Number Of Covariate, Baqun Zhang, Min Zhang

The University of Michigan Department of Biostatistics Working Paper Series

Most of existing methods for optimal treatment regimes, with few exceptions, focus on estimation and are not designed for variable selection with the objective of optimizing treatment decisions. In clinical trials and observational studies, often numerous baseline variables are collected and variable selection is essential for deriving reliable optimal treatment regimes. Although many variable selection methods exist, they mostly focus on selecting variables that are important for prediction (predictive variables) instead of variables that have a qualitative interaction with treatment (prescriptive variables) and hence are important for making treatment decisions. We propose a variable selection method within a general classification ...


Using A Data Quality Framework To Clean Data Extracted From The Electronic Health Record: A Case Study., Oliwier Dziadkowiec, Tiffany Callahan, Mustafa Ozkaynak, Blaine Reeder, John Welton 2016 University of Colorado, College of Nursing, Anschutz Medical Campus

Using A Data Quality Framework To Clean Data Extracted From The Electronic Health Record: A Case Study., Oliwier Dziadkowiec, Tiffany Callahan, Mustafa Ozkaynak, Blaine Reeder, John Welton

eGEMs (Generating Evidence & Methods to improve patient outcomes)

Objectives: Examine (1) the appropriateness of using a data quality (DQ) framework developed for relational databases as a data-cleaning tool for a dataset extracted from two EPIC databases; and (2) the differences in statistical parameter estimates on a dataset cleaned with the DQ framework and dataset not cleaned with the DQ framework.

Background: The use of data contained within electronic health records (EHRs) has the potential to open doors for a new wave of innovative research. Without adequate preparation of such large datasets for analysis, the results might be erroneous, which might affect clinical decision making or results of Comparative ...


Testing Homogeneity In Semiparametric Mixture Case-Control Models, C Z. Di, G KC Chan, C Zheng, KY Liang 2016 Fred Hutchinson Cancer Research Center

Testing Homogeneity In Semiparametric Mixture Case-Control Models, C Z. Di, G Kc Chan, C Zheng, Ky Liang

Chongzhi Di

Recently, Qin and Liang (Biometrics, 2011) considered a semiparametric mixture case-control model and proposed a score test for homogeneity. The mixture model is semiparametric in the sense that the density ratio of two distributions is assumed to be of exponential form, while the baseline density is unspecified. In a family of parametric admixture models, Di and Liang (Biometrics, 2011) showed that the likelihood ratio test statistics, which is equivalent to a supremum statistics, could improve power over score tests. We generalize the likelihood ratio or supremum statistics to the semiparametric mixture model and demonstrate the power gain over the score ...


Combined Computational-Experimental Design Of High-Temperature, High-Intensity Permanent Magnetic Alloys With Minimal Addition Of Rare-Earth Elements, Rajesh Jha 2016 Florida International University

Combined Computational-Experimental Design Of High-Temperature, High-Intensity Permanent Magnetic Alloys With Minimal Addition Of Rare-Earth Elements, Rajesh Jha

FIU Electronic Theses and Dissertations

AlNiCo magnets are known for high-temperature stability and superior corrosion resistance and have been widely used for various applications. Reported magnetic energy density ((BH) max) for these magnets is around 10 MGOe. Theoretical calculations show that ((BH) max) of 20 MGOe is achievable which will be helpful in covering the gap between AlNiCo and Rare-Earth Elements (REE) based magnets. An extended family of AlNiCo alloys was studied in this dissertation that consists of eight elements, and hence it is important to determine composition-property relationship between each of the alloying elements and their influence on the bulk properties.

In the present ...


Propensity Score Methods : A Simulation And Case Study Involving Breast Cancer Patients., John Craycroft 2016 University of Louisville

Propensity Score Methods : A Simulation And Case Study Involving Breast Cancer Patients., John Craycroft

Electronic Theses and Dissertations

Observational data presents unique challenges for analysis that are not encountered with experimental data resulting from carefully designed randomized controlled trials. Selection bias and unbalanced treatment assignments can obscure estimations of treatment effects, making the process of causal inference from observational data highly problematic. In 1983, Paul Rosenbaum and Donald Rubin formalized an approach for analyzing observational data that adjusts treatment effect estimates for the set of non-treatment variables that are measured at baseline. The propensity score is the conditional probability of assignment to a treatment group given the covariates. Using this score, one may balance the covariates across treatment ...


Stochastic Optimization Of Adaptive Enrichment Designs For Two Subpopulations, Aaron Fisher, Michael Rosenblum 2016 Johns Hopkins University Bloomberg School of Public Health

Stochastic Optimization Of Adaptive Enrichment Designs For Two Subpopulations, Aaron Fisher, Michael Rosenblum

Johns Hopkins University, Dept. of Biostatistics Working Papers

An adaptive enrichment design is a randomized trial that allows enrollment criteria to be modified at interim analyses, based on preset decision rules. When there is prior uncertainty regarding treatment effect heterogeneity, these trials can provide improved power for detecting treatment effects in subpopulations. An obstacle to using these designs is that there is no general approach to determine what decision rules and other design parameters will lead to good performance for a given research problem. To address this, we present a simulated annealing approach for optimizing the parameters of an adaptive enrichment design for a given scientific application. Optimization ...


A Weighted Instrumental Variable Estimator To Control For Instrument-Outcome Confounders, Douglas Lehmann, Yun Li, Rajiv Saran, Yi Li 2016 The University Of Michigan

A Weighted Instrumental Variable Estimator To Control For Instrument-Outcome Confounders, Douglas Lehmann, Yun Li, Rajiv Saran, Yi Li

The University of Michigan Department of Biostatistics Working Paper Series

No abstract provided.


Digital Commons powered by bepress