Open Access. Powered by Scholars. Published by Universities.®

Statistical Methodology Commons

Open Access. Powered by Scholars. Published by Universities.®

1,115 Full-Text Articles 1,508 Authors 603,045 Downloads 125 Institutions

All Articles in Statistical Methodology

Faceted Search

1,115 full-text articles. Page 33 of 38.

Depicting Estimates Using The Intercept In Meta-Regression Models: The Moving Constant Technique, Blair T. Johnson Dr., Tania B. Huedo-Medina Dr. 2011 University of Connecticut - Storrs

Depicting Estimates Using The Intercept In Meta-Regression Models: The Moving Constant Technique, Blair T. Johnson Dr., Tania B. Huedo-Medina Dr.

CHIP Documents

In any scientific discipline, the ability to portray research patterns graphically often aids greatly in interpreting a phenomenon. In part to depict phenomena, the statistics and capabilities of meta-analytic models have grown increasingly sophisticated. Accordingly, this article details how to move the constant in weighted meta-analysis regression models (viz. “meta-regression”) to illuminate the patterns in such models across a range of complexities. Although it is commonly ignored in practice, the constant (or intercept) in such models can be indispensible when it is not relegated to its usual static role. The moving constant technique makes possible estimates and confidence intervals at …


Estimation Of A Non-Parametric Variable Importance Measure Of A Continuous Exposure, Chambaz Antoine, Pierre Neuvial, Mark J. van der Laan 2011 MAPS, Université Paris Descartes and CNRS

Estimation Of A Non-Parametric Variable Importance Measure Of A Continuous Exposure, Chambaz Antoine, Pierre Neuvial, Mark J. Van Der Laan

U.C. Berkeley Division of Biostatistics Working Paper Series

We define a new measure of variable importance of an exposure on a continuous outcome, accounting for potential confounders. The exposure features a reference level x0 with positive mass and a continuum of other levels. For the purpose of estimating it, we fully develop the semi-parametric estimation methodology called targeted minimum loss estimation methodology (TMLE) [van der Laan & Rubin, 2006; van der Laan & Rose, 2011]. We cover the whole spectrum of its theoretical study (convergence of the iterative procedure which is at the core of the TMLE methodology; consistency and asymptotic normality of the estimator), practical implementation, simulation …


A Regularization Corrected Score Method For Nonlinear Regression Models With Covariate Error, David M. Zucker, Malka Gorfine, Yi Li, Donna Spiegelman 2011 Hebrew University

A Regularization Corrected Score Method For Nonlinear Regression Models With Covariate Error, David M. Zucker, Malka Gorfine, Yi Li, Donna Spiegelman

Harvard University Biostatistics Working Paper Series

No abstract provided.


Social Networks Enabled Coordination Model For Cost Management Of Patient Hospital Admissions, Shahadat Uddin, Liaquat Hossain 2011 The University of Sydney

Social Networks Enabled Coordination Model For Cost Management Of Patient Hospital Admissions, Shahadat Uddin, Liaquat Hossain

Shahadat Uddin

In this study, we introduce a social networks enabled coordination model for exploring the effect of network position of “patient,” “physician,” and “hospital” actors in a patient-centered care network that evolves during patient hospitalization period on the total cost of coordination. An actor is a node, which represents an entity such as individual and organization in a social network. In our analysis of actor networks and coordination in the healthcare literature, we identified that there is significant gap where a number of promising hospital coordination model have been developed (e.g., Guided Care Model, Chronic Care Model) for the current healthcare …


A Proof Of Bell's Inequality In Quantum Mechanics Using Causal Interactions, James M. Robins, Tyler J. VanderWeele, Richard D. Gill 2011 Harvard School of Public Health

A Proof Of Bell's Inequality In Quantum Mechanics Using Causal Interactions, James M. Robins, Tyler J. Vanderweele, Richard D. Gill

COBRA Preprint Series

We give a simple proof of Bell's inequality in quantum mechanics which, in conjunction with experiments, demonstrates that the local hidden variables assumption is false. The proof sheds light on relationships between the notion of causal interaction and interference between particles.


Effectively Selecting A Target Population For A Future Comparative Study, Lihui Zhao, Lu Tian, Tianxi Cai, Brian Claggett, L. J. Wei 2011 Northwestern University

Effectively Selecting A Target Population For A Future Comparative Study, Lihui Zhao, Lu Tian, Tianxi Cai, Brian Claggett, L. J. Wei

Harvard University Biostatistics Working Paper Series

When comparing a new treatment with a control in a randomized clinical study, the treatment effect is generally assessed by evaluating a summary measure over a specific study population. The success of the trial heavily depends on the choice of such a population. In this paper, we show a systematic, effective way to identify a promising population, for which the new treatment is expected to have a desired benefit, using the data from a current study involving similar comparator treatments. Specifically, with the existing data we first create a parametric scoring system using multiple covariates to estimate subject-specific treatment differences. …


Multiple Testing Of Local Maxima For Detection Of Peaks In Chip-Seq Data, Armin Schwartzman, Andrew Jaffe, Yulia Gavrilov, Clifford A. Meyer 2011 Harvard School of Public Health and Dana Farber Cancer Institute

Multiple Testing Of Local Maxima For Detection Of Peaks In Chip-Seq Data, Armin Schwartzman, Andrew Jaffe, Yulia Gavrilov, Clifford A. Meyer

Harvard University Biostatistics Working Paper Series

No abstract provided.


Manova: Type I Error Rate Analysis, Christopher Dau Wei Ling 2011 California Polytechnic State University, San Luis Obispo

Manova: Type I Error Rate Analysis, Christopher Dau Wei Ling

Statistics

No abstract provided.


A Study Of Missing Data Imputation And Predictive Modeling Of Strength Properties Of Wood Composites, Yan Zeng 2011 University of Tennessee, Knoxville

A Study Of Missing Data Imputation And Predictive Modeling Of Strength Properties Of Wood Composites, Yan Zeng

Masters Theses

Problem: Real-time process and destructive test data were collected from a wood composite manufacturer in the U.S. to develop real-time predictive models of two key strength properties (Modulus of Rupture (MOR) and Internal Bound (IB)) of a wood composite manufacturing process. Sensor malfunction and data “send/retrieval” problems lead to null fields in the company’s data warehouse which resulted in information loss. Many manufacturers attempt to build accurate predictive models excluding entire records with null fields or using summary statistics such as mean or median in place of the null field. However, predictive model errors in validation may be higher …


Asymptotic Theory For Cross-Validated Targeted Maximum Likelihood Estimation, Wenjing Zheng, Mark J. van der Laan 2011 University of California, Berkeley, Division of Biostatistics

Asymptotic Theory For Cross-Validated Targeted Maximum Likelihood Estimation, Wenjing Zheng, Mark J. Van Der Laan

Wenjing Zheng

We consider a targeted maximum likelihood estimator of a path-wise differentiable parameter of the data generating distribution in a semi-parametric model based on observing n independent and identically distributed observations. The targeted maximum likelihood estimator (TMLE) uses V-fold sample splitting for the initial estimator in order to make the TMLE maximally robust in its bias reduction step. We prove a general theorem that states asymptotic efficiency (and thereby regularity) of the targeted maximum likelihood estimator when the initial estimator is consistent and a second order term converges to zero in probability at a rate faster than the square root of …


On The Covariate-Adjusted Estimation For An Overall Treatment Difference With Data From A Randomized Comparative Clinical Trial, Lu Tian, Tianxi Cai, Lihui Zhao, L. J. Wei 2011 Stanford University School of Medicine

On The Covariate-Adjusted Estimation For An Overall Treatment Difference With Data From A Randomized Comparative Clinical Trial, Lu Tian, Tianxi Cai, Lihui Zhao, L. J. Wei

Harvard University Biostatistics Working Paper Series

No abstract provided.


Assessing Medicare Beneficiaries’ Strength‐Of‐Preference Scores For Health Care Options: How Engaging Does The Elicitation Technique Need To Be?, Trafford Crump, Hilary A. Llewellyn-Thomas 2011 Dartmouth College

Assessing Medicare Beneficiaries’ Strength‐Of‐Preference Scores For Health Care Options: How Engaging Does The Elicitation Technique Need To Be?, Trafford Crump, Hilary A. Llewellyn-Thomas

Dartmouth Scholarship

The objective was to determine if participants’ strength‐of‐preference scores for elective health care interventions at the end‐of‐life (EOL) elicited using a non‐engaging technique are affected by their prior use of an engaging elicitation technique.


Variable Importance Analysis With The Multipim R Package, Stephan J. Ritter, Nicholas P. Jewell, Alan E. Hubbard 2011 Division of Biostatistics, University of California, Berkeley

Variable Importance Analysis With The Multipim R Package, Stephan J. Ritter, Nicholas P. Jewell, Alan E. Hubbard

U.C. Berkeley Division of Biostatistics Working Paper Series

We describe the R package multiPIM, including statistical background, functionality and user options. The package is for variable importance analysis, and is meant primarily for analyzing data from exploratory epidemiological studies, though it could certainly be applied in other areas as well. The approach taken to variable importance comes from the causal inference field, and is different from approaches taken in other R packages. By default, multiPIM uses a double robust targeted maximum likelihood estimator (TMLE) of a parameter akin to the attributable risk. Several regression methods/machine learning algorithms are available for estimating the nuisance parameters of the models, including …


A Unified Approach To Non-Negative Matrix Factorization And Probabilistic Latent Semantic Indexing, Karthik Devarajan, Guoli Wang, Nader Ebrahimi 2011 Fox Chase Cancer Center

A Unified Approach To Non-Negative Matrix Factorization And Probabilistic Latent Semantic Indexing, Karthik Devarajan, Guoli Wang, Nader Ebrahimi

COBRA Preprint Series

Non-negative matrix factorization (NMF) by the multiplicative updates algorithm is a powerful machine learning method for decomposing a high-dimensional nonnegative matrix V into two matrices, W and H, each with nonnegative entries, V ~ WH. NMF has been shown to have a unique parts-based, sparse representation of the data. The nonnegativity constraints in NMF allow only additive combinations of the data which enables it to learn parts that have distinct physical representations in reality. In the last few years, NMF has been successfully applied in a variety of areas such as natural language processing, information retrieval, image processing, speech recognition …


Multiple Testing Of Local Maxima For Detection Of Unimodal Peaks In 1d, Armin Schwartzman, Yulia Gavrilov, Robert J. Adler 2011 Harvard School of Public Health and Dana Farber Cancer Institute

Multiple Testing Of Local Maxima For Detection Of Unimodal Peaks In 1d, Armin Schwartzman, Yulia Gavrilov, Robert J. Adler

Harvard University Biostatistics Working Paper Series

No abstract provided.


Component Extraction Of Complex Biomedical Signal And Performance Analysis Based On Different Algorithm, hemant pasusangai kasturiwale 2011 university of mumbai,India

Component Extraction Of Complex Biomedical Signal And Performance Analysis Based On Different Algorithm, Hemant Pasusangai Kasturiwale

Johns Hopkins University, Dept. of Biostatistics Working Papers

Biomedical signals can arise from one or many sources including heart ,brains and endocrine systems. Multiple sources poses challenge to researchers which may have contaminated with artifacts and noise. The Biomedical time series signal are like electroencephalogram(EEG),electrocardiogram(ECG),etc The morphology of the cardiac signal is very important in most of diagnostics based on the ECG. The diagnosis of patient is based on visual observation of recorded ECG,EEG,etc, may not be accurate. To achieve better understanding , PCA (Principal Component Analysis) and ICA algorithms helps in analyzing ECG signals . The immense scope in the field of biomedical-signal processing Independent Component Analysis( …


An Exploration Of Non-Detects In Environmental Data, Juliana Fajardo 2011 California Polytechnic State University, San Luis Obispo

An Exploration Of Non-Detects In Environmental Data, Juliana Fajardo

Statistics

No abstract provided.


Management And Support Of Shared Integrated Library Systems, Jason Vaughan, Kristen Costello 2011 University of Nevada, Las Vegas

Management And Support Of Shared Integrated Library Systems, Jason Vaughan, Kristen Costello

Library Faculty Publications

The University of Nevada, Las Vegas (UNLV) University Libraries has hosted and managed a shared integrated library system (ILS) since 1989. The system and the number of partner libraries sharing the system has grown significantly over the past two decades. Spurred by the level of involvement and support contributed by the host institution, the authors administered a comprehensive survey to current Innovative Interfaces libraries. Research findings are combined with a description of UNLV’s local practices to provide substantial insights into shared funding, support, and management activities associated with shared systems.


Propensity Score Analysis With Matching Weights, Liang Li 2011 Cleveland Clinic

Propensity Score Analysis With Matching Weights, Liang Li

COBRA Preprint Series

The propensity score analysis is one of the most widely used methods for studying the causal treatment effect in observational studies. This paper studies treatment effect estimation with the method of matching weights. This method resembles propensity score matching but offers a number of new features including efficient estimation, rigorous variance calculation, simple asymptotics, statistical tests of balance, clearly identified target population with optimal sampling property, and no need for choosing matching algorithm and caliper size. In addition, we propose the mirror histogram as a useful tool for graphically displaying balance. The method also shares some features of the inverse …


Power Analysis For Alternative Tests For The Equality Of Means., Haiyin Li 2011 East Tennessee State University

Power Analysis For Alternative Tests For The Equality Of Means., Haiyin Li

Electronic Theses and Dissertations

The two sample t-test is the test usually taught in introductory statistics courses to test for the equality of means of two populations. However, the t-test is not the only test available to compare the means of two populations. The randomization test is being incorporated into some introductory courses. There is also the bootstrap test. It is also not uncommon to decide the equality of the means based on confidence intervals for the means of these two populations. Are all those methods equally powerful? Can the idea of non-overlapping t confidence intervals be extended to bootstrap confidence intervals? The powers …


Digital Commons powered by bepress