Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

Series

COBRA

Discipline
Keyword
Publication Year
Publication

Articles 1 - 30 of 1119

Full-Text Articles in Physical Sciences and Mathematics

Marginal Proportional Hazards Models For Clustered Interval-Censored Data With Time-Dependent Covariates, Kaitlyn Cook, Wenbin Lu, Rui Wang Feb 2022

Marginal Proportional Hazards Models For Clustered Interval-Censored Data With Time-Dependent Covariates, Kaitlyn Cook, Wenbin Lu, Rui Wang

Harvard University Biostatistics Working Paper Series

The Botswana Combination Prevention Project was a cluster-randomized HIV prevention trial whose follow-up period coincided with Botswana’s national adoption of a universal test-and-treat strategy for HIV management. Of interest is whether, and to what extent, this change in policy (i) modified the observed preventative effects of the study intervention and (ii) was associated with a reduction in the population-level incidence of HIV in Botswana. To address these questions, we propose a stratified proportional hazards model for clustered interval-censored data with time-dependent covariates and develop a composite expectation maximization algorithm that facilitates estimation of model parameters without placing parametric assumptions on …


On Assessing Survival Benefit Of Immunotherapy Using Long-Term Restricted Mean Survival Time, Miki Horiguchi, Lu Tian, Hajime Uno Jan 2022

On Assessing Survival Benefit Of Immunotherapy Using Long-Term Restricted Mean Survival Time, Miki Horiguchi, Lu Tian, Hajime Uno

Harvard University Biostatistics Working Paper Series

The pattern of the difference between two survival curves we often observe in randomized clinical trials for evaluating immunotherapy is not proportional hazards; the treatment effect typically appears several months after the initiation of the treatment (i.e., delayed difference pattern). The commonly used logrank test and hazard ratio estimation approach will be suboptimal concerning testing and estimation for those trials. The long-term restricted mean survival time (LT-RMST) approach is a promising alternative for detecting the treatment effect that potentially appears later in the study. A challenge in employing the LT-RMST approach is that it must specify a lower end of …


Nonlinear Mixed-Effects Models For Hiv Viral Load Trajectories Before And After Antiretroviral Therapy Interruption, Incorporating Left Censoring, Sihaoyu Gao, Lang Wu, Tingting Yu, Roger Kouyos, Huldrych F. Gunthard, Rui Wang Jan 2022

Nonlinear Mixed-Effects Models For Hiv Viral Load Trajectories Before And After Antiretroviral Therapy Interruption, Incorporating Left Censoring, Sihaoyu Gao, Lang Wu, Tingting Yu, Roger Kouyos, Huldrych F. Gunthard, Rui Wang

Harvard University Biostatistics Working Paper Series

Characterizing features of the viral rebound trajectories and identifying host, virological, and immunological factors that are predictive of the viral rebound trajectories are central to HIV cure research. In this paper, we investigate if key features of HIV viral decay and CD4 trajectories during antiretroviral therapy (ART) are associated with characteristics of HIV viral rebound following ART interruption. Nonlinear mixed effect (NLME) models are used to model viral load trajectories before and following ART interruption, incorporating left censoring due to lower detection limits of viral load assays. A stochastic approximation EM (SAEM) algorithm is used for parameter estimation and inference. …


A Simple And Robust Alternative To Bland-Altman Method Of Assessing Clinical Agreement, Abhaya Indrayan Prof Jan 2022

A Simple And Robust Alternative To Bland-Altman Method Of Assessing Clinical Agreement, Abhaya Indrayan Prof

COBRA Preprint Series

Clinical agreement between two quantitative measurements on a group of subjects is generally assessed with the help of the Bland-Altman (B-A) limits. These limits only describe the dispersion of disagreements in 95% cases and do not measure the degree of agreement. The interpretation regarding the presence or absence of agreement by this method is based on whether B-A limits are within the pre-specified externally determined clinical tolerance limits. Thus, clinical tolerance limits are necessary for this method. We argue in this communication that the direct use of clinical tolerance limits for assessing agreement without the B-A limits is more effective …


Ratio And Difference Of Average Hazard With Survival Weight: New Measures To Quantify Survival Benefit Of New Therapy, Hajime Uno, Miki Horiguchi Sep 2021

Ratio And Difference Of Average Hazard With Survival Weight: New Measures To Quantify Survival Benefit Of New Therapy, Hajime Uno, Miki Horiguchi

Harvard University Biostatistics Working Paper Series

The hazard ratio (HR) has been the most popular measure to quantify the magnitude of treatment effect on time-to-event outcomes in clinical research. However, the HR estimated by Cox's method has several drawbacks. One major issue is that there is no clear interpretation when the proportional hazards (PH) assumption does not hold, because it is affected by study-specific censoring time distribution in non-PH cases. Another major issue is that the lack of a group-specific absolute hazard value in each group obscures the clinical significance of the magnitude of the treatment effect. Given these, we propose average hazard with survival weight …


Causal Mediation Analysis With Multiple Time-Varying Mediators, An-Shun Tai, Sheng-Hsuan Lin, Yu-Cheng Chu, Tsung Yu, Milo A. Puhan, Tyler Vanderweele Jul 2021

Causal Mediation Analysis With Multiple Time-Varying Mediators, An-Shun Tai, Sheng-Hsuan Lin, Yu-Cheng Chu, Tsung Yu, Milo A. Puhan, Tyler Vanderweele

Harvard University Biostatistics Working Paper Series

In longitudinal studies with time-varying exposures and mediators, the mediational g-formula is an important method for the assessment of direct and indirect effects. However, current methodologies based on the mediational g-formula can deal with only one mediator. This limitation makes these methodologies inapplicable to many scenarios. Hence, we develop a novel methodology by extending the mediational g-formula to cover cases with multiple time-varying mediators. We formulate two variants of our approach that are each suited to a distinct set of assumptions and effect definitions and present nonparametric identification results of each variant. We further show how complex causal mechanisms (whose …


Identification And Robust Estimation Of Swapped Direct And Indirect Effects: Mediation Analysis With Unmeasured Mediator–Outcome Confounding And Intermediate Confounding, An-Shun Tai, Sheng-Hsuan Lin Jan 2021

Identification And Robust Estimation Of Swapped Direct And Indirect Effects: Mediation Analysis With Unmeasured Mediator–Outcome Confounding And Intermediate Confounding, An-Shun Tai, Sheng-Hsuan Lin

Harvard University Biostatistics Working Paper Series

Counterfactual-model-based mediation analysis can yield substantial insight into the causal mechanism through the assessment of natural direct effects (NDEs) and natural indirect effects (NIEs). However, the assumptions regarding unmeasured mediator–outcome confounding and intermediate mediator–outcome confounding that are required for the determination of NDEs and NIEs present practical challenges. To address this problem, we introduce an instrumental blocker, a novel quasi-instrumental variable, to relax both of these assumptions, and we define a swapped direct effect (SDE) and a swapped indirect effect (SIE) to assess the mediation. We show that the SDE and SIE are identical to the NDE and NIE, respectively, …


Causal Mediation Analysis For Difference-In-Difference Design And Panel Data, Pei-Hsuan Hsia, An-Shun Tai, Chu-Lan Michael Kao, Yu-Hsuan Lin, Sheng-Hsuan Lin Jan 2021

Causal Mediation Analysis For Difference-In-Difference Design And Panel Data, Pei-Hsuan Hsia, An-Shun Tai, Chu-Lan Michael Kao, Yu-Hsuan Lin, Sheng-Hsuan Lin

Harvard University Biostatistics Working Paper Series

Advantages of panel data, i.e., difference in difference (DID) design data, are a large sample size and easy availability. Therefore, panel data are widely used in epidemiology and in all social science fields. The literatures on causal inferences of panel data setting or DID design are growing, but no theory or mediation analysis method has been proposed for such settings. In this study, we propose a methodology for conducting causal mediation analysis in DID design and panel data setting. We provide formal counterfactual definitions for controlled direct effect and natural direct and indirect effect in panel data setting and DID …


Robust Inference On Effects Attributable To Mediators: A Controlled-Direct-Effect-Based Approach For Causal Effect Decomposition With Multiple Mediators, An-Shun Tai, Yi-Juan Du, Sheng-Hsuan Lin Aug 2020

Robust Inference On Effects Attributable To Mediators: A Controlled-Direct-Effect-Based Approach For Causal Effect Decomposition With Multiple Mediators, An-Shun Tai, Yi-Juan Du, Sheng-Hsuan Lin

Harvard University Biostatistics Working Paper Series

Effect decomposition is a critical technique for mechanism investigation in settings with multiple causally ordered mediators. Causal mediation analysis is a standard method for effect decomposition, but the assumptions required for the identification process are extremely strong. By extending the framework of controlled direct effects, this study proposes the effect attributable to mediators (EAM) as a novel measure for effect decomposition. For policy making, EAM represents how much an effect can be eliminated by setting mediators to certain values. From the perspective of mechanism investigation, EAM contains information about how much a particular mediator or set of mediators is involved …


Integrated Multiple Mediation Analysis: A Robustness–Specificity Trade-Off In Causal Structure, An-Shun Tai, Sheng-Hsuan Lin May 2020

Integrated Multiple Mediation Analysis: A Robustness–Specificity Trade-Off In Causal Structure, An-Shun Tai, Sheng-Hsuan Lin

Harvard University Biostatistics Working Paper Series

Recent methodological developments in causal mediation analysis have addressed several issues regarding multiple mediators. However, these developed methods differ in their definitions of causal parameters, assumptions for identification, and interpretations of causal effects, making it unclear which method ought to be selected when investigating a given causal effect. Thus, in this study, we construct an integrated framework, which unifies all existing methodologies, as a standard for mediation analysis with multiple mediators. To clarify the relationship between existing methods, we propose four strategies for effect decomposition: two-way, partially forward, partially backward, and complete decompositions. This study reveals how the direct and …


Survival Mediation Analysis With The Death-Truncated Mediator: The Completeness Of The Survival Mediation Parameter, An-Shun Tai, Chun-An Tsai, Sheng-Hsuan Lin Apr 2020

Survival Mediation Analysis With The Death-Truncated Mediator: The Completeness Of The Survival Mediation Parameter, An-Shun Tai, Chun-An Tsai, Sheng-Hsuan Lin

Harvard University Biostatistics Working Paper Series

In medical research, the development of mediation analysis with a survival outcome has facilitated investigation into causal mechanisms. However, studies have not discussed the death-truncation problem for mediators, the problem being that conventional mediation parameters cannot be well-defined in the presence of a truncated mediator. In the present study, we systematically defined the completeness of causal effects to uncover the gap, in conventional causal definitions, between the survival and nonsurvival settings. We proposed three approaches to redefining the natural direct and indirect effects, which are generalized forms of the conventional causal effects for survival outcomes. Furthermore, we developed three statistical …


Estimating Marginal Hazard Ratios By Simultaneously Using A Set Of Propensity Score Models: A Multiply Robust Approach, Di Shu, Peisong Han, Rui Wang, Sengwee Toh Jan 2020

Estimating Marginal Hazard Ratios By Simultaneously Using A Set Of Propensity Score Models: A Multiply Robust Approach, Di Shu, Peisong Han, Rui Wang, Sengwee Toh

Harvard University Biostatistics Working Paper Series

The inverse probability weighted Cox model is frequently used to estimate marginal hazard ratios. Its validity requires a crucial condition that the propensity score model is correctly specified. To provide protection against misspecification of the propensity score model, we propose a weighted estimation method rooted in empirical likelihood theory. The proposed estimator is multiply robust in that it is guaranteed to be consistent when a set of postulated propensity score models contains a correctly specified model. Our simulation studies demonstrate satisfactory finite sample performance of the proposed method in terms of consistency and efficiency. We apply the proposed method to …


Estimation Of Conditional Power For Cluster-Randomized Trials With Interval-Censored Endpoints, Kaitlyn Cook, Rui Wang Jan 2020

Estimation Of Conditional Power For Cluster-Randomized Trials With Interval-Censored Endpoints, Kaitlyn Cook, Rui Wang

Harvard University Biostatistics Working Paper Series

Cluster-randomized trials (CRTs) of infectious disease preventions often yield correlated, interval-censored data: dependencies may exist between observations from the same cluster, and event occurrence may be assessed only at intermittent clinic visits. This data structure must be accounted for when conducting interim monitoring and futility assessment for CRTs. In this article, we propose a flexible framework for conditional power estimation when outcomes are correlated and interval-censored. Under the assumption that the survival times follow a shared frailty model, we first characterize the correspondence between the marginal and cluster-conditional survival functions, and then use this relationship to semiparametrically estimate the cluster-specific …


Randomization-Based Confidence Intervals For Cluster Randomized Trials, Dustin J. Rabideau, Rui Wang Jan 2020

Randomization-Based Confidence Intervals For Cluster Randomized Trials, Dustin J. Rabideau, Rui Wang

Harvard University Biostatistics Working Paper Series

In a cluster randomized trial (CRT), groups of people are randomly assigned to different interventions. Existing parametric and semiparametric methods for CRTs rely on distributional assumptions or a large number of clusters to maintain nominal confidence interval (CI) coverage. Randomization-based inference is an alternative approach that is distribution-free and does not require a large number of clusters to be valid. Although it is well-known that a CI can be obtained by inverting a randomization test, this requires randomization testing a non-zero null hypothesis, which is challenging with non-continuous and survival outcomes. In this paper, we propose a general method for …


Shrinkage Priors For Isotonic Probability Vectors And Binary Data Modeling, Philip S. Boonstra, Daniel R. Owen, Jian Kang Jan 2020

Shrinkage Priors For Isotonic Probability Vectors And Binary Data Modeling, Philip S. Boonstra, Daniel R. Owen, Jian Kang

The University of Michigan Department of Biostatistics Working Paper Series

This paper outlines a new class of shrinkage priors for Bayesian isotonic regression modeling a binary outcome against a predictor, where the probability of the outcome is assumed to be monotonically non-decreasing with the predictor. The predictor is categorized into a large number of groups, and the set of differences between outcome probabilities in consecutive categories is equipped with a multivariate prior having support over the set of simplexes. The Dirichlet distribution, which can be derived from a normalized cumulative sum of gamma-distributed random variables, is a natural choice of prior, but using mathematical and simulation-based arguments, we show that …


A Modular Framework For Early-Phase Seamless Oncology Trials, Philip S. Boonstra, Thomas M. Braun, Elizabeth C. Chase Jan 2020

A Modular Framework For Early-Phase Seamless Oncology Trials, Philip S. Boonstra, Thomas M. Braun, Elizabeth C. Chase

The University of Michigan Department of Biostatistics Working Paper Series

Background: As our understanding of the etiology and mechanisms of cancer becomes more sophisticated and the number of therapeutic options increases, phase I oncology trials today have multiple primary objectives. Many such designs are now 'seamless', meaning that the trial estimates both the maximum tolerated dose and the efficacy at this dose level. Sponsors often proceed with further study only with this additional efficacy evidence. However, with this increasing complexity in trial design, it becomes challenging to articulate fundamental operating characteristics of these trials, such as (i) what is the probability that the design will identify an acceptable, i.e. safe …


Power Calculation For Cross-Sectional Stepped-Wedge Cluster Randomized Trials With Binary Outcomes, Linda J. Harrison, Rui Wang Jan 2020

Power Calculation For Cross-Sectional Stepped-Wedge Cluster Randomized Trials With Binary Outcomes, Linda J. Harrison, Rui Wang

Harvard University Biostatistics Working Paper Series

Power calculation for stepped-wedge cluster randomized trials (SW-CRTs) presents unique challenges, beyond those of standard cluster randomized trials (CRTs), due to the need to consider temporal within cluster correlations and background period effects. To date, power calculation methods specific to SW-CRTs have primarily been developed under a linear model. When the outcome is binary, the use of a linear model corresponds to assessing a prevalence difference; yet trial analysis often employs a non-linear link function. We assess power for cross-sectional SW-CRTs under a logistic model fitted by generalized estimating equations. Firstly, under an exchangeable correlation structure, we show the power …


Generalized Matrix Decomposition Regression: Estimation And Inference For Two-Way Structured Data, Yue Wang, Ali Shojaie, Tim Randolph, Jing Ma Dec 2019

Generalized Matrix Decomposition Regression: Estimation And Inference For Two-Way Structured Data, Yue Wang, Ali Shojaie, Tim Randolph, Jing Ma

UW Biostatistics Working Paper Series

Analysis of two-way structured data, i.e., data with structures among both variables and samples, is becoming increasingly common in ecology, biology and neuro-science. Classical dimension-reduction tools, such as the singular value decomposition (SVD), may perform poorly for two-way structured data. The generalized matrix decomposition (GMD, Allen et al., 2014) extends the SVD to two-way structured data and thus constructs singular vectors that account for both structures. While the GMD is a useful dimension-reduction tool for exploratory analysis of two-way structured data, it is unsupervised and cannot be used to assess the association between such data and an outcome of interest. …


Statistical Inference For Networks Of High-Dimensional Point Processes, Xu Wang, Mladen Kolar, Ali Shojaie Dec 2019

Statistical Inference For Networks Of High-Dimensional Point Processes, Xu Wang, Mladen Kolar, Ali Shojaie

UW Biostatistics Working Paper Series

Fueled in part by recent applications in neuroscience, high-dimensional Hawkes process have become a popular tool for modeling the network of interactions among multivariate point process data. While evaluating the uncertainty of the network estimates is critical in scientific applications, existing methodological and theoretical work have only focused on estimation. To bridge this gap, this paper proposes a high-dimensional statistical inference procedure with theoretical guarantees for multivariate Hawkes process. Key to this inference procedure is a new concentration inequality on the first- and second-order statistics for integrated stochastic processes, which summarizes the entire history of the process. We apply this …


Inferring A Consensus Problem List Using Penalized Multistage Models For Ordered Data, Philip S. Boonstra, John C. Krauss Oct 2019

Inferring A Consensus Problem List Using Penalized Multistage Models For Ordered Data, Philip S. Boonstra, John C. Krauss

The University of Michigan Department of Biostatistics Working Paper Series

A patient's medical problem list describes his or her current health status and aids in the coordination and transfer of care between providers, among other things. Because a problem list is generated once and then subsequently modified or updated, what is not usually observable is the provider-effect. That is, to what extent does a patient's problem in the electronic medical record actually reflect a consensus communication of that patient's current health status? To that end, we report on and analyze a unique interview-based design in which multiple medical providers independently generate problem lists for each of three patient case abstracts …


Generalized Interventional Approach For Causal Mediation Analysis With Causally Ordered Multiple Mediators, Sheng-Hsuan Lin Jun 2019

Generalized Interventional Approach For Causal Mediation Analysis With Causally Ordered Multiple Mediators, Sheng-Hsuan Lin

Harvard University Biostatistics Working Paper Series

Causal mediation analysis has demonstrated the advantage of mechanism investigation. In conditions with causally ordered mediators, path-specific effects (PSEs) are introduced for specifying the effect subject to a certain combination of mediators. However, most PSEs are unidentifiable. To address this, an alternative approach termed interventional analogue of PSE (iPSE), is widely applied to effect decomposition. Previous studies that have considered multiple mediators have mainly focused on two-mediator cases due to the complexity of the mediation formula. This study proposes a generalized interventional approach for the settings, with the arbitrary number of ordered multiple mediators to study the causal parameter identification …


Unified Methods For Feature Selection In Large-Scale Genomic Studies With Censored Survival Outcomes, Lauren Spirko-Burns, Karthik Devarajan Mar 2019

Unified Methods For Feature Selection In Large-Scale Genomic Studies With Censored Survival Outcomes, Lauren Spirko-Burns, Karthik Devarajan

COBRA Preprint Series

One of the major goals in large-scale genomic studies is to identify genes with a prognostic impact on time-to-event outcomes which provide insight into the disease's process. With rapid developments in high-throughput genomic technologies in the past two decades, the scientific community is able to monitor the expression levels of tens of thousands of genes and proteins resulting in enormous data sets where the number of genomic features is far greater than the number of subjects. Methods based on univariate Cox regression are often used to select genomic features related to survival outcome; however, the Cox model assumes proportional hazards …


A Simulation Study Of Diagnostics For Bias In Non-Probability Samples, Philip S. Boonstra, Roderick Ja Little, Brady T. West, Rebecca R. Andridge, Fernanda Alvarado-Leiton Mar 2019

A Simulation Study Of Diagnostics For Bias In Non-Probability Samples, Philip S. Boonstra, Roderick Ja Little, Brady T. West, Rebecca R. Andridge, Fernanda Alvarado-Leiton

The University of Michigan Department of Biostatistics Working Paper Series

A non-probability sampling mechanism is likely to bias estimates of parameters with respect to a target population of interest. This bias poses a unique challenge when selection is 'non-ignorable', i.e. dependent upon the unobserved outcome of interest, since it is then undetectable and thus cannot be ameliorated. We extend a simulation study by Nishimura et al. [International Statistical Review, 84, 43--62 (2016)], adding a recently published statistic, the so-called 'standardized measure of unadjusted bias', which explicitly quantifies the extent of bias under the assumption that a specified amount of non-ignorable selection exists. Our findings suggest that this new …


Variance Estimation In Inverse Probability Weighted Cox Models, Di Shu, Jessica G. Young, Sengwee Toh, Rui Wang Jan 2019

Variance Estimation In Inverse Probability Weighted Cox Models, Di Shu, Jessica G. Young, Sengwee Toh, Rui Wang

Harvard University Biostatistics Working Paper Series

Inverse probability weighted Cox models can be used to estimate marginal hazard ratios under different treatments interventions in observational studies. To obtain variance estimates, the robust sandwich variance estimator is often recommended to account for the induced correlation among weighted observations. However, this estimator does not incorporate the uncertainty in estimating the weights and tends to overestimate the variance, leading to inefficient inference. Here we propose a new variance estimator that combines the estimation procedures for the hazard ratio and weights using stacked estimating equations, with additional adjustments for the sum of non-independent and identically distributed terms in a Cox …


General Approach Of Causal Mediation Analysis With Causally Ordered Multiple Mediators And Survival Outcome, An-Shun Tai, Pei-Hsuan Lin, Yen-Tsung Huang, Sheng-Hsuan Lin Jan 2019

General Approach Of Causal Mediation Analysis With Causally Ordered Multiple Mediators And Survival Outcome, An-Shun Tai, Pei-Hsuan Lin, Yen-Tsung Huang, Sheng-Hsuan Lin

Harvard University Biostatistics Working Paper Series

Causal mediation analysis with multiple mediators (causal multi-mediation analysis) is critical in understanding why an intervention works, especially in medical research. Deriving the path-specific effects (PSEs) of exposure on the outcome through a certain set of mediators can detail the causal mechanism of interest. However, the existing models of causal multi-mediation analysis are usually restricted to partial decomposition, which can only evaluate the cumulative effect of several paths. Moreover, the general form of PSEs for an arbitrary number of mediators has not been proposed. In this study, we provide a generalized definition of PSE for partial decomposition (partPSE) and for …


Concentrations Of Criteria Pollutants In The Contiguous U.S., 1979 – 2015: Role Of Model Parsimony In Integrated Empirical Geographic Regression, Sun-Young Kim, Matthew Bechle, Steve Hankey, Elizabeth (Lianne) A. Sheppard, Adam A. Szpiro, Julian D. Marshall Nov 2018

Concentrations Of Criteria Pollutants In The Contiguous U.S., 1979 – 2015: Role Of Model Parsimony In Integrated Empirical Geographic Regression, Sun-Young Kim, Matthew Bechle, Steve Hankey, Elizabeth (Lianne) A. Sheppard, Adam A. Szpiro, Julian D. Marshall

UW Biostatistics Working Paper Series

BACKGROUND: National- or regional-scale prediction models that estimate individual-level air pollution concentrations commonly include hundreds of geographic variables. However, these many variables may not be necessary and parsimonious approach including small numbers of variables may achieve sufficient prediction ability. This parsimonious approach can also be applied to most criteria pollutants. This approach will be powerful when generating publicly available datasets of model predictions that support research in environmental health and other fields. OBJECTIVES: We aim to (1) build annual-average integrated empirical geographic (IEG) regression models for the contiguous U.S. for six criteria pollutants, for all years with regulatory monitoring data …


Analysis Of Covariance (Ancova) In Randomized Trials: More Precision, Less Conditional Bias, And Valid Confidence Intervals, Without Model Assumptions, Bingkai Wang, Elizabeth Ogburn, Michael Rosenblum Oct 2018

Analysis Of Covariance (Ancova) In Randomized Trials: More Precision, Less Conditional Bias, And Valid Confidence Intervals, Without Model Assumptions, Bingkai Wang, Elizabeth Ogburn, Michael Rosenblum

Johns Hopkins University, Dept. of Biostatistics Working Papers

Covariate adjustment" in the randomized trial context refers to an estimator of the average treatment effect that adjusts for chance imbalances between study arms in baseline variables (called “covariates"). The baseline variables could include, e.g., age, sex, disease severity, and biomarkers. According to two surveys of clinical trial reports, there is confusion about the statistical properties of covariate adjustment. We focus on the ANCOVA estimator, which involves fitting a linear model for the outcome given the treatment arm and baseline variables, and trials with equal probability of assignment to treatment and control. We prove the following new (to the best …


Cross-Sectional Hiv Incidence Estimation Accounting For Heterogeneity Across Communities, Yuejia Xu, Oliver B. Laeyendecker, Rui Wang Sep 2018

Cross-Sectional Hiv Incidence Estimation Accounting For Heterogeneity Across Communities, Yuejia Xu, Oliver B. Laeyendecker, Rui Wang

Harvard University Biostatistics Working Paper Series

No abstract provided.


Robust Inference For The Stepped Wedge Design, James P. Hughes, Patrick J. Heagerty, Fan Xia, Yuqi Ren Aug 2018

Robust Inference For The Stepped Wedge Design, James P. Hughes, Patrick J. Heagerty, Fan Xia, Yuqi Ren

UW Biostatistics Working Paper Series

Based on a permutation argument, we derive a closed form expression for an estimate of the treatment effect, along with its standard error, in a stepped wedge design. We show that these estimates are robust to misspecification of both the mean and covariance structure of the underlying data-generating mechanism, thereby providing a robust approach to inference for the treatment effect in stepped wedge designs. We use simulations to evaluate the type I error and power of the proposed estimate and to compare the performance of the proposed estimate to the optimal estimate when the correct model specification is known. The …


Robust Estimation Of The Average Treatment Effect In Alzheimer's Disease Clinical Trials, Michael Rosenblum, Aidan Mcdermont, Elizabeth Colantuoni Mar 2018

Robust Estimation Of The Average Treatment Effect In Alzheimer's Disease Clinical Trials, Michael Rosenblum, Aidan Mcdermont, Elizabeth Colantuoni

Johns Hopkins University, Dept. of Biostatistics Working Papers

The primary analysis of Alzheimer's disease clinical trials often involves a mixed-model repeated measure (MMRM) approach. We consider another estimator of the average treatment effect, called targeted minimum loss based estimation (TMLE). This estimator is more robust to violations of assumptions about missing data than MMRM.

We compare TMLE versus MMRM by analyzing data from a completed Alzheimer's disease trial data set and by simulation studies. The simulations involved different missing data distributions, where loss to followup at a given visit could depend on baseline variables, treatment assignment, and the outcome measured at previous visits. The TMLE generally has improved …