Open Access. Powered by Scholars. Published by Universities.®

Medicine and Health Sciences Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 30 of 138

Full-Text Articles in Medicine and Health Sciences

On The Conventional Definition Of Path-Specific Effects - Fully Mediated Interaction With Multiple Ordered Mediators, An-Shun Tai, Le-Hsuan Liao, Sheng-Hsuan Lin Jul 2021

On The Conventional Definition Of Path-Specific Effects - Fully Mediated Interaction With Multiple Ordered Mediators, An-Shun Tai, Le-Hsuan Liao, Sheng-Hsuan Lin

Harvard University Biostatistics Working Paper Series

Path-specific effects (PSEs) are a critical measure for assessing mediation in the presence of multiple mediators. However, the conventional definition of PSEs has generated controversy because it often causes misinterpretation of the results of multiple mediator analysis. For in-depth analysis of this issue, we propose the concept of decomposing fully mediated interaction (FMI) from the average causal effect. We show that FMI misclassification is the main cause of PSE misinterpretation. Two strategies for specifying FMI are proposed: isolating FMI and reclassifying FMI. The choice of strategy depends on the objective. Isolating FMI is the superior strategy when the main objective …


Integrated Multiple Mediation Analysis: A Robustness–Specificity Trade-Off In Causal Structure, An-Shun Tai, Sheng-Hsuan Lin May 2020

Integrated Multiple Mediation Analysis: A Robustness–Specificity Trade-Off In Causal Structure, An-Shun Tai, Sheng-Hsuan Lin

Harvard University Biostatistics Working Paper Series

Recent methodological developments in causal mediation analysis have addressed several issues regarding multiple mediators. However, these developed methods differ in their definitions of causal parameters, assumptions for identification, and interpretations of causal effects, making it unclear which method ought to be selected when investigating a given causal effect. Thus, in this study, we construct an integrated framework, which unifies all existing methodologies, as a standard for mediation analysis with multiple mediators. To clarify the relationship between existing methods, we propose four strategies for effect decomposition: two-way, partially forward, partially backward, and complete decompositions. This study reveals how the direct and …


Technical Considerations In The Use Of The E-Value, Tyler J. Vanderweele, Peng Ding, Maya Mathur Feb 2018

Technical Considerations In The Use Of The E-Value, Tyler J. Vanderweele, Peng Ding, Maya Mathur

Harvard University Biostatistics Working Paper Series

The E-value is defined as the minimum strength of association on the risk ratio scale that an unmeasured confounder would have to have with both the exposure and the outcome, conditional on the measured covariates, to explain away the observed exposure-outcome association. We have elsewhere proposed that the reporting of E-values for estimates and for the limit of the confidence interval closest to the null become routine whenever causal effects are of interest. A number of questions have arisen about the use of E-value including questions concerning the interpretation of the relevant confounding association parameters, the nature of the transformation …


Evaluation Of Progress Towards The Unaids 90-90-90 Hiv Care Cascade: A Description Of Statistical Methods Used In An Interim Analysis Of The Intervention Communities In The Search Study, Laura Balzer, Joshua Schwab, Mark J. Van Der Laan, Maya L. Petersen Feb 2017

Evaluation Of Progress Towards The Unaids 90-90-90 Hiv Care Cascade: A Description Of Statistical Methods Used In An Interim Analysis Of The Intervention Communities In The Search Study, Laura Balzer, Joshua Schwab, Mark J. Van Der Laan, Maya L. Petersen

U.C. Berkeley Division of Biostatistics Working Paper Series

WHO guidelines call for universal antiretroviral treatment, and UNAIDS has set a global target to virally suppress most HIV-positive individuals. Accurate estimates of population-level coverage at each step of the HIV care cascade (testing, treatment, and viral suppression) are needed to assess the effectiveness of "test and treat" strategies implemented to achieve this goal. The data available to inform such estimates, however, are susceptible to informative missingness: the number of HIV-positive individuals in a population is unknown; individuals tested for HIV may not be representative of those whom a testing intervention fails to reach, and HIV-positive individuals with a viral …


Estimation Of Long-Term Area-Average Pm2.5 Concentrations For Area-Level Health Analyses, Sun-Young Kim, Casey Olives, Neal Fann, Joel Kaufman, Sverre Vedal, Lianne Sheppard Jul 2016

Estimation Of Long-Term Area-Average Pm2.5 Concentrations For Area-Level Health Analyses, Sun-Young Kim, Casey Olives, Neal Fann, Joel Kaufman, Sverre Vedal, Lianne Sheppard

UW Biostatistics Working Paper Series

Introduction: There is increasing evidence of an association between individual long-term PM2.5 exposure and human health. Mortality and morbidity data collected at the area-level are valuable resources for investigating corresponding population-level health effects. However, PM2.5 monitoring data are available for limited periods of time and locations, and are not adequate for estimating area-level concentrations. We developed a general approach to estimate county-average concentrations representative of population exposures for 1980-2010 in the continental U.S.

Methods: We predicted annual average PM2.5 concentrations at about 70,000 census tract centroids, using a point prediction model previously developed for estimating annual average …


Models For Hsv Shedding Must Account For Two Levels Of Overdispersion, Amalia Magaret Jan 2016

Models For Hsv Shedding Must Account For Two Levels Of Overdispersion, Amalia Magaret

UW Biostatistics Working Paper Series

We have frequently implemented crossover studies to evaluate new therapeutic interventions for genital herpes simplex virus infection. The outcome measured to assess the efficacy of interventions on herpes disease severity is the viral shedding rate, defined as the frequency of detection of HSV on the genital skin and mucosa. We performed a simulation study to ascertain whether our standard model, which we have used previously, was appropriately considering all the necessary features of the shedding data to provide correct inference. We simulated shedding data under our standard, validated assumptions and assessed the ability of 5 different models to reproduce the …


Nested Partially-Latent, Class Models For Dependent Binary Data, Estimating Disease Etiology, Zhenke Wu, Maria Deloria-Knoll, Scott L. Zeger Nov 2015

Nested Partially-Latent, Class Models For Dependent Binary Data, Estimating Disease Etiology, Zhenke Wu, Maria Deloria-Knoll, Scott L. Zeger

Johns Hopkins University, Dept. of Biostatistics Working Papers

The Pneumonia Etiology Research for Child Health (PERCH) study seeks to use modern measurement technology to infer the causes of pneumonia for which gold-standard evidence is unavailable. The paper describes a latent variable model designed to infer from case-control data the etiology distribution for the population of cases, and for an individual case given his or her measurements. We assume each observation is drawn from a mixture model for which each component represents one cause or disease class. The model addresses a major limitation of the traditional latent class approach by taking account of residual dependence among multivariate binary outcome …


The Statistics Of Sensitivity Analyses, Alexander R. Luedtke, Ivan Diaz, Mark J. Van Der Laan Oct 2015

The Statistics Of Sensitivity Analyses, Alexander R. Luedtke, Ivan Diaz, Mark J. Van Der Laan

U.C. Berkeley Division of Biostatistics Working Paper Series

Suppose one wishes to estimate a causal parameter given a sample of observations. This requires making unidentifiable assumptions about an underlying causal mechanism. Sensitivity analyses help investigators understand what impact violations of these assumptions could have on the causal conclusions drawn from a study, though themselves rely on untestable (but hopefully more interpretable) assumptions. Díaz and van der Laan (2013) advocate the use of a sequence (or continuum) of interpretable untestable assumptions of increasing plausibility for the sensitivity analysis so that experts can have informed opinions about which are true. In this work, we argue that using appropriate statistical procedures …


A General Framework For Diagnosing Confounding Of Time-Varying And Other Joint Exposures, John W. Jackson May 2015

A General Framework For Diagnosing Confounding Of Time-Varying And Other Joint Exposures, John W. Jackson

Harvard University Biostatistics Working Paper Series

No abstract provided.


A Unification Of Mediation And Interaction: A Four-Way Decomposition, Tyler J. Vanderweele Mar 2014

A Unification Of Mediation And Interaction: A Four-Way Decomposition, Tyler J. Vanderweele

Harvard University Biostatistics Working Paper Series

It is shown that the overall effect of an exposure on an outcome, in the presence of a mediator with which the exposure may interact, can be decomposed into four components: (i) the effect of the exposure in the absence of the mediator, (ii) the interactive effect when the mediator is left to what it would be in the absence of exposure, (iii) a mediated interaction, and (iv) a pure mediated effect. These four components, respectively, correspond to the portion of the effect that is due to neither mediation nor interaction, to just interaction (but not mediation), to both mediation …


Computational Model For Survey And Trend Analysis Of Patients With Endometriosis : A Decision Aid Tool For Ebm, Salvo Reina, Vito Reina, Franco Ameglio, Mauro Costa, Alessandro Fasciani Feb 2014

Computational Model For Survey And Trend Analysis Of Patients With Endometriosis : A Decision Aid Tool For Ebm, Salvo Reina, Vito Reina, Franco Ameglio, Mauro Costa, Alessandro Fasciani

COBRA Preprint Series

Endometriosis is increasingly collecting worldwide attention due to its medical complexity and social impact. The European community has identified this as a “social disease”. A large amount of information comes from scientists, yet several aspects of this pathology and staging criteria need to be clearly defined on a suitable number of individuals. In fact, available studies on endometriosis are not easily comparable due to a lack of standardized criteria to collect patients’ informations and scarce definitions of symptoms. Currently, only retrospective surgical stadiation is used to measure pathology intensity, while the Evidence Based Medicine (EBM) requires shareable methods and correct …


Adaptive Pair-Matching In The Search Trial And Estimation Of The Intervention Effect, Laura Balzer, Maya L. Petersen, Mark J. Van Der Laan Jan 2014

Adaptive Pair-Matching In The Search Trial And Estimation Of The Intervention Effect, Laura Balzer, Maya L. Petersen, Mark J. Van Der Laan

U.C. Berkeley Division of Biostatistics Working Paper Series

In randomized trials, pair-matching is an intuitive design strategy to protect study validity and to potentially increase study power. In a common design, candidate units are identified, and their baseline characteristics used to create the best n/2 matched pairs. Within the resulting pairs, the intervention is randomized, and the outcomes measured at the end of follow-up. We consider this design to be adaptive, because the construction of the matched pairs depends on the baseline covariates of all candidate units. As consequence, the observed data cannot be considered as n/2 independent, identically distributed (i.i.d.) pairs of units, as current practice assumes. …


Estimating Population Treatment Effects From A Survey Sub-Sample, Kara E. Rudolph, Ivan Diaz, Michael Rosenblum, Elizabeth A. Stuart Jan 2014

Estimating Population Treatment Effects From A Survey Sub-Sample, Kara E. Rudolph, Ivan Diaz, Michael Rosenblum, Elizabeth A. Stuart

Johns Hopkins University, Dept. of Biostatistics Working Papers

We consider the problem of estimating an average treatment effect for a target population from a survey sub-sample. Our motivating example is generalizing a treatment effect estimated in a sub-sample of the National Comorbidity Survey Replication Adolescent Supplement to the population of U.S. adolescents. To address this problem, we evaluate easy-to-implement methods that account for both non-random treatment assignment and a non-random two-stage selection mechanism. We compare the performance of a Horvitz-Thompson estimator using inverse probability weighting (IPW) and two double robust estimators in a variety of scenarios. We demonstrate that the two double robust estimators generally outperform IPW in …


Attributing Effects To Interactions, Tyler J. Vanderweele, Eric J. Tchetgen Tchetgen Jul 2013

Attributing Effects To Interactions, Tyler J. Vanderweele, Eric J. Tchetgen Tchetgen

Harvard University Biostatistics Working Paper Series

A framework is presented which allows an investigator to estimate the portion of the effect of one exposure that is attributable to an interaction with a second exposure. We show that when the two exposures are independent, the total effect of one exposure can be decomposed into a conditional effect of that exposure and a component due to interaction. The decomposition applies on difference or ratio scales. We discuss how the components can be estimated using standard regression models, and how these components can be used to evaluate the proportion of the total effect of the primary exposure attributable to …


Estimating Effects On Rare Outcomes: Knowledge Is Power, Laura B. Balzer, Mark J. Van Der Laan May 2013

Estimating Effects On Rare Outcomes: Knowledge Is Power, Laura B. Balzer, Mark J. Van Der Laan

U.C. Berkeley Division of Biostatistics Working Paper Series

Many of the secondary outcomes in observational studies and randomized trials are rare. Methods for estimating causal effects and associations with rare outcomes, however, are limited, and this represents a missed opportunity for investigation. In this article, we construct a new targeted minimum loss-based estimator (TMLE) for the effect of an exposure or treatment on a rare outcome. We focus on the causal risk difference and statistical models incorporating bounds on the conditional risk of the outcome, given the exposure and covariates. By construction, the proposed estimator constrains the predicted outcomes to respect this model knowledge. Theoretically, this bounding provides …


A Regionalized National Universal Kriging Model Using Partial Least Squares Regression For Estimating Annual Pm2.5 Concentrations In Epidemiology, Paul D. Sampson, Mark Richards, Adam A. Szpiro, Silas Bergen, Lianne Sheppard, Timothy V. Larson, Joel Kaufman Dec 2012

A Regionalized National Universal Kriging Model Using Partial Least Squares Regression For Estimating Annual Pm2.5 Concentrations In Epidemiology, Paul D. Sampson, Mark Richards, Adam A. Szpiro, Silas Bergen, Lianne Sheppard, Timothy V. Larson, Joel Kaufman

UW Biostatistics Working Paper Series

Many cohort studies in environmental epidemiology require accurate modeling and prediction of fine scale spatial variation in ambient air quality across the U.S. This modeling requires the use of small spatial scale geographic or “land use” regression covariates and some degree of spatial smoothing. Furthermore, the details of the prediction of air quality by land use regression and the spatial variation in ambient air quality not explained by this regression should be allowed to vary across the continent due to the large scale heterogeneity in topography, climate, and sources of air pollution. This paper introduces a regionalized national universal kriging …


Flexible Distributed Lag Models Using Random Functions With Application To Estimating Mortality Displacement From Heat-Related Deaths, Roger D. Peng Dec 2011

Flexible Distributed Lag Models Using Random Functions With Application To Estimating Mortality Displacement From Heat-Related Deaths, Roger D. Peng

Johns Hopkins University, Dept. of Biostatistics Working Papers

No abstract provided.


On The Definition Of A Confounder, Tyler J. Vanderweele, Ilya Shpitser Dec 2011

On The Definition Of A Confounder, Tyler J. Vanderweele, Ilya Shpitser

COBRA Preprint Series

The causal inference literature has provided a clear formal definition of confounding expressed in terms of counterfactual independence. The causal inference literature has not, however, produced a clear formal definition of a confounder, as it has given priority to the concept of confounding over that of a confounder. We consider a number of candidate definitions arising from various more informal statements made in the literature. We consider the properties satisfied by each candidate definition, principally focusing on (i) whether under the candidate definition control for all "confounders" suffices to control for "confounding" and (ii) whether each confounder in some context …


Components Of The Indirect Effect In Vaccine Trials: Identification Of Contagion And Infectiousness Effects, Tyler J. Vanderweele, Eric J. Tchetgen, M. Elizabeth Halloran Dec 2011

Components Of The Indirect Effect In Vaccine Trials: Identification Of Contagion And Infectiousness Effects, Tyler J. Vanderweele, Eric J. Tchetgen, M. Elizabeth Halloran

COBRA Preprint Series

Vaccination of one person may prevent the infection of another either because (i) the vaccine prevents the first from being infected and from infecting the second or because (ii) even if the first person is infected, the vaccine may render the infection less infectious. We might refer to the first of these mechanisms as a contagion effect and the second as an infectiousness effect. In this paper, for the simple setting of a randomized vaccine trial with households of size two, we use counterfactual theory under interference to provide formal definitions of a contagion effect and an infectiousness effect. Using …


Assessing Association For Bivariate Survival Data With Interval Sampling: A Copula Model Approach With Application To Aids Study, Hong Zhu, Mei-Cheng Wang Nov 2011

Assessing Association For Bivariate Survival Data With Interval Sampling: A Copula Model Approach With Application To Aids Study, Hong Zhu, Mei-Cheng Wang

Johns Hopkins University, Dept. of Biostatistics Working Papers

In disease surveillance systems or registries, bivariate survival data are typically collected under interval sampling. It refers to a situation when entry into a registry is at the time of the first failure event (e.g., HIV infection) within a calendar time interval, the time of the initiating event (e.g., birth) is retrospectively identified for all the cases in the registry, and subsequently the second failure event (e.g., death) is observed during the follow-up. Sampling bias is induced due to the selection process that the data are collected conditioning on the first failure event occurs within a time interval. Consequently, the …


A Regularization Corrected Score Method For Nonlinear Regression Models With Covariate Error, David M. Zucker, Malka Gorfine, Yi Li, Donna Spiegelman Sep 2011

A Regularization Corrected Score Method For Nonlinear Regression Models With Covariate Error, David M. Zucker, Malka Gorfine, Yi Li, Donna Spiegelman

Harvard University Biostatistics Working Paper Series

No abstract provided.


Estimation Of Risk Ratios In Cohort Studies With Common Outcomes: A Simple And Efficient Two-Stage Approach, Eric J. Tchetgen Sep 2011

Estimation Of Risk Ratios In Cohort Studies With Common Outcomes: A Simple And Efficient Two-Stage Approach, Eric J. Tchetgen

Harvard University Biostatistics Working Paper Series

No abstract provided.


On The Nondifferential Misclassification Of A Binary Confounder, Elizabeth L. Ogburn, Tyler J. Vanderweele Sep 2011

On The Nondifferential Misclassification Of A Binary Confounder, Elizabeth L. Ogburn, Tyler J. Vanderweele

COBRA Preprint Series

Abstract Consider a study with binary exposure, outcome, and confounder, where the confounder is nondifferentially misclassified. Epidemiologists have long accepted the unproven but oft-cited result that, if the confounder is binary, odds ratios, risk ratios, and risk differences which control for the mismeasured confounder will lie between the crude and the true measures. In this paper the authors provide an analytic proof of the result in the absence of a qualitative interaction between treatment and confounder, and demonstrate via counterexample that the result need not hold when there is a qualitative interaction between treatment and confounder. They also present an …


Modification By Frailty Status Of Ambient Air Pollution Effects On Lung Function In Older Adults In The Cardiovascular Health Study, Sandrah P. Eckel, Thomas A. Louis, Paulo H.M. Chaves, Linda P. Fried, Helene G. Margolis Aug 2011

Modification By Frailty Status Of Ambient Air Pollution Effects On Lung Function In Older Adults In The Cardiovascular Health Study, Sandrah P. Eckel, Thomas A. Louis, Paulo H.M. Chaves, Linda P. Fried, Helene G. Margolis

Johns Hopkins University, Dept. of Biostatistics Working Papers

Older adult susceptibility to air pollution health effects is well-recognized. Advanced age may act as a partial surrogate for conditions associated with aging. The authors investigated whether gerontologic frailty (a clinical health status metric) modified the effects of ambient ozone or particulate matter (PM10) air pollution on lung function in 3382 older adults using 7 years of followup data from the Cardiovascular Health Study (CHS) and the CHS Environmental Factors Ancillary Study. Monthly average pollution and annual frailty assessments were related to up to 3 repeated measurements of lung function using novel cumulative summaries of pollution and frailty histories that …


Variable Importance Analysis With The Multipim R Package, Stephan J. Ritter, Nicholas P. Jewell, Alan E. Hubbard Jul 2011

Variable Importance Analysis With The Multipim R Package, Stephan J. Ritter, Nicholas P. Jewell, Alan E. Hubbard

U.C. Berkeley Division of Biostatistics Working Paper Series

We describe the R package multiPIM, including statistical background, functionality and user options. The package is for variable importance analysis, and is meant primarily for analyzing data from exploratory epidemiological studies, though it could certainly be applied in other areas as well. The approach taken to variable importance comes from the causal inference field, and is different from approaches taken in other R packages. By default, multiPIM uses a double robust targeted maximum likelihood estimator (TMLE) of a parameter akin to the attributable risk. Several regression methods/machine learning algorithms are available for estimating the nuisance parameters of the models, including …


Reduced Bayesian Hierarchical Models: Estimating Health Effects Of Simultaneous Exposure To Multiple Pollutants, Jennifer F. Bobb, Francesca Dominici, Roger D. Peng Jul 2011

Reduced Bayesian Hierarchical Models: Estimating Health Effects Of Simultaneous Exposure To Multiple Pollutants, Jennifer F. Bobb, Francesca Dominici, Roger D. Peng

Johns Hopkins University, Dept. of Biostatistics Working Papers

Quantifying the health effects associated with simultaneous exposure to many air pollutants is now a research priority of the US EPA. Bayesian hierarchical models (BHM) have been extensively used in multisite time series studies of air pollution and health to estimate health effects of a single pollutant adjusted for potential confounding of other pollutants and other time-varying factors. However, when the scientific goal is to estimate the impacts of many pollutants jointly, a straightforward application of BHM is challenged by the need to specify a random-effect distribution on a high-dimensional vector of nuisance parameters, which often do not have an …


Bayesian Effect Estimation Accounting For Adjustment Uncertainty, Chi Wang, Giovanni Parmigiani, Francesca Dominici Apr 2011

Bayesian Effect Estimation Accounting For Adjustment Uncertainty, Chi Wang, Giovanni Parmigiani, Francesca Dominici

Harvard University Biostatistics Working Paper Series

No abstract provided.


Threshold Regression Models Adapted To Case-Control Studies, And The Risk Of Lung Cancer Due To Occupational Exposure To Asbestos In France, Antoine Chambaz, Dominique Choudat, Catherine Huber, Jean-Claude Pairon, Mark J. Van Der Laan Mar 2011

Threshold Regression Models Adapted To Case-Control Studies, And The Risk Of Lung Cancer Due To Occupational Exposure To Asbestos In France, Antoine Chambaz, Dominique Choudat, Catherine Huber, Jean-Claude Pairon, Mark J. Van Der Laan

U.C. Berkeley Division of Biostatistics Working Paper Series

Asbestos has been known for many years as a powerful carcinogen. Our purpose is quantify the relationship between an occupational exposure to asbestos and an increase of the risk of lung cancer. Furthermore, we wish to tackle the very delicate question of the evaluation, in subjects suffering from a lung cancer, of how much the amount of exposure to asbestos explains the occurrence of the cancer. For this purpose, we rely on a recent French case-control study. We build a large collection of threshold regression models, data-adaptively select a better model in it by multi-fold likelihood-based cross-validation, then fit the …


Minimum Description Length And Empirical Bayes Methods Of Identifying Snps Associated With Disease, Ye Yang, David R. Bickel Nov 2010

Minimum Description Length And Empirical Bayes Methods Of Identifying Snps Associated With Disease, Ye Yang, David R. Bickel

COBRA Preprint Series

The goal of determining which of hundreds of thousands of SNPs are associated with disease poses one of the most challenging multiple testing problems. Using the empirical Bayes approach, the local false discovery rate (LFDR) estimated using popular semiparametric models has enjoyed success in simultaneous inference. However, the estimated LFDR can be biased because the semiparametric approach tends to overestimate the proportion of the non-associated single nucleotide polymorphisms (SNPs). One of the negative consequences is that, like conventional p-values, such LFDR estimates cannot quantify the amount of information in the data that favors the null hypothesis of no disease-association.

We …


The Handling Of Missing Data In Molecular Epidemiologic Studies, Manisha Desai, Jessica Kubo, Denise Esserman, Mary Beth Terry Nov 2010

The Handling Of Missing Data In Molecular Epidemiologic Studies, Manisha Desai, Jessica Kubo, Denise Esserman, Mary Beth Terry

COBRA Preprint Series

Background: Molecular epidemiologic studies face a missing data problem as biospecimen data are often collected on only a proportion of subjects eligible for study.

Methods: We investigated all molecular epidemiologic studies published in CEBP in 2009 to characterize the prevalence of missing data and to elucidate how the issue was addressed. We considered multiple imputation (MI), a missing data technique that is readily available and easy to implement, as a possible solution.

Results: While the majority of studies had missing data, only 16% compared subjects with and without missing data. Furthermore, 95% of the studies with missing data performed a …