Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 17 of 17

Full-Text Articles in Physical Sciences and Mathematics

Uncertainty And The Value Of Diagnostic Information With Application To Axillary Lymph Node Dissection In Breast Cancer, Giovanni Parmigiani Dec 2003

Uncertainty And The Value Of Diagnostic Information With Application To Axillary Lymph Node Dissection In Breast Cancer, Giovanni Parmigiani

Johns Hopkins University, Dept. of Biostatistics Working Papers

In clinical decision making, it is common to ask whether, and how much, a diagnostic procedure is contributing to subsequent treatment decisions. Statistically, quantification of the value of the information provided by a diagnostic procedure can be carried out using decision trees with multiple decision points, representing both the diagnostic test and the subsequent treatments that may depend on the test's results. This article investigates probabilistic sensitivity analysis approaches for exploring and communicating parameter uncertainty in such decision trees. Complexities arise because uncertainty about a model's inputs determines uncertainty about optimal decisions at all decision nodes of a tree. We …


Kernel Estimation Of Rate Function For Recurrent Event Data, Chin-Tsang Chiang, Mei-Cheng Wang, Chiung-Yu Huang Dec 2003

Kernel Estimation Of Rate Function For Recurrent Event Data, Chin-Tsang Chiang, Mei-Cheng Wang, Chiung-Yu Huang

Johns Hopkins University, Dept. of Biostatistics Working Papers

Recurrent event data are largely characterized by the rate function but smoothing techniques for estimating the rate function have never been rigorously developed or studied in statistical literature. This paper considers the moment and least squares methods for estimating the rate function from recurrent event data. With an independent censoring assumption on the recurrent event process, we study statistical properties of the proposed estimators and propose bootstrap procedures for the bandwidth selection and for the approximation of confidence intervals in the estimation of the occurrence rate function. It is identified that the moment method without resmoothing via a smaller bandwidth …


Optimization Of Breast Cancer Screening Modalities, Yu Shen, Giovanni Parmigiani Dec 2003

Optimization Of Breast Cancer Screening Modalities, Yu Shen, Giovanni Parmigiani

Johns Hopkins University, Dept. of Biostatistics Working Papers

Mathematical models and decision analyses based on microsimulations have been shown to be useful in evaluating relative merits of various screening strategies in terms of cost and mortality reduction. Most investigations regarding the balance between mortality reduction and costs have focused on a single modality, mammography. A systematic evaluation of the relative expenses and projected benefit of combining clinical breast examination and mammography is not at present available. The purpose of this report is to provide methodologic details including assumptions and data used in the process of modeling for complex decision analyses, when searching for optimal breast cancer screening strategies …


Modeling The Incubation Period Of Anthrax, Ron Brookmeyer, Elizabeth Johnson, Sarah Barry Dec 2003

Modeling The Incubation Period Of Anthrax, Ron Brookmeyer, Elizabeth Johnson, Sarah Barry

Johns Hopkins University, Dept. of Biostatistics Working Papers

Models of the incubation period of anthrax are important to public health planners because they can be used to predict the delay before outbreaks are detected, the size of an outbreak and the duration of time that persons should remain on antibiotics to prevent disease. The difficulty is that there is little direct data about the incubation period in humans. The objective of this paper is to develop and apply models for the incubation period of anthrax. Mechanistic models that account for the biology of spore clearance and germination are developed based on a competing risks formulation. The models predict …


Underestimation Of Standard Errors In Multi-Site Time Series Studies, Michael Daniels, Francesca Dominici, Scott L. Zeger Nov 2003

Underestimation Of Standard Errors In Multi-Site Time Series Studies, Michael Daniels, Francesca Dominici, Scott L. Zeger

Johns Hopkins University, Dept. of Biostatistics Working Papers

Multi-site time series studies of air pollution and mortality and morbidity have figured prominently in the literature as comprehensive approaches for estimating acute effects of air pollution on health. Hierarchical models are generally used to combine site-specific information and estimate pooled air pollution effects taking into account both within-site statistical uncertainty, and across-site heterogeneity.

Within a site, characteristics of time series data of air pollution and health (small pollution effects, missing data, highly correlated predictors, non linear confounding etc.) make modelling all sources of uncertainty challenging. One potential consequence is underestimation of the statistical variance of the site-specific effects to …


Time-Series Studies Of Particulate Matter, Michelle L. Bell, Jonathan M. Samet, Francesca Dominici Nov 2003

Time-Series Studies Of Particulate Matter, Michelle L. Bell, Jonathan M. Samet, Francesca Dominici

Johns Hopkins University, Dept. of Biostatistics Working Papers

Studies of air pollution and human health have evolved from descriptive studies of the early phenomena of large increases in adverse health effects following extreme air pollution episodes, to time-series analyses and the development of sophisticated regression models. In fact, advanced statistical methods are necessary to address the many challenges inherent in the detection of a small pollution risk in the presence of many confounders. This paper reviews the history, methods, and findings of the time-series studies estimating health risks associated with short-term exposure to particulate matter, though much of the discussion is applicable to epidemiological studies of air pollution …


Smooth Quantile Ratio Estimation With Regression: Estimating Medical Expenditures For Smoking Attributable Diseases, Francesca Dominici, Scott L. Zeger Nov 2003

Smooth Quantile Ratio Estimation With Regression: Estimating Medical Expenditures For Smoking Attributable Diseases, Francesca Dominici, Scott L. Zeger

Johns Hopkins University, Dept. of Biostatistics Working Papers

In this paper we introduce a semi-parametric regression model for estimating the difference in the expected value of two positive and highly skewed random variables as a function of covariates. Our method extends Smooth Quantile Ratio Estimation (SQUARE), a novel estimator of the mean difference of two positive random variables, to a regression model.

The methodological development of this paper is motivated by a common problem in econometrics where we are interested in estimating the difference in the average expenditures between two populations, say with and without a disease, taking covariates into account. Let Y1 and Y2 be two positive …


Loss Function Based Ranking In Two-Stage, Hierarchical Models, Rongheng Lin, Thomas A. Louis, Susan M. Paddock, Greg Ridgeway Nov 2003

Loss Function Based Ranking In Two-Stage, Hierarchical Models, Rongheng Lin, Thomas A. Louis, Susan M. Paddock, Greg Ridgeway

Johns Hopkins University, Dept. of Biostatistics Working Papers

Several authors have studied the performance of optimal, squared error loss (SEL) estimated ranks. Though these are effective, in many applications interest focuses on identifying the relatively good (e.g., in the upper 10%) or relatively poor performers. We construct loss functions that address this goal and evaluate candidate rank estimates, some of which optimize specific loss functions. We study performance for a fully parametric hierarchical model with a Gaussian prior and Gaussian sampling distributions, evaluating performance for several loss functions. Results show that though SEL-optimal ranks and percentiles do not specifically focus on classifying with respect to a percentile cut …


Joint Modeling And Estimation For Recurrent Event Processes And Failure Time Data, Chiung-Yu Huang, Mei-Cheng Wang Nov 2003

Joint Modeling And Estimation For Recurrent Event Processes And Failure Time Data, Chiung-Yu Huang, Mei-Cheng Wang

Johns Hopkins University, Dept. of Biostatistics Working Papers

Recurrent event data are commonly encountered in longitudinal follow-up studies related to biomedical science, econometrics, reliability, and demography. In many studies, recurrent events serve as important measurements for evaluating disease progression, health deterioration, or insurance risk. When analyzing recurrent event data, an independent censoring condition is typically required for the construction of statistical methods. Nevertheless, in some situations, the terminating time for observing recurrent events could be correlated with the recurrent event process and, as a result, the assumption of independent censoring is violated. In this paper, we consider joint modeling of a recurrent event process and a failure time …


Unification Of Variance Components And Haseman-Elston Regression For Quantitative Trait Linkage Analysis, Wei-Min Chen, Karl W. Broman, Kung-Yee Liang Oct 2003

Unification Of Variance Components And Haseman-Elston Regression For Quantitative Trait Linkage Analysis, Wei-Min Chen, Karl W. Broman, Kung-Yee Liang

Johns Hopkins University, Dept. of Biostatistics Working Papers

Two of the major approaches for linkage analysis with quantitative traits in humans include variance components and Haseman-Elston regression. Previously, these have been viewed as quite separate methods. We describe a general model, fit by use of generalized estimating equations (GEE), for which the variance components and Haseman-Elston methods (including many of the extensions to the original Haseman-Elston method) are special cases, corresponding to different choices for a working covariance matrix. We also show that the regression-based test of Sham et al.(2002) is equivalent to a robust score statistic derived from our GEE approach. These results have several important implications. …


Smooth Quantile Ratio Estimation, Francesca Dominici, Leslie Cope, Daniel Q. Naiman, Scott L. Zeger Oct 2003

Smooth Quantile Ratio Estimation, Francesca Dominici, Leslie Cope, Daniel Q. Naiman, Scott L. Zeger

Johns Hopkins University, Dept. of Biostatistics Working Papers

In a study of health care expenditures attributable to smoking, we seek to compare the distribution of medical costs for persons with lung cancer or chronic obstructive pulmonary disease (cases) to those without (controls) using a national survey which includes hundreds of cases and thousands of controls. The distribution of costs is highly skewed toward larger values, making estimates of the mean from the smaller sample dependent on a small fraction of the biggest values. One approach to deal with the smaller sample is to rely on a simple parametric model such as the log-normal, but this makes the undesirable …


Hierarchical Bivariate Time Series Models: A Combined Analysis Of The Effects Of Particulate Matter On Morbidity And Mortality, Francesca Dominici, Antonella Zanobetti, Scott L. Zeger, Joel Schwartz, Jonathan M. Samet Oct 2003

Hierarchical Bivariate Time Series Models: A Combined Analysis Of The Effects Of Particulate Matter On Morbidity And Mortality, Francesca Dominici, Antonella Zanobetti, Scott L. Zeger, Joel Schwartz, Jonathan M. Samet

Johns Hopkins University, Dept. of Biostatistics Working Papers

In this paper we develop a hierarchical bivariate time series model to characterize the relationship between particulate matter less than 10 microns in aerodynamic diameter (PM10) and both mortality and hospital admissions for cardiovascular diseases. The model is applied to time series data on mortality and morbidity for 10 metropolitan areas in the United States from 1986 to 1993. We postulate that these time series should be related through a shared relationship with PM10.

At the first stage of the hierarchy, we fit two seemingly unrelated Poisson regression models to produce city-specific estimates of the log relative rates of mortality …


Nonparametric Estimation Of The Bivariate Recurrence Time Distribution, Chiung-Yu Huang, Mei-Cheng Wang Oct 2003

Nonparametric Estimation Of The Bivariate Recurrence Time Distribution, Chiung-Yu Huang, Mei-Cheng Wang

Johns Hopkins University, Dept. of Biostatistics Working Papers

This paper considers statistical models in which two different types of events, such as the diagnosis of a disease and the remission of the disease, occur alternately over time and are observed subject to right censoring. We propose nonparametric estimators for the joint distribution of bivariate recurrence times and the marginal distribution of the first recurrence time. In general, the marginal distribution of the second recurrence time cannot be estimated due to an identifiability problem, but a conditional distribution of the second recurrence time can be estimated non-parametrically. In literature, statistical methods have been developed to estimate the joint distribution …


A Nested Unsupervised Approach To Identifying Novel Molecular Subtypes, Elizabeth Garrett, Giovanni Parmigiani Oct 2003

A Nested Unsupervised Approach To Identifying Novel Molecular Subtypes, Elizabeth Garrett, Giovanni Parmigiani

Johns Hopkins University, Dept. of Biostatistics Working Papers

In classification problems arising in genomics research it is common to study populations for which a broad class assignment is known (say, normal versus diseased) and one seeks to find undiscovered subclasses within one or both of the known classes. Formally, this problem can be thought of as an unsupervised analysis nested within a supervised one. Here we take the view that the nested unsupervised analysis can successfully utilize information from the entire data set for constructing and/or selecting useful predictors. Specifically, we propose a mixture model approach to the nested unsupervised problem, where the supervised information is used to …


Stochastic Models Based On Molecular Hybridization Theory For Short Oligonucleotide Microarrays, Zhijin Wu, Richard Leblanc, Rafael A. Irizarry Sep 2003

Stochastic Models Based On Molecular Hybridization Theory For Short Oligonucleotide Microarrays, Zhijin Wu, Richard Leblanc, Rafael A. Irizarry

Johns Hopkins University, Dept. of Biostatistics Working Papers

High density oligonucleotide expression arrays are a widely used tool for the measurement of gene expression on a large scale. Affymetrix GeneChip arrays appear to dominate this market. These arrays use short oligonucleotides to probe for genes in an RNA sample. Due to optical noise, non-specific hybridization, probe-specific effects, and measurement error, ad-hoc measures of expression, that summarize probe intensities, can lead to imprecise and inaccurate results. Various researchers have demonstrated that expression measures based on simple statistical models can provide great improvements over the ad-hoc procedure offered by Affymetrix. Recently, physical models based on molecular hybridization theory, have been …


Cross-Calibration Of Stroke Disability Measures: Bayesian Analysis Of Longitudinal Ordinal Categorical Data Using Negative Dependence, Giovanni Parmigiani, Heidi W. Ashih, Gregory P. Samsa, Pamela W. Duncan, Sue Min Lai, David B. Matchar Aug 2003

Cross-Calibration Of Stroke Disability Measures: Bayesian Analysis Of Longitudinal Ordinal Categorical Data Using Negative Dependence, Giovanni Parmigiani, Heidi W. Ashih, Gregory P. Samsa, Pamela W. Duncan, Sue Min Lai, David B. Matchar

Johns Hopkins University, Dept. of Biostatistics Working Papers

It is common to assess disability of stroke patients using standardized scales, such as the Rankin Stroke Outcome Scale (RS) and the Barthel Index (BI). The Rankin Scale, which was designed for applications to stroke, is based on assessing directly the global conditions of a patient. The Barthel Index, which was designed for general applications, is based on a series of questions about the patient’s ability to carry out 10 basis activities of daily living. As both scales are commonly used, but few studies use both, translating between scales is important in gaining an overall understanding of the efficacy of …


Checking Assumptions In Latent Class Regression Models Via A Markov Chain Monte Carlo Estimation Approach: An Application To Depression And Socio-Economic Status, Elizabeth Garrett, Richard Miech, Pamela Owens, William W. Eaton, Scott L. Zeger Jan 2003

Checking Assumptions In Latent Class Regression Models Via A Markov Chain Monte Carlo Estimation Approach: An Application To Depression And Socio-Economic Status, Elizabeth Garrett, Richard Miech, Pamela Owens, William W. Eaton, Scott L. Zeger

Johns Hopkins University, Dept. of Biostatistics Working Papers

Latent class regression models are useful tools for assessing associations between covariates and latent variables. However, evaluation of key model assumptions cannot be performed using methods from standard regression models due to the unobserved nature of latent outcome variables. This paper presents graphical diagnostic tools to evaluate whether or not latent class regression models adhere to standard assumptions of the model: conditional independence and non-differential measurement. An integral part of these methods is the use of a Markov Chain Monte Carlo estimation procedure. Unlike standard maximum likelihood implementations for latent class regression model estimation, the MCMC approach allows us to …