Penalized Nonparametric Scalar-On-Function Regression Via Principal Coordinates, 2016 New York University School of Medicine

#### Penalized Nonparametric Scalar-On-Function Regression Via Principal Coordinates, Philip T. Reiss, David L. Miller, Pei-Shien Wu, Wen-Yu Hua

*Philip T. Reiss*

What Will Happen Here?, 2016 bepress university

#### What Will Happen Here?, Sidney Fourteen

*Sidney Fourteen*

A Multi-Indexed Logistic Model For Time Series, 2016 East Tennessee State University

#### A Multi-Indexed Logistic Model For Time Series, Xiang Liu

*Electronic Theses and Dissertations*

In this thesis, we explore a multi-indexed logistic regression (MILR) model, with particular emphasis given to its application to time series. MILR includes simple logistic regression (SLR) as a special case, and the hope is that it will in some instances also produce significantly better results. To motivate the development of MILR, we consider its application to the analysis of both simulated sine wave data and stock data. We looked at well-studied SLR and its application in the analysis of time series data. Using a more sophisticated representation of sequential data, we then detail the implementation of MILR. We compare ...

Human Exposure Modeling Using Sheds, 2016 Alion Science & Technology Inc

#### Human Exposure Modeling Using Sheds, Luther Smith, William Graham Glen

*Annual Symposium on Biomathematics and Ecology: Education and Research*

No abstract provided.

Risk Objectivism And Risk Subjectivism: When Are Risks Real, 2016 University of New Hampshire

#### Risk Objectivism And Risk Subjectivism: When Are Risks Real, Paul B. Thompson

*RISK: Health, Safety & Environment*

Typically, those who discuss risk management envision a two-step process wherein, first, risk is more or less objectively appraised and, second, the acceptability of those risks is subjectively evaluated. This paper questions the philosophical foundations of that approach.

The Reduced Form Of Litigation Models And The Plaintiff's Win Rate, 2016 University of Pennsylvania Law School

#### The Reduced Form Of Litigation Models And The Plaintiff's Win Rate, Jonah B. Gelbach

*Faculty Scholarship*

In this paper I introduce what I call the reduced form approach to studying the plaintiff's win rate in litigation selection models. A reduced form comprises a joint distribution of plaintiff's and defendant's beliefs concerning the probability that the plaintiff would win in the event a dispute were litigated; a conditional win rate function that tells us the actual probability of a plaintiff win in the event of litigation, given the parties' subjective beliefs; and a litigation rule that provides the probability that a case will be litigated given the two parties' beliefs. I show how models ...

A Synthesis Of Current Surveillance Planning Methods For The Sequential Monitoring Of Drug And Vaccine Adverse Effects Using Electronic Health Care Data, 2016 Group Health Research Institute; University of Washington

#### A Synthesis Of Current Surveillance Planning Methods For The Sequential Monitoring Of Drug And Vaccine Adverse Effects Using Electronic Health Care Data, Jennifer C. Nelson, Robert Wellman, Onchee Yu, Andrea J. Cook, Judith C. Maro, Rita Ouellet-Hellstrom, Denise Boudreau, James S. Floyd, Susan R. Heckbert, Simone Pinheiro, Marsha Reichman, Azadeh Shoaibi

*eGEMs (Generating Evidence & Methods to improve patient outcomes)*

**Introduction:** The large-scale assembly of electronic health care data combined with the use of sequential monitoring has made proactive postmarket drug- and vaccine-safety surveillance possible. Although sequential designs have been used extensively in randomized trials, less attention has been given to methods for applying them in observational electronic health care database settings.

**Existing Methods:** We review current sequential-surveillance planning methods from randomized trials, and the Vaccine Safety Datalink (VSD) and Mini-Sentinel Pilot projects—two national observational electronic health care database safety monitoring programs.

**Future Surveillance Planning:** Based on this examination, we suggest three steps for future surveillance planning in health ...

Advances In Portmanteau Diagnostic Tests, 2016 The University of Western Ontario

#### Advances In Portmanteau Diagnostic Tests, Jinkun Xiao

*Electronic Thesis and Dissertation Repository*

Portmanteau test serves an important role in model diagnostics for Box-Jenkins Modelling procedures. A large number of Portmanteau test based on the autocorrelation function are proposed for a general purpose goodness-of-fit test. Since the asymptotic distributions for the statistics has a complicated form which makes it hard to obtain the p-value directly, the gamma approximation is introduced to obtain the p-value. But the approximation will inevitably introduce approximation errors and needs a large number of observations to yield a good approximation. To avoid some pitfalls in the approximation, the Lin-Mcleod Test is further proposed to obtain a numeric solution to ...

Model Averaged Double Robust Estimation, 2016 Harvard School of Public Health

#### Model Averaged Double Robust Estimation, Matthew Cefalu, Francesca Dominici, Nils D. Arvold Md, Giovanni Parmigiani

*Harvard University Biostatistics Working Paper Series*

Existing methods in causal inference do not account for the uncertainty in the selection of confounders. We propose a new class of estimators for the average causal effect, the model averaged double robust estimators, that formally account for model uncertainty in both the propensity score and outcome model through the use of Bayesian model averaging. These estimators build on the desirable double robustness property by only requiring the true propensity score model or the true outcome model be within a specified class of models to maintain consistency. We provide asymptotic results and conduct a large scale simulation study that indicates ...

Probabilistic Methods In Information Theory, 2016 Cal State University-San Bernardino

#### Probabilistic Methods In Information Theory, Erik W. Pachas

*Electronic Theses, Projects, and Dissertations*

Given a probability space, we analyze the uncertainty, that is, the amount of information of a finite system, by studying the entropy of the system. We also extend the concept of entropy to a dynamical system by introducing a measure preserving transformation on a probability space. After showing some theorems and applications of entropy theory, we study the concept of ergodicity, which helps us to further analyze the information of the system.

Prevalence Estimation At The Cluster Level For Correlated Binary Data Using Random Partial-Cluster Sampling, 2016 University of North Carolina at Chapel Hill

#### Prevalence Estimation At The Cluster Level For Correlated Binary Data Using Random Partial-Cluster Sampling, Rujin Wang, John S. Preisser

*The University of North Carolina at Chapel Hill Department of Biostatistics Technical Report Series*

For clustered data in the medical sciences, disease is present when one or more of the observations in the cluster has the disease condition. This paper focuses on estimation of periodontal disease prevalence defined as the probability that one or more tooth sites have disease in a randomly selected subject. The prohibitive exam time and monetary cost of the full-mouth examination makes partial-mouth recording protocols attractive alternative methods to assess chronic periodontitis. In particular, Beck et al. (2006) proposed the random site selection method (RSSM), which pre-specifies a fixed number of tooth sites to be selected randomly from each subject ...

Distance-Based Analysis Of Variance For Brain Connectivity, 2016 Department of Biostatistics and Epidemiology, Perelman School of Medicine, University of Pennsylvania

#### Distance-Based Analysis Of Variance For Brain Connectivity, Russell T. Shinohara, Haochang Shou, Marco Carone, Robert Schultz, Birkan Tunc, Drew Parker, Ragini Verma

*UPenn Biostatistics Working Papers*

The field of neuroimaging dedicated to mapping connections in the brain is increasingly being recognized as key for understanding neurodevelopment and pathology. Networks of these connections are quantitatively represented using complex structures including matrices, functions, and graphs, which require specialized statistical techniques for estimation and inference about developmental and disorder-related changes. Unfortunately, classical statistical testing procedures are not well suited to high-dimensional testing problems. In the context of global or regional tests for differences in neuroimaging data, traditional analysis of variance (ANOVA) is not directly applicable without first summarizing the data into univariate or low-dimensional features, a process that may ...

Addition To Pglr Chap 6, 2016 Arizona State University

#### Addition To Pglr Chap 6, Joseph M. Hilbe

*Joseph M Hilbe*

Matching The Efficiency Gains Of The Logistic Regression Estimator While Avoiding Its Interpretability Problems, In Randomized Trials With Binary Outcomes, 2016 Johns Hopkins Bloomberg School of Public Health, Department of Biostatistics

#### Matching The Efficiency Gains Of The Logistic Regression Estimator While Avoiding Its Interpretability Problems, In Randomized Trials With Binary Outcomes, Michael Rosenblum, Jon Arni Steingrimsson

*Johns Hopkins University, Dept. of Biostatistics Working Papers*

Adjusting for prognostic baseline covariates can improve precision in analyzing randomized trials, leading to greater power to detect a treatment effect. For binary outcomes, a logistic regression estimator is commonly used for such adjustment. This has led to substantial efficiency gains in practice; for example, gains equivalent to reducing the required sample size by 20-28% were observed in a recent survey of traumatic brain injury trials. Robinson and Jewell (1991) proved that the logistic regression estimator is guaranteed to have equal or better asymptotic efficiency compared to the unadjusted estimator (which ignores baseline variables). Unfortunately, the logistic regression estimator has ...

The Use Of Permutation Tests For The Analysis Of Parallel And Stepped-Wedge Cluster Randomized Trials, 2016 Harvard University

#### The Use Of Permutation Tests For The Analysis Of Parallel And Stepped-Wedge Cluster Randomized Trials, Rui Wang, Victor Degruttola

*Harvard University Biostatistics Working Paper Series*

We investigate the use of permutation tests for the analysis of parallel and stepped-wedge cluster randomized trials. Permutation tests for parallel designs with exponential family endpoints have been extensively studied. The optimal permutation tests developed for exponential family alternatives require information on intraclass correlation, a quantity not yet defined for time-to-event endpoints. Therefore, it is unclear how efficient permutation tests can be constructed for cluster-randomized trials with such endpoints. We consider a class of test statistics formed by a weighted average of pair-specific treatment effect estimates and offer practical guidance on the choice of weights to improve efficiency. We apply ...

Improving Precision By Adjusting For Baseline Variables In Randomized Trials With Binary Outcomes, Without Regression Model Assumptions, 2016 Johns Hopkins Bloomberg School of Public Health

#### Improving Precision By Adjusting For Baseline Variables In Randomized Trials With Binary Outcomes, Without Regression Model Assumptions, Jon Arni Steingrimsson, Daniel F. Hanley, Michael Rosenblum

*Johns Hopkins University, Dept. of Biostatistics Working Papers*

In randomized clinical trials with baseline variables that are prognostic for the primary outcome, there is potential to improve precision and reduce sample size by appropriately adjusting for these variables. A major challenge is that there are multiple statistical methods to adjust for baseline variables, but little guidance on which is best to use in a given context. The choice of method can have important consequences. For example, one commonly used method leads to uninterpretable estimates if there is any treatment effect heterogeneity, which would jeopardize the validity of trial conclusions. We give practical guidance on how to avoid this ...

After Halliburton: Event Studies And Their Role In Federal Securities Fraud Litigation, 2016 University of Pennsylvania Law School

#### After Halliburton: Event Studies And Their Role In Federal Securities Fraud Litigation, Jill E. Fisch, Jonah B. Gelbach, Jonathan Klick

*Jill Fisch*

Event studies have become increasingly important in securities fraud litigation after the Supreme Court’s decision in *Halliburton II*. Litigants have used event study methodology, which empirically analyzes the relationship between the disclosure of corporate information and the issuer’s stock price, to provide evidence in the evaluation of key elements of federal securities fraud, including materiality, reliance, causation, and damages. As the use of event studies grows and they increasingly serve a gatekeeping function in determining whether litigation will proceed beyond a preliminary stage, it will be critical for courts to use them correctly.

This Article explores an array ...

After Halliburton: Event Studies And Their Role In Federal Securities Fraud Litigation, 2016 University of Pennsylvania Law School

#### After Halliburton: Event Studies And Their Role In Federal Securities Fraud Litigation, Jill E. Fisch, Jonah B. Gelbach, Jonathan Klick

*Jill Fisch*

Event studies have become increasingly important in securities fraud litigation after the Supreme Court’s decision in *Halliburton II*. Litigants have used event study methodology, which empirically analyzes the relationship between the disclosure of corporate information and the issuer’s stock price, to provide evidence in the evaluation of key elements of federal securities fraud, including materiality, reliance, causation, and damages. As the use of event studies grows and they increasingly serve a gatekeeping function in determining whether litigation will proceed beyond a preliminary stage, it will be critical for courts to use them correctly.

This Article explores an array ...

Actuarial Modelling With Mixtures Of Markov Chains, 2016 The University of Western Ontario

#### Actuarial Modelling With Mixtures Of Markov Chains, Yuzhou Zhang

*Electronic Thesis and Dissertation Repository*

Multi-state models are widely used in actuarial science because that they provide a convenient way of representing changes in people's statuses. Calculations are easy if one assumes that the model is a Markov chain. However, the memoryless property of a Markov chain is rarely appropriate.

This thesis considers several mixtures of Markov chains to capture the heterogeneity of people's mortality rates, morbidity rates, recovery rates, and ageing speeds. This heterogeneity may be the result of unobservable factors that affect individuals' health. The focus of this thesis is on investigating the behaviours of intensities of the observable transitions in ...

Survival Analysis In A Clinical Setting, 2016 Washington University in St. Louis

#### Survival Analysis In A Clinical Setting, Yunzhao Liu

*Arts & Sciences Electronic Theses and Dissertations*

With the fast paced advancement of modern medicine, cancer treatments have improved greatly over the past few decades; however, the overall survival rate has not improved for head neck squamous cell carcinoma (HNSCC). Traditionally, the general affected population of HNSCC was male over 50-60 years of age, whom have had history of alcohol and tobacco use. Conversely, in the recent decades, HNSCC has exhibited significant rise in younger patients, largely due to the increase in human papillomavirus (HPV) infection among young adults.

Generally, HPV as the most prevalent sexually transmitted disease, consisted of strains that do not cause harm to ...