Statistical Methods On Risk Management Of Extreme Events, 2017 University of Massachusetts Amherst

#### Statistical Methods On Risk Management Of Extreme Events, Zijing Zhang

*Doctoral Dissertations*

The goal of the dissertation is the investigation of financial risk analysis methodologies, using the schemes for extreme value modeling as well as techniques from copula modeling.

Extreme value theory is concerned with probabilistic and statistical questions re- lated to unusual behavior or rare events. The subject has a rich mathematical theory and also a long tradition of applications in a variety of areas. We are interested in its application in risk management, with a focus on estimating and forcasting the Value-at-Risk of financial time series data. Extremal data are inherently scarce, thus making inference challenging. In order to obtain ...

Informational Index And Its Applications In High Dimensional Data, 2017 University of Kentucky

#### Informational Index And Its Applications In High Dimensional Data, Qingcong Yuan

*Theses and Dissertations--Statistics*

We introduce a new class of measures for testing independence between two random vectors, which uses expected difference of conditional and marginal characteristic functions. By choosing a particular weight function in the class, we propose a new index for measuring independence and study its property. Two empirical versions are developed, their properties, asymptotics, connection with existing measures and applications are discussed. Implementation and Monte Carlo results are also presented.

We propose a two-stage sufficient variable selections method based on the new index to deal with large p small n data. The method does not require model specification and especially focuses ...

Nonparametric Compound Estimation, Derivative Estimation, And Change Point Detection, 2017 University of Kentucky

#### Nonparametric Compound Estimation, Derivative Estimation, And Change Point Detection, Sisheng Liu

*Theses and Dissertations--Statistics*

Firstly, we reviewed some popular nonparameteric regression methods during the past several decades. Then we extended the compound estimation (Charnigo and Srinivasan [2011]) to adapt random design points and heteroskedasticity and proposed a modified Cp criteria for tuning parameter selection. Moreover, we developed a *DCp* criteria for tuning paramter selection problem in general nonparametric derivative estimation. This extends *GCp* criteria in Charnigo, Hall and Srinivasan [2011] with random design points and heteroskedasticity. Next, we proposed a change point detection method via compound estimation for both fixed design and random design case, the adaptation of heteroskedasticity was considered for the method ...

Inference Using Bhattacharyya Distance To Model Interaction Effects When The Number Of Predictors Far Exceeds The Sample Size, 2017 University of Kentucky

#### Inference Using Bhattacharyya Distance To Model Interaction Effects When The Number Of Predictors Far Exceeds The Sample Size, Sarah A. Janse

*Theses and Dissertations--Statistics*

In recent years, statistical analyses, algorithms, and modeling of big data have been constrained due to computational complexity. Further, the added complexity of relationships among response and explanatory variables, such as higher-order interaction effects, make identifying predictors using standard statistical techniques difficult. These difficulties are only exacerbated in the case of small sample sizes in some studies. Recent analyses have targeted the identification of interaction effects in big data, but the development of methods to identify higher-order interaction effects has been limited by computational concerns. One recently studied method is the Feasible Solutions Algorithm (FSA), a fast, flexible method that ...

A Semiparametric Inference To Regression Analysis With Missing Covariates In Survey Data, 2017 North Carolina State University

#### A Semiparametric Inference To Regression Analysis With Missing Covariates In Survey Data, Shu Yang, Jae Kwang Kim

*Statistics Publications*

Parameter estimation in parametric regression models with missing covariates is considered under a survey sampling setup. Under missingness at random, a semiparametric maximum likelihood approach is proposed which requires no parametric specification of the marginal covariate distribution. By drawing from the von Mises calculus and V-Statistics theory, we obtain an asymptotic linear representation of the semiparametric maximum likelihood estimator (SMLE) of the regression parameters, which allows for a consistent estimator of asymptotic variance. An EM algorithm for computation is then developed to implement the proposed method using fractional imputation. Simulation results suggest that the SMLE method is robust, whereas the ...

Penalized Nonparametric Scalar-On-Function Regression Via Principal Coordinates, 2016 New York University School of Medicine

#### Penalized Nonparametric Scalar-On-Function Regression Via Principal Coordinates, Philip T. Reiss, David L. Miller, Pei-Shien Wu, Wen-Yu Hua

*Philip T. Reiss*

Semiparametric Adaptive Estimation With Nonignorable Nonresponse Data, 2016 Osaka University

#### Semiparametric Adaptive Estimation With Nonignorable Nonresponse Data, Kosuke Morikawa, Jae Kwang Kim

*Statistics Preprints*

When the response mechanism is believed to be nonignorable or not missing at random (NMAR), a valid analysis requires stronger assumptions about the data than do standard statistical methods. Semiparametric estimators have been developed un- der the correct model specification assumption for the response mechanism. In this paper, we consider a scheme for obtaining the optimal estimation for the parame- ters such as the mean and propose two semiparametric adaptive estimators that do not require any model assumptions except for the response mechanism. Asymptotic properties of proposed estimators are discussed, and we present an application to Korean Labor and Income ...

Improving Power In Group Sequential, Randomized Trials By Adjusting For Prognostic Baseline Variables And Short-Term Outcomes, 2016 Departmnet of Biostatistics, Johns Hopkins Bloomberg School of Public Health

#### Improving Power In Group Sequential, Randomized Trials By Adjusting For Prognostic Baseline Variables And Short-Term Outcomes, Tianchen Qian, Michael Rosenblum, Huitong Qiu

*Johns Hopkins University, Dept. of Biostatistics Working Papers*

In group sequential designs, adjusting for baseline variables and short-term outcomes can lead to increased power and reduced sample size. We derive formulas for the precision gain from such variable adjustment using semiparametric estimators for the average treatment effect, and give new results on what conditions lead to substantial power gains and sample size reductions. The formulas reveal how the impact of prognostic variables on the precision gain is modified by the number of pipeline participants, analysis timing, enrollment rate, and treatment effect heterogeneity, when the semiparametric estimator uses correctly specified models. Given set prognostic value of baseline variables and ...

Stochastic Optimization Of Adaptive Enrichment Designs For Two Subpopulations, 2016 Harvard T.H. Chan School of Public Health

#### Stochastic Optimization Of Adaptive Enrichment Designs For Two Subpopulations, Aaron Fisher, Michael Rosenblum

*Johns Hopkins University, Dept. of Biostatistics Working Papers*

An adaptive enrichment design is a randomized trial that allows enrollment criteria to be modified at interim analyses, based on a preset decision rule. When there is prior uncertainty regarding treatment effect heterogeneity, these trial designs can provide improved power for detecting treatment effects in subpopulations. We present a simulated annealing approach to search over the space of decision rules and other parameters for an adaptive enrichment design. The goal is to minimize the expected number enrolled or expected duration, while preserving the appropriate power and Type I error rate. We also explore the benefits of parallel computation in the ...

Monte Carlo Methods In Bayesian Inference: Theory, Methods And Applications, 2016 University of Arkansas, Fayetteville

#### Monte Carlo Methods In Bayesian Inference: Theory, Methods And Applications, Huarui Zhang

*Theses and Dissertations*

Monte Carlo methods are becoming more and more popular in statistics due to the fast development of efficient computing technologies. One of the major beneficiaries of this advent is the field of Bayesian inference. The aim of this thesis is two-fold: (i) to explain the theory justifying the validity of the simulation-based schemes in a Bayesian setting (why they should work) and (ii) to apply them in several different types of data analysis that a statistician has to routinely encounter. In Chapter 1, I introduce key concepts in Bayesian statistics. Then we discuss Monte Carlo Simulation methods in detail. Our ...

On The Comparison Of The Strength Of Morphological Integration Across Morphometric Datasets, 2016 Iowa State University

#### On The Comparison Of The Strength Of Morphological Integration Across Morphometric Datasets, Dean C. Adams, Michael L. Collyer

*Ecology, Evolution and Organismal Biology Publications*

Evolutionary morphologists frequently wish to understand the extent to which organisms are integrated, and whether the strength of morphological integration among subsets of phenotypic variables differ among taxa or other groups. However, comparisons of the strength of integration across datasets are difficult, in part because the summary measures that characterize these patterns (RV and rPLS) are dependent both on sample size and on the number of variables. As a solution to this issue we propose a standardized test statistic (a z-score) for measuring the degree of morphological integration between sets of variables. The approach is based on a partial least ...

Rao-Lovric And The Triwizard Point Null Hypothesis Tournament, 2016 Wayne State University

#### Rao-Lovric And The Triwizard Point Null Hypothesis Tournament, Shlomo Sawilowsky

*Journal of Modern Applied Statistical Methods*

The debate if the point null hypothesis is ever literally true cannot be resolved, because there are three competing statistical systems claiming ownership of the construct. The local resolution depends on personal acclimatization to a Fisherian, Frequentist, or Bayesian orientation (or an unexpected fourth champion if decision theory is allowed to compete). Implications of Rao and Lovric’s proposed Hodges-Lehman paradigm are discussed in the Appendix.

Censoring Unbiased Regression Trees And Ensembles, 2016 Department of Biostatistics, Johns Hopkins Bloomberg School of Public Health

#### Censoring Unbiased Regression Trees And Ensembles, Jon Arni Steingrimsson, Liqun Diao, Robert L. Strawderman

*Johns Hopkins University, Dept. of Biostatistics Working Papers*

This paper proposes a novel approach to building regression trees and ensemble learning in survival analysis. By first extending the theory of censoring unbiased transformations, we construct observed data estimators of full data loss functions in cases where responses can be right censored. This theory is used to construct two specific classes of methods for building regression trees and regression ensembles that respectively make use of Buckley-James and doubly robust estimating equations for a given full data risk function. For the particular case of squared error loss, we further show how to implement these algorithms using existing software (e.g ...

Matching The Efficiency Gains Of The Logistic Regression Estimator While Avoiding Its Interpretability Problems, In Randomized Trials, 2016 Johns Hopkins Bloomberg School of Public Health, Department of Biostatistics

#### Matching The Efficiency Gains Of The Logistic Regression Estimator While Avoiding Its Interpretability Problems, In Randomized Trials, Michael Rosenblum, Jon Arni Steingrimsson

*Johns Hopkins University, Dept. of Biostatistics Working Papers*

Adjusting for prognostic baseline variables can lead to improved power in randomized trials. For binary outcomes, a logistic regression estimator is commonly used for such adjustment. This has resulted in substantial efficiency gains in practice, e.g., gains equivalent to reducing the required sample size by 20-28% were observed in a recent survey of traumatic brain injury trials. Robinson and Jewell (1991) proved that the logistic regression estimator is guaranteed to have equal or better asymptotic efficiency compared to the unadjusted estimator (which ignores baseline variables). Unfortunately, the logistic regression estimator has the following dangerous vulnerabilities: it is only interpretable ...

Advanced Data Analysis - Lecture Notes, 2016 University of New Mexico

#### Advanced Data Analysis - Lecture Notes, Erik B. Erhardt, Edward J. Bedrick, Ronald M. Schrader

*Open Educational Resources*

Lecture notes for Advanced Data Analysis (ADA1 Stat 427/527 and ADA2 Stat 428/528), Department of Mathematics and Statistics, University of New Mexico, Fall 2016-Spring 2017. Additional material including RMarkdown templates for in-class and homework exercises, datasets, R code, and video lectures are available on the course websites: https://statacumen.com/teaching/ada1 and https://statacumen.com/teaching/ada2 .

**Contents**

I ADA1: Software

- 0 Introduction to R, Rstudio, and ggplot

II ADA1: Summaries and displays, and one-, two-, and many-way tests of means

- 1 Summarizing and Displaying Data
- 2 Estimation in One-Sample Problems
- 3 Two-Sample Inferences
- 4 Checking Assumptions ...

Nonparametric Methods For Doubly Robust Estimation Of Continuous Treatment Effects, 2016 University of Pennsylvania

#### Nonparametric Methods For Doubly Robust Estimation Of Continuous Treatment Effects, Edward H. Kennedy, Zongming Ma, Matthew D. Mchugh, Dylan S. Small

*Statistics Papers*

Continuous treatments (e.g. doses) arise often in practice, but many available causal effect estimators are limited by either requiring parametric models for the effect curve, or by not allowing doubly robust covariate adjustment. We develop a novel kernel smoothing approach that requires only mild smoothness assumptions on the effect curve and still allows for misspecification of either the treatment density or outcome regression. We derive asymptotic properties and give a procedure for data‐driven bandwidth selection. The methods are illustrated via simulation and in a study of the effect of nurse staffing on hospital readmissions penalties.

Introduction To Quantitative Methods, 2016 RMIT University

#### Introduction To Quantitative Methods, Siddhi Pittayachawan

*Siddhi Pittayachawan*

No abstract provided.

Advances In Portmanteau Diagnostic Tests, 2016 The University of Western Ontario

#### Advances In Portmanteau Diagnostic Tests, Jinkun Xiao

*Electronic Thesis and Dissertation Repository*

Portmanteau test serves an important role in model diagnostics for Box-Jenkins Modelling procedures. A large number of Portmanteau test based on the autocorrelation function are proposed for a general purpose goodness-of-fit test. Since the asymptotic distributions for the statistics has a complicated form which makes it hard to obtain the p-value directly, the gamma approximation is introduced to obtain the p-value. But the approximation will inevitably introduce approximation errors and needs a large number of observations to yield a good approximation. To avoid some pitfalls in the approximation, the Lin-Mcleod Test is further proposed to obtain a numeric solution to ...

Trends In The Sand: Directional Evolution In The Shell Shape Of Recessing Scallops (Bivalvia: Pectinidae), 2016 Iowa State University

#### Trends In The Sand: Directional Evolution In The Shell Shape Of Recessing Scallops (Bivalvia: Pectinidae), Emma Sherratt, Alvin Alejandrino, Andrew C. Kraemer, Jeanne M. Serb, Dean C. Adams

*Ecology, Evolution and Organismal Biology Publications*

Directional evolution is one of the most compelling evolutionary patterns observed in macroevolution. Yet, despite its importance, detecting such trends in multivariate data remains a challenge. In this study, we evaluate multivariate evolution of shell shape in 93 bivalved scallop species, combining geometric morphometrics and phylogenetic comparative methods. Phylomorphospace visualization described the history of morphological diversification in the group; revealing that taxa with a recessing life habit were the most distinctive in shell shape, and appeared to display a directional trend. To evaluate this hypothesis empirically, we extended existing methods by characterizing the mean directional evolution in phylomorphospace for recessing ...

On Some Test Statistics For Testing The Population Skewness And Kurtosis: An Empirical Study, 2016 Florida International University

#### On Some Test Statistics For Testing The Population Skewness And Kurtosis: An Empirical Study, Yawen Guo

*FIU Electronic Theses and Dissertations*

The purpose of this thesis is to propose some test statistics for testing the skewness and kurtosis parameters of a distribution, not limited to a normal distribution. Since a theoretical comparison is not possible, a simulation study has been conducted to compare the performance of the test statistics. We have compared both parametric methods (classical method with normality assumption) and non-parametric methods (bootstrap in Bias Corrected Standard Method, Efron’s Percentile Method, Hall’s Percentile Method and Bias Corrected Percentile Method). Our simulation results for testing the skewness parameter indicate that the power of the tests differs significantly across sample ...