Open Access. Powered by Scholars. Published by Universities.®

Applied Statistics Commons

Open Access. Powered by Scholars. Published by Universities.®

3,524 Full-Text Articles 4,908 Authors 2,834,925 Downloads 168 Institutions

All Articles in Applied Statistics

Faceted Search

3,524 full-text articles. Page 69 of 108.

Modeling Contagion In The Eurozone Crisis Via Dynamical Systems, Giuseppe Castellacci, Youngna Choi 2015 New York University

Modeling Contagion In The Eurozone Crisis Via Dynamical Systems, Giuseppe Castellacci, Youngna Choi

Department of Applied Mathematics and Statistics Faculty Scholarship and Creative Works

We recently (Castellacci and Choi, 2013) formulated a theoretical framework for the modeling of financial instability contagion using the theories of dynamical systems. Here, our main goal is to model the Eurozone financial crisis within that framework. The underlying system comprises many economic agents that belong to several subsystems. In each instantiation of this framework, the hierarchy and nesting of the subsystems is dictated by the nature of the problem at hand. We describe in great detail how a suitable model can be set up for the Eurozone crisis. The dynamical system is defined by the evolution of the wealths …


The Coupled Within- And Between-Host Dynamics In The Evolution Of Hiv/Aids In China, Jie Lou, Hongna Zhou, Dong Liang, Zhen Jin, Baojun Song 2015 Shanghai University

The Coupled Within- And Between-Host Dynamics In The Evolution Of Hiv/Aids In China, Jie Lou, Hongna Zhou, Dong Liang, Zhen Jin, Baojun Song

Department of Applied Mathematics and Statistics Faculty Scholarship and Creative Works

In this work, we develop and analyze mathematical models for the coupled within-host and between-host dynamics caricaturing the evolution of HIV/AIDS. The host population is divided into susceptible, the infected without receiving treatment and the infected receiving ART treatment in accordance with China’s Four-Free-One-Care Policy. The within-host model is a typical ODE model adopted from literatures. The between-host model incorporates age-since-infection described by a system of integrodifferential equations. The two models are coupled via the viral load and number of CD4+ T cells of within the hosts. For the between-host model with an arbitrarily selected HIV infected individual, we focus …


Robust Optimization Of Biological Protocols, Patrick Flaherty, Ronald W. Davis 2015 University of Massachusetts - Amherst

Robust Optimization Of Biological Protocols, Patrick Flaherty, Ronald W. Davis

Mathematics and Statistics Department Faculty Publication Series

When conducting high-throughput biological experiments, it is often necessary to develop a protocol that is both inexpensive and robust. Standard approaches are either not cost-effective or arrive at an optimized protocol that is sensitive to experimental variations. Here, we describe a novel approach that directly minimizes the cost of the protocol while ensuring the protocol is robust to experimental variation. Our approach uses a risk-averse conditional value-at-risk criterion in a robust parameter design framework. We demonstrate this approach on a polymerase chain reaction protocol and show that our improved protocol is less expensive than the standard protocol and more robust …


Tracking Faith: A Statistical Analysis Of The Spiritual Profiles Of Chicago And Dallas-Ft.Worth Over The Last 15 Years, Kaitlyn Fitzgerald 2015 Olivet Nazarene University

Tracking Faith: A Statistical Analysis Of The Spiritual Profiles Of Chicago And Dallas-Ft.Worth Over The Last 15 Years, Kaitlyn Fitzgerald

Honors Program Projects

This project explored the question of whether the spiritual profiles of the Chicago and Dallas-Ft. Worth (DFW) markets have changed significantly over the last 15 years and if those spiritual profiles differ significantly from each other. The Barna Group is a market research firm that has tracked the role of faith in America for over 30 years. This project was the extension of work done during an internship with the Barna Group in Summer 2014, and their extensive database was made available for this research. Regression analysis and test of hypotheses for population proportion comparison were performed on the responses …


Equate: Observed-Score Linking And Equating In R, Anthony D. Albano 2015 University of Nebraska-Lincoln

Equate: Observed-Score Linking And Equating In R, Anthony D. Albano

Department of Educational Psychology: Faculty Publications

Linking and equating are statistical procedures used to convert scores from one measurement scale to another. These procedures are most often used in testing programs that involve multiple test forms, where adjustments are made for form difficulty differences when creating a measurement scale that is common across forms. Linking and equating methods are traditionally distinguished by the type of scores they are applied to, whether observed scores or scores from an item response theory model. Methods are also distinguished by the study design under which measurements are taken. The R package equate (Albano, 2014) is free, open-source software for conducting …


New Results In Ell_1 Penalized Regression, Edward A. Roualdes 2015 University of Kentucky

New Results In Ell_1 Penalized Regression, Edward A. Roualdes

Theses and Dissertations--Statistics

Here we consider penalized regression methods, and extend on the results surrounding the l1 norm penalty. We address a more recent development that generalizes previous methods by penalizing a linear transformation of the coefficients of interest instead of penalizing just the coefficients themselves. We introduce an approximate algorithm to fit this generalization and a fully Bayesian hierarchical model that is a direct analogue of the frequentist version. A number of benefits are derived from the Bayesian persepective; most notably choice of the tuning parameter and natural means to estimate the variation of estimates – a notoriously difficult task for the …


Taming The Hurricane Of Acquisition Cost Growth – Or At Least Predicting It, Allen J. DeNeve, Erin T. Ryan, Jonathan D. Ritschel, Christine M. Schubert Kabban 2015 Air Force Institute of Technology

Taming The Hurricane Of Acquisition Cost Growth – Or At Least Predicting It, Allen J. Deneve, Erin T. Ryan, Jonathan D. Ritschel, Christine M. Schubert Kabban

Faculty Publications

Cost growth is a persistent adversary to efficient budgeting in the Department of Defense. Despite myriad studies to uncover causes of this cost growth, few of the proposed remedies have made a meaningful impact. A key reason may be that DoD cost estimates are formulated using the highly unrealistic assumption that a program’s current baseline characteristics will not change in the future. Using a weather forecasting analogy, the authors demonstrate how a statistical approach may be used to account for these inevitable baseline changes and identify related cost growth trends. These trends are then used to reduce the error in …


Evaluating The Long-Term Effects Of Logging Residue Removals In Great Lakes Aspen Forests, Michael I. Premer 2015 Michigan Technological University

Evaluating The Long-Term Effects Of Logging Residue Removals In Great Lakes Aspen Forests, Michael I. Premer

Dissertations, Master's Theses and Master's Reports

Commercial aspen (Populus spp.) forests of the Great Lakes region are primarily managed for timber products such as pulp fiber and panel board, but logging residues (topwood and non-merchantable bolewood) are potentially important for utilization in the bioenergy market. In some regions, pulp and paper mills already utilize residues as fuel in combustion for heat and electricity, and progressive energy policies will likely cause an increase in biomass feedstock demand. The effects of removing residues, which have a comparatively high concentration of macronutrients, is poorly understood when evaluating long-term site productivity, future timber yields, plant diversity, stand dynamics, and …


Developments In Nonparametric Regression Methods With Application To Raman Spectroscopy Analysis, Jing Guo 2015 University of Kentucky

Developments In Nonparametric Regression Methods With Application To Raman Spectroscopy Analysis, Jing Guo

Theses and Dissertations--Epidemiology and Biostatistics

Raman spectroscopy has been successfully employed in the classification of breast pathologies involving basis spectra for chemical constituents of breast tissue and resulted in high sensitivity (94%) and specificity (96%) (Haka et al, 2005). Motivated by recent developments in nonparametric regression, in this work, we adapt stacking, boosting, and dynamic ensemble learning into a nonparametric regression framework with application to Raman spectroscopy analysis for breast cancer diagnosis. In Chapter 2, we apply compound estimation (Charnigo and Srinivasan, 2011) in Raman spectra analysis to classify normal, benign, and malignant breast tissue. We explore both the spectra profiles and their derivatives to …


Compositions, Logratios And Geostatistics: An Application To Iron Ore, Clint Ward 2015 Edith Cowan University

Compositions, Logratios And Geostatistics: An Application To Iron Ore, Clint Ward

Theses: Doctorates and Masters

Common implementations of geostatistical methods, kriging and simulation, ignore the fact that geochemical data are usually reported in weight percent, sum to a constant, and are thus compositional in nature. The constant sum implies that rescaling has occurred and this can be shown to produce spurious correlations. Compositional geostatistics is an approach developed to ensure that the constant sum constraint is respected in estimation while removing dependencies on the spurious correlations. This study tests the applicability of this method against the commonly implemented ordinary cokriging method. The sample data are production blast cuttings analyses drawn from a producing iron ore …


Bayesian Inference Of The Weibull-Pareto Distribution, James Dow 2015 Georgia Southern University

Bayesian Inference Of The Weibull-Pareto Distribution, James Dow

Electronic Theses and Dissertations

The Weibull distribution has many applications in various topics. Some of these topics include survival analysis, reliability engineering, general insurance, electrical engineering, and industrial engineering. The Weibull distribution was further extended by the Weibull-Pareto distribution. A desirable property this distribution has is its shape can skew being able to better model left or right skewed data. Examples of skewed data include human longevity and actuarial data. In this work a hierarchical Bayesian model was developed using the Weibull-Pareto distribution.


The Sensitivity Of A Test Based On Spearman's Rho In Cross-Correlation Change Point Problems, Congjian Liu 2015 Georgia Southern University

The Sensitivity Of A Test Based On Spearman's Rho In Cross-Correlation Change Point Problems, Congjian Liu

Electronic Theses and Dissertations

In change point problems, there are three main questions that researchers are interested in. First of all, is there a change point or not? Second, when does the change point occur in a time series? Third, how quickly can we detect the change point? In this thesis, we first explain what a change point is, and what a cross-correlation is. We then discuss prior research in this area. Then we discuss and examine a test based on Spearman's rho, introduced by Wied and Dehling (2011), which tests the null hypothesis of no change point, and compare the change point we …


On The Interpretation Of Multi-Year Estimates Of The American Community Survey As Period Estimates, Chaitra Nagaraja, Tucker McElroy 2014 Fordham University

On The Interpretation Of Multi-Year Estimates Of The American Community Survey As Period Estimates, Chaitra Nagaraja, Tucker Mcelroy

Chaitra H Nagaraja

The rolling sample methodology of the American Community Survey introduces temporal distortions, resulting in Multi-Year Estimates that measure aggregate activity over three or five years. This paper introduces a novel, nonparametric method for quantifying the impact of viewing multi-year estimates as functions of single-year estimates belonging to the same time span. The method is based on examining the changes to confidence interval coverage. As an application of primary interest, the interpretation of a multi-year estimate as the simple average of single-year estimates is a viewpoint that underpins the published estimates of sampling variability. Therefore it is vital to ascertain the …


Financial Statement Fraud Detection Using Supervised Learning Methods (Ph.D. Dissertation), Adrian Gepp 2014 Bond University

Financial Statement Fraud Detection Using Supervised Learning Methods (Ph.D. Dissertation), Adrian Gepp

Adrian Gepp

No abstract provided.


Promoting Similarity Of Model Sparsity Structures In Integrative Analysis Of Cancer Genetic Data, Shuangge Ma 2014 Yale University

Promoting Similarity Of Model Sparsity Structures In Integrative Analysis Of Cancer Genetic Data, Shuangge Ma

Shuangge Ma

In profiling studies, the analysis of a single dataset often leads to unsatisfactory results because of the small sample size. Multi-dataset analysis utilizes information across multiple independent datasets and outperforms single-dataset analysis. Among the available multi-dataset analysis methods, integrative analysis methods aggregate and analyze raw data and outperform meta-analysis methods, which analyze multiple datasets separately and then pool summary statistics. In this study, we conduct integrative analysis and marker selection under the heterogeneity structure, which allows different datasets to have overlapping but not necessarily identical sets of markers. Under certain scenarios, it is reasonable to expect some similarity of identified …


Predicting Financial Distress: A Comparison Of Survival Analysis And Decision Tree Techniques, Adrian Gepp, Kuldeep Kumar 2014 Bond University

Predicting Financial Distress: A Comparison Of Survival Analysis And Decision Tree Techniques, Adrian Gepp, Kuldeep Kumar

Adrian Gepp

Financial distress and then the consequent failure of a business is usually an extremely costly and disruptive event. Statistical financial distress prediction models attempt to predict whether a business will experience financial distress in the future. Discriminant analysis and logistic regression have been the most popular approaches, but there is also a large number of alternative cutting – edge data mining techniques that can be used. In this paper, a semi-parametric Cox survival analysis model and non-parametric CART decision trees have been applied to financial distress prediction and compared with each other as well as the most popular approaches. This …


Case Studies In Evaluating Time Series Prediction Models Using The Relative Mean Absolute Error, Nicholas G. Reich, Justin Lessler, Krzysztof Sakrejda, Stephen A. Lauer, Sopon Iamsirithaworn, Derek A T Cummings 2014 University of Massachusetts - Amherst

Case Studies In Evaluating Time Series Prediction Models Using The Relative Mean Absolute Error, Nicholas G. Reich, Justin Lessler, Krzysztof Sakrejda, Stephen A. Lauer, Sopon Iamsirithaworn, Derek A T Cummings

Nicholas G Reich

Statistical prediction models inform decision-making processes in many real-world settings. Prior to using predictions in practice, one must rigorously test and validate candidate models to ensure that the proposed predictions have sufficient accuracy to be used in practice. In this paper, we present a framework for evaluating time series predictions that emphasizes computational simplicity and an intuitive interpretation using the relative mean absolute error metric. For a single time series, this metric enables comparisons of candidate model predictions against naive reference models, a method that can provide useful and standardized performance benchmarks. Additionally, in applications with multiple time series, this …


Simulating Univariate And Multivariate Nonnormal Distributions Through The Method Of Percentiles, Jennifer Koran, Todd C. Headrick, Tzu Chun Kuo 2014 Southern Illinois University Carbondale

Simulating Univariate And Multivariate Nonnormal Distributions Through The Method Of Percentiles, Jennifer Koran, Todd C. Headrick, Tzu Chun Kuo

Todd Christopher Headrick

This article derives a standard normal-based power method polynomial transformation for Monte Carlo simulation studies, approximating distributions, and fitting distributions to data based on the method of percentiles. The proposed method is used primarily when (1) conventional (or L) moment-based estimators such as skew (or L-skew) and kurtosis (or L -kurtosis) are unknown or (2) data are unavailable but percentiles are known (e.g., standardized test score reports). The proposed transformation also has the advantage that solutions to polynomial coefficients are available in simple closed form and thus obviates numerical equation solving. A procedure is also described for simulating power method …


Optimal Full Matching For Survival Outcomes: A Method That Merits More Widespread Use, Peter Austin, Elizabeth Stuart 2014 Institute for Clinical Evaluative Sciences

Optimal Full Matching For Survival Outcomes: A Method That Merits More Widespread Use, Peter Austin, Elizabeth Stuart

Peter Austin

Matching on the propensity score is a commonly used analytic method for estimating the effects of treatments on outcomes. Commonly used propensity score matching methods include nearest neighbor matching and nearest neighbor caliper matching. Rosenbaum (1991) proposed an optimal full matching approach, in which matched strata are formed consisting of either one treated subject and at least one control subject or one control subject and at least one treated subject. Full matching has been used rarely in the applied literature. Furthermore, its performance for use with survival outcomes has not been rigorously evaluated. We propose a method to use full …


Reconciling Experimental Incoherence With Real-World Coherence In Punitive Damages, Theodore Eisenberg, Jeffrey J. Rachlinski, Martin T. Wells 2014 Cornell Law School

Reconciling Experimental Incoherence With Real-World Coherence In Punitive Damages, Theodore Eisenberg, Jeffrey J. Rachlinski, Martin T. Wells

Jeffrey J. Rachlinski

Experimental evidence generated in controlled laboratory studies suggests that the legal system in general, and punitive damages awards in particular, should display an incoherent pattern. According to the prediction, inexperienced decisionmakers, such as juries, should fail to convert their qualitative judgments of defendants' conduct into consistent, meaningful dollar amounts. This Article tests this prediction and finds modest support for the thesis that experience across different types of cases will lead to greater consistency in awards. Despite this support, numerous studies of damage awards in real cases detect a generally sensible pattern of damage awards. This Article tries to reconcile the …


Digital Commons powered by bepress