Open Access. Powered by Scholars. Published by Universities.®

Statistics and Probability Commons

Open Access. Powered by Scholars. Published by Universities.®

11,760 Full-Text Articles 17,254 Authors 3,366,482 Downloads 238 Institutions

All Articles in Statistics and Probability

Faceted Search

11,760 full-text articles. Page 353 of 354.

Investigating The Ironwood Tree (Casuarina Equisetifolia) Decline On Guam Using Applied Multinomial Modeling, Karl Anthony Schlub 2010 Louisiana State University and Agricultural and Mechanical College

Investigating The Ironwood Tree (Casuarina Equisetifolia) Decline On Guam Using Applied Multinomial Modeling, Karl Anthony Schlub

LSU Master's Theses

The ironwood tree (Casuarina equisetifolia), a protector of coastlines of the sub-tropical and tropical Western Pacific, is in decline on the island of Guam where aggressive data collection and efforts to mitigate the problem are underway. For each sampled tree the level of decline was measured on an ordinal scale consisting of five categories ranging from healthy to near dead. Several predictors were also measured including tree diameter, fire damage, typhoon damage, presence or absence of termites, presence or absence of basidiocarps, and various geographical or cultural factors. The five decline response levels can be viewed as categories of a ...


Encryption Using Deterministic Chaos, Jonathan Blackledge, Nikolai Ptitsyn 2010 Dublin Institute of Technology

Encryption Using Deterministic Chaos, Jonathan Blackledge, Nikolai Ptitsyn

Articles

The concepts of randomness, unpredictability, complexity and entropy form the basis of modern cryptography and a cryptosystem can be interpreted as the design of a key-dependent bijective transformation that is unpredictable to an observer for a given computational resource. For any cryptosystem, including a Pseudo-Random Number Generator (PRNG), encryption algorithm or a key exchange scheme, for example, a cryptanalyst has access to the time series of a dynamic system and knows the PRNG function (the algorithm that is assumed to be based on some iterative process) which is taken to be in the public domain by virtue of the Kerchhoff-Shannon ...


Economic Risk Assessment Using The Fractal Market Hypothesis, Jonathan Blackledge, Marek Rebow 2010 Technological University Dublin

Economic Risk Assessment Using The Fractal Market Hypothesis, Jonathan Blackledge, Marek Rebow

Conference papers

This paper considers the Fractal Market Hypothesi (FMH) for assessing the risk(s) in developing a financial portfolio based on data that is available through the Internet from an increasing number of sources. Most financial risk management systems are still based on the Efficient Market Hypothesis which often fails due to the inaccuracies of the statistical models that underpin the hypothesis, in particular, that financial data are based on stationary Gaussian processes. The FMH considered in this paper assumes that financial data are non-stationary and statistically self-affine so that a risk analysis can, in principal, be applied at any time ...


Dynamic Model Pooling Methodology For Improving Aberration Detection Algorithms, Brenton J. Sellati 2010 University of Massachusetts Amherst

Dynamic Model Pooling Methodology For Improving Aberration Detection Algorithms, Brenton J. Sellati

Masters Theses 1911 - February 2014

Syndromic surveillance is defined generally as the collection and statistical analysis of data which are believed to be leading indicators for the presence of deleterious activities developing within a system. Conceptually, syndromic surveillance can be applied to any discipline in which it is important to know when external influences manifest themselves in a system by forcing it to depart from its baseline. Comparing syndromic surveillance systems have led to mixed results, where models that dominate in one performance metric are often sorely deficient in another. This results in a zero-sum trade off where one performance metric must be afforded greater ...


Route Choice Behavior In Risky Networks With Real-Time Information, Michael D. Razo 2010 University of Massachusetts Amherst

Route Choice Behavior In Risky Networks With Real-Time Information, Michael D. Razo

Masters Theses 1911 - February 2014

This research investigates route choice behavior in networks with risky travel times and real-time information. A stated preference survey is conducted in which subjects use a PC-based interactive maps to choose routes link-by-link in various scenarios. The scenarios include two types of maps: the first presenting a choice between one stochastic route and one deterministic route, and the second with real-time information and an available detour. The first type measures the basic risk attitude of the subject. The second type allows for strategic planning, and measures the effect of this opportunity on subjects' choice behavior.

Results from each subject are ...


Census 2010 And Human Services And Community Development, Mark Salling, Jenita McGowan 2010 Cleveland State University

Census 2010 And Human Services And Community Development, Mark Salling, Jenita Mcgowan

Urban Publications

Census 2010 and Human Services and Community Development, Planning & Action, The Center for Community Solutions, Vol. 63, No. 2 (March), 2010, pp 1-4.


Numerical Analysis Of Non-Constant Pure Rate Of Time Preference: A Model Of Climate Policy, Tomoki FUJII, Larry KARP 2010 Singapore Management University

Numerical Analysis Of Non-Constant Pure Rate Of Time Preference: A Model Of Climate Policy, Tomoki Fujii, Larry Karp

Research Collection School Of Economics

When current decisions affect welfare in the far-distant future, as with climate change, the use of a declining pure rate of time preference (PRTP) provides potentially important modeling flexibility. The difficulty of analyzing models with non-constant PRTP limits their application. We describe and provide software (available online) to implement an algorithm to numerically obtain a Markov perfect equilibrium for an optimal control problem with non-constant PRTP. We apply this software to a simplified version of the numerical climate change model used in the Stern Review. For our calibration, the policy recommendations are less sensitive to the PRTP than widely believed.


A Small Area Procedure For Estimating Population Counts, Emily J. Berg 2010 Iowa State University

A Small Area Procedure For Estimating Population Counts, Emily J. Berg

Graduate Theses and Dissertations

Many large scale surveys are designed to achieve acceptable reliability for large domains. Direct estimators for more detailed levels of aggregation are often judged to be unreliable due to small sample sizes. Estimation for small domains, often defined by geographic and demographic characteristics, is known as small area estimation. A common approach to small area estimation is to derive predictors under a specified mixed model for the direct estimators. A procedure of this type is developed for small areas defined by the cells of a two-way table.

Construction of small domain estimators using the Canadian Labour Force Survey (LFS) motivates ...


Statistical Methods For Analyzing Physical Activity Data, Nicholas Koenen Beyler 2010 Iowa State University

Statistical Methods For Analyzing Physical Activity Data, Nicholas Koenen Beyler

Graduate Theses and Dissertations

Physical activity is any bodily movement that results in caloric expenditure. One important aspect of physical activity research is the assessment of usual (i.e., long-term average) physical activity in the population, in order to better understand the links between physical activity and health outcomes. Daily or weekly measurements of physical activity taken from a sample of indivuals are prone to measurement errors and nuisance effects, which can lead to biased estimates of usual physical activity parameters. Fortunately, statistical models can be used to account and adjust for these errors in order to give more accurate estimates of usual physical ...


Detecting Recombination And Its Mechanistic Association With Genomic Features Via Statistical Models, Misha Rajaram 2010 Iowa State University

Detecting Recombination And Its Mechanistic Association With Genomic Features Via Statistical Models, Misha Rajaram

Graduate Theses and Dissertations

Recombination is a powerful weapon in the evolutionary arsenal of retroviruses such as HIV. It enables the production of chimeric variants or recombinants that may confer a selective advantage to the pathogen over the host immune response. Recombinants further accentuate differences in virulence, disease progression and drug resistance mutation patterns already observed in non-recombinant variants of HIV. This thesis describes the development of a rapid genotyper for HIV sequences employing supervised learning algorithms and its application to complex HIV recombinant data, the application of a hierarchical model for detection of recombination hotspots in the HIV-1 genome and the extension of ...


Improving Statistical Inference For Gene Expression Profiling Data By Borrowing Information, Long Qu 2010 Iowa State University

Improving Statistical Inference For Gene Expression Profiling Data By Borrowing Information, Long Qu

Graduate Theses and Dissertations

Gene expression profiling experiments, in particular, microarray experiments, are popular in genomics research. However, in addition to the great opportunities provided by such experiments, statistical challenges also arise in the analysis of expression profiling data. The current thesis discusses statistical issues associated with gene expression profiling experiments and develops new statistical methods to tackle some of these problems.

In Chapter 2, we consider the insufficient sample size problem in detecting differential gene expression. We address the problem by developing and evaluating methods for variance model selection. The idea is that information about error variances might be learned from related datasets ...


New Methods For Statistical Modeling And Analysis Of Nondestructive Evaluation Data, Ming Li 2010 Iowa State University

New Methods For Statistical Modeling And Analysis Of Nondestructive Evaluation Data, Ming Li

Graduate Theses and Dissertations

Statistical methods have a long history of applications in physical sciences and engineering for design of experiments and data analyses. In nondestructive evaluation (NDE) studies, standard statistical methods are described in Military Handbook 1823A as guidelines to analyze the experimental NDE data both in carefully controlled laboratory setup and field studies. However complicated data structures often demand non-traditional statistical approaches. In this dissertation, with the inspiration and needs from actual NDE data applications, we introduced several statistical methods for better description of the problem and more appropriate modeling of the data. We also discussed the potential applications of those statistical ...


Probabilistic Studies Of Different Investment Strategies, Ling Huang 2010 Iowa State University

Probabilistic Studies Of Different Investment Strategies, Ling Huang

Graduate Theses and Dissertations

Lump Sum (LS), Dollar Cost Averaging (DCA), and Value Averaging (VA) are among the most popular investment strategies. However, conflicting conclusions on their relative performances have been given in the literature due to the use of different time periods of data and simulations. We propose an alternative investment strategy called Threshold Control (TC) based on statistical process monitoring. The idea is that the investor only makes portfolio moves when its market value is far above or below the target value set by the investor. TC can be viewed as a generalization of both LS and VA, and provides more flexibility ...


Statisical Method And Simulation On Detecting Cracks In Vibrothermography Inspection, Chunwang Gao 2010 Iowa State University

Statisical Method And Simulation On Detecting Cracks In Vibrothermography Inspection, Chunwang Gao

Graduate Theses and Dissertations

Vibrothermography is a nondestructive evaluation method that can be

used to detect cracks in specimens and it is the main engineering

technique we focused in this thesis. This study can be separated into

three parts. In the first part, we develop a systematic statistical

method to provide a detection algorithm to automatically analyze the

data generated in Sonic IR inspections. Principal components analysis

(PCA) was used for dimension reduction. Robust regression and cluster

analysis are used to find the maximum studentized residual (MSD) for

crack detection. The procedure proved to be both more efficient and

more accurate than human inspection ...


A Novel Principal Component Analysis Method For Identifying Differentially Expressed Gene Signatures, Ai-ling Teh 2010 Iowa State University

A Novel Principal Component Analysis Method For Identifying Differentially Expressed Gene Signatures, Ai-Ling Teh

Graduate Theses and Dissertations

Microarray data sets contain a wealth of information on the gene expression levels for thousands of genes for small number of different conditions called assays. But, the information is hidden by high noise levels, and low signal levels. Data mining techniques are used to extract the information of genes related to the assays. This work proposed a powerful principal component analysis (PCA) based method in extend of PCA approach of Rollins et. al. (2006). The proposed method is able to generate gene signatures that expressed the most differently between two assay groups in a microarray data set.

This work developed ...


Model Selection For Good Estimation Or Prediction Over A User-Specified Covariate Distribution, Adam Lee Pintar 2010 Iowa State University

Model Selection For Good Estimation Or Prediction Over A User-Specified Covariate Distribution, Adam Lee Pintar

Graduate Theses and Dissertations

In many applications it is common to observe a response with corresponding potential explanatory variables or covariates. Regression models using either the frequentist or Bayesian paradigm for inference are often employed to model such data. To perform model selection in the frequentist paradigm, step-wise or all-subsets selection based on the Cp criterion, the Akaike information criterion (AIC), or the Bayesian information criterion (BIC) are often used. Also, strategies based on cross-validation are available. In the Bayesian paradigm, the deviance information criterion (DIC) or posterior model probabilities are the primary tools for model selection. One theme central to these methods ...


Contributions To Accelerated Destructive Degradation Test Planning, Ying Shi 2010 Iowa State University

Contributions To Accelerated Destructive Degradation Test Planning, Ying Shi

Graduate Theses and Dissertations

Many failure mechanisms can be traced to

underlying degradation processes. Degradation eventually leads to a

weakness that can cause a failure for products. When it is possible

to measure degradation, such data often provide more information

than traditional failure-time data for purposes of assessing and

improving product reliability. For some products, however,

degradation rates at use conditions are so low that appreciable

degradation will not be observed in a test of practical time length.

In such cases, it might be possible to use some accelerating

variables (e.g., temperature, voltage, or pressure) to accelerate

the degradation processes. In today's ...


Imputation Procedures For American Community Survey Group Quarters Small Area Estimation, Chandra Erdman, Chaitra Nagaraja 2009 Fordham University

Imputation Procedures For American Community Survey Group Quarters Small Area Estimation, Chandra Erdman, Chaitra Nagaraja

Chaitra H Nagaraja

No abstract provided.


Measuring The Hiv/Aids Epidemic: Approaches And Challenges, Ron Brookmeyer 2009 University of California, Los Angeles

Measuring The Hiv/Aids Epidemic: Approaches And Challenges, Ron Brookmeyer

Ron Brookmeyer

In this article, the author reviews current approaches and methods for measuring the scope of the human immunodeficiency virus (HIV)/acquired immunodeficiency syndrome (AIDS) epidemic and their strengths and weaknesses. In recent years, various public health agencies have revised statistical estimates of the scope of the HIV/AIDS pandemic. The author considers the reasons underlying these revisions. New sources of data for estimating HIV prevalence have become available, such as nationally representative probability-based surveys. New technologies such as biomarkers that indicate when persons became infected are now used to determine HIV incidence rates. The author summarizes the main sources of ...


The Effect Of Salvage Therapy On Survival In A Longitudinal Study With Treatment By Indication, Edward Kennedy, Jeremy Taylor, Douglas Schaubel, Scott Williams 2009 University of Pennsylvania

The Effect Of Salvage Therapy On Survival In A Longitudinal Study With Treatment By Indication, Edward Kennedy, Jeremy Taylor, Douglas Schaubel, Scott Williams

Edward H. Kennedy

We consider using observational data to estimate the effect of a treatment on disease recurrence, when the decision to initiate treatment is based on longitudinal factors associated with the risk of recurrence. The effect of salvage androgen deprivation therapy (SADT) on the risk of recurrence of prostate cancer is inadequately described by the existing literature. Furthermore, standard Cox regression yields biased estimates of the effect of SADT, since it is necessary to adjust for prostate-specific antigen (PSA), which is a time-dependent confounder and an intermediate variable. In this paper, we describe and compare two methods which appropriately adjust for PSA ...


Digital Commons powered by bepress