Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 11 of 11

Full-Text Articles in Physical Sciences and Mathematics

Exploring Improvements To The Convergence Of Reconstructing Historical Destructive Earthquakes, Kameron Lightheart Nov 2021

Exploring Improvements To The Convergence Of Reconstructing Historical Destructive Earthquakes, Kameron Lightheart

Theses and Dissertations

Determining risk to human populations due to natural disasters has been a topic of interest in the STEM fields for centuries. Earthquakes and the tsunamis they cause are of particular interest due to their repetition cycles. These cycles can last hundreds of years but we have only had modern measuring instruments for the last century or so which makes analysis difficult. In this document, we explore ways to improve upon an existing method for reconstructing earthquakes from historical accounts of tsunamis. This method was designed and implemented by Jared P Whitehead's research group over the last 5 years. The issue …


Trustworthy, Useful Languages For Probabilistic Modeling And Inference, Neil B. Toronto Jun 2014

Trustworthy, Useful Languages For Probabilistic Modeling And Inference, Neil B. Toronto

Theses and Dissertations

The ideals of exact modeling, and of putting off approximations as long as possible, make Bayesian practice both successful and difficult. Languages for modeling probabilistic processes, whose implementations answer questions about them under asserted conditions, promise to ease much of the difficulty. Unfortunately, very few of these languages have mathematical specifications. This makes them difficult to trust: there is no way to distinguish between an implementation error and a feature, and there is no standard by which to prove optimizations correct. Further, because the languages are based on the incomplete theories of probability typically used in Bayesian practice, they place …


Cluster Expansion Models Via Bayesian Compressive Sensing, Lance Jacob Nelson May 2013

Cluster Expansion Models Via Bayesian Compressive Sensing, Lance Jacob Nelson

Theses and Dissertations

The steady march of new technology depends crucially on our ability to discover and design new, advanced materials. Partially due to increases in computing power, computational methods are now having an increased role in this discovery process. Advances in this area speed the discovery and development of advanced materials by guiding experimental work down fruitful paths. Density functional theory (DFT)has proven to be a highly accurate tool for computing material properties. However, due to its computational cost and complexity, DFT is unsuited to performing exhaustive searches over many candidate materials or for extracting thermodynamic information. To perform these types of …


Hierarchical Probit Models For Ordinal Ratings Data, Allison M. Butler Jun 2011

Hierarchical Probit Models For Ordinal Ratings Data, Allison M. Butler

Theses and Dissertations

University students often complete evaluations of their courses and instructors. The evaluation tool typically contains questions about the course and the instructor on an ordinal Likert scale. We assess instructor effectiveness while adjusting for known confounders. We present a probit regression model with a latent variable to measure the instructor effectiveness accounting for student specific covariates, such as student grade in the course, high school and university GPA, and ACT score.


Adaptive Threat Detector Testing Using Bayesian Gaussian Process Models, Bradley Thomas Ferguson May 2011

Adaptive Threat Detector Testing Using Bayesian Gaussian Process Models, Bradley Thomas Ferguson

Theses and Dissertations

Detection of biological and chemical threats is an important consideration in the modern national defense policy. Much of the testing and evaluation of threat detection technologies is performed without appropriate uncertainty quantification. This paper proposes an approach to analyzing the effect of threat concentration on the probability of detecting chemical and biological threats. The approach uses a probit semi-parametric formulation between threat concentration level and the probability of instrument detection. It also utilizes a bayesian adaptive design to determine at which threat concentrations the tests should be performed. The approach offers unique advantages, namely, the flexibility to model non-monotone curves …


A Bayesian Decision Theoretical Approach To Supervised Learning, Selective Sampling, And Empirical Function Optimization, James Lamond Carroll Mar 2010

A Bayesian Decision Theoretical Approach To Supervised Learning, Selective Sampling, And Empirical Function Optimization, James Lamond Carroll

Theses and Dissertations

Many have used the principles of statistics and Bayesian decision theory to model specific learning problems. It is less common to see models of the processes of learning in general. One exception is the model of the supervised learning process known as the "Extended Bayesian Formalism" or EBF. This model is descriptive, in that it can describe and compare learning algorithms. Thus the EBF is capable of modeling both effective and ineffective learning algorithms. We extend the EBF to model un-supervised learning, semi-supervised learning, supervised learning, and empirical function optimization. We also generalize the utility model of the EBF to …


Super-Resolution Via Image Recapture And Bayesian Effect Modeling, Neil B. Toronto Mar 2009

Super-Resolution Via Image Recapture And Bayesian Effect Modeling, Neil B. Toronto

Theses and Dissertations

The goal of super-resolution is to increase not only the size of an image, but also its apparent resolution, making the result more plausible to human viewers. Many super-resolution methods do well at modest magnification factors, but even the best suffer from boundary and gradient artifacts at high magnification factors. This thesis presents Bayesian edge inference (BEI), a novel method grounded in Bayesian inference that does not suffer from these artifacts and remains competitive in published objective quality measures. BEI works by modeling the image capture process explicitly, including any downsampling, and modeling a fictional recapture process, which together allow …


An Adaptive Bayesian Approach To Bernoulli-Response Clinical Trials, Andrew W. Stacey Aug 2007

An Adaptive Bayesian Approach To Bernoulli-Response Clinical Trials, Andrew W. Stacey

Theses and Dissertations

Traditional clinical trials have been inefficient in their methods of dose finding and dose allocation. In this paper a four-parameter logistic equation is used to model the outcome of Bernoulli-response clinical trials. A Bayesian adaptive design is used to fit the logistic equation to the dose-response curve of Phase II and Phase III clinical trials. Because of inherent restrictions in the logistic model, symmetric candidate densities cannot be used, thereby creating asymmetric jumping rules inside the Markov chain Monte Carlo algorithm. An order restricted Metropolis-Hastings algorithm is implemented to account for these limitations. Modeling clinical trials in a Bayesian framework …


Development Of Informative Priors In Microarray Studies, Kassandra M. Fronczyk Jul 2007

Development Of Informative Priors In Microarray Studies, Kassandra M. Fronczyk

Theses and Dissertations

Microarrays measure the abundance of DNA transcripts for thousands of gene sequences, simultaneously facilitating genomic comparisons across tissue types or disease status. These experiments are used to understand fundamental aspects of growth and development and to explore the underlying genetic causes of many diseases. The data from most microarray studies are found in open-access online databases. Bayesian models are ideal for the analysis of microarray data because of their ability to integrate prior information; however, most current Bayesian analyses use empirical or flat priors. We present a Perl script to build an informative prior by mining online databases for similar …


Graphical And Bayesian Analysis Of Unbalanced Patient Management Data, Emily Stewart Righter Mar 2007

Graphical And Bayesian Analysis Of Unbalanced Patient Management Data, Emily Stewart Righter

Theses and Dissertations

The International Normalizing Ratio (INR) measures the speed at which blood clots. Healthy people have an INR of about one. Some people are at greater risk of blood clots and their physician prescribes a target INR range, generally 2-3. The farther a patient is above or below their prescribed range, the more dangerous their situation. A variety of point-of-care (POC) devices has been developed to monitor patients. The purpose of this research was to develop innovative graphics to help describe a highly unbalanced dataset and to carry out Bayesian analyses to determine which of five devices best manages patients. An …


Modeling Distributions Of Test Scores With Mixtures Of Beta Distributions, Jingyu Feng Nov 2005

Modeling Distributions Of Test Scores With Mixtures Of Beta Distributions, Jingyu Feng

Theses and Dissertations

Test score distributions are used to make important instructional decisions about students. The test scores usually do not follow a normal distribution. In some cases, the scores appear to follow a bimodal distribution that can be modeled with a mixture of beta distributions. This bimodality may be due different levels of students' ability. The purpose of this study was to develop and apply statistical techniques for fitting beta mixtures and detecting bimodality in test score distributions. Maximum likelihood and Bayesian methods were used to estimate the five parameters of the beta mixture distribution for scores in four quizzes in a …