Open Access. Powered by Scholars. Published by Universities.®

Probability Commons

Open Access. Powered by Scholars. Published by Universities.®

Statistical Theory

Institution
Keyword
Publication Year
Publication
Publication Type

Articles 1 - 30 of 45

Full-Text Articles in Probability

Sensitivity Analysis Of Prior Distributions In Regression Model Estimation, Ayoade I Adewole, Oluwatoyin K. Bodunwa Jan 2024

Sensitivity Analysis Of Prior Distributions In Regression Model Estimation, Ayoade I Adewole, Oluwatoyin K. Bodunwa

Al-Bahir Journal for Engineering and Pure Sciences

Bayesian inferences depend solely on specification and accuracy of likelihoods and prior distributions of the observed data. The research delved into Bayesian estimation method of regression models to reduce the impact of some of the problems, posed by convectional method of estimating regression models, such as handling complex models, availability of small sample sizes and inclusion of background information in the estimation procedure. Posterior distributions are based on prior distributions and the data accuracy, which is the fundamental principles of Bayesian statistics to produce accurate final model estimates. Sensitivity analysis is an essential part of mathematical model validation in obtaining …


Machine Learning Approaches For Cyberbullying Detection, Roland Fiagbe Jan 2024

Machine Learning Approaches For Cyberbullying Detection, Roland Fiagbe

Data Science and Data Mining

Cyberbullying refers to the act of bullying using electronic means and the internet. In recent years, this act has been identifed to be a major problem among young people and even adults. It can negatively impact one’s emotions and lead to adverse outcomes like depression, anxiety, harassment, and suicide, among others. This has led to the need to employ machine learning techniques to automatically detect cyberbullying and prevent them on various social media platforms. In this study, we want to analyze the combination of some Natural Language Processing (NLP) algorithms (such as Bag-of-Words and TFIDF) with some popular machine learning …


Predicting Superconducting Critical Temperature Using Regression Analysis, Roland Fiagbe Jan 2024

Predicting Superconducting Critical Temperature Using Regression Analysis, Roland Fiagbe

Data Science and Data Mining

This project estimates a regression model to predict the superconducting critical temperature based on variables extracted from the superconductor’s chemical formula. The regression model along with the stepwise variable selection gives a reasonable and good predictive model with a lower prediction error (MSE). Variables extracted based on atomic radius, valence, atomic mass and thermal conductivity appeared to have the most contribution to the predictive model.


The Distribution Of The Significance Level, Paul O. Monnu Jan 2024

The Distribution Of The Significance Level, Paul O. Monnu

Electronic Theses and Dissertations

Reporting the p-value is customary when conducting a test of hypothesis or significance. The likelihood of getting a fictitious second sample and presuming the null hypothesis is correct is the p-value. The significance level is a statistic that interests us to investigate. Being a statistic, it has a distribution. For the F-test in a one-way ANOVA and the t-tests for population means, we define the significance level, its observed value, and the observed significance level. It is possible to derive the significance level distribution. The t-test and the F-test are not without controversy. Specifically, we demonstrate that as sample size …


Exploration And Statistical Modeling Of Profit, Caleb Gibson Dec 2023

Exploration And Statistical Modeling Of Profit, Caleb Gibson

Undergraduate Honors Theses

For any company involved in sales, maximization of profit is the driving force that guides all decision-making. Many factors can influence how profitable a company can be, including external factors like changes in inflation or consumer demand or internal factors like pricing and product cost. Understanding specific trends in one's own internal data, a company can readily identify problem areas or potential growth opportunities to help increase profitability.

In this discussion, we use an extensive data set to examine how a company might analyze their own data to identify potential changes the company might investigate to drive better performance. Based …


Statistical Roles Of The G-Expectation Framework In Model Uncertainty: The Semi-G-Structure As A Stepping Stone, Yifan Li Oct 2022

Statistical Roles Of The G-Expectation Framework In Model Uncertainty: The Semi-G-Structure As A Stepping Stone, Yifan Li

Electronic Thesis and Dissertation Repository

The G-expectation framework is a generalization of the classical probability system based on the sublinear expectation to deal with phenomena that cannot be described by a single probabilistic model. These phenomena are closely related to the long-existing concern about model uncertainty in statistics. However, the distributions and independence in the G-framework are quite different from the classical setup. These distinctions bring difficulty when applying the idea of this framework to general statistical practice. Therefore, a fundamental and unavoidable problem is how to better understand G-version concepts from a statistical perspective.

To explore this problem, this thesis establishes a new substructure …


New Developments On The Estimability And The Estimation Of Phase-Type Actuarial Models, Cong Nie Jul 2022

New Developments On The Estimability And The Estimation Of Phase-Type Actuarial Models, Cong Nie

Electronic Thesis and Dissertation Repository

This thesis studies the estimability and the estimation methods for two models based on Markov processes: the phase-type aging model (PTAM), which models the human aging process, and the discrete multivariate phase-type model (DMPTM), which can be used to model multivariate insurance claim processes.

The principal contributions of this thesis can be categorized into two areas. First, an objective measure of estimability is proposed to quantify estimability in the context of statistical models. Existing methods for assessing estimability require the subjective specification of thresholds, which potentially limits their usefulness. Unlike these methods, the proposed measure of estimability is objective. In …


Advancements In Gaussian Process Learning For Uncertainty Quantification, John C. Nicholson May 2022

Advancements In Gaussian Process Learning For Uncertainty Quantification, John C. Nicholson

All Dissertations

Gaussian processes are among the most useful tools in modeling continuous processes in machine learning and statistics. The research presented provides advancements in uncertainty quantification using Gaussian processes from two distinct perspectives. The first provides a more fundamental means of constructing Gaussian processes which take on arbitrary linear operator constraints in much more general framework than its predecessors, and the other from the perspective of calibration of state-aware parameters in computer models. If the value of a process is known at a finite collection of points, one may use Gaussian processes to construct a surface which interpolates these values to …


Early-Warning Alert Systems For Financial-Instability Detection: An Hmm-Driven Approach, Xing Gu Apr 2022

Early-Warning Alert Systems For Financial-Instability Detection: An Hmm-Driven Approach, Xing Gu

Electronic Thesis and Dissertation Repository

Regulators’ early intervention is crucial when the financial system is experiencing difficulties. Financial stability must be preserved to avert banks’ bailouts, which hugely drain government's financial resources. Detecting in advance periods of financial crisis entails the development and customisation of accurate and robust quantitative techniques. The goal of this thesis is to construct automated systems via the interplay of various mathematical and statistical methodologies to signal financial instability episodes in the near-term horizon. These signal alerts could provide regulatory bodies with the capacity to initiate appropriate response that will thwart or at least minimise the occurrence of a financial crisis. …


Applying The Data: Predictive Analytics In Sport, Anthony Teeter, Margo Bergman Nov 2020

Applying The Data: Predictive Analytics In Sport, Anthony Teeter, Margo Bergman

Access*: Interdisciplinary Journal of Student Research and Scholarship

The history of wagering predictions and their impact on wide reaching disciplines such as statistics and economics dates to at least the 1700’s, if not before. Predicting the outcomes of sports is a multibillion-dollar business that capitalizes on these tools but is in constant development with the addition of big data analytics methods. Sportsline.com, a popular website for fantasy sports leagues, provides odds predictions in multiple sports, produces proprietary computer models of both winning and losing teams, and provides specific point estimates. To test likely candidates for inclusion in these prediction algorithms, the authors developed a computer model, and test …


Task Interrupted By A Poisson Process, Jarrett Christopher Nantais Oct 2020

Task Interrupted By A Poisson Process, Jarrett Christopher Nantais

Major Papers

We consider a task which has a completion time T (if not interrupted), which is a random variable with probability density function (pdf) f(t), t>0. Before it is complete, the task may be interrupted by a Poisson process with rate lambda. If that happens, then the task must begin again, with the same completion time random variable T, but with a potentially different realization. These interruptions can reoccur, until eventually the task is finished, with a total time of W. In this paper, we will find the Laplace Transform of W in several special cases.


Uniform Random Variate Generation With The Linear Congruential Method, Joseph Free Jul 2020

Uniform Random Variate Generation With The Linear Congruential Method, Joseph Free

PANDION: The Osprey Journal of Research and Ideas

This report considers the issue of using a specific linear congruential generator (LCG) to create random variates from the uniform(0,1) distribution. The LCG is used to generate multiple samples of pseudo-random numbers and statistical computation techniques are used to assess whether those samples could have resulted from a uniform(0,1) distribution. Source code is included with this report in the appendix along with annotations.


On Arnold–Villasenor Conjectures For Characterizaing Exponential Distribution Based On Sample Of Size Three, George Yanev May 2020

On Arnold–Villasenor Conjectures For Characterizaing Exponential Distribution Based On Sample Of Size Three, George Yanev

School of Mathematical and Statistical Sciences Faculty Publications and Presentations

Arnold and Villasenor [4] obtain a series of characterizations of the exponential distribution based on random samples of size two. These results were already applied in constructing goodness-of-fit tests. Extending the techniques from [4], we prove some of Arnold and Villasenor’s conjectures for samples of size three. An example with simulated data is discussed.


Personal Foul: How Head Trauma And The Insurance Industry Are Threatening Sports, Zachary Cooler Apr 2020

Personal Foul: How Head Trauma And The Insurance Industry Are Threatening Sports, Zachary Cooler

Senior Honors Theses

This thesis will investigate the growing problem of head trauma in contact sports like football, hockey, and soccer through medical studies, implications to the insurance industry, and ongoing litigation. The thesis will investigate medical studies that are finding more evidence to support the claim that contact sports players are more likely to receive head trauma symptoms such as memory loss, mood swings, and even Lou Gehrig’s disease in extreme cases. The thesis will also demonstrate that these medical symptoms and monetary losses from medical claims are convincing insurance companies to withdraw insurance coverage for sports leagues, which they are justifying …


Inferences For Weibull-Gamma Distribution In Presence Of Partially Accelerated Life Test, Mahmoud Mansour, M A W Mahmoud Prof., Rashad El-Sagheer Mar 2020

Inferences For Weibull-Gamma Distribution In Presence Of Partially Accelerated Life Test, Mahmoud Mansour, M A W Mahmoud Prof., Rashad El-Sagheer

Basic Science Engineering

In this paper, the point at issue is to deliberate point and interval estimations for the parameters of Weibull-Gamma distribution (WGD) using progressively Type-II censored (PROG-II-C) sample under step stress partially accelerated life test (SSPALT) model. The maximum likelihood (ML), Bayes, and four parametric bootstrap methods are used to obtain the point estimations for the distribution parameters and the acceleration factor. Furthermore, the approximate confidence intervals (ACIs), four bootstrap confidence intervals and credible intervals of the estimators have been gotten. The results of Bayes estimators are computed under the squared error loss (SEL) function using Markov Chain Monte Carlo (MCMC) …


How Machine Learning And Probability Concepts Can Improve Nba Player Evaluation, Harrison Miller Jan 2020

How Machine Learning And Probability Concepts Can Improve Nba Player Evaluation, Harrison Miller

CMC Senior Theses

In this paper I will be breaking down a scholarly article, written by Sameer K. Deshpande and Shane T. Jensen, that proposed a new method to evaluate NBA players. The NBA is the highest level professional basketball league in America and stands for the National Basketball Association. They proposed to build a model that would result in how NBA players impact their teams chances of winning a game, using machine learning and probability concepts. I preface that by diving into these concepts and their mathematical backgrounds. These concepts include building a linear model using ordinary least squares method, the bias …


Statistical Inference For Networks Of High-Dimensional Point Processes, Xu Wang, Mladen Kolar, Ali Shojaie Dec 2019

Statistical Inference For Networks Of High-Dimensional Point Processes, Xu Wang, Mladen Kolar, Ali Shojaie

UW Biostatistics Working Paper Series

Fueled in part by recent applications in neuroscience, high-dimensional Hawkes process have become a popular tool for modeling the network of interactions among multivariate point process data. While evaluating the uncertainty of the network estimates is critical in scientific applications, existing methodological and theoretical work have only focused on estimation. To bridge this gap, this paper proposes a high-dimensional statistical inference procedure with theoretical guarantees for multivariate Hawkes process. Key to this inference procedure is a new concentration inequality on the first- and second-order statistics for integrated stochastic processes, which summarizes the entire history of the process. We apply this …


Generalizations Of The Arcsine Distribution, Rebecca Rasnick May 2019

Generalizations Of The Arcsine Distribution, Rebecca Rasnick

Electronic Theses and Dissertations

The arcsine distribution looks at the fraction of time one player is winning in a fair coin toss game and has been studied for over a hundred years. There has been little further work on how the distribution changes when the coin tosses are not fair or when a player has already won the initial coin tosses or, equivalently, starts with a lead. This thesis will first cover a proof of the arcsine distribution. Then, we explore how the distribution changes when the coin the is unfair. Finally, we will explore the distribution when one person has won the first …


Unified Methods For Feature Selection In Large-Scale Genomic Studies With Censored Survival Outcomes, Lauren Spirko-Burns, Karthik Devarajan Mar 2019

Unified Methods For Feature Selection In Large-Scale Genomic Studies With Censored Survival Outcomes, Lauren Spirko-Burns, Karthik Devarajan

COBRA Preprint Series

One of the major goals in large-scale genomic studies is to identify genes with a prognostic impact on time-to-event outcomes which provide insight into the disease's process. With rapid developments in high-throughput genomic technologies in the past two decades, the scientific community is able to monitor the expression levels of tens of thousands of genes and proteins resulting in enormous data sets where the number of genomic features is far greater than the number of subjects. Methods based on univariate Cox regression are often used to select genomic features related to survival outcome; however, the Cox model assumes proportional hazards …


Non Parametric Test For Testing Exponentiality Against Exponential Better Than Used In Laplace Transform Order, Mahmoud Mansour, M A W Mahmoud Prof. Mar 2019

Non Parametric Test For Testing Exponentiality Against Exponential Better Than Used In Laplace Transform Order, Mahmoud Mansour, M A W Mahmoud Prof.

Basic Science Engineering

In this paper, the test statistic for testing exponentiality against exponential better than used in Laplace transform order (EBUL) based on the Laplace transform technique is proposed. Pitman’s asymptotic efficiency of our test is calculated and compared with other tests. The percentiles of this test are tabulated. The powers of the test are estimated for famously used distributions in aging problems. In the case of censored data, our test is applied and the percentiles are also calculated and tabulated. Finally, real examples in different areas are utilized as practical applications for the proposed test.


Modeling Stochastically Intransitive Relationships In Paired Comparison Data, Ryan Patrick Alexander Mcshane Jan 2019

Modeling Stochastically Intransitive Relationships In Paired Comparison Data, Ryan Patrick Alexander Mcshane

Statistical Science Theses and Dissertations

If the Warriors beat the Rockets and the Rockets beat the Spurs, does that mean that the Warriors are better than the Spurs? Sophisticated fans would argue that the Warriors are better by the transitive property, but could Spurs fans make a legitimate argument that their team is better despite this chain of evidence?

We first explore the nature of intransitive (rock-scissors-paper) relationships with a graph theoretic approach to the method of paired comparisons framework popularized by Kendall and Smith (1940). Then, we focus on the setting where all pairs of items, teams, players, or objects have been compared to …


Counting And Coloring Sudoku Graphs, Kyle Oddson Jan 2019

Counting And Coloring Sudoku Graphs, Kyle Oddson

Mathematics and Statistics Dissertations, Theses, and Final Project Papers

A sudoku puzzle is most commonly a 9 × 9 grid of 3 × 3 boxes wherein the puzzle player writes the numbers 1 - 9 with no repetition in any row, column, or box. We generalize the notion of the n2 × n2 sudoku grid for all n ϵ Z ≥2 and codify the empty sudoku board as a graph. In the main section of this paper we prove that sudoku boards and sudoku graphs exist for all such n we prove the equivalence of [3]'s construction using unions and products of graphs to the definition of …


Asymptotic Behavior Of The Random Logistic Model And Of Parallel Bayesian Logspline Density Estimators, Konstandinos Kotsiopoulos Jul 2018

Asymptotic Behavior Of The Random Logistic Model And Of Parallel Bayesian Logspline Density Estimators, Konstandinos Kotsiopoulos

Doctoral Dissertations

This dissertation is comprised of two separate projects. The first concerns a Markov chain called the Random Logistic Model. For r in (0,4] and x in [0,1] the logistic map fr(x) = rx(1 - x) defines, for positive integer t, the dynamical system xr(t + 1) = f(xr(t)) on [0,1], where xr(1) = x. The interplay between this dynamical system and the Markov chain xr,N(t) defined by perturbing the logistic map by truncated Gaussian noise scaled by N-1/2, where N -> infinity, is studied. A natural question is …


Deep Learning Analysis Of Limit Order Book, Xin Xu May 2018

Deep Learning Analysis Of Limit Order Book, Xin Xu

Arts & Sciences Electronic Theses and Dissertations

In this paper, we build a deep neural network for modeling spatial structure in limit order book and make prediction for future best ask or best bid price based on ideas of (Sirignano 2016). We propose an intuitive data processing method to approximate the data is non-available for us based only on level I data that is more widely available. The model is based on the idea that there is local dependence for best ask or best bid price and sizes of related orders. First we use logistic regression to prove that this approach is reasonable. To show the advantages …


Evaluation Of Using The Bootstrap Procedure To Estimate The Population Variance, Nghia Trong Nguyen May 2018

Evaluation Of Using The Bootstrap Procedure To Estimate The Population Variance, Nghia Trong Nguyen

Electronic Theses and Dissertations

The bootstrap procedure is widely used in nonparametric statistics to generate an empirical sampling distribution from a given sample data set for a statistic of interest. Generally, the results are good for location parameters such as population mean, median, and even for estimating a population correlation. However, the results for a population variance, which is a spread parameter, are not as good due to the resampling nature of the bootstrap method. Bootstrap samples are constructed using sampling with replacement; consequently, groups of observations with zero variance manifest in these samples. As a result, a bootstrap variance estimator will carry a …


Effect Of Neuromodulation Of Short-Term Plasticity On Information Processing In Hippocampal Interneuron Synapses, Elham Bayat Mokhtari Jan 2018

Effect Of Neuromodulation Of Short-Term Plasticity On Information Processing In Hippocampal Interneuron Synapses, Elham Bayat Mokhtari

Graduate Student Theses, Dissertations, & Professional Papers

Neurons convey information about the complex dynamic environment in the form of signals. Computational neuroscience provides a theoretical foundation toward enhancing our understanding of nervous system. The aim of this dissertation is to present techniques to study the brain and how it processes information in particular neurons in hippocampus.

We begin with a brief review of the history of neuroscience and biological background of basic neurons. To appreciate the importance of information theory, familiarity with the information theoretic basics is required, these basics are presented in Chapter 2. In Chapter 3, we use information theory to estimate the amount of …


Gilmore Girls And Instagram: A Statistical Look At The Popularity Of The Television Show Through The Lens Of An Instagram Page, Brittany Simmons May 2017

Gilmore Girls And Instagram: A Statistical Look At The Popularity Of The Television Show Through The Lens Of An Instagram Page, Brittany Simmons

Student Scholar Symposium Abstracts and Posters

After going on the Warner Brothers Tour in December of 2015, I created a Gilmore Girls Instagram account. This account, which started off as a way for me to create edits of the show and post my photos from the tour turned into something bigger than I ever could have imagined. In just over a year I have over 55,000 followers. I post content including revival news, merchandise, and edits of the show that have been featured in Entertainment Weekly, Bustle, E! News, People Magazine, Yahoo News, & GilmoreNews.

I created a dataset of qualitative and quantitative outcomes from my …


Inference On The Stress-Strength Model From Weibull Gamma Distribution, Mahmoud Mansour, Rashad El-Sagheer, M. A. W. Mahmoud Prof. May 2017

Inference On The Stress-Strength Model From Weibull Gamma Distribution, Mahmoud Mansour, Rashad El-Sagheer, M. A. W. Mahmoud Prof.

Basic Science Engineering

No abstract provided.


Advanced Sequential Monte Carlo Methods And Their Applications To Sparse Sensor Network For Detection And Estimation, Kai Kang Aug 2016

Advanced Sequential Monte Carlo Methods And Their Applications To Sparse Sensor Network For Detection And Estimation, Kai Kang

Doctoral Dissertations

The general state space models present a flexible framework for modeling dynamic systems and therefore have vast applications in many disciplines such as engineering, economics, biology, etc. However, optimal estimation problems of non-linear non-Gaussian state space models are analytically intractable in general. Sequential Monte Carlo (SMC) methods become a very popular class of simulation-based methods for the solution of optimal estimation problems. The advantages of SMC methods in comparison with classical filtering methods such as Kalman Filter and Extended Kalman Filter are that they are able to handle non-linear non-Gaussian scenarios without relying on any local linearization techniques. In this …


Models For Hsv Shedding Must Account For Two Levels Of Overdispersion, Amalia Magaret Jan 2016

Models For Hsv Shedding Must Account For Two Levels Of Overdispersion, Amalia Magaret

UW Biostatistics Working Paper Series

We have frequently implemented crossover studies to evaluate new therapeutic interventions for genital herpes simplex virus infection. The outcome measured to assess the efficacy of interventions on herpes disease severity is the viral shedding rate, defined as the frequency of detection of HSV on the genital skin and mucosa. We performed a simulation study to ascertain whether our standard model, which we have used previously, was appropriately considering all the necessary features of the shedding data to provide correct inference. We simulated shedding data under our standard, validated assumptions and assessed the ability of 5 different models to reproduce the …