Open Access. Powered by Scholars. Published by Universities.®

Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 4 of 4

Full-Text Articles in Mathematics

A Uniformly Most Powerful Test For The Mean Of A Beta Distribution, Richard Ntiamoah Kyei Aug 2024

A Uniformly Most Powerful Test For The Mean Of A Beta Distribution, Richard Ntiamoah Kyei

Electronic Theses and Dissertations

The beta distribution is used in numerous real-world applications, including areas such as manufacturing (quality control) and analyzing patient outcomes in health care. It also plays a key role in statistical theory, including multivariate analysis of variance (MANOVA) and Bayesian statistics. It is a flexible distribution that can account for many different characteristics of real data. To our surprise, there has been very little work or discussion on performing statistical hypothesis testing for the mean when it is reasonable to assume that the population is beta distributed. Many analysts conduct traditional analyses using a t-test or nonparametric approach, try transformations, …


Classification In Supervised Statistical Learning With The New Weighted Newton-Raphson Method, Toma Debnath Jan 2024

Classification In Supervised Statistical Learning With The New Weighted Newton-Raphson Method, Toma Debnath

Electronic Theses and Dissertations

In this thesis, the Weighted Newton-Raphson Method (WNRM), an innovative optimization technique, is introduced in statistical supervised learning for categorization and applied to a diabetes predictive model, to find maximum likelihood estimates. The iterative optimization method solves nonlinear systems of equations with singular Jacobian matrices and is a modification of the ordinary Newton-Raphson algorithm. The quadratic convergence of the WNRM, and high efficiency for optimizing nonlinear likelihood functions, whenever singularity in the Jacobians occur allow for an easy inclusion to classical categorization and generalized linear models such as the Logistic Regression model in supervised learning. The WNRM is thoroughly investigated …


Refining The Inverse Lipschitz Constant For Injective Relu Networks, Cole Rausch Jan 2024

Refining The Inverse Lipschitz Constant For Injective Relu Networks, Cole Rausch

Electronic Theses and Dissertations

In this thesis, we study the Inverse Lipschitz Constant (ILC) of injective ReLU layers. We study the tightness of the ILC lower bound established in Puthawala et al. Our approach has three components. First, we find that the conditions for injectivity on lines yield a weaker condition than the general condition given in Puthawala et al. Second, we perform numerical experiments to judge the tightness of the existing ILC lower bound and find that bound is overly conservative. Third, we identify the source of the potential slack in the proof of the existing ILC bound, and perform further numerical experiments …


Contrastive Learning, With Application To Forensic Identification Of Source, Cole Ryan Patten Jan 2024

Contrastive Learning, With Application To Forensic Identification Of Source, Cole Ryan Patten

Electronic Theses and Dissertations

Forensic identification of source problems often fall under the category of verification problems, where recent advances in deep learning have been made by contrastive learning methods. Many forensic identification of source problems deal with a scarcity of data, an issue addressed by few-shot learning. In this work, we make specific what makes a neural network a contrastive network. We then consider the use of contrastive neural networks for few-shot learning classification problems and compare them to other statistical and deep learning methods. Our findings indicate similar performance between models trained by contrastive loss and models trained by cross-entropy loss. We …