Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

Statistics and Probability

PDF

University of Kentucky

2020

Keyword
Publication
Publication Type

Articles 31 - 35 of 35

Full-Text Articles in Physical Sciences and Mathematics

Measuring Change: Prediction Of Early Onset Sepsis, Aric Schadler Jan 2020

Measuring Change: Prediction Of Early Onset Sepsis, Aric Schadler

Theses and Dissertations--Statistics

Sepsis occurs in a patient when an infection enters into the blood stream and spreads throughout the body causing a cascading response from the immune system. Sepsis is one of the leading causes of morbidity and mortality in today’s hospitals. This is despite published and accepted guidelines for timely and appropriate interventions for septic patients. The largest barrier to applying these interventions is the early identification of septic patients. Early identification and treatment leads to better outcomes, shorter lengths of stay, and financial savings for healthcare institutions. In order to increase the lead time in recognizing patients trending towards septicemia …


Moment Kernels For T-Central Subspace, Weihang Ren Jan 2020

Moment Kernels For T-Central Subspace, Weihang Ren

Theses and Dissertations--Statistics

The T-central subspace allows one to perform sufficient dimension reduction for any statistical functional of interest. We propose a general estimator using a third moment kernel to estimate the T-central subspace. In particular, in this dissertation we develop sufficient dimension reduction methods for the central mean subspace via the regression mean function and central subspace via Fourier transform, central quantile subspace via quantile estimator and central expectile subsapce via expectile estima- tor. Theoretical results are established and simulation studies show the advantages of our proposed methods.


Simultaneous Tolerance Intervals For Response Surface And Mixture Designs Using The Adjusted Product Set Method, Aisaku Nakamura Jan 2020

Simultaneous Tolerance Intervals For Response Surface And Mixture Designs Using The Adjusted Product Set Method, Aisaku Nakamura

Theses and Dissertations--Statistics

Various methods for constructing simultaneous tolerance intervals for regression models have been developed over the years, but all of them can be shown to be conservative. In this thesis, extensive simulations are conducted to evaluate the degree of conservatism with respect to their coverage probabilities. A new strategy to fit simultaneous tolerance intervals on linear models is proposed by modifying an existing method, which we call the adjusted product set (APS) method. The APS method will also be used to construct simultaneous tolerance bands on response surface and mixture designs.


Orthogonal Recurrent Neural Networks And Batch Normalization In Deep Neural Networks, Kyle Eric Helfrich Jan 2020

Orthogonal Recurrent Neural Networks And Batch Normalization In Deep Neural Networks, Kyle Eric Helfrich

Theses and Dissertations--Mathematics

Despite the recent success of various machine learning techniques, there are still numerous obstacles that must be overcome. One obstacle is known as the vanishing/exploding gradient problem. This problem refers to gradients that either become zero or unbounded. This is a well known problem that commonly occurs in Recurrent Neural Networks (RNNs). In this work we describe how this problem can be mitigated, establish three different architectures that are designed to avoid this issue, and derive update schemes for each architecture. Another portion of this work focuses on the often used technique of batch normalization. Although found to be successful …


Unitary And Symmetric Structure In Deep Neural Networks, Kehelwala Dewage Gayan Maduranga Jan 2020

Unitary And Symmetric Structure In Deep Neural Networks, Kehelwala Dewage Gayan Maduranga

Theses and Dissertations--Mathematics

Recurrent neural networks (RNNs) have been successfully used on a wide range of sequential data problems. A well-known difficulty in using RNNs is the vanishing or exploding gradient problem. Recently, there have been several different RNN architectures that try to mitigate this issue by maintaining an orthogonal or unitary recurrent weight matrix. One such architecture is the scaled Cayley orthogonal recurrent neural network (scoRNN), which parameterizes the orthogonal recurrent weight matrix through a scaled Cayley transform. This parametrization contains a diagonal scaling matrix consisting of positive or negative one entries that can not be optimized by gradient descent. Thus the …