Open Access. Powered by Scholars. Published by Universities.®

Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 4 of 4

Full-Text Articles in Mathematics

Why Rectified Linear Neurons: A Possible Interval-Based Explanation, Jonathan Contreras, Martine Ceberio, Vladik Kreinovich Nov 2021

Why Rectified Linear Neurons: A Possible Interval-Based Explanation, Jonathan Contreras, Martine Ceberio, Vladik Kreinovich

Departmental Technical Reports (CS)

At present, the most efficient machine learning techniques are deep neural networks. In these networks, a signal repeatedly undergoes two types of transformations: linear combination of inputs, and a non-linear transformation of each value v -> s(v). Empirically, the function s(v) = max(v,0) -- known as the rectified linear function -- works the best. There are some partial explanations for this empirical success; however, none of these explanations is fully convincing. In this paper, we analyze this why-question from the viewpoint of uncertainty propagation. We show that reasonable uncertainty-related arguments lead to another possible explanation of why rectified linear functions …


Uncertainty: Ideas Behind Neural Networks Lead Us Beyond Kl-Decomposition And Interval Fields, Michael Beer, Olga Kosheleva, Vladik Kreinovich Oct 2021

Uncertainty: Ideas Behind Neural Networks Lead Us Beyond Kl-Decomposition And Interval Fields, Michael Beer, Olga Kosheleva, Vladik Kreinovich

Departmental Technical Reports (CS)

In many practical situations, we know that there is a functional dependence between a quantity q and quantities a1, ..., an, but the exact form of this dependence is only known with uncertainty. In some cases, we only know the class of possible functions describing this dependence. In other cases, we also know the probabilities of different functions from this class -- i.e., we know the corresponding random field or random process. To solve problems related to such a dependence, it is desirable to be able to simulate the corresponding functions, i.e., to have algorithms that transform simple intervals or …


Limit Theorems As Blessing Of Dimensionality: Neural-Oriented Overview, Olga Kosheleva, Vladik Kreinovich Apr 2021

Limit Theorems As Blessing Of Dimensionality: Neural-Oriented Overview, Olga Kosheleva, Vladik Kreinovich

Departmental Technical Reports (CS)

As a system becomes more complex, at first, its description and analysis becomes more complicated. However, a further increase in the system's complexity often makes this analysis simpler. A classical example is Central Limit Theorem: when we have a few independent sources of uncertainty, the resulting uncertainty is very difficult to describe, but as the number of such sources increases, the resulting distribution get close to an easy-to-analyze normal one -- and indeed, normal distributions are ubiquitous. We show that such limit theorems often make analysis of complex systems easier -- i.e., lead to blessing of dimensionality phenomenon -- for …


Forecasting Crashes, Credit Card Default, And Imputation Analysis On Missing Values By The Use Of Neural Networks, Jazmin Quezada Jan 2019

Forecasting Crashes, Credit Card Default, And Imputation Analysis On Missing Values By The Use Of Neural Networks, Jazmin Quezada

Open Access Theses & Dissertations

A neural network is a system of hardware and/or software patterned after the operation of neurons in the human brain. Neural networks,- also called Artificial Neural Networks - are a variety of deep learning technology, which also falls under the umbrella of artificial intelligence, or AI. Recent studies shows that Artificial Neural Network has the highest coefficient of determination (i.e. measure to assess how well a model explains and predicts future outcomes.) in comparison to the K-nearest neighbor classifiers, logistic regression, discriminant analysis, naive Bayesian classifier, and classification trees. In this work, the theoretical description of the neural network methodology …