Open Access. Powered by Scholars. Published by Universities.®

Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

University of Texas at El Paso

2021

Neural networks

Articles 1 - 3 of 3

Full-Text Articles in Mathematics

Why Rectified Linear Neurons: A Possible Interval-Based Explanation, Jonathan Contreras, Martine Ceberio, Vladik Kreinovich Nov 2021

Why Rectified Linear Neurons: A Possible Interval-Based Explanation, Jonathan Contreras, Martine Ceberio, Vladik Kreinovich

Departmental Technical Reports (CS)

At present, the most efficient machine learning techniques are deep neural networks. In these networks, a signal repeatedly undergoes two types of transformations: linear combination of inputs, and a non-linear transformation of each value v -> s(v). Empirically, the function s(v) = max(v,0) -- known as the rectified linear function -- works the best. There are some partial explanations for this empirical success; however, none of these explanations is fully convincing. In this paper, we analyze this why-question from the viewpoint of uncertainty propagation. We show that reasonable uncertainty-related arguments lead to another possible explanation of why rectified linear functions …


Uncertainty: Ideas Behind Neural Networks Lead Us Beyond Kl-Decomposition And Interval Fields, Michael Beer, Olga Kosheleva, Vladik Kreinovich Oct 2021

Uncertainty: Ideas Behind Neural Networks Lead Us Beyond Kl-Decomposition And Interval Fields, Michael Beer, Olga Kosheleva, Vladik Kreinovich

Departmental Technical Reports (CS)

In many practical situations, we know that there is a functional dependence between a quantity q and quantities a1, ..., an, but the exact form of this dependence is only known with uncertainty. In some cases, we only know the class of possible functions describing this dependence. In other cases, we also know the probabilities of different functions from this class -- i.e., we know the corresponding random field or random process. To solve problems related to such a dependence, it is desirable to be able to simulate the corresponding functions, i.e., to have algorithms that transform simple intervals or …


Limit Theorems As Blessing Of Dimensionality: Neural-Oriented Overview, Olga Kosheleva, Vladik Kreinovich Apr 2021

Limit Theorems As Blessing Of Dimensionality: Neural-Oriented Overview, Olga Kosheleva, Vladik Kreinovich

Departmental Technical Reports (CS)

As a system becomes more complex, at first, its description and analysis becomes more complicated. However, a further increase in the system's complexity often makes this analysis simpler. A classical example is Central Limit Theorem: when we have a few independent sources of uncertainty, the resulting uncertainty is very difficult to describe, but as the number of such sources increases, the resulting distribution get close to an easy-to-analyze normal one -- and indeed, normal distributions are ubiquitous. We show that such limit theorems often make analysis of complex systems easier -- i.e., lead to blessing of dimensionality phenomenon -- for …