Open Access. Powered by Scholars. Published by Universities.®

Digital Commons Network

Open Access. Powered by Scholars. Published by Universities.®

PDF

Series

University of Texas at El Paso

Departmental Technical Reports (CS)

Probabilistic uncertainty

Articles 1 - 12 of 12

Full-Text Articles in Entire DC Network

Need For Techniques Intermediate Between Interval And Probabilistic Ones, Olga Kosheleva, Vladik Kreinovich Feb 2022

Need For Techniques Intermediate Between Interval And Probabilistic Ones, Olga Kosheleva, Vladik Kreinovich

Departmental Technical Reports (CS)

In high performance computing, when we process a large amount of data, we do not have much information about the dependence between measurement errors corresponding to different inputs. To gauge the uncertainty of the result of data processing, the two usual approaches are: the interval approach, when we consider the worst-case scenario in which all measurement errors are strongly correlated, and the probabilistic approach, when we assume that all these errors are independent. The problem is that usually, the interval approach leads to too pessimistic, too large uncertainty estimates, while the probabilistic approach often underestimates the resulting uncertainty. To get …


Need To Combine Interval And Probabilistic Uncertainty: What Needs To Be Computed, What Can Be Computed, What Can Be Feasibly Computed, And How Physics Can Help, Julio Urenda, Vladik Kreinovich, Olga Kosheleva Jan 2022

Need To Combine Interval And Probabilistic Uncertainty: What Needs To Be Computed, What Can Be Computed, What Can Be Feasibly Computed, And How Physics Can Help, Julio Urenda, Vladik Kreinovich, Olga Kosheleva

Departmental Technical Reports (CS)

In many practical situations, the quantity of interest is difficult to measure directly. In such situations, to estimate this quantity, we measure easier-to-measure quantities which are related to the desired one by a known relation, and we use the results of these measurement to estimate the desired quantity. How accurate is this estimate?

Traditional engineering approach assumes that we know the probability distributions of measurement errors; however, in practice, we often only have partial information about these distributions. In some cases, we only know the upper bounds on the measurement errors; in such cases, the only thing we know about …


Need To Combine Interval And Probabilistic Uncertainty: What Needs To Be Computed, What Can Be Computed, What Can Be Feasibly Computed, And How Physics Can Help, Songsak Sriboonchitta, Thach N. Nguyen, Vladik Kreinovich, Hung T. Nguyen Sep 2018

Need To Combine Interval And Probabilistic Uncertainty: What Needs To Be Computed, What Can Be Computed, What Can Be Feasibly Computed, And How Physics Can Help, Songsak Sriboonchitta, Thach N. Nguyen, Vladik Kreinovich, Hung T. Nguyen

Departmental Technical Reports (CS)

In many practical situations, the quantity of interest is difficult to measure directly. In such situations, to estimate this quantity, we measure easier-to-measure quantities which are related to the desired one by a known relation, and we use the results of these measurement to estimate the desired quantity. How accurate is this estimate?

Traditional engineering approach assumes that we know the probability distributions of measurement errors; however, in practice, we often only have partial information about these distributions. In some cases, we only know the upper bounds on the measurement errors; in such cases, the only thing we know about …


How To Deal With Uncertainties In Computing: From Probabilistic And Interval Uncertainty To Combination Of Different Approaches, With Applications To Engineering And Bioinformatics, Vladik Kreinovich Mar 2017

How To Deal With Uncertainties In Computing: From Probabilistic And Interval Uncertainty To Combination Of Different Approaches, With Applications To Engineering And Bioinformatics, Vladik Kreinovich

Departmental Technical Reports (CS)

Most data processing techniques traditionally used in scientific and engineering practice are statistical. These techniques are based on the assumption that we know the probability distributions of measurement errors etc.

In practice, often, we do not know the distributions, we only know the bound D on the measurement accuracy -- hence, after the get the measurement result X, the only information that we have about the actual (unknown) value x of the measured quantity is that $x$ belongs to the interval [X − D, X + D]. Techniques for data processing under such interval uncertainty are called interval computations; these …


Towards A Fast, Practical Alternative To Joint Inversion Of Multiple Datasets: Model Fusion, Omar Ochoa, Aaron A. Velasco, Christian Servin Oct 2010

Estimating Information Amount Under Uncertainty: Algorithmic Solvability And Computational Complexity, Vladik Kreinovich, Gang Xiang Jan 2010

Estimating Information Amount Under Uncertainty: Algorithmic Solvability And Computational Complexity, Vladik Kreinovich, Gang Xiang

Departmental Technical Reports (CS)

Measurement results (and, more generally, estimates) are never absolutely accurate: there is always an uncertainty, the actual value x is, in general, different from the estimate X. Sometimes, we know the probability of different values of the estimation error dx=X-x, sometimes, we only know the interval of possible values of dx, sometimes, we have interval bounds on the cdf of dx. To compare different measuring instruments, it is desirable to know which of them brings more information - i.e., it is desirable to gauge the amount of information. For probabilistic uncertainty, this amount of information is described by Shannon's entropy; …


Application-Motivated Combinations Of Fuzzy, Interval, And Probability Approaches, And Their Use In Geoinformatics, Bioinformatics, And Engineering, Vladik Kreinovich May 2008

Application-Motivated Combinations Of Fuzzy, Interval, And Probability Approaches, And Their Use In Geoinformatics, Bioinformatics, And Engineering, Vladik Kreinovich

Departmental Technical Reports (CS)

Most data processing techniques traditionally used in scientific and engineering practice are statistical. These techniques are based on the assumption that we know the probability distributions of measurement errors etc. In practice, often, we do not know the distributions, we only know the bound D on the measurement accuracy - hence, after the get the measurement result X, the only information that we have about the actual (unknown) value x of the measured quantity is that x belongs to the interval [X - D, X + D]. Techniques for data processing under such interval uncertainty are called interval computations; these …


Propagation And Provenance Of Probabilistic And Interval Uncertainty In Cyberinfrastructure-Related Data Processing And Data Fusion, Paulo Pinheiro Da Silva, Aaron A. Velasco, Martine Ceberio, Christian Servin, Matthew G. Averill, Nicholas Ricky Del Rio, Luc Longpre, Vladik Kreinovich Nov 2007

Propagation And Provenance Of Probabilistic And Interval Uncertainty In Cyberinfrastructure-Related Data Processing And Data Fusion, Paulo Pinheiro Da Silva, Aaron A. Velasco, Martine Ceberio, Christian Servin, Matthew G. Averill, Nicholas Ricky Del Rio, Luc Longpre, Vladik Kreinovich

Departmental Technical Reports (CS)

In the past, communications were much slower than computations. As a result, researchers and practitioners collected different data into huge databases located at a single location such as NASA and US Geological Survey. At present, communications are so much faster that it is possible to keep different databases at different locations, and automatically select, transform, and collect relevant data when necessary. The corresponding cyberinfrastructure is actively used in many applications. It drastically enhances scientists' ability to discover, reuse and combine a large number of resources, e.g., data and services.

Because of this importance, it is desirable to be able to …


Combining Interval, Probabilistic, And Fuzzy Uncertainty: Foundations, Algorithms, Challenges -- An Overview, Vladik Kreinovich, David J. Berleant, Scott Ferson, Weldon A. Lodwick Nov 2005

Combining Interval, Probabilistic, And Fuzzy Uncertainty: Foundations, Algorithms, Challenges -- An Overview, Vladik Kreinovich, David J. Berleant, Scott Ferson, Weldon A. Lodwick

Departmental Technical Reports (CS)

Since the 1960s, many algorithms have been designed to deal with interval uncertainty. In the last decade, there has been a lot of progress in extending these algorithms to the case when we have a combination of interval and probabilistic uncertainty. We provide an overview of related algorithms, results, and remaining open problems.


Towards Combining Probabilistic And Interval Uncertainty In Engineering Calculations: Algorithms For Computing Statistics Under Interval Uncertainty, And Their Computational Complexity, Vladik Kreinovich, Gang Xiang, Scott A. Starks, Luc Longpre, Martine Ceberio, Roberto Araiza, J. Beck, R. Kandathi, A. Nayak, R. Torres, J. Hajagos Jun 2005

Towards Combining Probabilistic And Interval Uncertainty In Engineering Calculations: Algorithms For Computing Statistics Under Interval Uncertainty, And Their Computational Complexity, Vladik Kreinovich, Gang Xiang, Scott A. Starks, Luc Longpre, Martine Ceberio, Roberto Araiza, J. Beck, R. Kandathi, A. Nayak, R. Torres, J. Hajagos

Departmental Technical Reports (CS)

In many engineering applications, we have to combine probabilistic and interval uncertainty. For example, in environmental analysis, we observe a pollution level x(t) in a lake at different moments of time t, and we would like to estimate standard statistical characteristics such as mean, variance, autocorrelation, correlation with other measurements. In environmental measurements, we often only measure the values with interval uncertainty. We must therefore modify the existing statistical algorithms to process such interval data.

In this paper, we provide a survey of algorithms for computing various statistics under interval uncertainty and their computational complexity. The survey includes both known …


Towards Combining Probabilistic And Interval Uncertainty In Engineering Calculations, Scott A. Starks, Vladik Kreinovich, Luc Longpre, Martine Ceberio, Gang Xiang, Roberto Araiza, J. Beck, R. Kandathi, A. Nayak, R. Torres Jul 2004

Towards Combining Probabilistic And Interval Uncertainty In Engineering Calculations, Scott A. Starks, Vladik Kreinovich, Luc Longpre, Martine Ceberio, Gang Xiang, Roberto Araiza, J. Beck, R. Kandathi, A. Nayak, R. Torres

Departmental Technical Reports (CS)

In many engineering applications, we have to combine probabilistic and interval errors. For example, in environmental analysis, we observe a pollution level x(t) in a lake at different moments of time t, and we would like to estimate standard statistical characteristics such as mean, variance, autocorrelation, correlation with other measurements. In environmental measurements, we often only know the values with interval uncertainty. We must therefore modify the existing statistical algorithms to process such interval data. Such modification are described in this paper.


Real-Time Algorithms For Statistical Analysis Of Interval Data, Berlin Wu, Hung T. Nguyen, Vladik Kreinovich Oct 2003

Real-Time Algorithms For Statistical Analysis Of Interval Data, Berlin Wu, Hung T. Nguyen, Vladik Kreinovich

Departmental Technical Reports (CS)

When we have only interval ranges [xi] of sample values x1,...,xn, what is the interval [V] of possible values for the variance V of these values? There are quadratic time algorithms for computing the exact lower bound V- on the variance of interval data, and for computing V+ under reasonable easily verifiable conditions. The problem is that in real life, we often make additional measurements. In traditional statistics, if we have a new measurement result, we can modify the value of variance in constant time. In contrast, previously known algorithms for processing interval data required that, once a new data …