Open Access. Powered by Scholars. Published by Universities.®

Engineering Commons

Open Access. Powered by Scholars. Published by Universities.®

Computer Engineering

Departmental Technical Reports (CS)

Interval uncertainty

Articles 1 - 27 of 27

Full-Text Articles in Engineering

Why It Is Important To Precisiate Goals, Olga Kosheleva, Vladik Kreinovich, Hung T. Nguyen Mar 2015

Why It Is Important To Precisiate Goals, Olga Kosheleva, Vladik Kreinovich, Hung T. Nguyen

Departmental Technical Reports (CS)

After Zadeh and Bellman explained how to optimize a function under fuzzy constraints, there have been many successful applications of this optimization. However, in many practical situations, it turns out to be more efficient to precisiate the objective function before performing optimization. In this paper, we provide a possible explanation for this empirical fact.


Minimax Portfolio Optimization Under Interval Uncertainty, Meng Yuan, Xu Lin, Junzo Watada, Vladik Kreinovich Jan 2015

Minimax Portfolio Optimization Under Interval Uncertainty, Meng Yuan, Xu Lin, Junzo Watada, Vladik Kreinovich

Departmental Technical Reports (CS)

In the 1950s, Markowitz proposed to combine different investment instruments to design a portfolio that either maximizes the expected return under constraints on volatility (risk) or minimizes the risk under given expected return. Markowitz's formulas are still widely used in financial practice. However, these formulas assume that we know the exact values of expected return and variance for each instrument, and that we know the exact covariance of every two instruments. In practice, we only know these values with some uncertainty. Often, we only know the bounds of these values -- i.e., in other words, we only know the intervals …


Interval Computations And Interval-Related Statistical Techniques: Estimating Uncertainty Of The Results Of Data Processing And Indirect Measurements, Vladik Kreinovich Dec 2014

Interval Computations And Interval-Related Statistical Techniques: Estimating Uncertainty Of The Results Of Data Processing And Indirect Measurements, Vladik Kreinovich

Departmental Technical Reports (CS)

In many practical situations, we only know the upper bound Δ on the measurement error: |Δx| ≤ Δ. In other words, we only know that the measurement error is located on the interval [−Δ, Δ]. The traditional approach is to assume that Δx is uniformly distributed on [−Δ, Δ]. In some situations, however, this approach underestimates the error of indirect measurements. It is therefore desirable to directly process this interval uncertainty. Such "interval computations" methods have been developed since the 1950s. In this paper, we provide a brief overview of related algorithms and results.


Optimizing Computer Representation And Computer Processing Of Epistemic Uncertainty For Risk-Informed Decision Making: Finances Etc., Vladik Kreinovich, Nitaya Buntao, Olga Kosheleva Apr 2012

Optimizing Computer Representation And Computer Processing Of Epistemic Uncertainty For Risk-Informed Decision Making: Finances Etc., Vladik Kreinovich, Nitaya Buntao, Olga Kosheleva

Departmental Technical Reports (CS)

Uncertainty is usually gauged by using standard statistical characteristics: mean, variance, correlation, etc. Then, we use the known values of these characteristics (or the known bounds on these values) to select a decision. Sometimes, it becomes clear that the selected characteristics do not always describe a situation well; then other known (or new) characteristics are proposed. A good example is description of volatility in finance: it started with variance, and now many descriptions are competing, all with their own advantages and limitations.

In such situations, a natural idea is to come up with characteristics tailored to specific application areas: e.g., …


Estimating Probability Of Failure Of A Complex System Based On Inexact Information About Subsystems And Components, With Potential Applications To Aircraft Maintenance, Vladik Kreinovich, Christelle Jacob, Didier Dubois, Janette Cardoso, Martine Ceberio, Ildar Batyrshin Jun 2011

Estimating Probability Of Failure Of A Complex System Based On Partial Information About Subsystems And Components, With Potential Applications To Aircraft Maintenance, Christelle Jacob, Didier Dubois, Janette Cardoso, Martine Ceberio, Vladik Kreinovich May 2011

Towards Faster Estimation Of Statistics And Odes Under Interval, P-Box, And Fuzzy Uncertainty: From Interval Computations To Rough Set-Related Computations, Vladik Kreinovich Mar 2011

Towards A Fast, Practical Alternative To Joint Inversion Of Multiple Datasets: Model Fusion, Omar Ochoa, Aaron A. Velasco, Christian Servin Oct 2010

Estimating Information Amount Under Uncertainty: Algorithmic Solvability And Computational Complexity, Vladik Kreinovich, Gang Xiang Jan 2010

Estimating Information Amount Under Uncertainty: Algorithmic Solvability And Computational Complexity, Vladik Kreinovich, Gang Xiang

Departmental Technical Reports (CS)

Measurement results (and, more generally, estimates) are never absolutely accurate: there is always an uncertainty, the actual value x is, in general, different from the estimate X. Sometimes, we know the probability of different values of the estimation error dx=X-x, sometimes, we only know the interval of possible values of dx, sometimes, we have interval bounds on the cdf of dx. To compare different measuring instruments, it is desirable to know which of them brings more information - i.e., it is desirable to gauge the amount of information. For probabilistic uncertainty, this amount of information is described by Shannon's entropy; …


A Broad Prospective On Fuzzy Transforms: From Gauging Accuracy Of Quantity Estimates To Gauging Accuracy And Resolution Of Measuring Physical Fields, Vladik Kreinovich, Irina Perfilieva Nov 2009

A Broad Prospective On Fuzzy Transforms: From Gauging Accuracy Of Quantity Estimates To Gauging Accuracy And Resolution Of Measuring Physical Fields, Vladik Kreinovich, Irina Perfilieva

Departmental Technical Reports (CS)

Fuzzy transform is a new type of function transforms that has been successfully used in different application. In this paper, we provide a broad prospective on fuzzy transform. Specifically, we show that fuzzy transform naturally appears when, in addition to measurement uncertainty, we also encounter another type of localization uncertainty: that the measured value may come not only from the desired location x, but also from the nearby locations.


Model Fusion Under Probabilistic And Interval Uncertainty, With Application To Earth Sciences, Omar Ochoa, Aaron A. Velasco, Christian Servin, Vladik Kreinovich Nov 2009

Model Fusion Under Probabilistic And Interval Uncertainty, With Application To Earth Sciences, Omar Ochoa, Aaron A. Velasco, Christian Servin, Vladik Kreinovich

Departmental Technical Reports (CS)

One of the most important studies of the earth sciences is that of the Earth's interior structure. There are many sources of data for Earth tomography models: first-arrival passive seismic data (from the actual earthquakes), first-arrival active seismic data (from the seismic experiments), gravity data, and surface waves. Currently, each of these datasets is processed separately, resulting in several different Earth models that have specific coverage areas, different spatial resolutions and varying degrees of accuracy. These models often provide complimentary geophysical information on earth structure (P and S wave velocity structure).

Combining the information derived from each requires a joint …


Quantum Computations Techniques For Gauging Reliability Of Interval And Fuzzy Data, Luc Longpre, Christian Servin, Vladik Kreinovich Jul 2009

Quantum Computations Techniques For Gauging Reliability Of Interval And Fuzzy Data, Luc Longpre, Christian Servin, Vladik Kreinovich

Departmental Technical Reports (CS)

In traditional interval computations, we assume that the interval data corresponds to guaranteed interval bounds, and that fuzzy estimates provided by experts are correct. In practice, measuring instruments are not 100% reliable, and experts are not 100% reliable, we may have estimates which are "way off", intervals which do not contain the actual values at all. Usually, we know the percentage of such outlier un-reliable measurements. However, it is desirable to check that the reliability of the actual data is indeed within the given percentage. The problem of checking (gauging) this reliability is, in general, NP-hard; in reasonable cases, there …


Towards Neural-Based Understanding Of The Cauchy Deviate Method For Processing Interval And Fuzzy Uncertainty, Vladik Kreinovich, Hung T. Nguyen Jan 2009

Towards Neural-Based Understanding Of The Cauchy Deviate Method For Processing Interval And Fuzzy Uncertainty, Vladik Kreinovich, Hung T. Nguyen

Departmental Technical Reports (CS)

One of the most efficient techniques for processing interval and fuzzy data is a Monte-Carlo type technique of Cauchy deviates that uses Cauchy distributions. This technique is mathematically valid, but somewhat counterintuitive. In this paper, following the ideas of Paul Werbos, we provide a natural neural network explanation for this technique.


Trade-Off Between Sample Size And Accuracy: Case Of Measurements Under Interval Uncertainty, Hung T. Nguyen, Olga Kosheleva, Vladik Kreinovich, Scott Ferson Jun 2008

Trade-Off Between Sample Size And Accuracy: Case Of Measurements Under Interval Uncertainty, Hung T. Nguyen, Olga Kosheleva, Vladik Kreinovich, Scott Ferson

Departmental Technical Reports (CS)

In many practical situations, we are not satisfied with the accuracy of the existing measurements. There are two possible ways to improve the measurement accuracy:

first, instead of a single measurement, we can make repeated measurements; the additional information coming from these additional measurements can improve the accuracy of the result of this series of measurements;

second, we can replace the current measuring instrument with a more accurate one; correspondingly, we can use a more accurate (and more expensive) measurement procedure provided by a measuring lab -- e.g., a procedure that includes the use of a higher quality reagent.

In …


Application-Motivated Combinations Of Fuzzy, Interval, And Probability Approaches, And Their Use In Geoinformatics, Bioinformatics, And Engineering, Vladik Kreinovich May 2008

Application-Motivated Combinations Of Fuzzy, Interval, And Probability Approaches, And Their Use In Geoinformatics, Bioinformatics, And Engineering, Vladik Kreinovich

Departmental Technical Reports (CS)

Most data processing techniques traditionally used in scientific and engineering practice are statistical. These techniques are based on the assumption that we know the probability distributions of measurement errors etc. In practice, often, we do not know the distributions, we only know the bound D on the measurement accuracy - hence, after the get the measurement result X, the only information that we have about the actual (unknown) value x of the measured quantity is that x belongs to the interval [X - D, X + D]. Techniques for data processing under such interval uncertainty are called interval computations; these …


Propagation And Provenance Of Probabilistic And Interval Uncertainty In Cyberinfrastructure-Related Data Processing And Data Fusion, Paulo Pinheiro Da Silva, Aaron A. Velasco, Martine Ceberio, Christian Servin, Matthew G. Averill, Nicholas Ricky Del Rio, Luc Longpre, Vladik Kreinovich Nov 2007

Propagation And Provenance Of Probabilistic And Interval Uncertainty In Cyberinfrastructure-Related Data Processing And Data Fusion, Paulo Pinheiro Da Silva, Aaron A. Velasco, Martine Ceberio, Christian Servin, Matthew G. Averill, Nicholas Ricky Del Rio, Luc Longpre, Vladik Kreinovich

Departmental Technical Reports (CS)

In the past, communications were much slower than computations. As a result, researchers and practitioners collected different data into huge databases located at a single location such as NASA and US Geological Survey. At present, communications are so much faster that it is possible to keep different databases at different locations, and automatically select, transform, and collect relevant data when necessary. The corresponding cyberinfrastructure is actively used in many applications. It drastically enhances scientists' ability to discover, reuse and combine a large number of resources, e.g., data and services.

Because of this importance, it is desirable to be able to …


Why Intervals? Why Fuzzy Numbers? Towards A New Justification, Vladik Kreinovich Apr 2007

Why Intervals? Why Fuzzy Numbers? Towards A New Justification, Vladik Kreinovich

Departmental Technical Reports (CS)

The purpose of this paper is to present a new characterization of the set of all intervals (and of the corresponding set of fuzzy numbers). This characterization is based on several natural properties useful in mathematical modeling; the main of these properties is the necessity to be able to combine (fuse) several pieces of knowledge.


Interval Approach To Preserving Privacy In Statistical Databases: Related Challenges And Algorithms Of Computational Statistics, Luc Longpre, Gang Xiang, Vladik Kreinovich, Eric Freudenthal Mar 2007

Interval Approach To Preserving Privacy In Statistical Databases: Related Challenges And Algorithms Of Computational Statistics, Luc Longpre, Gang Xiang, Vladik Kreinovich, Eric Freudenthal

Departmental Technical Reports (CS)

In many practical situations, it is important to store large amounts of data and to be able to statistically process the data. A large part of the data is confidential, so while we welcome statistical data processing, we do not want to reveal sensitive individual data. If we allow researchers to ask all kinds of statistical queries, this can lead to violation of people's privacy. A sure-proof way to avoid these privacy violations is to store ranges of values (e.g., between 40 and 50 for age) instead of the actual values. This idea solves the privacy problem, but it leads …


For Piecewise Smooth Signals, L1 Method Is The Best Among Lp: An Interval-Based Justification Of An Empirical Fact, Vladik Kreinovich, Arnold Neumaier Dec 2006

For Piecewise Smooth Signals, L1 Method Is The Best Among Lp: An Interval-Based Justification Of An Empirical Fact, Vladik Kreinovich, Arnold Neumaier

Departmental Technical Reports (CS)

Traditional engineering techniques use the Least Squares method (i.e., in mathematical terms, the l2-norm) to process data. It is known that in many practical situations, lp-methods with p=/=2 lead to better results. In different practical situations, different values of p are optimal. It is known that in several situations when we need to reconstruct a piecewise smooth signal, the empirically optimal value of p is close to 1. In this paper, we provide a new interval-based theoretical explanation for this empirical fact.


Interval-Based Robust Statistical Techniques For Non-Negative Convex Functions With Application To Timing Analysis Of Computer Chips, Michael Orshansky, Wei-Shen Wang, Gang Xiang, Vladik Kreinovich Jan 2006

Interval-Based Robust Statistical Techniques For Non-Negative Convex Functions With Application To Timing Analysis Of Computer Chips, Michael Orshansky, Wei-Shen Wang, Gang Xiang, Vladik Kreinovich

Departmental Technical Reports (CS)

In chip design, one of the main objectives is to decrease its clock cycle; however, the existing approaches to timing analysis under uncertainty are based on fundamentally restrictive assumptions. Statistical timing analysis techniques assume that the full probabilistic distribution of timing uncertainty is available; in reality, the complete probabilistic distribution information is often unavailable. Additionally, the existing alternative of treating uncertainty as interval-based, or affine, is limited since it cannot handle probabilistic information in principle. In this paper, a fundamentally new paradigm for timing uncertainty description is proposed as a way to consistently and rigorously handle partially available descriptions of …


Monte-Carlo-Type Techniques For Processing Interval Uncertainty, And Their Potential Engineering Applications, Vladik Kreinovich, J. Beck, Carlos M. Ferregut, A. Sanchez, George R. Keller, Matthew G. Averill, Scott A. Starks Dec 2005

Monte-Carlo-Type Techniques For Processing Interval Uncertainty, And Their Potential Engineering Applications, Vladik Kreinovich, J. Beck, Carlos M. Ferregut, A. Sanchez, George R. Keller, Matthew G. Averill, Scott A. Starks

Departmental Technical Reports (CS)

In engineering applications, we need to make decisions under uncertainty. Traditionally, in engineering, statistical methods are used, methods assuming that we know the probability distribution of different uncertain parameters. Usually, we can safely linearize the dependence of the desired quantities y (e.g., stress at different structural points) on the uncertain parameters xi - thus enabling sensitivity analysis. Often, the number n of uncertain parameters is huge, so sensitivity analysis leads to a lot of computation time. To speed up the processing, we propose to use special Monte-Carlo-type simulations.


Combining Interval, Probabilistic, And Fuzzy Uncertainty: Foundations, Algorithms, Challenges -- An Overview, Vladik Kreinovich, David J. Berleant, Scott Ferson, Weldon A. Lodwick Nov 2005

Combining Interval, Probabilistic, And Fuzzy Uncertainty: Foundations, Algorithms, Challenges -- An Overview, Vladik Kreinovich, David J. Berleant, Scott Ferson, Weldon A. Lodwick

Departmental Technical Reports (CS)

Since the 1960s, many algorithms have been designed to deal with interval uncertainty. In the last decade, there has been a lot of progress in extending these algorithms to the case when we have a combination of interval and probabilistic uncertainty. We provide an overview of related algorithms, results, and remaining open problems.


Towards Combining Probabilistic And Interval Uncertainty In Engineering Calculations: Algorithms For Computing Statistics Under Interval Uncertainty, And Their Computational Complexity, Vladik Kreinovich, Gang Xiang, Scott A. Starks, Luc Longpre, Martine Ceberio, Roberto Araiza, J. Beck, R. Kandathi, A. Nayak, R. Torres, J. Hajagos Jun 2005

Towards Combining Probabilistic And Interval Uncertainty In Engineering Calculations: Algorithms For Computing Statistics Under Interval Uncertainty, And Their Computational Complexity, Vladik Kreinovich, Gang Xiang, Scott A. Starks, Luc Longpre, Martine Ceberio, Roberto Araiza, J. Beck, R. Kandathi, A. Nayak, R. Torres, J. Hajagos

Departmental Technical Reports (CS)

In many engineering applications, we have to combine probabilistic and interval uncertainty. For example, in environmental analysis, we observe a pollution level x(t) in a lake at different moments of time t, and we would like to estimate standard statistical characteristics such as mean, variance, autocorrelation, correlation with other measurements. In environmental measurements, we often only measure the values with interval uncertainty. We must therefore modify the existing statistical algorithms to process such interval data.

In this paper, we provide a survey of algorithms for computing various statistics under interval uncertainty and their computational complexity. The survey includes both known …


Monte-Carlo-Type Techniques For Processing Interval Uncertainty, And Their Geophysical And Engineering Applications, Matthew G. Averill, Kate C. Miller, George R. Keller, Vladik Kreinovich, Jan Beck, Roberto Araiza, Roberto Torres, Scott A. Starks Dec 2004

Monte-Carlo-Type Techniques For Processing Interval Uncertainty, And Their Geophysical And Engineering Applications, Matthew G. Averill, Kate C. Miller, George R. Keller, Vladik Kreinovich, Jan Beck, Roberto Araiza, Roberto Torres, Scott A. Starks

Departmental Technical Reports (CS)

To determine the geophysical structure of a region, we measure seismic travel times and reconstruct velocities at different depths from this data. There are several algorithms for solving this inverse problem, but these algorithms do not tell us how accurate these reconstructions are.

Traditional approach to accuracy estimation assumes that the measurement errors are independently normally distributed. Problem: the resulting accuracies are not in line with geophysical intuition. Reason: a typical error is when we miss the first arrival of the seismic wave; it is not normal (bounded by the wave period T) and not independent.

Typically, all we know …


Towards Combining Probabilistic And Interval Uncertainty In Engineering Calculations, Scott A. Starks, Vladik Kreinovich, Luc Longpre, Martine Ceberio, Gang Xiang, Roberto Araiza, J. Beck, R. Kandathi, A. Nayak, R. Torres Jul 2004

Towards Combining Probabilistic And Interval Uncertainty In Engineering Calculations, Scott A. Starks, Vladik Kreinovich, Luc Longpre, Martine Ceberio, Gang Xiang, Roberto Araiza, J. Beck, R. Kandathi, A. Nayak, R. Torres

Departmental Technical Reports (CS)

In many engineering applications, we have to combine probabilistic and interval errors. For example, in environmental analysis, we observe a pollution level x(t) in a lake at different moments of time t, and we would like to estimate standard statistical characteristics such as mean, variance, autocorrelation, correlation with other measurements. In environmental measurements, we often only know the values with interval uncertainty. We must therefore modify the existing statistical algorithms to process such interval data. Such modification are described in this paper.


Sensitivity Analysis Of Neural Control, Chin-Wang Tao, Hung T. Nguyen, J. T. Yao, Vladik Kreinovich Oct 2003

Sensitivity Analysis Of Neural Control, Chin-Wang Tao, Hung T. Nguyen, J. T. Yao, Vladik Kreinovich

Departmental Technical Reports (CS)

We provide explicit formulas that describe how sensitive the resulting signal of a neural network is to the measurement errors with which we measure the inputs.


Fair Division Under Interval Uncertainty, Ronald R. Yager, Vladik Kreinovich Jun 1998

Fair Division Under Interval Uncertainty, Ronald R. Yager, Vladik Kreinovich

Departmental Technical Reports (CS)

It is often necessary to divide a certain amount of money between n participants, i.e., to assign, to each participant, a certain portion w(i)>=0 of the whole sum (so that w(1)+...+w(n)=1). In some situations, from the fairness requirements, we can uniquely determine these "weights" w(i). However, in some other situations, general considerations do not allow us to uniquely determine these weights, we only know the intervals [w-(i),w+(i)] of possible fair weights. We show that natural fairness requirements enable us to choose unique weights from these intervals; as a result, we present an algorithm for fair division under interval uncertainty.