Open Access. Powered by Scholars. Published by Universities.®

Engineering Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 12 of 12

Full-Text Articles in Engineering

Addressing The Challenged Of Dcop Based Decision-Making Algorithms In Modern Power Systems, Luis Daniel Ramirez Burgueno May 2023

Addressing The Challenged Of Dcop Based Decision-Making Algorithms In Modern Power Systems, Luis Daniel Ramirez Burgueno

Open Access Theses & Dissertations

Natural disasters have been determined as the leading cause of power outages, causing not only huge economic losses, but also the interruption of crucial welfare activities and the arise of security concerns. Because of the later, decision-making considering grid modernization, power system economics, and system resiliency has been a crucial theme in power systemsâ?? research. The need to better withstand catastrophic events and reducing the dependency of bulky generating units has propelled the development and better management of behind-the-meter generation or distributed energy resources (DERs). DERs can assist in the grid in different manners, not only by meeting energy demand …


A Probabilistic Creep Constitutive Model For Creep Deformation, Damage, And Rupture, Md Abir Hossain Jan 2020

A Probabilistic Creep Constitutive Model For Creep Deformation, Damage, And Rupture, Md Abir Hossain

Open Access Theses & Dissertations

The structural analysis of Industrial Gas Turbine (IGT), Aeroengine, Gen IV nuclear components under in-service conditions at various stress and temperature are susceptible to time-dependent creep deformation and creep induced failure. Such failure phenomena are exacerbated by the randomness in material properties, oscillating loading conditions, and other sources of uncertainty. The demand for physically based probabilistic creep modeling is highly sought by alloy designers. The objective of this study is to develop and validate a probabilistic creep-damage model incorporating multi-sources of uncertainty to replace the traditional deterministic and empirical decision-based modeling. In this study, the deterministic Sine-hyperbolic (Sinh) creep-damage model …


Decision Making For Dynamic Systems Under Uncertainty: Predictions And Parameter Recomputations, Leobardo Valera Jan 2018

Decision Making For Dynamic Systems Under Uncertainty: Predictions And Parameter Recomputations, Leobardo Valera

Open Access Theses & Dissertations

In this Thesis, we are interested in making decision over a model of a dynamic system. We want to know, on one hand, how the corresponding dynamic phenomenon unfolds under different input parameters (simulations). These simulations might help researchers to design devices with a better performance than the actual ones. On the other hand, we are also interested in predicting the behavior of the dynamic system based on knowledge of the phenomenon in order to prevent undesired outcomes. Finally, this Thesis is concerned with the identification of parameters of dynamic systems that ensure a specific performance or behavior.

Understanding the …


Combining Interval And Probabilistic Uncertainty In Engineering Applications, Andrew Martin Pownuk Jan 2016

Combining Interval And Probabilistic Uncertainty In Engineering Applications, Andrew Martin Pownuk

Open Access Theses & Dissertations

In many practical application, we process measurement results and expert estimates. Measurements and expert estimates are never absolutely accurate, their result are slightly different from the actual (unknown) values of the corresponding quantities. It is therefore desirable to analyze how this measurement and estimation inaccuracy affects the results of data processing. There exist numerous methods for estimating the accuracy of the results of data processing under different models of measurement and estimation inaccuracies: probabilistic, interval, and fuzzy. To be useful in engineering applications, these methods should provide accurate estimate for the resulting uncertainty, should not take too much computation time, …


Optimizing Computer Representation And Computer Processing Of Epistemic Uncertainty For Risk-Informed Decision Making: Finances Etc., Vladik Kreinovich, Nitaya Buntao, Olga Kosheleva Apr 2012

Optimizing Computer Representation And Computer Processing Of Epistemic Uncertainty For Risk-Informed Decision Making: Finances Etc., Vladik Kreinovich, Nitaya Buntao, Olga Kosheleva

Departmental Technical Reports (CS)

Uncertainty is usually gauged by using standard statistical characteristics: mean, variance, correlation, etc. Then, we use the known values of these characteristics (or the known bounds on these values) to select a decision. Sometimes, it becomes clear that the selected characteristics do not always describe a situation well; then other known (or new) characteristics are proposed. A good example is description of volatility in finance: it started with variance, and now many descriptions are competing, all with their own advantages and limitations.

In such situations, a natural idea is to come up with characteristics tailored to specific application areas: e.g., …


Estimating Information Amount Under Uncertainty: Algorithmic Solvability And Computational Complexity, Vladik Kreinovich, Gang Xiang Jan 2010

Estimating Information Amount Under Uncertainty: Algorithmic Solvability And Computational Complexity, Vladik Kreinovich, Gang Xiang

Departmental Technical Reports (CS)

Measurement results (and, more generally, estimates) are never absolutely accurate: there is always an uncertainty, the actual value x is, in general, different from the estimate X. Sometimes, we know the probability of different values of the estimation error dx=X-x, sometimes, we only know the interval of possible values of dx, sometimes, we have interval bounds on the cdf of dx. To compare different measuring instruments, it is desirable to know which of them brings more information - i.e., it is desirable to gauge the amount of information. For probabilistic uncertainty, this amount of information is described by Shannon's entropy; …


Optimal Sensor Placement In Environmental Research: Designing A Sensor Network Under Uncertainty, Aline James, Craig Tweedie, Tanja Magoc, Vladik Kreinovich, Martine Ceberio Dec 2009

Optimal Sensor Placement In Environmental Research: Designing A Sensor Network Under Uncertainty, Aline James, Craig Tweedie, Tanja Magoc, Vladik Kreinovich, Martine Ceberio

Departmental Technical Reports (CS)

One of our main challenges in meteorology and environment research is that in many important remote areas, sensor coverage is sparse, leaving us with numerous blind spots. Placement and maintenance of sensors in these areas are expensive. It is therefore desirable to find out how, within a given budget, we can design a sensor network are important activities was developing reasonable techniques for sensor that would provide us with the largest amount of useful information while minimizing the size of the "blind spot" areas which is not covered by the sensors.

This problem is very difficult even to formulate in …


Model Fusion Under Probabilistic And Interval Uncertainty, With Application To Earth Sciences, Omar Ochoa, Aaron A. Velasco, Christian Servin, Vladik Kreinovich Nov 2009

Model Fusion Under Probabilistic And Interval Uncertainty, With Application To Earth Sciences, Omar Ochoa, Aaron A. Velasco, Christian Servin, Vladik Kreinovich

Departmental Technical Reports (CS)

One of the most important studies of the earth sciences is that of the Earth's interior structure. There are many sources of data for Earth tomography models: first-arrival passive seismic data (from the actual earthquakes), first-arrival active seismic data (from the seismic experiments), gravity data, and surface waves. Currently, each of these datasets is processed separately, resulting in several different Earth models that have specific coverage areas, different spatial resolutions and varying degrees of accuracy. These models often provide complimentary geophysical information on earth structure (P and S wave velocity structure).

Combining the information derived from each requires a joint …


Maximum Entropy In Support Of Semantically Annotated Datasets, Paulo Pinheiro Da Silva, Vladik Kreinovich, Christian Servin Sep 2008

Maximum Entropy In Support Of Semantically Annotated Datasets, Paulo Pinheiro Da Silva, Vladik Kreinovich, Christian Servin

Departmental Technical Reports (CS)

One of the important problems of semantic web is checking whether two datasets describe the same quantity. The existing solution to this problem is to use these datasets' ontologies to deduce that these datasets indeed represent the same quantity. However, even when ontologies seem to confirm the identify of the two corresponding quantities, it is still possible that in reality, we deal with somewhat different quantities. A natural way to check the identity is to compare the numerical values of the measurement results: if they are close (within measurement errors), then most probably we deal with the same quantity, else …


Propagation And Provenance Of Probabilistic And Interval Uncertainty In Cyberinfrastructure-Related Data Processing And Data Fusion, Paulo Pinheiro Da Silva, Aaron A. Velasco, Martine Ceberio, Christian Servin, Matthew G. Averill, Nicholas Ricky Del Rio, Luc Longpre, Vladik Kreinovich Nov 2007

Propagation And Provenance Of Probabilistic And Interval Uncertainty In Cyberinfrastructure-Related Data Processing And Data Fusion, Paulo Pinheiro Da Silva, Aaron A. Velasco, Martine Ceberio, Christian Servin, Matthew G. Averill, Nicholas Ricky Del Rio, Luc Longpre, Vladik Kreinovich

Departmental Technical Reports (CS)

In the past, communications were much slower than computations. As a result, researchers and practitioners collected different data into huge databases located at a single location such as NASA and US Geological Survey. At present, communications are so much faster that it is possible to keep different databases at different locations, and automatically select, transform, and collect relevant data when necessary. The corresponding cyberinfrastructure is actively used in many applications. It drastically enhances scientists' ability to discover, reuse and combine a large number of resources, e.g., data and services.

Because of this importance, it is desirable to be able to …


A New Cauchy-Based Black-Box Technique For Uncertainty In Risk Analysis, Vladik Kreinovich, Scott Ferson Feb 2003

A New Cauchy-Based Black-Box Technique For Uncertainty In Risk Analysis, Vladik Kreinovich, Scott Ferson

Departmental Technical Reports (CS)

Uncertainty is very important in risk analysis. A natural way to describe this uncertainty is to describe a set of possible values of each unknown quantity (this set is usually an interval), plus any additional information that we may have about the probability of different values within this set. Traditional statistical techniques deal with the situations in which we have a complete information about the probabilities; in real life, however, we often have only partial information about them. We therefore need to describe methods of handling such partial information in risk analysis. Several such techniques have been presented, often on …


Why 95% And Two Sigma? A Theoretical Justification For An Empirical Measurement Practice, Hung T. Nguyen, Vladik Kreinovich, Chin-Wang Tao Jul 2000

Why 95% And Two Sigma? A Theoretical Justification For An Empirical Measurement Practice, Hung T. Nguyen, Vladik Kreinovich, Chin-Wang Tao

Departmental Technical Reports (CS)

The probability p(k) that the value of a random variable is far away from the mean (e.g. further than k standard deviations away) is so small that this possibility can be often safely ignored. It is desirable to select k for which the dependence of the probability p(k) on the distribution is the smallest possible. Empirically, this dependence is the smallest for k between 1.5 and 2.5. In this paper, we give a theoretical explanation for this empirical result.