Open Access. Powered by Scholars. Published by Universities.^{®}
 Keyword

 Interval uncertainty (2)
 Interval Uncertainty (2)
 Grid Integration Issues (1)
 HpFEM (1)
 Distributed Computing (1)

 Discrete maximum principles (1)
 3D images (1)
 Ellipsoids (1)
 Hypothesis testing (1)
 Feasible Algorithms (1)
 Expert knowledge (1)
 Formal Specification (1)
 Autonomous agents (1)
 Data processing (1)
 Bioinformatics (1)
 Dependence between the inputs (1)
 Computational complexity (1)
 Application Programming Interface (1)
 Fast fourier transform (1)
 Dependence (1)
 Data Communication (1)
 Copula (1)
 Correlation (1)
 Expert systems (1)
 Constraints (1)
 Geoinformatics (1)
 Entropy (1)
 Help systems (1)
 Data Access and Management (1)
 Fuzzy sets (1)
Articles 1  30 of 52
FullText Articles in Computer Engineering
Estimating Variance Under Interval And Fuzzy Uncertainty: Case Of Hierarchical Estimation, Gang Xiang, Vladik Kreinovich
Estimating Variance Under Interval And Fuzzy Uncertainty: Case Of Hierarchical Estimation, Gang Xiang, Vladik Kreinovich
Departmental Technical Reports (CS)
No abstract provided.
Fast Algorithms For Computing Statistics Under Interval And Fuzzy Uncertainty, And Their Applications, Gang Xiang, Vladik Kreinovich
Fast Algorithms For Computing Statistics Under Interval And Fuzzy Uncertainty, And Their Applications, Gang Xiang, Vladik Kreinovich
Departmental Technical Reports (CS)
In many engineering applications, we have to combine probabilistic, interval, and fuzzy uncertainty. For example, in environmental analysis, we observe a pollution level x(t) in a lake at different moments of time t, and we would like to estimate standard statistical characteristics such as mean, variance, autocorrelation, correlation with other measurements. In environmental measurements, we often only measure the values with interval uncertainty. We must therefore modify the existing statistical algorithms to process such interval data.
In this paper, we provide a brief survey of algorithms for computing various statistics under interval (and fuzzy) uncertainty and of their applications ...
For Piecewise Smooth Signals, L1 Method Is The Best Among Lp: An IntervalBased Justification Of An Empirical Fact, Vladik Kreinovich, Arnold Neumaier
For Piecewise Smooth Signals, L1 Method Is The Best Among Lp: An IntervalBased Justification Of An Empirical Fact, Vladik Kreinovich, Arnold Neumaier
Departmental Technical Reports (CS)
Traditional engineering techniques use the Least Squares method (i.e., in mathematical terms, the l2norm) to process data. It is known that in many practical situations, lpmethods with p=/=2 lead to better results. In different practical situations, different values of p are optimal. It is known that in several situations when we need to reconstruct a piecewise smooth signal, the empirically optimal value of p is close to 1. In this paper, we provide a new intervalbased theoretical explanation for this empirical fact.
How To Take Into Account Dependence Between The Inputs: From Interval Computations To ConstraintRelated Set Computations, With Potential Applications To Nuclear Safety, Bio And Geosciences, Martine Ceberio, Scott Ferson, Vladik Kreinovich, Sanjeev Chopra, Gang Xiang, Adrian Murguia, Jorge Santillan
How To Take Into Account Dependence Between The Inputs: From Interval Computations To ConstraintRelated Set Computations, With Potential Applications To Nuclear Safety, Bio And Geosciences, Martine Ceberio, Scott Ferson, Vladik Kreinovich, Sanjeev Chopra, Gang Xiang, Adrian Murguia, Jorge Santillan
Departmental Technical Reports (CS)
In many reallife situations, in addition to knowing the intervals Xi of possible values of each variable xi, we also know additional restrictions on the possible combinations of xi; in this case, the set X of possible values of x=(x1,..,xn) is a proper subset of the original box X1 x ... x Xn. In this paper, we show how to take into account this dependence between the inputs when computing the range of a function f(x1,...,xn).
What Users Say They Want In Documentation, David G. Novick, Karen Ward
What Users Say They Want In Documentation, David G. Novick, Karen Ward
Departmental Papers (CS)
While earlier work provided a partial view of users’ preferences about manuals, for most users in most work contexts the important question remains open: What do users want in documentation? This paper presents the results of a study in which a diverse crosssection of 25 users was interviewed in depth about their needs and preferences with respect to software help systems, whether printed or online, that they use at work. The study’s participants indicated that they preferred documentation, whether online or printed, that is easy to navigate, provides explanations at an appropriate level of technical detail, enables finding as ...
Why Don't People Read The Manual?, David G. Novick, Karen Ward
Why Don't People Read The Manual?, David G. Novick, Karen Ward
Departmental Papers (CS)
Few users of computer applications seek help from the documentation. This paper reports the results of an empirical study of why this is so and examines how, in real work, users solve their usability problems. Based on indepth interviews with 25 subjects representing a varied crosssection of users, we find that users do avoid using both paper and online help systems. Few users have paper manuals for the most heavily used applications, but none complained about their lack. Online help is more likely to be consulted than paper manuals, but users are equally likely to report that they solve their ...
How To Efficiently Process Uncertainty Within A Cyberinfrastructure Without Sacrificing Privacy And Confidentiality, Luc Longpre, Vladik Kreinovich
How To Efficiently Process Uncertainty Within A Cyberinfrastructure Without Sacrificing Privacy And Confidentiality, Luc Longpre, Vladik Kreinovich
Departmental Technical Reports (CS)
In this paper, we propose a simple solution to the problem of estimating uncertainty of the results of applying a blackbox algorithm  without sacrificing privacy and confidentiality of the algorithm.
Entropy Conserving Probability Transforms And The Entailment Principle, Ronald R. Yager, Vladik Kreinovich
Entropy Conserving Probability Transforms And The Entailment Principle, Ronald R. Yager, Vladik Kreinovich
Departmental Technical Reports (CS)
Our main result here is the development of a general procedure for transforming some initial probability distribution into a new probability distribution in a way that the resulting distribution has entropy at least as great as the original distribution. A significant aspect of our approach is that it makes use of the Zadeh's entailment principle which is itself a general procedure for going from an initial possibility distribution to a new possibility distribution so that the resulting possibility has an uncertainty at least as great of the original.
Two Etudes On Combining Probabilistic And Interval Uncertainty: Processing Correlations And Measuring Loss Of Privacy, Martine Ceberio, Gang Xiang, Luc Longpre, Vladik Kreinovich, Hung T. Nguyen, Daniel Berleant
Two Etudes On Combining Probabilistic And Interval Uncertainty: Processing Correlations And Measuring Loss Of Privacy, Martine Ceberio, Gang Xiang, Luc Longpre, Vladik Kreinovich, Hung T. Nguyen, Daniel Berleant
Departmental Technical Reports (CS)
In many practical situations, there is a need to combine interval and probabilistic uncertainty. The need for such a combination leads to two types of problems: (1) how to process the given combined uncertainty, and (2) how to gauge the amount of uncertainty and  a related question  how to best decrease this uncertainty. In our research, we concentrate on these two types of problems. In this paper, we present two examples that illustrate how the corresponding problems can be solved.
Comppknots: A Framework For Parallel Prediction And Comparison Of Rna Secondary Structures With Pseudoknots, Trilce Estrada, Abel Licon, Michela Taufer
Comppknots: A Framework For Parallel Prediction And Comparison Of Rna Secondary Structures With Pseudoknots, Trilce Estrada, Abel Licon, Michela Taufer
Departmental Technical Reports (CS)
Codes for RNA structure prediction based on energy minimization are usually very time and resource intensive. For this reason several codes have been significantly simplified: in some cases they are unable to predict complex secondary structures such as pseudoknots, while at other times they are able to predict structures with reduced lengths, or they are only able to predict some elementary and simple pseudoknots. Each of the existing codes has its strengths and weaknesses. Providing scientists with tools that are able to combine the strengths of the several codes is a worthwhile objective.
To address this need, we present compPknots ...
Canica: An Ide For The Java Modeling Language, Angelica B. Perez, Yoonsik Cheon, Ann Q. Gates
Canica: An Ide For The Java Modeling Language, Angelica B. Perez, Yoonsik Cheon, Ann Q. Gates
Departmental Technical Reports (CS)
Canica is an integrated development environment for the Java Modeling Language (JML), a formal behavioral interface specification language for Java. The JML distribution includes several support tools, such as a syntax checker, a compiler, and a document generator, and there are several thirdparty tools available for JML. However, most of these tools are commandlinebased and work in isolation. Canica glues and streamlines these tools to provide a GUIbased, integrated environment for JML; for example, it automates unit testing completely from test data generation to test execution and test result determination. In this paper, we describe the key features of Canica ...
Finding Least Expensive Tolerance Solutions And Least Expensive Tolerance Revisions: Algorithms And Computational Complexity, Inna Pivkina, Vladik Kreinovich
Finding Least Expensive Tolerance Solutions And Least Expensive Tolerance Revisions: Algorithms And Computational Complexity, Inna Pivkina, Vladik Kreinovich
Departmental Technical Reports (CS)
For an engineering design, tolerances in design parameters are selected so that within these tolerances, we guarantee the desired functionality. Feasible algorithms are known for solving the corresponding computational problems: the problem of finding tolerances that guarantee the given functionality, and the problem of checking whether given tolerances guarantee this functionality.
In this paper, we show that in many practical problems, the problem of choosing the optimal tolerances can also be solved by a feasible algorithm. We prove that a slightly different problem of finding the optimal tolerance revision is, in contrast, computationally difficult (namely, NPhard). We also show that ...
A ModelBased Workflow Approach For Scientific Applications, Leonardo Salayandia, Paulo Pinheiro Da Silva, Ann Q. Gates, Alvaro Rebellon
A ModelBased Workflow Approach For Scientific Applications, Leonardo Salayandia, Paulo Pinheiro Da Silva, Ann Q. Gates, Alvaro Rebellon
Departmental Technical Reports (CS)
Productive design of scientific workflows often depends on the effectiveness of the communication between the discipline domain experts and computer scientists, including their ability to share their specific needs in the design of the workflow. Discipline domain experts and computer scientists, however, tend to have distinct needs for designing workflows including terminology, level of abstraction, workflow aspects that should be included in the design. This paper discusses the use of a ModelBased Workflow (MBW) approach as an abstract way to specify workflows that conciliate the needs of domain and computer scientists. Within the context of GEON, an NSF cyberinfrastructure for ...
Wavesurfer: A Tool For Sound Analysis, Ernesto Medina, Thamar Solorio
Wavesurfer: A Tool For Sound Analysis, Ernesto Medina, Thamar Solorio
Departmental Technical Reports (CS)
Researchers in the Interactive Systems Group at UTEP have been using a research tool called Didi for some time now. It was originally designed to be easily adaptable. This tool has proven to be adaptable as it has been changed by different researchers to suit particular needs. As a result, multiple versions of the program exist. In addition to this, the tool only works in Linux and has grown quite a bit. To solve these problems, the different versions could have been consolidated into one program and modified to produce a version that worked on other platforms, or another program ...
WorkflowDriven Ontologies: An Earth Sciences Case Study, Leonardo Salayandia, Paulo Pinheiro Da Silva, Ann Q. Gates, Flor Salcedo
WorkflowDriven Ontologies: An Earth Sciences Case Study, Leonardo Salayandia, Paulo Pinheiro Da Silva, Ann Q. Gates, Flor Salcedo
Departmental Technical Reports (CS)
A goal of the Geosciences Network (GEON) is to develop cyberinfrastructure that will allow earth scientists to discover access, integrate and disseminate knowledge in distributed environments such as the Web, changing the way in which research is conducted. The earth sciences community has begun the complex task of creating ontologies to support this effort. A challenge is to coalesce the needs of the earth scientists, who wish to capture knowledge in a particular discipline through the ontology, with the need to leverage the knowledge to support technology that will facilitate computation, for example, by helping the composition of services. This ...
For Complex Intervals, Exact Range Computation Is NpHard Even For Single Use Expressions (Even For The Product), Martine Ceberio, Vladik Kreinovich, Guenter Mayer
For Complex Intervals, Exact Range Computation Is NpHard Even For Single Use Expressions (Even For The Product), Martine Ceberio, Vladik Kreinovich, Guenter Mayer
Departmental Technical Reports (CS)
One of the main problems of interval computations is to compute the range Y of the given function f(x1,...,xn) under interval uncertainty. Interval computations started with the invention of straightforward interval computations, when we simply replace each elementary arithmetic operation in the code for f with the corresponding operation from interval arithmetic. In general, this technique only leads to an enclosure for the desired range, but in the important case of single use expressions (SUE), in which each variable occurs only once, we get the exact range. Thus, for SUE expressions, there exists a feasible (polynomialtime) algorithm for ...
Fast Computation Of Exact Ranges Of Symmetric Convex And Concave Functions Under Interval Uncertainty, Gang Xiang
Fast Computation Of Exact Ranges Of Symmetric Convex And Concave Functions Under Interval Uncertainty, Gang Xiang
Departmental Technical Reports (CS)
Many statistical characteristics y=f(x1,...,xn) are continuous, symmetric, and either concave or convex; examples include population variance V=(1/n)*(x1^2+...+xn^2)E^2 (where E=(1/n)*(x1+...+xn), Shannon's entropy S=p1*log(p1)..pn*log(pn), and many other characteristics. In practice, often, we often only know the intervals Xi=[xi,xi+] that contain the (unknown) actual inputs xi. Since different values xi from Xi lead, in general, to different values of f(x1,...,xn), we need to find the range Y={f(x1,...,xn):x1 in X1,...,xn in Xn}, i ...
Detecting Filled Pauses In Tutorial Dialogs, Gaurav Garg, Nigel Ward
Detecting Filled Pauses In Tutorial Dialogs, Gaurav Garg, Nigel Ward
Departmental Technical Reports (CS)
As dialog systems become more capable, users tend to talk more spontaneously and less formally. Spontaneous speech includes features which convey information about the user's state. In particular, filled pauses, such as `um' and `uh', can indicate that the user is having trouble, wants more time, wants to hold the floor, or is uncertain. In this paper we present a first study of the acoustic characteristics of filled pauses in tutorial dialogs. We show that in this domain, as in other domains, filled pauses typically have flat pitch and fairly constant energy. We present a simple algorithm based on ...
The Effectiveness Of ThresholdBased Scheduling Policies On Boinc Projects, Trilce Estrada, David A. Flores, Michela Taufer, Patricia J. Teller, Andre Kerstens, David P. Anderson
The Effectiveness Of ThresholdBased Scheduling Policies On Boinc Projects, Trilce Estrada, David A. Flores, Michela Taufer, Patricia J. Teller, Andre Kerstens, David P. Anderson
Departmental Technical Reports (CS)
Several scientific projects use BOINC (Berkeley Open Infrastructure for Network Computing) to perform largescale simulations using volunteers� computers (workers) across the Internet. In general, the scheduling of tasks in BOINC uses a FirstComeFirstServe policy and no attention is paid to workers� past performance, such as whether they have tended to perform tasks promptly and correctly. In this paper we use SimBA, a discreteevent simulator of BOINC applications, to study new thresholdbased scheduling strategies for BOINC projects that use availability and reliability metrics to classify workers and distribute tasks according to this classification. We show that if availability and reliability thresholds ...
How To Measure Loss Of Privacy, Luc Longpre, Vladik Kreinovich
How To Measure Loss Of Privacy, Luc Longpre, Vladik Kreinovich
Departmental Technical Reports (CS)
To compare different schemes for preserving privacy, it is important to be able to gauge loss of privacy. Since loss of privacy means that we gain new information about a person, it seems natural to measure the loss of privacy by the amount of information that we gained. However, this seemingly natural definition is not perfect: when we originally know that a person's salary is between $10,000 and $20,000 and later learn that the salary is between $10,000 and $15,000, we gained exactly as much information (one bit) as when we learn that the salary ...
Measuring Privacy Loss In Statistical Databases, Vinod Chirayath, Luc Longpre, Vladik Kreinovich
Measuring Privacy Loss In Statistical Databases, Vinod Chirayath, Luc Longpre, Vladik Kreinovich
Departmental Technical Reports (CS)
Protection of privacy in databases has become of increasing importance. While a number of techniques have been proposed to query databases while preserving privacy of individual records in the database, very little is done to define a measure on how much privacy is lost after statistical releases. We suggest a definition based on information theory. Intuitively, the privacy loss is proportional to how much the descriptional complexity of a record decreases relative to the statistical release. There are some problems with this basic definition and we suggest ways to address these problems.
Automatic Labeling Of Back Channels, Udit Sajjanhar, Nigel Ward
Automatic Labeling Of Back Channels, Udit Sajjanhar, Nigel Ward
Departmental Technical Reports (CS)
In dialog, the proper production of backchannels is an important way for listeners to cooperate with speakers. Developing quantitative models of this process is important both for improving spoken dialog systems and for teaching second language learners. An essential step for the development of such models is labeling all backchannels in corpora of humanhuman dialogs. Currently this is done by hand. This report describes a method for automatically identifying backchannels in conversation corpora, using only the patterns of speech and silence by the speaker and the listener in the local context. Tested on Arabic, Spanish, and English, this method identifies ...
Statistical Data Processing Under Interval Uncertainty: Algorithms And Computational Complexity, Vladik Kreinovich
Statistical Data Processing Under Interval Uncertainty: Algorithms And Computational Complexity, Vladik Kreinovich
Departmental Technical Reports (CS)
No abstract provided.
Unimodality, Independence Lead To NpHardness Of Interval Probability Problems, Daniel J. Berleant, Olga Kosheleva, Vladik Kreinovich, Hung T. Nguyen
Unimodality, Independence Lead To NpHardness Of Interval Probability Problems, Daniel J. Berleant, Olga Kosheleva, Vladik Kreinovich, Hung T. Nguyen
Departmental Technical Reports (CS)
In many reallife situations, we only have partial information about probabilities. This information is usually described by bounds on moments, on probabilities of certain events, etc.  i.e., by characteristics c(p) which are linear in terms of the unknown probabilities pj. If we know interval bounds on some such characteristics ai <= ci(p) <= Ai, and we are interested in a characteristic c(p), then we can find the bounds on c(p) by solving a linear programming problem.
In some situations, we also have additional conditions on the probability distribution  e.g., we may know that the two variables x1 and x2 are independent, or that the distribution of x1 and x2 is unimodal. We show that adding each of these conditions makes the corresponding interval ...
Helping Students To Become Researchers: What We Can Gain From Russian Experience, Vladik Kreinovich, Ann Q. Gates, Olga Kosheleva
Helping Students To Become Researchers: What We Can Gain From Russian Experience, Vladik Kreinovich, Ann Q. Gates, Olga Kosheleva
Departmental Technical Reports (CS)
The fact that many internationally renowned scientists have been educated in the former Soviet Union shows that many features of its education system were good. In this session, we briefly describe the features that we believe to have been good. Some of these features have already been successfully implemented (with appropriate adjustments) in affinity research groups at the Department of Computer Science of the University of Texas at El Paso (UTEP).
Growth Rates Under Interval Uncertainty, Janos Hajagos, Vladik Kreinovich
Growth Rates Under Interval Uncertainty, Janos Hajagos, Vladik Kreinovich
Departmental Technical Reports (CS)
For many reallife systems ranging from financial to populationrelated to medical, dynamics is described by a system of linear equations. For such systems, the growth rate lambda can be determined as the largest eigenvalue of the corresponding matrix A. In many practical situations, we only know the components of the matrix A with interval (or fuzzy) uncertainty. In such situations, it is desirable to find the range of possible values of lambda. In this paper, we propose an efficient algorithm for computing lambda for a practically important case when all the components of the matrix A are nonnegative.
Interval And Fuzzy Techniques In BusinessRelated Computer Security: Intrusion Detection, Privacy Protection, Mohsen Beheshti, Jianchao Han, Luc Longpre, Scott A. Starks, J. Ivan Vargas, Gang Xiang
Interval And Fuzzy Techniques In BusinessRelated Computer Security: Intrusion Detection, Privacy Protection, Mohsen Beheshti, Jianchao Han, Luc Longpre, Scott A. Starks, J. Ivan Vargas, Gang Xiang
Departmental Technical Reports (CS)
Ecommerce plays an increasingly large role in business. As a result, businessrelated computer security becomes more and more important. In this talk, we describe how interval and fuzzy techniques can help in solving related computer security problems.
Bilinear Models From System Approach Justified For Classification, With Potential Applications To Bioinformatics, Richard Aló, Francois Modave, Vladik Kreinovich, David Herrera, Xiaojing Wang
Bilinear Models From System Approach Justified For Classification, With Potential Applications To Bioinformatics, Richard Aló, Francois Modave, Vladik Kreinovich, David Herrera, Xiaojing Wang
Departmental Technical Reports (CS)
When we do not know the dynamics of a complex system, it is natural to use common sense to get a reasonable first approximation  which turns out to be a bilinear dynamics. Surprisingly, for classification problems, a similar bilinear approximation turns out to be unexpectedly accurate. In this paper, we provide an explanation for this accuracy.
Economics Of Engineering Design Under Interval (And Fuzzy) Uncertainty: Case Study Of Building Design, Carlos M. Ferregut, Jan Beck, Araceli Sanchez, Vladik Kreinovich
Economics Of Engineering Design Under Interval (And Fuzzy) Uncertainty: Case Study Of Building Design, Carlos M. Ferregut, Jan Beck, Araceli Sanchez, Vladik Kreinovich
Departmental Technical Reports (CS)
One of the main objectives of engineering design is to find a design that is the cheapest among all designs that satisfy given constraints. Most of the constraints must be satisfied under all possible values within certain ranges. Checking all possible combinations of values is often very timeconsuming. In this paper, we propose a faster algorithm for checking such constraints.
Topaz: A Firefox Protocol Extension For Gridftp Based On Data Flow Diagrams, Richard Zamudio, Daniel Catarino, Michela Taufer, Brent Stearn, Karan Bhatia
Topaz: A Firefox Protocol Extension For Gridftp Based On Data Flow Diagrams, Richard Zamudio, Daniel Catarino, Michela Taufer, Brent Stearn, Karan Bhatia
Departmental Technical Reports (CS)
As grid infrastructures mature, an increasing challenge is to provide enduser scientists with intuitive interfaces to computational services, data management capabilities, and visualization tools. The current approach used in a number of cyberinfrastructure projects is to leverage the capabilities of the Mozilla framework to provide rich enduser tools that seamlessly integrate with remote resources such as web/grid services and data repositories.
In this paper we apply rigorous software engineering tools, Data Flow Diagrams or DFDs, to guide the design, implementation, and performance analysis of Topaz, a GridFTP protocol extension to the Firefox browser. GridFTP servers, similar to FTP servers ...