Open Access. Powered by Scholars. Published by Universities.^{®}
 Keyword

 Runtime assertion checking (3)
 Interval uncertainty (3)
 Java Modeling Language (JML) (2)
 Java Language (2)
 Design by contract (2)

 Usability (2)
 JML Language (2)
 Genetic algorithms (2)
 Probabilistic uncertainty (2)
 Assertion (1)
 Discrete nonnegative conservation (1)
 DempsterSchaffer approach (1)
 Branch coverage (1)
 Commonsense reasoning (1)
 JML language. (1)
 Dialog system (1)
 Bioinformatics (1)
 Aviation (1)
 Call sequences (1)
 Computational complexity (1)
 Engineering calculations (1)
 Engineering applications (1)
 Elliptic problems (1)
 Domains (1)
 Environmental analysis (1)
 Benchmarking (1)
 EinsteinBergman approach to 5D models (1)
 5D geometry (1)
 Bestpossible bounds (1)
 Compactification of extra dimensions (1)
Articles 1  30 of 50
FullText Articles in Computer Engineering
MonteCarloType Techniques For Processing Interval Uncertainty, And Their Potential Engineering Applications, Vladik Kreinovich, J. Beck, Carlos M. Ferregut, A. Sanchez, George R. Keller, Matthew G. Averill, Scott A. Starks
MonteCarloType Techniques For Processing Interval Uncertainty, And Their Potential Engineering Applications, Vladik Kreinovich, J. Beck, Carlos M. Ferregut, A. Sanchez, George R. Keller, Matthew G. Averill, Scott A. Starks
Departmental Technical Reports (CS)
In engineering applications, we need to make decisions under uncertainty. Traditionally, in engineering, statistical methods are used, methods assuming that we know the probability distribution of different uncertain parameters. Usually, we can safely linearize the dependence of the desired quantities y (e.g., stress at different structural points) on the uncertain parameters xi  thus enabling sensitivity analysis. Often, the number n of uncertain parameters is huge, so sensitivity analysis leads to a lot of computation time. To speed up the processing, we propose to use special MonteCarlotype simulations.
Population Variance Under Interval Uncertainty: A New Algorithm, Evgeny Dantsin, Vladik Kreinovich, Alexander Wolper, Gang Xiang
Population Variance Under Interval Uncertainty: A New Algorithm, Evgeny Dantsin, Vladik Kreinovich, Alexander Wolper, Gang Xiang
Departmental Technical Reports (CS)
In statistical analysis of measurement results, it is often beneficial to compute the range [V] of the population variance V when we only know the intervals [XiDi,Xi+Di] of possible values of xi. In general, this problem is NPhard; a polynomialtime algorithm is known for the case when the measurements are sufficiently accurate, i.e., when XiXj >= (D_i+D_j)/n for all i =/= j. In this paper, we show that we can efficiently compute [V} under a weaker (and more general) condition XiXj >= D_iD_j/n.
Specifying And Checking Method Call Sequences Of Java Programs, Yoonsik Cheon, Ashaveena Perumandla
Specifying And Checking Method Call Sequences Of Java Programs, Yoonsik Cheon, Ashaveena Perumandla
Departmental Technical Reports (CS)
In a pre and postconditionsstyle specification, it is difficult to specify the allowed sequences of method calls, referred to as protocols. The protocols are essential properties of reusable objectoriented classes and application frameworks, and the approaches based on the pre and postconditions, such as design by contracts (DBC) and formal behavioral interface specification languages (BISL), are being accepted as a practical and effective tool for describing precise interfaces of (reusable) program modules. We propose a simple extension to the Java Modeling Language (JML), a BISL for Java, to specify protocol properties in an intuitive and concise manner. The key idea ...
Optimal Choice Of Granularity In Commonsense Estimation: Why HalfOrders Of Magnitude, Jerry R. Hobbs, Vladik Kreinovich
Optimal Choice Of Granularity In Commonsense Estimation: Why HalfOrders Of Magnitude, Jerry R. Hobbs, Vladik Kreinovich
Departmental Technical Reports (CS)
It has been observed that when people make crude estimates, they feel comfortable choosing between alternatives which differ by a halforder of magnitude (e.g., were there 100, 300, or 1,000 people in the crowd), and less comfortable making a choice on a more detailed scale, with finer granules, or on a coarser scale (like 100 or 1,000). In this paper, we describe two models of choosing granularity in commonsense estimates, and we show that for both models, in the optimal granularity, the next estimate is 34 times larger than the previous one. Thus, these two optimization results ...
Combining Interval, Probabilistic, And Fuzzy Uncertainty: Foundations, Algorithms, Challenges  An Overview, Vladik Kreinovich, David J. Berleant, Scott Ferson, Weldon A. Lodwick
Combining Interval, Probabilistic, And Fuzzy Uncertainty: Foundations, Algorithms, Challenges  An Overview, Vladik Kreinovich, David J. Berleant, Scott Ferson, Weldon A. Lodwick
Departmental Technical Reports (CS)
Since the 1960s, many algorithms have been designed to deal with interval uncertainty. In the last decade, there has been a lot of progress in extending these algorithms to the case when we have a combination of interval and probabilistic uncertainty. We provide an overview of related algorithms, results, and remaining open problems.
Decision Making Beyond Arrow's "Impossibility Theorem", With The Analysis Of Effects Of Collusion And Mutual Attraction, Hung T. Nguyen, Olga Kosheleva, Vladik Kreinovich
Decision Making Beyond Arrow's "Impossibility Theorem", With The Analysis Of Effects Of Collusion And Mutual Attraction, Hung T. Nguyen, Olga Kosheleva, Vladik Kreinovich
Departmental Technical Reports (CS)
In 1951, K. J. Arrow proved that, under certain assumptions, it is impossible to have group decision making rules which satisfy reasonable conditions like symmetry. This Impossibility Theorem is often cited as a proof that reasonable group decision making is impossible.
We start our paper by remarking that Arrow's result only covers the situations when the only information we have about individual preferences is their binary preferences between the alternatives. If we follow the main ideas of modern decision making and game theory and also collect information about the preferences between lotteries (i.e., collect the utility values of ...
IntervalBased Robust Statistical Techniques For NonNegative Convex Functions, With Application To Timing Analysis Of Computer Chips, Michael Orshansky, WeiShen Wang, Martine Ceberio, Gang Xiang
IntervalBased Robust Statistical Techniques For NonNegative Convex Functions, With Application To Timing Analysis Of Computer Chips, Michael Orshansky, WeiShen Wang, Martine Ceberio, Gang Xiang
Departmental Technical Reports (CS)
In chip design, one of the main objectives is to decrease its clock cycle. On the design stage, this time is usually estimated by using worstcase (interval) techniques, in which we only use the bounds on the parameters that lead to delays. This analysis does not take into account that the probability of the worstcase values is usually very small; thus, the resulting estimates are overconservative, leading to unnecessary overdesign and underperformance of circuits. If we knew the exact probability distributions of the corresponding parameters, then we could use MonteCarlo simulations (or the corresponding analytical techniques) to get the desired ...
A Fitness Function For Modular Evolutionary Testing Of ObjectOriented Programs, Yoonsik Cheon, Kim Myoung
A Fitness Function For Modular Evolutionary Testing Of ObjectOriented Programs, Yoonsik Cheon, Kim Myoung
Departmental Technical Reports (CS)
We show that encapsulation of states in objectoriented programs hinders the search for test data using evolutionary testing. In a welldesigned objectoriented program the encapsulated or hidden state is accessible only through exported or public methods. As client code is oblivious to the internal state of a server object, no guidance is available to test the client code using evolutionary testing. In particular, it is difficult to determine the fitness or goodness of test data, as it may depend on the hidden internal state. However, evolutionary testing is a promising new approach whose effectiveness has been shown by several researchers ...
Quantum Versions Of KCsp Algorithms: A First Step Towards Quantum Algorithms For IntervalRelated Constraint Satisfaction Problems, Evgeny Dantsin, Alexander Wolpert, Vladik Kreinovich
Quantum Versions Of KCsp Algorithms: A First Step Towards Quantum Algorithms For IntervalRelated Constraint Satisfaction Problems, Evgeny Dantsin, Alexander Wolpert, Vladik Kreinovich
Departmental Technical Reports (CS)
In many industrial engineering problems, we must select a design, select parameters of a process, or, in general, make a decision. Informally, this decision must be optimal, the best for the users. In traditional operations research, we assume that we know the objective function f(x) whose values describe the consequence of a decision x for the user. Optimization of welldefined functions is what started calculus in the first place: once we know the objective function f(x), we can use differentiation to find its maximum, e.g., as the point x at which the derivative of f with respect ...
Computing Mean And Variance Under DempsterShafer Uncertainty: Towards Faster Algorithms, Vladik Kreinovich, Gang Xiang, Scott Ferson
Computing Mean And Variance Under DempsterShafer Uncertainty: Towards Faster Algorithms, Vladik Kreinovich, Gang Xiang, Scott Ferson
Departmental Technical Reports (CS)
In many reallife situations, we only have partial information about the actual probability distribution. For example, under DempsterShafer uncertainty, we only know the masses m1,...,mn assigned to different sets S1,...,Sn, but we do not know the distribution within each set Si. Because of this uncertainty, there are many possible probability distributions consistent with our knowledge; different distributions have, in general, different values of standard statistical characteristics such as mean and variance. It is therefore desirable, given a DempsterShafer knowledge base, to compute the ranges of possible values of mean E and of variance V.
In their recent paper ...
Usability Over Time, Valerie Mendoza, David G. Novick
Usability Over Time, Valerie Mendoza, David G. Novick
Departmental Papers (CS)
Testing of usability could perhaps be more accurately described as testing of learnability. We know more about the problems of novice users than we know of the problems of experienced users. To understand how these problems differ, and to understand how usability problems change as users change from novice to experienced, we conducted a longitudinal study of usability among middleschool teachers creating Web sites. The study looked at the use both the use of documentation and the underlying software, tracking the causes and extent of user frustration over eight weeks. We validated a categorization scheme for frustration episodes. We found ...
CoGeneration Of Text And Graphics, David G. Novick, Brian Lowe
CoGeneration Of Text And Graphics, David G. Novick, Brian Lowe
Departmental Papers (CS)
To reduce potential discrepancies between textual and graphical content in documentation, it is possible to produce both text and graphics from a single common source. One approach to cogeneration of text and graphics uses a single logical specification; a second approach starts with CADbased representation and produces a corresponding textual account. This paper explores these two different approaches, reports the results of using prototypes embodying the approaches to represent simple figures, and discusses issues that were identified through use of the prototypes. While it appears feasible to cogenerate text and graphics automatically, the process raises deep issues of design of ...
Root Causes Of Lost Time And User Stress In A Simple Dialog System, Nigel G. Ward, Anais G. Rivera, Karen Ward, David G. Novick
Root Causes Of Lost Time And User Stress In A Simple Dialog System, Nigel G. Ward, Anais G. Rivera, Karen Ward, David G. Novick
Departmental Papers (CS)
As a prioritysetting exercise, we compared interactions between users and a simple spoken dialog system to interactions between users and a human operator. We observed usability events, places in which system behavior differed from human behavior, and for each we noted the impact, root causes, and prospects for improvement. We suggest some priority issues for research, involving not only such core areas as speech recognition and synthesis and language understanding and generation, but also lessstudied topics such as adaptive or flexible timeouts, turntaking and speaking rate.
Towards A CrossPlatform Microbenchmark Suite For Evaluating Hardware Performance Counter Data, Maria G. Aguilera, Roberto Araiza, Thientam Pham, Patricia J. Teller
Towards A CrossPlatform Microbenchmark Suite For Evaluating Hardware Performance Counter Data, Maria G. Aguilera, Roberto Araiza, Thientam Pham, Patricia J. Teller
Departmental Technical Reports (CS)
As useful as performance counters are, the meaning of reported aggregate event counts is sometimes questionable. Questions arise due to unanticipated processor behavior, overhead associated with the interface, the granularity of the monitored code, hardware errors, and lack of standards with respect to event definitions. To explore these issues, we are conducting a sequence of studies using carefully crafted microbenchmarks that permit the accurate prediction of event counts and investigation of the differences between hardwarereported and predicted event counts. This paper presents the methodology employed, some of the microbenchmarks developed, and some of the information uncovered to date. The information ...
Optimization And Decision Making Under Interval And Fuzzy Uncertainty: Towards New Mathematical Foundations, Hung T. Nguyen, Vladik Kreinovich
Optimization And Decision Making Under Interval And Fuzzy Uncertainty: Towards New Mathematical Foundations, Hung T. Nguyen, Vladik Kreinovich
Departmental Technical Reports (CS)
In many industrial engineering problems, we must select a design, select parameters of a process, or, in general, make a decision. Informally, this decision must be optimal, the best for the users. In traditional operations research, we assume that we know the objective function f(x) whose values describe the consequence of a decision x for the user. Optimization of welldefined functions is what started calculus in the first place: once we know the objective function f(x), we can use differentiation to find its maximum, e.g., as the point x at which the derivative of f with respect ...
On Quantum Versions Of RecordBreaking Algorithms For Sat, Evgeny Dantsin, Vladik Kreinovich, Alexander Wolpert
On Quantum Versions Of RecordBreaking Algorithms For Sat, Evgeny Dantsin, Vladik Kreinovich, Alexander Wolpert
Departmental Technical Reports (CS)
It is well known that a straightforward application of Grover's quantum search algorithm enables to solve SAT in O(2^(n/2)) steps. Ambainis (SIGACT News, 2004) observed that it is possible to use Grover's technique to similarly speed up a sophisticated algorithm for solving 3SAT. In this note, we show that a similar speed up can be obtained for all major recordbreaking algorithms for satisfiability. We also show that if we use Grover's technique only, then we cannot do better than quadratic speed up.
From Intervals To Domains: Towards A General Description Of Validated Uncertainty, With Potential Applications To Geospatial And Meteorological Data, Vladik Kreinovich, Olga Kosheleva, Scott A. Starks, Kavitha Tupelly, Gracaliz P. Dimuro, Antonio C. Da Costa Rocha, Karen Villaverde
From Intervals To Domains: Towards A General Description Of Validated Uncertainty, With Potential Applications To Geospatial And Meteorological Data, Vladik Kreinovich, Olga Kosheleva, Scott A. Starks, Kavitha Tupelly, Gracaliz P. Dimuro, Antonio C. Da Costa Rocha, Karen Villaverde
Departmental Technical Reports (CS)
When physical quantities xi are numbers, then the corresponding measurement accuracy can be usually represented in interval terms, and interval computations can be used to estimate the resulting uncertainty in y=f(x1,...,xn).
In some practical problems, we are interested in more complex structures such as functions, operators, etc. Examples: we may be interested in how the material strain depends on the applied stress, or in how a physical quantity such as temperature or velocity of sound depends on a 3D point.
For many such structures, there are ways to represent uncertainty, but usually, for each new structure, we ...
IntervalType And Affine ArithmeticType Techniques For Handling Uncertainty In Expert Systems, With Applications To Geoinformatics And Computer Security, Martine Ceberio, Vladik Kreinovich, Sanjeev Chopra, Luc Longpre, Hung T. Nguyen, Bertram Ludaescher, Chitta Baral
IntervalType And Affine ArithmeticType Techniques For Handling Uncertainty In Expert Systems, With Applications To Geoinformatics And Computer Security, Martine Ceberio, Vladik Kreinovich, Sanjeev Chopra, Luc Longpre, Hung T. Nguyen, Bertram Ludaescher, Chitta Baral
Departmental Technical Reports (CS)
Expert knowledge consists of statements Sj (facts and rules). The expert's degree of confidence in each statement Sj can be described as a (subjective) probability (some probabilities are known to be independent). Examples: if we are interested in oil, we should look at seismic data (confidence 90%); a bank A trusts a client B, so if we trust A, we should trust B too (confidence 99%). If a query Q is deducible from facts and rules, what is our confidence p(Q) in Q? We can describe Q as a propositional formula F in terms of Sj; computing p ...
Discrete Conservation Of Nonnegativity For Elliptic Problems Solved By The HpFem, Pavel Solin, Thomas Vejchodsky, Roberto Araiza
Discrete Conservation Of Nonnegativity For Elliptic Problems Solved By The HpFem, Pavel Solin, Thomas Vejchodsky, Roberto Araiza
Departmental Technical Reports (CS)
Most results related to discrete nonnegativity conservation principles (DNCP) for elliptic problems are limited to finite differences (FDM) and lowestorder finite element methods (FEM). In this paper we confirm that a straightforward extension to higherorder finite element methods (hpFEM) in the classical sense is not possible. We formulate a weaker DNCP for the Poisson equation in one spatial dimension and prove it using an interval computing technique. Numerical experiments related to the extension of this result to 2D are presented.
Processing Educational Data: From Traditional Statistical Techniques To An Appropriate Combination Of Probabilistic, Interval, And Fuzzy Approaches, Olga Kosheleva, Martine Ceberio
Processing Educational Data: From Traditional Statistical Techniques To An Appropriate Combination Of Probabilistic, Interval, And Fuzzy Approaches, Olga Kosheleva, Martine Ceberio
Departmental Technical Reports (CS)
There are many papers that experimentally compare effectiveness of different teaching techniques. Most of these papers use traditional statistical approach to process the experimental results. The traditional statistical approach is well suited to numerical data but often, what we are processing is either intervals (e.g., A means anything from 90 to 100) or fuzzytype perceptions, words from the natural language like "understood well" or "understood reasonably well". We show that the use of intervals and fuzzy techniques leads to more adequate processing of educational data.
Interval Versions Of Statistical Techniques With Applications To Environmental Analysis, Bioinformatics, And Privacy In Statistical Databases, Vladik Kreinovich, Luc Longpre, Scott A. Starks, Gang Xiang, Jan Beck, Raj Kandathi, Asis Nayak, Scott Ferson, Janos Hajagos
Interval Versions Of Statistical Techniques With Applications To Environmental Analysis, Bioinformatics, And Privacy In Statistical Databases, Vladik Kreinovich, Luc Longpre, Scott A. Starks, Gang Xiang, Jan Beck, Raj Kandathi, Asis Nayak, Scott Ferson, Janos Hajagos
Departmental Technical Reports (CS)
In many areas of science and engineering, it is desirable to estimate statistical characteristics (mean, variance, covariance, etc.) under interval uncertainty. For example, we may want to use the measured values x(t) of a pollution level in a lake at different moments of time to estimate the average pollution level; however, we do not know the exact values x(t)  e.g., if one of the measurement results is 0, this simply means that the actual (unknown) value of x(t) can be anywhere between 0 and the detection limit DL. We must therefore modify the existing statistical algorithms ...
Which Fuzzy Logic Is The Best: Pragmatic Approach (And Its Theoretical Analysis), Vladik Kreinovich, Hung T. Nguyen
Which Fuzzy Logic Is The Best: Pragmatic Approach (And Its Theoretical Analysis), Vladik Kreinovich, Hung T. Nguyen
Departmental Technical Reports (CS)
In this position paper, we argue that when we are looking for the best fuzzy logic, we should specify in what sense the best, and that we get different fuzzy logics as ``the best'' depending on what optimality criterion we use.
KaluzaKlein 5d Ideas Made Fully Geometric, Scott A. Starks, Olga Kosheleva, Vladik Kreinovich
KaluzaKlein 5d Ideas Made Fully Geometric, Scott A. Starks, Olga Kosheleva, Vladik Kreinovich
Departmental Technical Reports (CS)
After the 1916 success of General relativity that explained gravity by adding time as a fourth dimension, physicists have been trying to explain other physical fields by adding extra dimensions. In 1921, Kaluza and Klein has shown that under certain conditions like cylindricity (dg_{ij}/dx^5=0), the addition of the 5th dimension can explain the electromagnetic field. The problem with this approach is that while the model itself is geometric, conditions like cylindricity are not geometric. This problem was partly solved by Einstein and Bergman who proposed, in their 1938 paper, that the 5th dimension is compactified into ...
Kolmogorov Complexity Leads To A Representation Theorem For Idempotent Probabilities (SigmaMaxitive Measures), Vladik Kreinovich, Luc Longpre
Kolmogorov Complexity Leads To A Representation Theorem For Idempotent Probabilities (SigmaMaxitive Measures), Vladik Kreinovich, Luc Longpre
Departmental Technical Reports (CS)
In many application areas, it is important to consider maxitive measures (idempotent probabilities), i.e., mappings m for which m(A U B)=max(m(A),m(B)). In his papers, J. H. Lutz has used Kolmogorov complexity to show that for constructively defined sets A, one maxitive measure  fractal dimension  can be represented as m(A)= sup{f(x): x in A}. We show that a similar representation is possible for an arbitrary maxitive measure.
If An Exact Interval Computation Problem Is NpHard, Then The Approximate Problem Is Also NpHard: A MetaResult, Aline B. Loreto, Laira V. Toscani, Leila Robeiro, Dalcidio M. Claudio, Liara S. Leal, Luc Longpre, Vladik Kreinovich
If An Exact Interval Computation Problem Is NpHard, Then The Approximate Problem Is Also NpHard: A MetaResult, Aline B. Loreto, Laira V. Toscani, Leila Robeiro, Dalcidio M. Claudio, Liara S. Leal, Luc Longpre, Vladik Kreinovich
Departmental Technical Reports (CS)
In interval computations, usually, once we prove that a problem of computing the exact range is NPhard, then it later turns out that the problem of computing this range with a given accuracy is also NPhard. In this paper, we provide a general explanation for this phenomenon.
Consortium Of CiseMii Funded Institutions: Initial Recommendations On Broadening Participation Of Hispanics, Ann Q. Gates
Consortium Of CiseMii Funded Institutions: Initial Recommendations On Broadening Participation Of Hispanics, Ann Q. Gates
Departmental Technical Reports (CS)
No abstract provided.
Towards Combining Probabilistic And Interval Uncertainty In Engineering Calculations: Algorithms For Computing Statistics Under Interval Uncertainty, And Their Computational Complexity, Vladik Kreinovich, Gang Xiang, Scott A. Starks, Luc Longpre, Martine Ceberio, Roberto Araiza, J. Beck, R. Kandathi, A. Nayak, R. Torres, J. Hajagos
Towards Combining Probabilistic And Interval Uncertainty In Engineering Calculations: Algorithms For Computing Statistics Under Interval Uncertainty, And Their Computational Complexity, Vladik Kreinovich, Gang Xiang, Scott A. Starks, Luc Longpre, Martine Ceberio, Roberto Araiza, J. Beck, R. Kandathi, A. Nayak, R. Torres, J. Hajagos
Departmental Technical Reports (CS)
In many engineering applications, we have to combine probabilistic and interval uncertainty. For example, in environmental analysis, we observe a pollution level x(t) in a lake at different moments of time t, and we would like to estimate standard statistical characteristics such as mean, variance, autocorrelation, correlation with other measurements. In environmental measurements, we often only measure the values with interval uncertainty. We must therefore modify the existing statistical algorithms to process such interval data.
In this paper, we provide a survey of algorithms for computing various statistics under interval uncertainty and their computational complexity. The survey includes both ...
Why Product Of Probabilities (Masses) For Independent Events? A Remark, Vladik Kreinovich, Scott Ferson
Why Product Of Probabilities (Masses) For Independent Events? A Remark, Vladik Kreinovich, Scott Ferson
Departmental Technical Reports (CS)
For independent events A and B, the probability P(A&B) is equal to the product of the corresponding probabilities: P(A&B)=P(A)*P(B). It is well known that the product f(a,b)=a*b has the following property: once P(A1)+...+P(An)=1 and P(B1)+...+P(Bm)=1, the probabilities P(Ai&Bj)=f(P(Ai),P(Bj)) also add to 1: f(P(A1),P(B1))+...+f(P(An),P(Bm))=1. We prove that the product is the only function that satisfies this property, i.e., that if, vice versa, this ...
Computing BestPossible Bounds For The Distribution Of A Sum Of Several Variables Is NpHard, Vladik Kreinovich, Scott Ferson
Computing BestPossible Bounds For The Distribution Of A Sum Of Several Variables Is NpHard, Vladik Kreinovich, Scott Ferson
Departmental Technical Reports (CS)
In many reallife situations, we know the probability distribution of two random variables x1 and x2, but we have no information about the correlation between x1 and x2; what are the possible probability distributions for the sum x1+x2? This question was originally raised by A. N. Kolmogorov. Algorithms exist that provide bestpossible bounds for the distribution of x1+x2; these algorithms have been implemented as a part of the efficient software for handling probabilistic uncertainty. A natural question is: what if we have several (n>2) variables with known distribution, we have no information about their correlation, and we ...
Some Usability Issues And Research Priorities In Spoken Dialog Applications, Nigel Ward, Anais G. Rivera, Karen Ward, David G. Novick
Some Usability Issues And Research Priorities In Spoken Dialog Applications, Nigel Ward, Anais G. Rivera, Karen Ward, David G. Novick
Departmental Technical Reports (CS)
As a prioritysetting exercise, we examined interactions between users and a simple spoken dialog system in comparison to interactions with a human operator. Based on analysis of the observed usability differences and their root causes we propose seven priority issues for spoken dialog systems research.