Open Access. Powered by Scholars. Published by Universities.®

Computer Engineering Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 30 of 50

Full-Text Articles in Computer Engineering

Monte-Carlo-Type Techniques For Processing Interval Uncertainty, And Their Potential Engineering Applications, Vladik Kreinovich, J. Beck, Carlos M. Ferregut, A. Sanchez, George R. Keller, Matthew G. Averill, Scott A. Starks Dec 2005

Monte-Carlo-Type Techniques For Processing Interval Uncertainty, And Their Potential Engineering Applications, Vladik Kreinovich, J. Beck, Carlos M. Ferregut, A. Sanchez, George R. Keller, Matthew G. Averill, Scott A. Starks

Departmental Technical Reports (CS)

In engineering applications, we need to make decisions under uncertainty. Traditionally, in engineering, statistical methods are used, methods assuming that we know the probability distribution of different uncertain parameters. Usually, we can safely linearize the dependence of the desired quantities y (e.g., stress at different structural points) on the uncertain parameters xi - thus enabling sensitivity analysis. Often, the number n of uncertain parameters is huge, so sensitivity analysis leads to a lot of computation time. To speed up the processing, we propose to use special Monte-Carlo-type simulations.


Population Variance Under Interval Uncertainty: A New Algorithm, Evgeny Dantsin, Vladik Kreinovich, Alexander Wolper, Gang Xiang Nov 2005

Population Variance Under Interval Uncertainty: A New Algorithm, Evgeny Dantsin, Vladik Kreinovich, Alexander Wolper, Gang Xiang

Departmental Technical Reports (CS)

In statistical analysis of measurement results, it is often beneficial to compute the range [V] of the population variance V when we only know the intervals [Xi-Di,Xi+Di] of possible values of xi. In general, this problem is NP-hard; a polynomial-time algorithm is known for the case when the measurements are sufficiently accurate, i.e., when |Xi-Xj| >= (D_i+D_j)/n for all i =/= j. In this paper, we show that we can efficiently compute [V} under a weaker (and more general) condition |Xi-Xj| >= |D_i-D_j|/n.


Specifying And Checking Method Call Sequences Of Java Programs, Yoonsik Cheon, Ashaveena Perumandla Nov 2005

Specifying And Checking Method Call Sequences Of Java Programs, Yoonsik Cheon, Ashaveena Perumandla

Departmental Technical Reports (CS)

In a pre and postconditions-style specification, it is difficult to specify the allowed sequences of method calls, referred to as protocols. The protocols are essential properties of reusable object-oriented classes and application frameworks, and the approaches based on the pre and postconditions, such as design by contracts (DBC) and formal behavioral interface specification languages (BISL), are being accepted as a practical and effective tool for describing precise interfaces of (reusable) program modules. We propose a simple extension to the Java Modeling Language (JML), a BISL for Java, to specify protocol properties in an intuitive and concise manner. The key idea ...


Optimal Choice Of Granularity In Commonsense Estimation: Why Half-Orders Of Magnitude, Jerry R. Hobbs, Vladik Kreinovich Nov 2005

Optimal Choice Of Granularity In Commonsense Estimation: Why Half-Orders Of Magnitude, Jerry R. Hobbs, Vladik Kreinovich

Departmental Technical Reports (CS)

It has been observed that when people make crude estimates, they feel comfortable choosing between alternatives which differ by a half-order of magnitude (e.g., were there 100, 300, or 1,000 people in the crowd), and less comfortable making a choice on a more detailed scale, with finer granules, or on a coarser scale (like 100 or 1,000). In this paper, we describe two models of choosing granularity in commonsense estimates, and we show that for both models, in the optimal granularity, the next estimate is 3-4 times larger than the previous one. Thus, these two optimization results ...


Combining Interval, Probabilistic, And Fuzzy Uncertainty: Foundations, Algorithms, Challenges -- An Overview, Vladik Kreinovich, David J. Berleant, Scott Ferson, Weldon A. Lodwick Nov 2005

Combining Interval, Probabilistic, And Fuzzy Uncertainty: Foundations, Algorithms, Challenges -- An Overview, Vladik Kreinovich, David J. Berleant, Scott Ferson, Weldon A. Lodwick

Departmental Technical Reports (CS)

Since the 1960s, many algorithms have been designed to deal with interval uncertainty. In the last decade, there has been a lot of progress in extending these algorithms to the case when we have a combination of interval and probabilistic uncertainty. We provide an overview of related algorithms, results, and remaining open problems.


Decision Making Beyond Arrow's "Impossibility Theorem", With The Analysis Of Effects Of Collusion And Mutual Attraction, Hung T. Nguyen, Olga Kosheleva, Vladik Kreinovich Oct 2005

Decision Making Beyond Arrow's "Impossibility Theorem", With The Analysis Of Effects Of Collusion And Mutual Attraction, Hung T. Nguyen, Olga Kosheleva, Vladik Kreinovich

Departmental Technical Reports (CS)

In 1951, K. J. Arrow proved that, under certain assumptions, it is impossible to have group decision making rules which satisfy reasonable conditions like symmetry. This Impossibility Theorem is often cited as a proof that reasonable group decision making is impossible.

We start our paper by remarking that Arrow's result only covers the situations when the only information we have about individual preferences is their binary preferences between the alternatives. If we follow the main ideas of modern decision making and game theory and also collect information about the preferences between lotteries (i.e., collect the utility values of ...


Interval-Based Robust Statistical Techniques For Non-Negative Convex Functions, With Application To Timing Analysis Of Computer Chips, Michael Orshansky, Wei-Shen Wang, Martine Ceberio, Gang Xiang Oct 2005

Interval-Based Robust Statistical Techniques For Non-Negative Convex Functions, With Application To Timing Analysis Of Computer Chips, Michael Orshansky, Wei-Shen Wang, Martine Ceberio, Gang Xiang

Departmental Technical Reports (CS)

In chip design, one of the main objectives is to decrease its clock cycle. On the design stage, this time is usually estimated by using worst-case (interval) techniques, in which we only use the bounds on the parameters that lead to delays. This analysis does not take into account that the probability of the worst-case values is usually very small; thus, the resulting estimates are over-conservative, leading to unnecessary over-design and under-performance of circuits. If we knew the exact probability distributions of the corresponding parameters, then we could use Monte-Carlo simulations (or the corresponding analytical techniques) to get the desired ...


A Fitness Function For Modular Evolutionary Testing Of Object-Oriented Programs, Yoonsik Cheon, Kim Myoung Oct 2005

A Fitness Function For Modular Evolutionary Testing Of Object-Oriented Programs, Yoonsik Cheon, Kim Myoung

Departmental Technical Reports (CS)

We show that encapsulation of states in object-oriented programs hinders the search for test data using evolutionary testing. In a well-designed object-oriented program the encapsulated or hidden state is accessible only through exported or public methods. As client code is oblivious to the internal state of a server object, no guidance is available to test the client code using evolutionary testing. In particular, it is difficult to determine the fitness or goodness of test data, as it may depend on the hidden internal state. However, evolutionary testing is a promising new approach whose effectiveness has been shown by several researchers ...


Quantum Versions Of K-Csp Algorithms: A First Step Towards Quantum Algorithms For Interval-Related Constraint Satisfaction Problems, Evgeny Dantsin, Alexander Wolpert, Vladik Kreinovich Oct 2005

Quantum Versions Of K-Csp Algorithms: A First Step Towards Quantum Algorithms For Interval-Related Constraint Satisfaction Problems, Evgeny Dantsin, Alexander Wolpert, Vladik Kreinovich

Departmental Technical Reports (CS)

In many industrial engineering problems, we must select a design, select parameters of a process, or, in general, make a decision. Informally, this decision must be optimal, the best for the users. In traditional operations research, we assume that we know the objective function f(x) whose values describe the consequence of a decision x for the user. Optimization of well-defined functions is what started calculus in the first place: once we know the objective function f(x), we can use differentiation to find its maximum, e.g., as the point x at which the derivative of f with respect ...


Computing Mean And Variance Under Dempster-Shafer Uncertainty: Towards Faster Algorithms, Vladik Kreinovich, Gang Xiang, Scott Ferson Oct 2005

Computing Mean And Variance Under Dempster-Shafer Uncertainty: Towards Faster Algorithms, Vladik Kreinovich, Gang Xiang, Scott Ferson

Departmental Technical Reports (CS)

In many real-life situations, we only have partial information about the actual probability distribution. For example, under Dempster-Shafer uncertainty, we only know the masses m1,...,mn assigned to different sets S1,...,Sn, but we do not know the distribution within each set Si. Because of this uncertainty, there are many possible probability distributions consistent with our knowledge; different distributions have, in general, different values of standard statistical characteristics such as mean and variance. It is therefore desirable, given a Dempster-Shafer knowledge base, to compute the ranges of possible values of mean E and of variance V.

In their recent paper ...


Usability Over Time, Valerie Mendoza, David G. Novick Sep 2005

Usability Over Time, Valerie Mendoza, David G. Novick

Departmental Papers (CS)

Testing of usability could perhaps be more accurately described as testing of learnability. We know more about the problems of novice users than we know of the problems of experienced users. To understand how these problems differ, and to understand how usability problems change as users change from novice to experienced, we conducted a longitudinal study of usability among middle-school teachers creating Web sites. The study looked at the use both the use of documentation and the underlying software, tracking the causes and extent of user frustration over eight weeks. We validated a categorization scheme for frustration episodes. We found ...


Co-Generation Of Text And Graphics, David G. Novick, Brian Lowe Sep 2005

Co-Generation Of Text And Graphics, David G. Novick, Brian Lowe

Departmental Papers (CS)

To reduce potential discrepancies between textual and graphical content in documentation, it is possible to produce both text and graphics from a single common source. One approach to co-generation of text and graphics uses a single logical specification; a second approach starts with CAD-based representation and produces a corresponding textual account. This paper explores these two different approaches, reports the results of using prototypes embodying the approaches to represent simple figures, and discusses issues that were identified through use of the prototypes. While it appears feasible to co-generate text and graphics automatically, the process raises deep issues of design of ...


Root Causes Of Lost Time And User Stress In A Simple Dialog System, Nigel G. Ward, Anais G. Rivera, Karen Ward, David G. Novick Sep 2005

Root Causes Of Lost Time And User Stress In A Simple Dialog System, Nigel G. Ward, Anais G. Rivera, Karen Ward, David G. Novick

Departmental Papers (CS)

As a priority-setting exercise, we compared interactions between users and a simple spoken dialog system to interactions between users and a human operator. We observed usability events, places in which system behavior differed from human behavior, and for each we noted the impact, root causes, and prospects for improvement. We suggest some priority issues for research, involving not only such core areas as speech recognition and synthesis and language understanding and generation, but also less-studied topics such as adaptive or flexible timeouts, turn-taking and speaking rate.


Towards A Cross-Platform Microbenchmark Suite For Evaluating Hardware Performance Counter Data, Maria G. Aguilera, Roberto Araiza, Thientam Pham, Patricia J. Teller Sep 2005

Towards A Cross-Platform Microbenchmark Suite For Evaluating Hardware Performance Counter Data, Maria G. Aguilera, Roberto Araiza, Thientam Pham, Patricia J. Teller

Departmental Technical Reports (CS)

As useful as performance counters are, the meaning of reported aggregate event counts is sometimes questionable. Questions arise due to unanticipated processor behavior, overhead associated with the interface, the granularity of the monitored code, hardware errors, and lack of standards with respect to event definitions. To explore these issues, we are conducting a sequence of studies using carefully crafted microbenchmarks that permit the accurate prediction of event counts and investigation of the differences between hardware-reported and predicted event counts. This paper presents the methodology employed, some of the microbenchmarks developed, and some of the information uncovered to date. The information ...


Optimization And Decision Making Under Interval And Fuzzy Uncertainty: Towards New Mathematical Foundations, Hung T. Nguyen, Vladik Kreinovich Sep 2005

Optimization And Decision Making Under Interval And Fuzzy Uncertainty: Towards New Mathematical Foundations, Hung T. Nguyen, Vladik Kreinovich

Departmental Technical Reports (CS)

In many industrial engineering problems, we must select a design, select parameters of a process, or, in general, make a decision. Informally, this decision must be optimal, the best for the users. In traditional operations research, we assume that we know the objective function f(x) whose values describe the consequence of a decision x for the user. Optimization of well-defined functions is what started calculus in the first place: once we know the objective function f(x), we can use differentiation to find its maximum, e.g., as the point x at which the derivative of f with respect ...


On Quantum Versions Of Record-Breaking Algorithms For Sat, Evgeny Dantsin, Vladik Kreinovich, Alexander Wolpert Aug 2005

On Quantum Versions Of Record-Breaking Algorithms For Sat, Evgeny Dantsin, Vladik Kreinovich, Alexander Wolpert

Departmental Technical Reports (CS)

It is well known that a straightforward application of Grover's quantum search algorithm enables to solve SAT in O(2^(n/2)) steps. Ambainis (SIGACT News, 2004) observed that it is possible to use Grover's technique to similarly speed up a sophisticated algorithm for solving 3-SAT. In this note, we show that a similar speed up can be obtained for all major record-breaking algorithms for satisfiability. We also show that if we use Grover's technique only, then we cannot do better than quadratic speed up.


From Intervals To Domains: Towards A General Description Of Validated Uncertainty, With Potential Applications To Geospatial And Meteorological Data, Vladik Kreinovich, Olga Kosheleva, Scott A. Starks, Kavitha Tupelly, Gracaliz P. Dimuro, Antonio C. Da Costa Rocha, Karen Villaverde Aug 2005

From Intervals To Domains: Towards A General Description Of Validated Uncertainty, With Potential Applications To Geospatial And Meteorological Data, Vladik Kreinovich, Olga Kosheleva, Scott A. Starks, Kavitha Tupelly, Gracaliz P. Dimuro, Antonio C. Da Costa Rocha, Karen Villaverde

Departmental Technical Reports (CS)

When physical quantities xi are numbers, then the corresponding measurement accuracy can be usually represented in interval terms, and interval computations can be used to estimate the resulting uncertainty in y=f(x1,...,xn).

In some practical problems, we are interested in more complex structures such as functions, operators, etc. Examples: we may be interested in how the material strain depends on the applied stress, or in how a physical quantity such as temperature or velocity of sound depends on a 3-D point.

For many such structures, there are ways to represent uncertainty, but usually, for each new structure, we ...


Interval-Type And Affine Arithmetic-Type Techniques For Handling Uncertainty In Expert Systems, With Applications To Geoinformatics And Computer Security, Martine Ceberio, Vladik Kreinovich, Sanjeev Chopra, Luc Longpre, Hung T. Nguyen, Bertram Ludaescher, Chitta Baral Aug 2005

Interval-Type And Affine Arithmetic-Type Techniques For Handling Uncertainty In Expert Systems, With Applications To Geoinformatics And Computer Security, Martine Ceberio, Vladik Kreinovich, Sanjeev Chopra, Luc Longpre, Hung T. Nguyen, Bertram Ludaescher, Chitta Baral

Departmental Technical Reports (CS)

Expert knowledge consists of statements Sj (facts and rules). The expert's degree of confidence in each statement Sj can be described as a (subjective) probability (some probabilities are known to be independent). Examples: if we are interested in oil, we should look at seismic data (confidence 90%); a bank A trusts a client B, so if we trust A, we should trust B too (confidence 99%). If a query Q is deducible from facts and rules, what is our confidence p(Q) in Q? We can describe Q as a propositional formula F in terms of Sj; computing p ...


Discrete Conservation Of Nonnegativity For Elliptic Problems Solved By The Hp-Fem, Pavel Solin, Thomas Vejchodsky, Roberto Araiza Aug 2005

Discrete Conservation Of Nonnegativity For Elliptic Problems Solved By The Hp-Fem, Pavel Solin, Thomas Vejchodsky, Roberto Araiza

Departmental Technical Reports (CS)

Most results related to discrete nonnegativity conservation principles (DNCP) for elliptic problems are limited to finite differences (FDM) and lowest-order finite element methods (FEM). In this paper we confirm that a straightforward extension to higher-order finite element methods (hp-FEM) in the classical sense is not possible. We formulate a weaker DNCP for the Poisson equation in one spatial dimension and prove it using an interval computing technique. Numerical experiments related to the extension of this result to 2D are presented.


Processing Educational Data: From Traditional Statistical Techniques To An Appropriate Combination Of Probabilistic, Interval, And Fuzzy Approaches, Olga Kosheleva, Martine Ceberio Jul 2005

Processing Educational Data: From Traditional Statistical Techniques To An Appropriate Combination Of Probabilistic, Interval, And Fuzzy Approaches, Olga Kosheleva, Martine Ceberio

Departmental Technical Reports (CS)

There are many papers that experimentally compare effectiveness of different teaching techniques. Most of these papers use traditional statistical approach to process the experimental results. The traditional statistical approach is well suited to numerical data but often, what we are processing is either intervals (e.g., A means anything from 90 to 100) or fuzzy-type perceptions, words from the natural language like "understood well" or "understood reasonably well". We show that the use of intervals and fuzzy techniques leads to more adequate processing of educational data.


Interval Versions Of Statistical Techniques With Applications To Environmental Analysis, Bioinformatics, And Privacy In Statistical Databases, Vladik Kreinovich, Luc Longpre, Scott A. Starks, Gang Xiang, Jan Beck, Raj Kandathi, Asis Nayak, Scott Ferson, Janos Hajagos Jul 2005

Interval Versions Of Statistical Techniques With Applications To Environmental Analysis, Bioinformatics, And Privacy In Statistical Databases, Vladik Kreinovich, Luc Longpre, Scott A. Starks, Gang Xiang, Jan Beck, Raj Kandathi, Asis Nayak, Scott Ferson, Janos Hajagos

Departmental Technical Reports (CS)

In many areas of science and engineering, it is desirable to estimate statistical characteristics (mean, variance, covariance, etc.) under interval uncertainty. For example, we may want to use the measured values x(t) of a pollution level in a lake at different moments of time to estimate the average pollution level; however, we do not know the exact values x(t) -- e.g., if one of the measurement results is 0, this simply means that the actual (unknown) value of x(t) can be anywhere between 0 and the detection limit DL. We must therefore modify the existing statistical algorithms ...


Which Fuzzy Logic Is The Best: Pragmatic Approach (And Its Theoretical Analysis), Vladik Kreinovich, Hung T. Nguyen Jun 2005

Which Fuzzy Logic Is The Best: Pragmatic Approach (And Its Theoretical Analysis), Vladik Kreinovich, Hung T. Nguyen

Departmental Technical Reports (CS)

In this position paper, we argue that when we are looking for the best fuzzy logic, we should specify in what sense the best, and that we get different fuzzy logics as ``the best'' depending on what optimality criterion we use.


Kaluza-Klein 5d Ideas Made Fully Geometric, Scott A. Starks, Olga Kosheleva, Vladik Kreinovich Jun 2005

Kaluza-Klein 5d Ideas Made Fully Geometric, Scott A. Starks, Olga Kosheleva, Vladik Kreinovich

Departmental Technical Reports (CS)

After the 1916 success of General relativity that explained gravity by adding time as a fourth dimension, physicists have been trying to explain other physical fields by adding extra dimensions. In 1921, Kaluza and Klein has shown that under certain conditions like cylindricity (dg_{ij}/dx^5=0), the addition of the 5th dimension can explain the electromagnetic field. The problem with this approach is that while the model itself is geometric, conditions like cylindricity are not geometric. This problem was partly solved by Einstein and Bergman who proposed, in their 1938 paper, that the 5th dimension is compactified into ...


Kolmogorov Complexity Leads To A Representation Theorem For Idempotent Probabilities (Sigma-Maxitive Measures), Vladik Kreinovich, Luc Longpre Jun 2005

Kolmogorov Complexity Leads To A Representation Theorem For Idempotent Probabilities (Sigma-Maxitive Measures), Vladik Kreinovich, Luc Longpre

Departmental Technical Reports (CS)

In many application areas, it is important to consider maxitive measures (idempotent probabilities), i.e., mappings m for which m(A U B)=max(m(A),m(B)). In his papers, J. H. Lutz has used Kolmogorov complexity to show that for constructively defined sets A, one maxitive measure - fractal dimension - can be represented as m(A)= sup{f(x): x in A}. We show that a similar representation is possible for an arbitrary maxitive measure.


If An Exact Interval Computation Problem Is Np-Hard, Then The Approximate Problem Is Also Np-Hard: A Meta-Result, Aline B. Loreto, Laira V. Toscani, Leila Robeiro, Dalcidio M. Claudio, Liara S. Leal, Luc Longpre, Vladik Kreinovich Jun 2005

If An Exact Interval Computation Problem Is Np-Hard, Then The Approximate Problem Is Also Np-Hard: A Meta-Result, Aline B. Loreto, Laira V. Toscani, Leila Robeiro, Dalcidio M. Claudio, Liara S. Leal, Luc Longpre, Vladik Kreinovich

Departmental Technical Reports (CS)

In interval computations, usually, once we prove that a problem of computing the exact range is NP-hard, then it later turns out that the problem of computing this range with a given accuracy is also NP-hard. In this paper, we provide a general explanation for this phenomenon.


Consortium Of Cise-Mii Funded Institutions: Initial Recommendations On Broadening Participation Of Hispanics, Ann Q. Gates Jun 2005

Consortium Of Cise-Mii Funded Institutions: Initial Recommendations On Broadening Participation Of Hispanics, Ann Q. Gates

Departmental Technical Reports (CS)

No abstract provided.


Towards Combining Probabilistic And Interval Uncertainty In Engineering Calculations: Algorithms For Computing Statistics Under Interval Uncertainty, And Their Computational Complexity, Vladik Kreinovich, Gang Xiang, Scott A. Starks, Luc Longpre, Martine Ceberio, Roberto Araiza, J. Beck, R. Kandathi, A. Nayak, R. Torres, J. Hajagos Jun 2005

Towards Combining Probabilistic And Interval Uncertainty In Engineering Calculations: Algorithms For Computing Statistics Under Interval Uncertainty, And Their Computational Complexity, Vladik Kreinovich, Gang Xiang, Scott A. Starks, Luc Longpre, Martine Ceberio, Roberto Araiza, J. Beck, R. Kandathi, A. Nayak, R. Torres, J. Hajagos

Departmental Technical Reports (CS)

In many engineering applications, we have to combine probabilistic and interval uncertainty. For example, in environmental analysis, we observe a pollution level x(t) in a lake at different moments of time t, and we would like to estimate standard statistical characteristics such as mean, variance, autocorrelation, correlation with other measurements. In environmental measurements, we often only measure the values with interval uncertainty. We must therefore modify the existing statistical algorithms to process such interval data.

In this paper, we provide a survey of algorithms for computing various statistics under interval uncertainty and their computational complexity. The survey includes both ...


Why Product Of Probabilities (Masses) For Independent Events? A Remark, Vladik Kreinovich, Scott Ferson Jun 2005

Why Product Of Probabilities (Masses) For Independent Events? A Remark, Vladik Kreinovich, Scott Ferson

Departmental Technical Reports (CS)

For independent events A and B, the probability P(A&B) is equal to the product of the corresponding probabilities: P(A&B)=P(A)*P(B). It is well known that the product f(a,b)=a*b has the following property: once P(A1)+...+P(An)=1 and P(B1)+...+P(Bm)=1, the probabilities P(Ai&Bj)=f(P(Ai),P(Bj)) also add to 1: f(P(A1),P(B1))+...+f(P(An),P(Bm))=1. We prove that the product is the only function that satisfies this property, i.e., that if, vice versa, this ...


Computing Best-Possible Bounds For The Distribution Of A Sum Of Several Variables Is Np-Hard, Vladik Kreinovich, Scott Ferson Jun 2005

Computing Best-Possible Bounds For The Distribution Of A Sum Of Several Variables Is Np-Hard, Vladik Kreinovich, Scott Ferson

Departmental Technical Reports (CS)

In many real-life situations, we know the probability distribution of two random variables x1 and x2, but we have no information about the correlation between x1 and x2; what are the possible probability distributions for the sum x1+x2? This question was originally raised by A. N. Kolmogorov. Algorithms exist that provide best-possible bounds for the distribution of x1+x2; these algorithms have been implemented as a part of the efficient software for handling probabilistic uncertainty. A natural question is: what if we have several (n>2) variables with known distribution, we have no information about their correlation, and we ...


Some Usability Issues And Research Priorities In Spoken Dialog Applications, Nigel Ward, Anais G. Rivera, Karen Ward, David G. Novick Jun 2005

Some Usability Issues And Research Priorities In Spoken Dialog Applications, Nigel Ward, Anais G. Rivera, Karen Ward, David G. Novick

Departmental Technical Reports (CS)

As a priority-setting exercise, we examined interactions between users and a simple spoken dialog system in comparison to interactions with a human operator. Based on analysis of the observed usability differences and their root causes we propose seven priority issues for spoken dialog systems research.