Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 30 of 44

Full-Text Articles in Physical Sciences and Mathematics

Real-Time Divisible Load Scheduling For Cluster Computing, Xuan Lin, Ying Lu, Jitender S. Deogun, Steve Goddard Oct 2006

Real-Time Divisible Load Scheduling For Cluster Computing, Xuan Lin, Ying Lu, Jitender S. Deogun, Steve Goddard

CSE Technical Reports

Cluster computing has emerged as a new paradigm for solving large-scale problems. To enhance QoS and provide performance guarantees in cluster computing environments, various real-time scheduling algorithms and workload models have been investigated. Computational loads that can be arbitrarily divided into independent pieces represent many real-world applications. Divisible load theory (DLT) provides insight into distribution strategies for such computations. However, the problem of providing performance guarantees to divisible load applications has not yet been systematically studied. This paper investigates such algorithms for a cluster environment. Design parameters that affect the performance of these algorithms and scenarios when the choice of …


Dynamic Characterization Of Web Application Interfaces, Marc Randall Fisher Ii, Sebastian Elbaum, Gregg Rothermel Oct 2006

Dynamic Characterization Of Web Application Interfaces, Marc Randall Fisher Ii, Sebastian Elbaum, Gregg Rothermel

CSE Technical Reports

Web applications are increasingly prominent in society, serving a wide variety of user needs. Engineers seeking to enhance, test, and maintain these applications and third-party programmers wishing to utilize these applications need to understand their interfaces. In this paper, therefore, we present methodologies for characterizing the interfaces of web applications through a form of dynamic analysis, in which directed requests are sent to the application, and responses are analyzed to draw inferences about its interface. We also provide mechanisms to increase the scalability of the approach. Finally, we evaluate the approach’s performance on six non-trivial web applications.


Parallel Randomized State-Space Search, Matthew B. Dwyer, Sebastian Elbaum, Suzette Person, Rahul Purandare Oct 2006

Parallel Randomized State-Space Search, Matthew B. Dwyer, Sebastian Elbaum, Suzette Person, Rahul Purandare

CSE Technical Reports

Model checkers search the space of possible program behaviors to detect errors and to demonstrate their absence. Despite major advances in reduction and optimization techniques, state-space search can still become cost-prohibitive as program size and complexity increase. In this paper, we present a technique for dramatically improving the cost-effectiveness of state-space search techniques for error detection using parallelism. Our approach can be composed with all of the reduction and optimization techniques we are aware of to amplify their benefits. It was developed based on insights gained from performing a large empirical study of the cost-effectiveness of randomization techniques in state-space …


Exploiting Geographical And Temporal Locality To Boost Search Efficiency In Peer-To-Peer Systems, Hailong Cai, Jun Wang Oct 2006

Exploiting Geographical And Temporal Locality To Boost Search Efficiency In Peer-To-Peer Systems, Hailong Cai, Jun Wang

School of Computing: Faculty Publications

As a hot research topic, many search algorithms have been presented and studied for unstructured peer-to-peer (P2P) systems during the past few years. Unfortunately, current approaches either cannot yield good lookup performance, or incur high search cost and system maintenance overhead. The poor search efficiency of these approaches may seriously limit the scalability of current unstructured P2P systems. In this paper, we propose to exploit two-dimensional locality to improve P2P system search efficiency. We present a locality-aware P2P system architecture called Foreseer, which explicitly exploits geographical locality and temporal locality by constructing a neighbor overlay and a friend overlay, respectively. …


Adaptive Online Program Analysis: Concepts, Infrastructure, And Applications, Matthew B. Dwyer, Alex Kinneer, Sebastian Elbaum Sep 2006

Adaptive Online Program Analysis: Concepts, Infrastructure, And Applications, Matthew B. Dwyer, Alex Kinneer, Sebastian Elbaum

CSE Technical Reports

Dynamic analysis of state-based properties is being applied to problems such as validation, intrusion detection, and program steering and reconfiguration. Dynamic analysis of such properties, however, is used rarely in practice due to its associated run-time overhead that causes multiple orders of magnitude slowdown of program execution. In this paper, we present an approach for exploiting the state-fullness of specifications to reduce the cost of dynamic program analysis. With our approach, the results of the analysis are guaranteed to be identical to those of the traditional, expensive dynamic analyses, yet with overheads between 23% and 33% relative to the un-instrumented …


On The Use Of Mutation Faults In Empirical Assessments Of Test Case Prioritization Techniques, Hyunsook Do, Gregg Rothermel Sep 2006

On The Use Of Mutation Faults In Empirical Assessments Of Test Case Prioritization Techniques, Hyunsook Do, Gregg Rothermel

School of Computing: Faculty Publications

Regression testing is an important activity in the software life cycle, but it can also be very expensive. To reduce the cost of regression testing, software testers may prioritize their test cases so that those which are more important, by some measure, are run earlier in the regression testing process. One potential goal of test case prioritization techniques is to increase a test suite’s rate of fault detection (how quickly, in a run of its test cases, that test suite can detect faults). Previous work has shown that prioritization can improve a test suite’s rate of fault detection, but the …


Image Interpolation By Two-Dimensional Parametric Cubic Convolution, Jiazheng Shi, Stephen E. Reichenbach Jul 2006

Image Interpolation By Two-Dimensional Parametric Cubic Convolution, Jiazheng Shi, Stephen E. Reichenbach

School of Computing: Faculty Publications

Cubic convolution is a popular method for image interpolation. Traditionally, the piecewise-cubic kernel has been derived in one dimension with one parameter and applied to two-dimensional (2-D) images in a separable fashion. However, images typically are statistically nonseparable, which motivates this investigation of nonseparable cubic convolution. This paper derives two new nonseparable, 2-D cubic-convolution kernels. The first kernel, with three parameters (designated 2D-3PCC), is the most general 2-D, piecewise-cubic interpolator defined on [-2, 2] x [-2, 2] with constraints for biaxial symmetry, diagonal (or 90 rotational) symmetry, continuity, and smoothness. The second kernel, with five parameters (designated 2D-5PCC), relaxes the …


Adaptive Interpolation Algorithms For Temporal-Oriented Datasets, Jun Gao Jun 2006

Adaptive Interpolation Algorithms For Temporal-Oriented Datasets, Jun Gao

Department of Computer Science and Engineering: Dissertations, Theses, and Student Research

Spatiotemporal datasets can be classified into two categories: temporal-oriented and spatial-oriented datasets depending on whether missing spatiotemporal values are closer to the values of its temporal or spatial neighbors. We present an adaptive spatiotemporal interpolation model that can estimate the missing values in both categories of spatiotemporal datasets. The key parameters of the adaptive spatiotemporal interpolation model can be adjusted based on experience.


Controlling Factors In Evaluating Path-Sensitive Error Detection Techniques, Matthew B. Dwyer, Suzette Person, Sebastian Elbaum Apr 2006

Controlling Factors In Evaluating Path-Sensitive Error Detection Techniques, Matthew B. Dwyer, Suzette Person, Sebastian Elbaum

CSE Technical Reports

Recent advances in static program analysis have made it possible to detect errors in applications that have been thoroughly tested and are in wide-spread use. The ability to find errors that have eluded traditional validation methods is due to the development and combination of sophisticated algorithmic techniques that are embedded in the implementations of analysis tools. Evaluating new analysis techniques is typically performed by running an analysis tool on a collection of subject programs, perhaps enabling and disabling a given technique in different runs. While seemingly sensible, this approach runs the risk of attributing improvements in the cost-effectiveness of the …


Sofya: A Flexible Framework For Development Of Dynamic Program Analyses For Java Software, Alex Kinneer, Matthew B. Dwyer, Gregg Rothermel Apr 2006

Sofya: A Flexible Framework For Development Of Dynamic Program Analyses For Java Software, Alex Kinneer, Matthew B. Dwyer, Gregg Rothermel

CSE Technical Reports

Dynamic analysis techniques are well established in the software engineering community as methods for validating, understanding, maintaining, and improving programs. Generally, this class of techniques requires developers to instrument programs to generate events that capture, or observe, relevant features of program execution. Streams of these events are then processed to achieve the goals of the dynamic analysis. The lack of high-level tools for defining program observations, automating their mapping to efficient low-level implementations, and supporting the flexible combination of different event-stream-based processing components hampers the development and evaluation of new dynamic analysis techniques. For example, mapping non-trivial program observations to …


Carving Differential Unit Test Cases From System Test Cases, Sebastian Elbaum, Hui Nee Chin, Matthew B. Dwyer, Jonathan Dokulil Apr 2006

Carving Differential Unit Test Cases From System Test Cases, Sebastian Elbaum, Hui Nee Chin, Matthew B. Dwyer, Jonathan Dokulil

CSE Technical Reports

Unit test cases are focused and efficient. System tests are effective at exercising complex usage patterns. Differential unit tests (DUT) are a hybrid of unit and system tests. They are generated by carving the system components, while executing a system test case, that influence the behavior of the target unit, and then re-assembling those components so that the unit can be exercised as it was by the system test. We conjecture that DUTs retain some of the advantages of unit tests, can be automatically and inexpensively generated, and have the potential for revealing faults related to intricate system executions. In …


Interactive Fault Localization Techniques In A Spreadsheet Environment, Joseph R. Ruthruff, Margaret Burnett, Gregg Rothermel Apr 2006

Interactive Fault Localization Techniques In A Spreadsheet Environment, Joseph R. Ruthruff, Margaret Burnett, Gregg Rothermel

School of Computing: Faculty Publications

End-user programmers develop more software than any other group of programmers, using software authoring devices such as multimedia simulation builders, e-mail filtering editors, by-demonstration macro builders, and spreadsheet environments. Despite this, there has been only a little research on finding ways to help these programmers with the dependability of the software they create. We have been working to address this problem in several ways, one of which includes supporting end-user debugging activities through interactive fault localization techniques. This paper investigates fault localization techniques in the spreadsheet domain, the most common type of end-user programming environment. We investigate a technique previously …


Idf: An Inconsistency Detection Framework – Performance Modeling And Guide To Its Design, Yijun Lu, Xueming Li, Hong Jiang Mar 2006

Idf: An Inconsistency Detection Framework – Performance Modeling And Guide To Its Design, Yijun Lu, Xueming Li, Hong Jiang

CSE Technical Reports

With the increased popularity of replica-based services in distributed systems such as the Grid, consistency control among replicas becomes more and more important. To this end, IDF (Inconsistency Detection Framework), a two-layered overlay-based architecture, has been proposed as a new way to solve this problem—instead of enforcing a predefined protocol, IDF detects inconsistency in a timely manner when it occurs and resolves it based on applications’ semantics.
This paper presents a comprehensive analytical study of IDF to assess its performance and provide insight into its design. More specifically, it develops an analytical model to characterize IDF. Based on this model, …


Implementing Cs1 With Embedded Instructional Research Design In Laboratories, Jeff Lang, Gwen C. Nugent, Ashok Samal, Leen-Kiat Soh Feb 2006

Implementing Cs1 With Embedded Instructional Research Design In Laboratories, Jeff Lang, Gwen C. Nugent, Ashok Samal, Leen-Kiat Soh

School of Computing: Faculty Publications

Closed laboratories are becoming an increasingly popular approach to teaching introductory computer science courses. Unlike open laboratories that tend to be an informal environment provided for students to practice their skills with attendance optional, closed laboratories are structured meeting times that support the lecture component of the course, and attendance is required. This paper reports on an integrated approach to designing, implementing, and assessing laboratories with an embedded instructional research design. The activities reported here are parts of a department-wide effort not only to improve student learning in computer science and computer engineering (CE) but also to improve the agility …


Allocating Non-Real-Time And Soft Real-Time Jobs In Multiclusters, Ligang He, Stephen A. Jarvis, Daniel P. Spooner, Hong Jiang, Donna N. Dillenberger, Graham R. Nudd Feb 2006

Allocating Non-Real-Time And Soft Real-Time Jobs In Multiclusters, Ligang He, Stephen A. Jarvis, Daniel P. Spooner, Hong Jiang, Donna N. Dillenberger, Graham R. Nudd

School of Computing: Faculty Publications

This paper addresses workload allocation techniques for two types of sequential jobs that might be found in multicluster systems, namely, non-real-time jobs and soft real-time jobs. Two workload allocation strategies, the Optimized mean Response Time (ORT) and the Optimized mean Miss Rate (OMR), are developed by establishing and numerically solving two optimization equation sets. The ORT strategy achieves an optimized mean response time for non-real-time jobs, while the OMR strategy obtains an optimized mean miss rate for soft real-time jobs over multiple clusters. Both strategies take into account average system behaviors (such as the mean arrival rate of jobs) in …


Ceft: A Cost-Effective, Fault-Tolerant Parallel Virtual File System, Yifeng Zhu, Hong Jiang Feb 2006

Ceft: A Cost-Effective, Fault-Tolerant Parallel Virtual File System, Yifeng Zhu, Hong Jiang

School of Computing: Faculty Publications

The vulnerability of computer nodes due to component failures is a critical issue for cluster-based file systems. This paper studies the development and deployment of mirroring in cluster-based parallel virtual file systems to provide fault tolerance and analyzes the tradeoffs between the performance and the reliability in the mirroring scheme. It presents the design and implementation of CEFT, a scalable RAID-10 style file system based on PVFS, and proposes four novel mirroring protocols depending on whether the mirroring operations are server-driven or client-driven, whether they are asynchronous or synchronous. The comparisons of their write performances, measured in a real cluster, …


Evaluating The Effectiveness Of Slicing For Model Reduction Of Concurrent Object-Oriented Programs, Matthew B. Dwyer, John Hatcliff, Matthew Hoosier, Venkatesh Ranganath, Robby, Todd Wallentine Jan 2006

Evaluating The Effectiveness Of Slicing For Model Reduction Of Concurrent Object-Oriented Programs, Matthew B. Dwyer, John Hatcliff, Matthew Hoosier, Venkatesh Ranganath, Robby, Todd Wallentine

CSE Book Chapters

Model checking techniques have proven effective for checking a number of non-trivial concurrent object-oriented software systems. However, due to the high computational and memory costs, a variety of model reduction techniques are needed to overcome current limitations on applicability and scalability. Conventional wisdom holds that static program slicing can be an effective model reduction technique, yet anecdotal evidence is mixed, and there has been no work that has systematically studied the costs/benefits of slicing for model reduction in the context of model checking source code for realistic systems. In this paper, we present an overview of the sophisticated Indus program …


Scaling A Dataflow Testing Methodology To The Multiparadigmworld Of Commercial Spreadsheets, Marc Fisher Ii, Gregg Rothermel, Tyler Creelan, Margaret Burnett Jan 2006

Scaling A Dataflow Testing Methodology To The Multiparadigmworld Of Commercial Spreadsheets, Marc Fisher Ii, Gregg Rothermel, Tyler Creelan, Margaret Burnett

CSE Conference and Workshop Papers

Spreadsheets are widely used but often contain faults. Thus, in prior work we presented a data-flow testing methodology for use with spreadsheets, which studies have shown can be used cost-effectively by end-user programmers. To date, however, the methodology has been investigated across a limited set of spreadsheet language features. Commercial spreadsheet environments are multiparadigm languages, utilizing features not accommodated by our prior approaches. In addition, most spreadsheets contain large numbers of replicated formulas that severely limit the efficiency of data-flow testing approaches. We show how to handle these two issues with a new data-flow adequacy criterion and automated detection of …


Helping End-User Programmers “Engineer” Dependable Software, Gregg Rothermel Jan 2006

Helping End-User Programmers “Engineer” Dependable Software, Gregg Rothermel

CSE Conference and Workshop Papers

Not long ago, most software was written by professional programmers, who could be presumed to have an interest in software engineering methodologies and in tools and techniques for improving software dependability. Today, however, a great deal of software is written not by professionals but by end-users, who create applications such as multimedia simulations, dynamic web pages, and spreadsheets. Applications such as these are often used to guide important decisions or aid in important tasks, and it is important that they be sufficiently dependable, but evidence shows that they frequently are not. For example, studies have shown that a large percentage …


Eraid: Conserving Energy In High Performance Raid Systems With Conventional Disks, Dong Li, Jun Wang Jan 2006

Eraid: Conserving Energy In High Performance Raid Systems With Conventional Disks, Dong Li, Jun Wang

CSE Technical Reports

Recently energy consumption becomes an ever critical concern for both low-end and high-end storage server and data centers. A majority of existing energy conservation solutions resort to multi-speed disks. However, current server systems are still built with conventional disks.
In this paper, we propose an energy saving policy, eRAID, for conventional disk based RAID-1 systems. eRAID saves energy by spinning down partial or entire mirror disk group with predictable performance degradation. The heart work of eRAID is to develop an accurate dynamic performance control (including disk power management) scheme. To guarantee service quality, the dynamic performance control works for two …


The Performance Of Elliptic Curve Based Group Diffie-Hellman Protocols For Secure Group Communication Over Ad Hoc Networks, Yong Wang, Byrav Ramamurthy, Xukai Zou Jan 2006

The Performance Of Elliptic Curve Based Group Diffie-Hellman Protocols For Secure Group Communication Over Ad Hoc Networks, Yong Wang, Byrav Ramamurthy, Xukai Zou

CSE Conference and Workshop Papers

The security of the two party Diffie-Hellman key exchange protocol is currently based on the discrete logarithm problem (DLP). However, it can also be built upon the elliptic curve discrete logarithm problem (ECDLP). Most proposed secure group communication schemes employ the DLP-based Diffie-Hellman protocol. This paper proposes the ECDLP-based Diffie-Hellman protocols for secure group communication and evaluates their performance on wireless ad hoc networks. The proposed schemes are compared at the same security level with DLP-based group protocols under different channel conditions. Our experiments and analysis show that the Tree-based Group Elliptic Curve Diffie-Hellman (TGECDH) protocol is the best in …


Integrated Intermediate Waveband And Wavelength Switching For Optical Wdm Mesh Networks, Mengke Li, Byrav Ramamurthy Jan 2006

Integrated Intermediate Waveband And Wavelength Switching For Optical Wdm Mesh Networks, Mengke Li, Byrav Ramamurthy

CSE Conference and Workshop Papers

As wavelength-division multiplexing (WDM) evolves towards practical applications in optical transport networks, waveband switching (WBS) has been introduced to cut down the operational costs and to reduce the complexities and sizes of network components, e.g., optical cross-connects (OXCs). This paper considers the routing, wavelength assignment and waveband assignment (RWWBA) problem in a WDM network supporting mixed waveband and wavelength switching. First, the techniques supporting waveband switching are studied, where a node architecture enabling mixed waveband and wavelength switching is proposed. Second, to solve the RWWBA problem with reduced switching costs and improved network throughput, the cost savings and call blocking …


A Maximum-Likelihood Approach To Symbolic Indirect Correlation, Ashutosh Joshi, George Nagy, Daniel Lopresti, Sharad C. Seth Jan 2006

A Maximum-Likelihood Approach To Symbolic Indirect Correlation, Ashutosh Joshi, George Nagy, Daniel Lopresti, Sharad C. Seth

CSE Conference and Workshop Papers

Symbolic Indirect Correlation (SIC) is a nonparametric method that offers significant advantages for recognition of ordered unsegmented signals. A previously introduced formulation of SIC based on subgraph-isomorphism requires very large reference sets in the presence of noise. In this paper, we seek to address this issue by formulating SIC classification as a maximum likelihood problem. We present experimental evidence that demonstrates that this new approach is more robust for the problem of online handwriting recognition using noisy input.


On Reoptimizing Multi-Class Classifiers, Kun Deng, Chris Bourke, Stephen Scott, Robert E. Schapire, N. V. Vinodchandran Jan 2006

On Reoptimizing Multi-Class Classifiers, Kun Deng, Chris Bourke, Stephen Scott, Robert E. Schapire, N. V. Vinodchandran

CSE Technical Reports

Significant changes in the instance distribution or associated cost function of a learning problem require one to reoptimize a previously learned classifier to work under new conditions. We study the problem of reoptimizing a multi-class classifier based on its ROC hypersurface and a matrix describing the costs of each type of prediction error. For a binary classifier, it is straightforward to find an optimal operating point based on its ROC curve and the relative cost of true positive to false positive error. However, the corresponding multi-class problem (finding an optimal operating point based on a ROC hypersurface and cost matrix) …


A Cross-Layer Protocol For Wireless Sensor Networks, Ian F. Akyildiz, Mehmet C. Vuran, Ӧzgür B. Akan Jan 2006

A Cross-Layer Protocol For Wireless Sensor Networks, Ian F. Akyildiz, Mehmet C. Vuran, Ӧzgür B. Akan

CSE Conference and Workshop Papers

Severe energy constraints of battery-powered sensor nodes necessitate energy-efficient communication protocols in order to fulfill application objectives of wireless sensor networks (WSN). However, the vast majority of the existing solutions are based on classical layered protocols approach. It is much more resource-efficient to have a unified scheme which melts common protocol layer functionalities into a cross-layer module for resource-constrained sensor nodes. To the best of our knowledge, to date, there is no unified cross-layer communication protocol for efficient and reliable event communication which considers transport, routing, medium access functionalities with physical layer (wireless channel) effects for WSNs.
In this paper, …


An Interactive Constraint-Based Approach To Minesweeper, Ken Bayer, Josh Snyder, Berthe Y. Choueiry Jan 2006

An Interactive Constraint-Based Approach To Minesweeper, Ken Bayer, Josh Snyder, Berthe Y. Choueiry

CSE Conference and Workshop Papers

We present a Java applet that uses Constraint Processing (CP) to assist a human in playing the popular game Minesweeper. Our goal is to illustrate the power of CP techniques to model and solve combinatorial problems in a context accessible to the general public.

Minesweeper is a video game that has been included with Microsoft Windows since 1989. In this game, the player is presented with a grid of squares. Each of these squares may conceal a mine. When the player clicks on a square, it is revealed. If the square is a mine, the game is over. If the …


Automated Generation Of Context-Aware Tests, Zhimin Wang, Sebastian Elbaum, David Rosenblum Jan 2006

Automated Generation Of Context-Aware Tests, Zhimin Wang, Sebastian Elbaum, David Rosenblum

CSE Technical Reports

The incorporation of context-awareness capabilities into pervasive applications allows them to leverage contextual information to provide additional services while maintaining an acceptable quality of service. These added capabilities, however, introduce a distinct input space that can affect the behavior of these applications at any point during their execution, making their validation quite challenging. In this paper, we introduce an approach to improve the test suite of a context-aware application by identifying context-aware program points where context changes may affect the application’s behavior, and by systematically manipulating the context data fed into the application to increase its exposure to potentially valuable …


Web Application Characterization Through Directed Requests, Sebastian Elbaum, Kalyanram Chilakamarri, Marc Randall Fisher Ii, Gregg Rothermel Jan 2006

Web Application Characterization Through Directed Requests, Sebastian Elbaum, Kalyanram Chilakamarri, Marc Randall Fisher Ii, Gregg Rothermel

CSE Technical Reports

Web applications are increasingly prominent in society, serving a wide variety of user needs. Engineers seeking to enhance, test, and maintain these applications must be able to understand and characterize their interfaces. Third-party programmers (professional or end user) wishing to incorporate the data provided by such services into their own applications would also benefit from such characterization when the target site does not provide adequate programmatic interfaces. In this paper, therefore, we present methodologies for characterizing the interfaces to web applications through a form of dynamic analysis, in which directed requests are sent to the application, and responses are analyzed …


An Ontology-Based Metamodel For Software Patterns, Scott Henninger, Padmapriya Ashokkumar Jan 2006

An Ontology-Based Metamodel For Software Patterns, Scott Henninger, Padmapriya Ashokkumar

CSE Technical Reports

Patterns have been successfully used in software design to reuse proven solutions. But the complex interconnections and the number of pattern collections is becoming a barrier for identifying relevant patterns and pattern combinations for a given design context. More formal representations of patterns are needed that allow machine processing and the creation of systematic pattern languages that guide composition of patterns into coherent design solutions. In this paper, we present a technique based on Description Logic and Semantic Web technologies to address these problems. A metamodel is presented for developing pattern languages using this technology. Usability patterns are used to …


Spatio-Temporal Characteristics Of Point And Field Sources In Wireless Sensor Networks, Mehmet C. Vuran, Ӧzgür B. Akan Jan 2006

Spatio-Temporal Characteristics Of Point And Field Sources In Wireless Sensor Networks, Mehmet C. Vuran, Ӧzgür B. Akan

CSE Conference and Workshop Papers

Wireless Sensor Networks (WSN) are comprised of densely deployed sensor nodes collaboratively observing and communicating extracted information about a physical phenomenon. Dense deployment of sensor nodes makes the sensor observations highly correlated in the space domain. In addition, consecutive samples obtained by a sensor node are also temporally correlated for the applications involving the observation of the variation of a physical phenomenon. Based on the physical characteristics and dispersion pattern over the area, the phenomenon to be observed can be modeled as point source or field source. Clearly, understanding the spatio-temporal correlation characteristics of the point and field sources brings …