Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

Computer Sciences

Series

2008

Institution
Keyword
Publication

Articles 1 - 30 of 891

Full-Text Articles in Physical Sciences and Mathematics

The 4 X 4 Semantic Model: Exploiting Data, Functional, Non-Functional And Execution Semantics Across Business Process, Workflow, Partner Services And Middleware Services Tiers, Amit P. Sheth, Karthik Gomadam Dec 2008

The 4 X 4 Semantic Model: Exploiting Data, Functional, Non-Functional And Execution Semantics Across Business Process, Workflow, Partner Services And Middleware Services Tiers, Amit P. Sheth, Karthik Gomadam

Kno.e.sis Publications

Business processes in the global environment increasingly encompass multiple partners and complex, rapidly changing requirements. In this context it is critical that strategic business objectives align with and map accurately to systems that support flexible and dynamic business processes. To support the demanding requirements of global business processes, we propose a comprehensive, unifying 4 X 4 Semantic Model that uses Semantic Templates to link four tiers of implementation with four types of semantics. The four tiers are the Business Process Tier, the Workflow Enactment Tier, the Partner Services Tier, and the Middleware Services Tier. The four types of semantics are …


Topological Structures In The Equities Market Network, Gregory Leibon, Scott Pauls, Daniel Rockmore, Robert Savell Dec 2008

Topological Structures In The Equities Market Network, Gregory Leibon, Scott Pauls, Daniel Rockmore, Robert Savell

Dartmouth Scholarship

We present a new method for articulating scale-dependent topological descriptions of the network structure inherent in many complex systems. The technique is based on “partition decoupled null models,” a new class of null models that incorporate the interaction of clustered partitions into a random model and generalize the Gaussian ensemble. As an application, we analyze a correlation matrix derived from 4 years of close prices of equities in the New York Stock Exchange (NYSE) and National Association of Securities Dealers Automated Quotation (NASDAQ). In this example, we expose (i) a natural structure composed of 2 interacting partitions of …


Energy-Efficient Peer-To-Peer Caching And Mobility Management In 4g Hybrid Networks, Mehdi Azami, Bharat Bhargava Dec 2008

Energy-Efficient Peer-To-Peer Caching And Mobility Management In 4g Hybrid Networks, Mehdi Azami, Bharat Bhargava

Department of Computer Science Technical Reports

No abstract provided.


Text Mining In Radiology Reports, Tianxia Gong, Chew Lim Tan, Tze-Yun Leong, Cheng Kiang Lee, Boon Chuan Pang, C. C. Tchoyoson Lim, Qi Tian, Suisheng Tang, Zhuo Zhang Dec 2008

Text Mining In Radiology Reports, Tianxia Gong, Chew Lim Tan, Tze-Yun Leong, Cheng Kiang Lee, Boon Chuan Pang, C. C. Tchoyoson Lim, Qi Tian, Suisheng Tang, Zhuo Zhang

Research Collection School Of Computing and Information Systems

Medical text mining has gained increasing interest in recent years. Radiology reports contain rich information describing radiologist's observations on the patient's medical conditions in the associated medical images. However as most reports are in free text format, the valuable information contained in those reports cannot be easily accessed and used, unless proper text mining has been applied. In this paper we propose a text mining system to extract and use the information in radiology reports. The system consists of three main modules: a medical finding extractor a report and image retriever and a text-assisted image feature extractor In evaluation, the …


A Trust-Based Secure Service Discovery (Tssd) Model For Pervasive Computing, Sheikh Iqbal Ahamed, Moushumi Sharmin Dec 2008

A Trust-Based Secure Service Discovery (Tssd) Model For Pervasive Computing, Sheikh Iqbal Ahamed, Moushumi Sharmin

Mathematics, Statistics and Computer Science Faculty Research and Publications

To cope with the challenges posed by device capacity and capability, and also the nature of ad hoc networks, a Service discovery model is needed that can resolve security and privacy issues with simple solutions. The use of complex algorithms and powerful fixed infrastructure is infeasible due to the volatile nature of pervasive environment and tiny pervasive devices. In this paper, we present a trust-based secure Service discovery model, TSSD (trust-based secure service discovery) for a truly pervasive environment. Our model is a hybrid one that allows both secure and non-secure discovery of services. This model allows Service discovery and …


Archaeology Via Underwater Robots: Mapping And Localization Within Maltese Cistern Systems, Christopher M. Clark, Christopher S. Olstad, Keith Buhagiar, Timmy Gambin Dec 2008

Archaeology Via Underwater Robots: Mapping And Localization Within Maltese Cistern Systems, Christopher M. Clark, Christopher S. Olstad, Keith Buhagiar, Timmy Gambin

Computer Science and Software Engineering

This paper documents the application of several underwater robot mapping and localization techniques used during an archaeological expedition. The goal of this project was to explore and map ancient cisterns located on the islands of Malta and Gozo. The cisterns of interest acted as water storage systems for fortresses, private homes, and churches. They often consisted of several connected chambers, still containing water. A sonar-equipped Remotely Operated Vehicle (ROV) was deployed into these cisterns to obtain both video footage and sonar range measurements. Four different mapping and localization techniques were employed including 1) Sonar image mosaics using stationary sonar scans, …


Semantic Sensor Web, Amit P. Sheth, Cory Henson, Krishnaprasad Thirunarayan Dec 2008

Semantic Sensor Web, Amit P. Sheth, Cory Henson, Krishnaprasad Thirunarayan

Kno.e.sis Publications

No abstract provided.


Aesthetic Journeys, Johanna Brewer, Scott Mainwaring, Paul Dourish Dec 2008

Aesthetic Journeys, Johanna Brewer, Scott Mainwaring, Paul Dourish

Computer Science: Faculty Publications

Researchers and designers are increasingly creating technologies intended to support urban mobility. However, the question of what mobility is remains largely under-examined. In this paper we will use the notion of aesthetic journeys to reconsider the relationship between urban spaces, people and technologies. Fieldwork on the Orange County bus system and in the London Underground leads to a discussion of how we might begin to design for multiple mobilities.


An Online Learning Approach To Community Building Among Asian Journalists, Ma. Mercedes T. Rodrigo, Violet B. Valdez Dec 2008

An Online Learning Approach To Community Building Among Asian Journalists, Ma. Mercedes T. Rodrigo, Violet B. Valdez

Department of Information Systems & Computer Science Faculty Publications

This chapter describes a master's program in journalism designed for professional Asian journalists which has drawn students from 13 Asian countries and is run by faculty members from five countries. The program uses blended learning methods combining synchronous, asynchronous, and classroom-based approaches. An exploratory study was conducted to describe the strategies used by the students and teachers to build a community of learners (Garrison, Anderson, & Archer, 2000) and hence achieve the program's learning goals. The study took into consideration cultural differences, in particular, those referring to educational experiences. Results show that the respondents tended to use the strategies of …


Analyzing Rigidity With Pebble Games, Audrey Lee, Ileana Streinu, Louis Theran Dec 2008

Analyzing Rigidity With Pebble Games, Audrey Lee, Ileana Streinu, Louis Theran

Computer Science: Faculty Publications

How many pair-wise distances must be prescribed between an unknown set of points, and how should they be distributed, to determine only a discrete set of possible solutions? These questions, and related generalizations, are central in a variety of applications. Combinatorial rigidity shows that in two-dimensions one can get the answer, generically, via an efficiently testable sparse graph property. We present a video and a web site illustrating algorithmic results for a variety of rigidity-related problems, as well as abstract generalizations. Our accompanying interactive software is based on a comprehensive implementation of the pebble game paradigm.


Combinatorial Genericity And Minimal Rigidity, Ileana Streinu, Louis Theran Dec 2008

Combinatorial Genericity And Minimal Rigidity, Ileana Streinu, Louis Theran

Computer Science: Faculty Publications

A well studied geometric problem, with applications ranging from molecular structure determination to sensor networks, asks for the reconstruction of a set P of n unknown points from a finite set of pairwise distances (up to Euclidean isometries). We are concerned here with a related problem: which sets of distances are minimal with the property that they allow for the reconstruction of P, up to a finite set of possibilities? In the planar case, the answer is known generically via the landmark Maxwell-Laman Theorem from Rigidity Theory, and it leads to a combinatorial answer: the underlying structure of such a …


Unfolding Convex Polyhedra Via Quasigeodesic Star Unfoldings, Jin-Ichi Itoh, Joseph O'Rourke, Costin Vîlcu Dec 2008

Unfolding Convex Polyhedra Via Quasigeodesic Star Unfoldings, Jin-Ichi Itoh, Joseph O'Rourke, Costin Vîlcu

Computer Science: Faculty Publications

We extend the notion of a star unfolding to be based on a simple quasigeodesic loop Q rather than on a point. This gives a new general method to unfold the surface of any convex polyhedron P to a simple, planar polygon: shortest paths from all vertices of P to Q are cut, and all but one segment of Q is cut.


A Survey Of Transfer Learning Methods For Reinforcement Learning, Nicholas Bone Dec 2008

A Survey Of Transfer Learning Methods For Reinforcement Learning, Nicholas Bone

Computer Science Graduate and Undergraduate Student Scholarship

Transfer Learning (TL) is the branch of Machine Learning concerned with improving performance on a target task by leveraging knowledge from a related (and usually already learned) source task. TL is potentially applicable to any learning task, but in this survey we consider TL in a Reinforcement Learning (RL) context. TL is inspired by psychology; humans constantly apply previous knowledge to new tasks, but such transfer has traditionally been very difficult for—or ignored by—machine learning applications. The goals of TL are to facilitate faster and better learning of new tasks by applying past experience where appropriate, and to enable autonomous …


Decision Tree Ensemble: Small Heterogeneous Is Better Than Large Homogeneous, Mike Gashler, Christophe G. Giraud-Carrier, Tony R. Martinez Dec 2008

Decision Tree Ensemble: Small Heterogeneous Is Better Than Large Homogeneous, Mike Gashler, Christophe G. Giraud-Carrier, Tony R. Martinez

Faculty Publications

Using decision trees that split on randomly selected attributes is one way to increase the diversity within an ensemble of decision trees. Another approach increases diversity by combining multiple tree algorithms. The random forest approach has become popular because it is simple and yields good results with common datasets. We present a technique that combines heterogeneous tree algorithms and contrast it with homogeneous forest algorithms. Our results indicate that random forests do poorly when faced with irrelevant attributes, while our heterogeneous technique handles them robustly. Further, we show that large ensembles of random trees are more susceptible to diminishing returns …


Learning-Based Fusion For Data Deduplication, Sabra Dinerstein, Parris K. Egbert, Stephen W. Clyde, Jared Dinerstein Dec 2008

Learning-Based Fusion For Data Deduplication, Sabra Dinerstein, Parris K. Egbert, Stephen W. Clyde, Jared Dinerstein

Faculty Publications

Rule-based deduplication utilizes expert domain knowledge to identify and remove duplicate data records. Achieving high accuracy in a rule-based system requires the creation of rules containing a good combination of discriminatory clues. Unfortunately, accurate rule-based deduplication often requires significant manual tuning of both the rules and the corresponding thresholds. This need for manual tuning reduces the efficacy of rule-based deduplication and its applicability to real-world data sets. No adequate solution exists for this problem. We propose a novel technique for rule-based deduplication. We apply individual deduplication rules, and combine the resultant match scores via learning-based information fusion. We show empirically …


Capturing Workflow Event Data For Monitoring, Performance Analysis, And Management Of Scientific Workflows, Matthew Valerio, Satya S. Sahoo, Roger Barga, Jared Jackson Dec 2008

Capturing Workflow Event Data For Monitoring, Performance Analysis, And Management Of Scientific Workflows, Matthew Valerio, Satya S. Sahoo, Roger Barga, Jared Jackson

Kno.e.sis Publications

To effectively support real-time monitoring and performance analysis of scientific workflow execution, varying levels of event data must be captured and made available to interested parties. This paper discusses the creation of an ontology-aware workflow monitoring system for use in the Trident system which utilizes a distributed publish/subscribe event model. The implementation of the publish/subscribe system is discussed and performance results are presented.


Nowhere To Hide: Finding Plagiarized Documents Based On Sentence Similarity, Nathaniel Gustafson, Yiu-Kai D. Ng, Maria Soledad Pera Dec 2008

Nowhere To Hide: Finding Plagiarized Documents Based On Sentence Similarity, Nathaniel Gustafson, Yiu-Kai D. Ng, Maria Soledad Pera

Faculty Publications

Plagiarism is a serious problem that infringes copyrighted documents/materials, which is an unethical practice and decreases the economic incentive received by authors (owners) of the original copies. Unfortunately, plagiarism is getting worse due to the increasing number of online publications on the Web, which facilitates locating and paraphrasing information. In solving this problem, we propose a novel plagiarism-detection method, called SimPaD, which (i) establishes the degree of resemblance between any two documents D1 and D2 based on their sentence-to-sentence similarity computed by using pre-defined word-correlation factors, and (ii) generates a graphical view of sentences that are similar (or the same) …


Nymble: Blocking Misbehaving Users In Anonymizing Networks, Patrick P. Tsang, Apu Kapadia, Cory Cornelius, Sean W. Smith Dec 2008

Nymble: Blocking Misbehaving Users In Anonymizing Networks, Patrick P. Tsang, Apu Kapadia, Cory Cornelius, Sean W. Smith

Computer Science Technical Reports

Anonymizing networks such as Tor allow users to access Internet services privately by using a series of routers to hide the client's IP address from the server. The success of such networks, however, has been limited by users employing this anonymity for abusive purposes such as defacing popular websites. Website administrators routinely rely on IP-address blocking for disabling access to misbehaving users, but blocking IP addresses is not practical if the abuser routes through an anonymizing network. As a result, administrators block \emph{all} known exit nodes of anonymizing networks, denying anonymous access to misbehaving and behaving users alike. To address …


Direct Extraction Of Normal Mapped Meshes From Volume Data, Mark Barry, Zoë J. Wood Dec 2008

Direct Extraction Of Normal Mapped Meshes From Volume Data, Mark Barry, Zoë J. Wood

Computer Science and Software Engineering

We describe a method of directly extracting a simplified contour surface along with detailed normal maps from volume data in one fast and integrated process. A robust dual contouring algorithm is used for efficiently extracting a high-quality "crack-free" simplified surface from volume data. As each polygon is generated, the normal map is simultaneously generated. An underlying octree data structure reduces the search space required for high to low resolution surface normal mapping. The process quickly yields simplified meshes fitted with normal maps that accurately resemble their complex equivalents.


Calibrating Function Point Backfiring Conversion Ratios Using Neuro-Fuzzy Technique, Justin Wong, Luiz Fernando Capretz, Danny Ho Dec 2008

Calibrating Function Point Backfiring Conversion Ratios Using Neuro-Fuzzy Technique, Justin Wong, Luiz Fernando Capretz, Danny Ho

Electrical and Computer Engineering Publications

Software estimation is an important aspect in software development projects because poor estimations can lead to late delivery, cost overruns, and possibly project failure. Backfiring is a popular technique for sizing and predicting the volume of source code by converting the function point metric into source lines of code mathematically using conversion ratios. While this technique is popular and useful, there is a high margin of error in backfiring. This research introduces a new method to reduce that margin of error. Neural networks and fuzzy logic in software prediction models have been demonstrated in the past to have improved performance …


An Analysis Of Entries In The First Tac Market Design Competition, Jinzhong Niu, Kai Cai, Peter Mcburney, Simon Parsons Dec 2008

An Analysis Of Entries In The First Tac Market Design Competition, Jinzhong Niu, Kai Cai, Peter Mcburney, Simon Parsons

Publications and Research

This paper presents an analysis of entries in the first TAC Market Design Competition final that compares the entries across several scenarios. The analysis complements previous work analyzing the 2007 competition, demonstrating some vulnerabilities of entries that placed highly in the competition. The paper also suggests a simple strategy that would have performed well.


Functional Monitoring Without Monotonicity, Chrisil Arackaparambil, Joshua Brody, Amit Chakrabarti Dec 2008

Functional Monitoring Without Monotonicity, Chrisil Arackaparambil, Joshua Brody, Amit Chakrabarti

Computer Science Technical Reports

The notion of distributed functional monitoring was recently introduced by Cormode, Muthukrishnan and Yi to initiate a formal study of the communication cost of certain fundamental problems arising in distributed systems, especially sensor networks. In this model, each of k sites reads a stream of tokens and is in communication with a central coordinator, who wishes to continuously monitor some function f of \sigma, the union of the k streams. The goal is to minimize the number of bits communicated by a protocol that correctly monitors f(\sigma), to within some small error. As in previous work, we focus on a …


Digital Image Ballistics From Jpeg Quantization: A Followup Study, Hany Farid Dec 2008

Digital Image Ballistics From Jpeg Quantization: A Followup Study, Hany Farid

Computer Science Technical Reports

The lossy JPEG compression scheme employs a quantization table that controls the amount of compression achieved. Because different cameras typically employ different tables, a comparison of an image's quantization scheme to a database of known cameras affords a simple technique for confirming or denying an image's source. This report describes the analysis of quantization tables extracted from 1,000,000 images downloaded from Flickr.com.


Group-Aware Stream Filtering For Bandwidth-Efficient Data Dissemination, Ming Li, David Kotz Dec 2008

Group-Aware Stream Filtering For Bandwidth-Efficient Data Dissemination, Ming Li, David Kotz

Dartmouth Scholarship

In this paper we are concerned with disseminating high-volume data streams to many simultaneous applications over a low-bandwidth wireless mesh network. For bandwidth efficiency, we propose a group-aware stream filtering approach, used in conjunction with multicasting, that exploits two overlooked, yet important, properties of these applications: 1) many applications can tolerate some degree of “slack” in their data quality requirements, and 2) there may exist multiple subsets of the source data satisfying the quality needs of an application. We can thus choose the “best alternative” subset for each application to maximize the data overlap within the group to best benefit …


A Performance And Productivity Study Using Mpi, Titanium, And Fortress, Amy Apon, Chris Bryan, Wesley Emeneker Dec 2008

A Performance And Productivity Study Using Mpi, Titanium, And Fortress, Amy Apon, Chris Bryan, Wesley Emeneker

Publications

The popularity of cluster computing has increased focus on usability, especially in the area of programmability. Languages and libraries that require explicit message passing have been the standard. New languages, designed for cluster computing, are coming to the forefront as a way to simplify parallel programming. Titanium and Fortress are examples of this new class of programming paradigms. This papers presents results from a productivity study of these two newcomers with MPI, the de- facto standard for parallel programming.


Head-Pose Tracking With A Time-Of-Flight Camera, Simon Meers, Koren Ward Dec 2008

Head-Pose Tracking With A Time-Of-Flight Camera, Simon Meers, Koren Ward

Faculty of Informatics - Papers (Archive)

Intelligent interfaces that make use of the user's head pose or facial features in order to interpret the user's identity or point of attention, are finding increasing application in numerous fields. Although various techniques exist to passively track the user's gaze or head pose using monocular or stereo cameras, these systems generally cannot perceive in detail the characteristic three-dimensional (3D) profile of the user's head or face. Time-of-flight cameras, such as the Swiss Ranger SR-3000, are a recent innovation capable of providing three-dimensional image data from a single sensor. The advent of such sensors opens up new possibilities in the …


Protecting Critical Infrastructure With Games Technology, Adrian Boeing, Martin Masek, Bill Bailey Dec 2008

Protecting Critical Infrastructure With Games Technology, Adrian Boeing, Martin Masek, Bill Bailey

Australian Information Warfare and Security Conference

It is widely recognised that there is a considerable gap in the protection of the national infrastructure. Trying to identify what is in fact ‘critical’ is proving to be very difficult as threats constantly evolve. An interactive prototyping tool is useful in playing out scenarios and simulating the effect of change, however existing simulators in the critical infrastructure area are typically limited in the visual representation and interactivity. To remedy this we propose the use of games technology. Through its use, critical infrastructure scenarios can be rapidly constructed, tested, and refined. In this paper, we highlight the features of games …


A Holistic Scada Security Standard For The Australian Context, Christopher Beggs Dec 2008

A Holistic Scada Security Standard For The Australian Context, Christopher Beggs

Australian Information Warfare and Security Conference

Supervisory Control and Data Acquisition (SCADA) systems which control Australia’s critical infrastructure are currently demonstrating signs of vulnerabilities as they are being interconnected to corporate networks, essentially exposing them to malicious threats. This paper discusses the vulnerabilities associated with SCADA systems, as well as discussing various SCADA standards and initiatives that have been developed in recent years to mitigate such threats. The paper presents the requirement for a holistic SCADA security standard that is practical and feasible for each SCADA industry sector.


Visualisation Of Critical Infrastructure Failure, W D. Wilde, M J. Warren Dec 2008

Visualisation Of Critical Infrastructure Failure, W D. Wilde, M J. Warren

Australian Information Warfare and Security Conference

The paper explores the complexity of critical infrastructure and critical infrastructure failure (CIF), real life examples are used to discuss the complexity involved. The paper then discusses what Visualisation is and how Visualisation can be applied to a security situation, in particular critical infrastructure. The paper concludes by discussing the future direction of the research.


Media, Government And Manipulation: The Cases Of The Two Gulf Wars, William Hutchinson Dec 2008

Media, Government And Manipulation: The Cases Of The Two Gulf Wars, William Hutchinson

Australian Information Warfare and Security Conference

This paper explores the bias and manipulation of the Western mass media during the Gulf wars of 1991 and 2003. The tactics of compliance and the ethics of the press and journalists are examined. The need for a pluralist press is extolled.