Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

2001

Series

Computer Sciences

Institution
Keyword
Publication

Articles 1 - 30 of 283

Full-Text Articles in Physical Sciences and Mathematics

Open Source Software: A History, David Bretthauer Dec 2001

Open Source Software: A History, David Bretthauer

Published Works

In the 30 years from 1970-2000, open source software began as an assumption without a name or a clear alternative. It has evolved into a sophisticated movement which has produced some of the most stable and widely used software packages ever produced. This paper traces the evolution of three operating systems: GNU, BSD, and Linux, as well as the communities which have evolved with these systems and some of the commonly-used software packages developed using the open source model. It also discusses some of the major figures in open source software, and defines both “free software” and “open source software.”


Deploying Integrated Web-Based Spatial Applications Within An Oracle Database Environment, James Carswell Dec 2001

Deploying Integrated Web-Based Spatial Applications Within An Oracle Database Environment, James Carswell

Conference papers

In this paper, we describe the architectural and functional characteristics of e-Spatial™ technology, comprising an innovative software package that represents a timely alternative to traditional and complex proprietary GIS application packages. The two main components of the package, developed by e- Spatial Solutions, are the iSMART™ database development technology and the i-Spatial™ Information Server (iSIS), both implemented within an Oracle 9i Spatial database environment. This technology allows users to build and deploy spatially enabled or standard Internet applications without requiring any application-specific source code. It can be deployed on any Oracle supported hardware platform and on any device that supports …


Mobile-Agent Versus Client/Server Performance: Scalability In An Information-Retrieval Task, Robert S. Gray, David Kotz, Ronald A. Peterson, Joyce Barton, Daria Chacon, Peter Gerken, Martin Hofmann, Jeffrey Bradshaw, Maggie Breedy, Renia Jeffers, Niranjan Suri Dec 2001

Mobile-Agent Versus Client/Server Performance: Scalability In An Information-Retrieval Task, Robert S. Gray, David Kotz, Ronald A. Peterson, Joyce Barton, Daria Chacon, Peter Gerken, Martin Hofmann, Jeffrey Bradshaw, Maggie Breedy, Renia Jeffers, Niranjan Suri

Dartmouth Scholarship

Building applications with mobile agents often reduces the bandwidth required for the application, and improves performance. The cost is increased server workload. There are, however, few studies of the scalability of mobile-agent systems. We present scalability experiments that compare four mobile-agent platforms with a traditional client/server approach. The four mobile-agent platforms have similar behavior, but their absolute performance varies with underlying implementation choices. Our experiments demonstrate the complex interaction between environmental, application, and system parameters.


Fast Implementation Of Depth Contours Using Topological Sweep, Kim Miller, Suneeta Ramaswami, Peter Rousseeuw, Toni Sellarès, Diane Souvaine, Ileana Streinu, Anja Struyf Dec 2001

Fast Implementation Of Depth Contours Using Topological Sweep, Kim Miller, Suneeta Ramaswami, Peter Rousseeuw, Toni Sellarès, Diane Souvaine, Ileana Streinu, Anja Struyf

Computer Science: Faculty Publications

The concept of location depth was introduced in statistics as a way to extend the univariate notion of ranking to a bivariate configuration of data points. It has been used successfully for robust estimation, hypothesis testing, and graphical display. These reguire the computation of depth regions, which form a collection of nested polygons. The center of the deepest region is called the Tukey median. The only available implemented algorithms for the depth contours and the Tukey median are slow, which limits their usefulness. In this paper we describe an optimal algorithm which computes all depth contours in &Ogr;(n 2) time …


On Detecting Service Violations And Bandwidth Theft In Qos Network Domains, Ahsan Habib, Sonia Fahmy, Srinivas R. Avasarala, Venkatesh Prabhakar, Bharat Bhargava Dec 2001

On Detecting Service Violations And Bandwidth Theft In Qos Network Domains, Ahsan Habib, Sonia Fahmy, Srinivas R. Avasarala, Venkatesh Prabhakar, Bharat Bhargava

Department of Computer Science Technical Reports

No abstract provided.


Dual Heuristic Programming For Fuzzy Control, George G. Lendaris, Thaddeus T. Shannon, Larry J. Schultz, Steven Hutsell, Alec Rogers Dec 2001

Dual Heuristic Programming For Fuzzy Control, George G. Lendaris, Thaddeus T. Shannon, Larry J. Schultz, Steven Hutsell, Alec Rogers

Systems Science Faculty Publications and Presentations

Overview material for the Special Session (Tuning Fuzzy Controllers Using Adaptive Critic Based Approximate Dynamic Programming) is provided. The Dual Heuristic Programming (DHP) method of Approximate Dynamic Programming is described and used to the design a fuzzy control system. DHP and related techniques have been developed in the neurocontrol context but can be equally productive when used with fuzzy controllers or neuro-fuzzy hybrids. This technique is demonstrated by designing a temperature controller for a simple water bath system. In this example, we take advantage of the TSK model framework to initialize the tunable parameters of our plant model with reasonable …


Meeting Medical Terminology Needs: The Ontology-Enhanced Medical Concept Mapper, Gondy Leroy, Hsinchun Chen Dec 2001

Meeting Medical Terminology Needs: The Ontology-Enhanced Medical Concept Mapper, Gondy Leroy, Hsinchun Chen

CGU Faculty Publications and Research

This paper describes the development and testing of the Medical Concept Mapper, a tool designed to facilitate access to online medical information sources by providing users with appropriate medical search terms for their personal queries. Our system is valuable for patients whose knowledge of medical vocabularies is inadequate to find the desired information, and for medical experts who search for information outside their field of expertise. The Medical Concept Mapper maps synonyms and semantically related concepts to a user's query. The system is unique because it integrates our natural language processing tool, i.e., the Arizona (AZ) Noun Phraser, with human-created …


Meeting Medical Terminology Needs: The Ontology-Enhanced Medical Concept Mapper, Gondy Leroy, Hsinchun Chen Dec 2001

Meeting Medical Terminology Needs: The Ontology-Enhanced Medical Concept Mapper, Gondy Leroy, Hsinchun Chen

CGU Faculty Publications and Research

This paper describes the development and testing of the Medical Concept Mapper, a tool designed to facilitate access to online medical information sources by providing users with appropriate medical search terms for their personal queries. Our system is valuable for patients whose knowledge of medical vocabularies is inadequate to find the desired information, and for medical experts who search for information outside their field of expertise. The Medical Concept Mapper maps synonyms and semantically related concepts to a user's query. The system is unique because it integrates our natural language processing tool, i.e., the Arizona (AZ) Noun Phraser, with human-created …


Image Magnification Using Level-Set Reconstruction, Bryan S. Morse, Duane Schwartzwald Dec 2001

Image Magnification Using Level-Set Reconstruction, Bryan S. Morse, Duane Schwartzwald

Faculty Publications

Image magnification is a common problem in imaging applications, requiring interpolation to “read between the pixels”. Although many magnification/interpolation algorithms have been proposed in the literature, all methods must suffer to some degree the effects of impefect reconstruction―false high-frequency content introduced by the underlying original sampling. Most often, these effects manifest themselves as jagged contours in the image. This paper presents a method for constrained smoothing of such artifacts that attempts to produce smooth reconstructions of the image’s level curves while still maintaining image fidelity. This is similar to other iterative reconstruction algorithms and to Bayesian restoration techniques, but instead …


Fast Focal Length Solution In Partial Panoramic Image Stitching, William A. Barrett, Kirk L. Duffin Dec 2001

Fast Focal Length Solution In Partial Panoramic Image Stitching, William A. Barrett, Kirk L. Duffin

Faculty Publications

Accurate estimation of effective camera focal length is crucial to the success of panoramic image stitching. Fast techniques for estimating the focal length exist, but are dependent upon a close initial approximation or the existence of a full circle panoramic image sequence. Numerical solutions of the focal length demonstrate strong coupling between the focal length and the angles used to position each component image about the common spherical center. This paper demonstrates that parameterizing panoramic image positions using spherical arc length instead of angles effectively decouples the focal length Ji.om the image position. This new parameterization does not require an …


Houghing The Hough: Peak Collection For Detection Of Corners, Junctions And Line Intersections, William A. Barrett, Kevin D. Petersen Dec 2001

Houghing The Hough: Peak Collection For Detection Of Corners, Junctions And Line Intersections, William A. Barrett, Kevin D. Petersen

Faculty Publications

We exploit the Accumulator Array of the Hough Transform by finding collections of (2 or more) peaks through which a given sinusoid will pass. Such sinusoids identify points in the original image where lines intersect. Peak collection (or line aggregation) is performed by making a second pass through the edge map, but instead of laying points down in the accumulator array (as with the original Hough Transform), we compute the line integral over each sinusoid that corresponds to the current edge point. If a sinusoid passes through greater than or equal to 2 peaks, we deposit that sum/integral into a …


Automated Online News Classification With Personalization, Chee-Hong Chan, Aixin Sun, Ee Peng Lim Dec 2001

Automated Online News Classification With Personalization, Chee-Hong Chan, Aixin Sun, Ee Peng Lim

Research Collection School Of Computing and Information Systems

Classification of online news, in the past, has often been done manually. In our proposed Categorizor system, we have experimented an automated approach to classify online news using the Support Vector Machine (SVM). SVM has been shown to deliver good classification results when ample training documents are given. In our research, we have applied SVM to personalized classification of online news.


A Confidence Measure For Boundary Detection And Object Selection, William A. Barrett, Eric N. Mortensen Dec 2001

A Confidence Measure For Boundary Detection And Object Selection, William A. Barrett, Eric N. Mortensen

Faculty Publications

We introduce a confidence measure that estimates the assurance that a graph arc (or edge) corresponds to an object boundary in an image. A weighted, planar graph is imposed onto the watershed lines of a gradient magnitude image and the confidence measure is a function of the cost of fixed-length paths emanating from and extending to each end of a graph arc. The confidence measure is applied to automate the detection of object boundaries and thereby reduces (often greatly) the time and effort required for object boundary definition within a user-guided image segmentation environment.


Development Of A Systems Engineering Model Of The Chemical Separations Process: Quarterly Progress Report 8/16/01- 11/15/01, Yitung Chen, Randy Clarksean, Darrell Pepper Nov 2001

Development Of A Systems Engineering Model Of The Chemical Separations Process: Quarterly Progress Report 8/16/01- 11/15/01, Yitung Chen, Randy Clarksean, Darrell Pepper

Separations Campaign (TRP)

The AAA program is developing technology for the transmutation of nuclear waste to address many of the long-term disposal issues. An integral part of this program is the proposed chemical separations scheme.

Two activities are proposed in this Phase I task: the development of a systems engineering model and the refinement of the Argonne code AMUSE (Argonne Model for Universal Solvent Extraction). The detailed systems engineering model is the start of an integrated approach to the analysis of the materials separations associated with the AAA Program. A second portion of the project is to streamline and improve an integral part …


The Vacuum Buffer, Voicu Popescu Nov 2001

The Vacuum Buffer, Voicu Popescu

Link Foundation Modeling, Simulation and Training Fellowship Reports

Image-based rendering (IBR) techniques have the potential of alleviating some of the bottlenecks of traditional geometry-based rendering such as modeling difficulty and prohibitive cost of photorealism. One of the most appealing IBR approaches uses images enhanced with per-pixel depth and creates new views by 3D warping (IBRW). Modeling a scene with depth images lets one automatically capture intricate details, which are hard to model conventionally. Also, rendering from such representations has the potential of being efficient since it seems that the number of samples that need to be warped is independent of the scene complexity and is just a fraction …


Ecdis Development Laboratory And Navigation Technology Demonstration Center, Lee Alexander, Maxim F. Van Norden, Charles M. Fralick Nov 2001

Ecdis Development Laboratory And Navigation Technology Demonstration Center, Lee Alexander, Maxim F. Van Norden, Charles M. Fralick

Center for Coastal and Ocean Mapping

The U.S. Navy is undergoing a major transition from traditional, paper chart navigation to computer-based electronic charting. The Chief of Naval Operations (CNO) has mandated that all Navy ships will navigate strictly through electronic means by FY07. However, due to some recent groundings, the Navy is now striving to accelerate the full implementation of electronic navigation by FY04. The Naval Oceanographic Office (NAVOCEANO) is making a concerted effort to support this transition with upgrades to state-of-the-art survey ships, instrumentation, and data processing equipment. NAVOCEANO is increasing its capability to rapidly collect and process hydrographic survey data, and to quickly produce …


Component-Based Software Development, Luiz Fernando Capretz, Miriam Capretz, Dahai Li Nov 2001

Component-Based Software Development, Luiz Fernando Capretz, Miriam Capretz, Dahai Li

Electrical and Computer Engineering Publications

Component-based software development (CBSD) strives to achieve a set of pre-built, standardized software components available to fit a specific architectural style for some application domain; the application is then assembled using these components. Component-based software reusability will be at the forefront of software development technology in the next few years. This paper describes a software life cycle that supports component-based development under an object-oriented framework. Development time versus software life cycle phases, which is an important assessment of the component-based development model put forward, is also mentioned.


Parameter Synthesis Of Higher Kinematic Planars, Min-Ho Kyung, Elisha Sacks Nov 2001

Parameter Synthesis Of Higher Kinematic Planars, Min-Ho Kyung, Elisha Sacks

Department of Computer Science Technical Reports

No abstract provided.


A Round Trip Time And Timeout Aware Traffic Conditioner For Differentiated Services Networks, Ahsan Habib, Bharat Bhargava, Sonia Fahmy Nov 2001

A Round Trip Time And Timeout Aware Traffic Conditioner For Differentiated Services Networks, Ahsan Habib, Bharat Bhargava, Sonia Fahmy

Department of Computer Science Technical Reports

No abstract provided.


Knowledge Discovery In Biological Datasets Using A Hybrid Bayes Classifier/Evolutionary Algorithm, Michael L. Raymer, Leslie A. Kuhn, William F. Punch Nov 2001

Knowledge Discovery In Biological Datasets Using A Hybrid Bayes Classifier/Evolutionary Algorithm, Michael L. Raymer, Leslie A. Kuhn, William F. Punch

Kno.e.sis Publications

A key element of bioinformatics research is the extraction of meaningful information from large experimental data sets. Various approaches, including statistical and graph theoretical methods, data mining, and computational pattern recognition, have been applied to this task with varying degrees of success. We have previously shown that a genetic algorithm coupled with a k-nearest-neighbors classifier performs well in extracting information about protein-water binding from X-ray crystallographic protein structure data. Using a novel classifier based on the Bayes discriminant function, we present a hybrid algorithm that employs feature selection and extraction to isolate salient features from large biological data sets. The …


Profile Combinatorics For Fragment Selection In Comparative Protein Structure Modeling, Deacon Sweeney, Travis E. Doom, Michael L. Raymer Nov 2001

Profile Combinatorics For Fragment Selection In Comparative Protein Structure Modeling, Deacon Sweeney, Travis E. Doom, Michael L. Raymer

Kno.e.sis Publications

Sequencing of the human genome was a great stride towards modeling cellular complexes, massive systems whose key players are proteins and DNA. A major bottleneck limiting the modeling process is structure and function annotation for the new genes. Contemporary protein structure prediction algorithms represent the sequence of every protein of known structure with a profile to which the profile of a protein sequence of unknown structure is compared for recognition. We propose a novel approach to increase the scope and resolution of protein structure profiles. Our technique locates equivalent regions among the members of a structurally similar fold family, and …


Polygonal Chains Cannot Lock In 4d, Roxana Cocan, Joseph O'Rourke Nov 2001

Polygonal Chains Cannot Lock In 4d, Roxana Cocan, Joseph O'Rourke

Computer Science: Faculty Publications

We prove that, in all dimensions d ≥ 4, every simple open polygonal chain and every tree may be straightened, and every simple closed polygonal chain may be convexified. These reconfigurations can be achieved by algorithms that use polynomial time in the number of vertices, and result in a polynomial number of “moves.” These results contrast to those known for d = 2, where trees can “lock,” and for d = 3, where open and closed chains can lock.


Using Ssm Proxies To Provide Efficient Multiple-Source Multicast Delivery, Daniel Zappala, Aaron Fabbri Nov 2001

Using Ssm Proxies To Provide Efficient Multiple-Source Multicast Delivery, Daniel Zappala, Aaron Fabbri

Faculty Publications

We consider the possibility that single-source multicast (SSM) will become a universal multicast service, enabling large-scale distribution of content from a few well-known sources to a general audience. Operating under this assumption, we explore the problem of building the traditional IP model of any-source multicast on top of SSM. Toward this end, we design an SSM proxy service that allows any sender to efficiently deliver content to a multicast group. We demonstrate the performance improvements this service offers over standard SSM and describe extensions for access control, dynamic proxy discovery, and multicast proxy distribution.


Pickup And Delivery Problem With Time Windows: Algorithms And Test Case Generation, Hoong Chuin Lau, Zhe Liang Nov 2001

Pickup And Delivery Problem With Time Windows: Algorithms And Test Case Generation, Hoong Chuin Lau, Zhe Liang

Research Collection School Of Computing and Information Systems

In the pickup and delivery problem with time windows (PDPTW), vehicles have to transport loads from origins to destinations respecting capacity and time constraints. In this paper, we present a two-phase method to solve the PDPTW. In the first phase, we apply a novel construction heuristics to generate an initial solution. In the second phase, a tabu search method is proposed to improve the solution. Another contribution of this paper is a strategy to generate good problem instances and benchmarking solutions for PDPTW, based on Solomon's benchmark test cases for VRPTW. Experimental results show that our approach yields very good …


Privacy Protection For Transactions Of Digital Goods, Feng Bao, Robert H. Deng Nov 2001

Privacy Protection For Transactions Of Digital Goods, Feng Bao, Robert H. Deng

Research Collection School Of Computing and Information Systems

In this paper we study the problem of how to protect users’ privacy in web transactions of digital goods. In particular, we introduce a system which allows a user to disclose his/her identity information (such as user account or credit card number) to a web site in exchange for a digital item, but prevents the web site from learning which specific item the user intends to obtain. The problem concerned here is orthogonal to the problem of anonymous transactions [RSG98], [RR98] but commensurate with the general problem of PIR (private information retrieval) [CGK95]


Hierarchical Text Classification And Evaluation, Aixin Sun, Ee Peng Lim Nov 2001

Hierarchical Text Classification And Evaluation, Aixin Sun, Ee Peng Lim

Research Collection School Of Computing and Information Systems

Hierarchical Classification refers to assigning of one or more suitable categories from a hierarchical category space to a document. While previous work in hierarchical classification focused on virtual category trees where documents are assigned only to the leaf categories, we propose atop-down level-based classification method that can classify documents to both leaf and internal categories. As the standard performance measures assume independence between categories, they have not considered the documents incorrectly classified into categories that are similar or not far from the correct ones in the category tree. We therefore propose the Category-Similarity Measures and Distance-Based Measures to consider the …


Computational Geometry Column 42, Joseph S. B. Mitchell, Joseph O'Rourke Oct 2001

Computational Geometry Column 42, Joseph S. B. Mitchell, Joseph O'Rourke

Computer Science: Faculty Publications

A compendium of thirty previously published open problems in computational geometry is presented.


Sensor-Assisted Video Mosaicing For Seafloor Mapping, Yuri Rzhanov, Randy G. Cutter Jr., Lloyd C. Huff Oct 2001

Sensor-Assisted Video Mosaicing For Seafloor Mapping, Yuri Rzhanov, Randy G. Cutter Jr., Lloyd C. Huff

Center for Coastal and Ocean Mapping

This paper discusses a proposed processing technique for combining video imagery with auxiliary sensor information. The latter greatly simplifies image processing by reducing complexity of the transformation model. The mosaics produced by this technique are adequate for many applications, in particular habitat mapping. The algorithm is demonstrated through simulations and hardware configuration is described.


Convergence Classes And Spaces Of Partial Functions, Anthony K. Seda, Roland Heinze, Pascal Hitzler Oct 2001

Convergence Classes And Spaces Of Partial Functions, Anthony K. Seda, Roland Heinze, Pascal Hitzler

Computer Science and Engineering Faculty Publications

We study the relationship between convergence spaces and convergence classes given by means of both nets and filters, we consider the duality between them and we identify in convergence terms when a convergence space coincides with a convergence class. We examine the basic operators in the Vienna Development Method of formal systems development, namely, extension, glueing, restriction, removal and override, from the perspective of the Logic for Computable Functions. Thus, we examine in detail the Scott continuity, or otherwise, of these operators when viewed as operators on the domain (XY) of partial functions mapping X into …


Solar: Towards A Flexible And Scalable Data-Fusion Infrastructure For Ubiquitous Computing, Guanling Chen, David Kotz Oct 2001

Solar: Towards A Flexible And Scalable Data-Fusion Infrastructure For Ubiquitous Computing, Guanling Chen, David Kotz

Dartmouth Scholarship

As we embed more computers into our daily environment, ubiquitous computing promises to make them less noticeable and to avoid information overload. We see, however, few ubiquitous applications that are able to adapt to the dynamics of user, physical, and computational context. The challenge is to allow applications flexible access to these sources, and yet scale to thousands of devices and sensors. In this paper we introduce our proposed infrastructure, Solar. In Solar, information sources produce events. Applications may subscribe to interesting sources directly, or they may instantiate and subscribe to a tree of operators that filter, transform, merge and …