Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

Computer Sciences

Theses/Dissertations

2012

Institution
Keyword
Publication

Articles 1 - 30 of 579

Full-Text Articles in Physical Sciences and Mathematics

Software Requirements As Executable Code, Karen Eileen Wasielewski Morand Dec 2012

Software Requirements As Executable Code, Karen Eileen Wasielewski Morand

Regis University Student Publications (comprehensive collection)

This project analyzed the effectiveness of using Story Testing frameworks to create an application directly from user specifications. It did this by taking an example business application with "traditional" specifications and rewriting those specifications in three different Story Testing Frameworks - Cucumber, FitNesse, and JBehave. Analysis of results drew the following conclusions: 1) Story Testing can help prove a project's completeness, 2) Specifications are still too technical, 3) Implementation is not overly complex, and 4) Story Testing is worth it. It proposed future research around evaluating natural languages and seeking more user-friendly ways of writing specifications in a natural language.


Application Of Web Mashup Technology To Oyster Information Services, Christian Chuindja Ngniah Dec 2012

Application Of Web Mashup Technology To Oyster Information Services, Christian Chuindja Ngniah

University of New Orleans Theses and Dissertations

Web mashup is a lightweight technology used to integrate data from remote sources without direct access to their databases. As a data consumer, a Web mashup application creates new contents by retrieving data through the Web application programming interface (API) provided by the external sources. As a data provider, the service program publishes its Web API and implements the specified functions.

In the project reported by this thesis, we have implemented two Web mashup applications to enhance the Web site oystersentinel.org: the Perkinsus marinus model and the Oil Spill model. Each model overlay geospatial data from a local database …


Application Of Digital Forensic Science To Electronic Discovery In Civil Litigation, Brian Roux Dec 2012

Application Of Digital Forensic Science To Electronic Discovery In Civil Litigation, Brian Roux

University of New Orleans Theses and Dissertations

Following changes to the Federal Rules of Civil Procedure in 2006 dealing with the role of Electronically Stored Information, digital forensics is becoming necessary to the discovery process in civil litigation. The development of case law interpreting the rule changes since their enactment defines how digital forensics can be applied to the discovery process, the scope of discovery, and the duties imposed on parties. Herein, pertinent cases are examined to determine what trends exist and how they effect the field. These observations buttress case studies involving discovery failures in large corporate contexts along with insights on the technical reasons those …


Methodology And Automated Metadata Extraction From Multiple Volume Shadow Copies, Henri Michael Van Goethem Dec 2012

Methodology And Automated Metadata Extraction From Multiple Volume Shadow Copies, Henri Michael Van Goethem

Masters Theses, 2010-2019

Modern day digital forensics investigations rely on timelines as a principal method for normalizing and chronologically categorizing artifacts recovered from computer systems. Timelines provide investigators with a chronological representation of digital evidence so they can depict altered and unaltered digital forensics data in-context to drive conclusions about system events and/or user activities. While investigators rely on many system artifacts such as file system time/date stamps, operating system artifacts, program artifacts, logs, and/or registry artifacts as input for deriving chronological representations, using only the available or most recent version of the artifacts may provide a limited picture of historical changes on …


Investigating Volatility Trends Of Silver Through An Analysis Of Stock Options Prices, Dylan Houston Dec 2012

Investigating Volatility Trends Of Silver Through An Analysis Of Stock Options Prices, Dylan Houston

Honors Theses

Volatility is a statistical measure that describes the amount of fluctuation in prices for a given investment; generally, the higher the volatility for an investment, the riskier it is perceived to be. Traders study volatility history so that they can make informed decisions on how to invest capital. The purpose of this article is to analyze implied volatility values, which are derived from the investment's price and are considered the market's estimate of the investment's actual volatility, for silver electronically traded fund (ETF) options in periods of both high and low price movement. In doing so, we desired to see …


Connotational Subtyping And Runtime Class Mutability In Ruby, Ian S. Dillon Dec 2012

Connotational Subtyping And Runtime Class Mutability In Ruby, Ian S. Dillon

Electronic Theses and Dissertations

Connotational subtyping is an approach to typing that allows an object's type to change dynamically, following changes to the object's internal state. This allows for a more precise representation of a problem domain with logical objects that have variable behavior. Two approaches to supporting connotational subtyping in the Ruby programming language were implemented: a language-level implementation using pure Ruby and a modification to the Ruby 1.8.7 interpreter. While neither implementation was wholly successful the language level implementation created complications with reflective language features like self and super and, while Ruby 1.8.7 has been obsoleted by Ruby 1.9 (YARV), the results …


Data Integration: A Case Study In The Financial Services Industry, Louis Epie Dec 2012

Data Integration: A Case Study In The Financial Services Industry, Louis Epie

Regis University Student Publications (comprehensive collection)

Current economic conditions result in banks participating in multiple mergers and acquisitions. This results in banks inheriting silo and heterogeneous systems. For banks to remain competitive, they must create a strategy to integrate data from these acquired systems in a dynamic, efficient, and consumable manner. This research considers a case study of a large financial services company that has successfully integrated data from different sources. In addition this research investigates endeavors that experts in the field have undertaken to develop architectures that address the problems with data integration and appropriate solutions.


Automatic Classification Of Epilepsy Lesions, Junwei Sun Dec 2012

Automatic Classification Of Epilepsy Lesions, Junwei Sun

Electronic Thesis and Dissertation Repository

Epilepsy is a common and diverse set of chronic neurological disorders characterized by seizures. Epileptic seizures result from abnormal, excessive or hypersynchronous neuronal activity in the brain. Seizure types are organized firstly according to whether the source of the seizure within the brain is localized or distributed. In this work, our objective is to validate the use of MRI (Magnetic Resonance Imaging) for localizing seizure focus for improved surgical planning. We apply computer vision and machine learning techniques to tackle the problem of epilepsy lesion classification. First datasets of digitized histology images from brain cortexes of different patients are obtained …


Automatic Foreground Initialization For Binary Image Segmentation, Wei Li Dec 2012

Automatic Foreground Initialization For Binary Image Segmentation, Wei Li

Electronic Thesis and Dissertation Repository

Foreground segmentation is a fundamental problem in computer vision. A popular approach for foreground extraction is through graph cuts in energy minimization framework. Most existing graph cuts based image segmentation algorithms rely on user’s initialization. In this work, we aim to find an automatic initialization for graph cuts. Unlike many previous methods, no additional training dataset is needed. Collecting a training set is not only expensive and time consuming, but it also may bias the algorithm to the particular data distribution of the collected dataset. We assume that the foreground differs significantly from the background in some unknown feature space …


An Outcome-Based Approach For Ensuring Regulatory Compliance Of Business Processes, Quanjun Yin Dec 2012

An Outcome-Based Approach For Ensuring Regulatory Compliance Of Business Processes, Quanjun Yin

Electronic Thesis and Dissertation Repository

In service industries, such as healthcare, catering, tourism, etc., there exist regulations that require organisations’ service comply with the regulations. More and more regulations in the service sector are, or are aimed to be, outcome-focused regulations. An outcome prescribed in the regulation is what users should experience or achieve when the regulated business processes are compliant. Service providers need to proactively ensure that the outcomes specified in the regulations have been achieved prior to conducting the relevant part of the business or prior to inspectors discovering noncompliance. Current approaches check system requirements or business processes, not outcomes, against regulations and …


Hybrid Solvers For The Boolean Satisfiability Problem: An Exploration, Nicole Nelson Dec 2012

Hybrid Solvers For The Boolean Satisfiability Problem: An Exploration, Nicole Nelson

Theses and Dissertations

The Boolean Satisfiability problem (SAT) is one of the most extensively researched NP-complete problems in Computer Science. This thesis focuses on the design of feasible solvers for this problem. A SAT problem instance is a formula in propositional logic. A SAT solver attempts to find a solution for the formula. Our research focuses on a newer solver paradigm, hybrid solvers, where two solvers are combined in order to gain the benefits from both solvers in the search for a solution. Our hybrid solver, AmbSAT, combines two well-known solvers: the systematic Davis-Putnam-Logemann-Loveland solver (DPLL) and the stochastic WalkSAT solver. AmbSAT's design …


Charateristics And Impact Of Interpersonal Conflicts On Requirements Risks, Avinder Walia Dec 2012

Charateristics And Impact Of Interpersonal Conflicts On Requirements Risks, Avinder Walia

Electronic Thesis and Dissertation Repository

Interpersonal conflicts in software projects have an impact on project’s success, product’s quality, team’s performance, etc. However, in Requirements Engineering (RE), there is dearth of research on this topic; previous research has focused largely on conflicts among requirements. We conducted a case study of an industrial project to determine the characteristics (e.g., type, severity, conflict management styles, etc.) and impact of interpersonal conflicts rooted in RE (RE-Conflicts), on project risks associated with requirements (e..g., inadequately identified requirements, incorrect requirements, etc). The findings show that the conflicts over administrative procedures (47%) had the highest frequency count. The highest number of …


A Guide To Documenting Software Design For Maximum Software Portability For Software Defined Radios, Joseph Snively Dec 2012

A Guide To Documenting Software Design For Maximum Software Portability For Software Defined Radios, Joseph Snively

Regis University Student Publications (comprehensive collection)

The use of software defined communications systems is growing incredibly fast. The field of software engineering as a discipline has not adequately addressed the subject of software portability which makes large and costly software development efforts less ready to port to future platforms. By understanding the causes of portability problems, they can either be avoided altogether in development or very well documented so that they are easier to overcome in future efforts. Literature, case studies, and surveys are used to collect opinions and information about large software programs where portability is a desirable characteristic in order to best establish the …


Improvements On Seeding Based Protein Sequence Similarity Search, Weiming Li Dec 2012

Improvements On Seeding Based Protein Sequence Similarity Search, Weiming Li

Electronic Thesis and Dissertation Repository

The primary goal of bioinformatics is to increase an understanding in the biology of organisms. Computational, statistical, and mathematical theories and techniques have been developed on formal and practical problems that assist to achieve this primary goal. For the past three decades, the primary application of bioinformatics has been biological data analysis. The DNA or protein sequence similarity search is perhaps the most common, yet vitally important task for analyzing biological data.

The sequence similarity search is a process of finding optimal sequence alignments. On the theoretical level, the problem of sequence similarity search is complex. On the applicational level, …


Error Correction In Next Generation Dna Sequencing Data, Michael Z. Molnar Dec 2012

Error Correction In Next Generation Dna Sequencing Data, Michael Z. Molnar

Electronic Thesis and Dissertation Repository

Motivation: High throughput Next Generation Sequencing (NGS) technologies can sequence the genome of a species quickly and cheaply. Errors that are introduced by NGS technologies limit the full potential of the applications that rely on their data. Current techniques used to correct these errors are not sufficient, and a more efficient and accurate program is needed to correct errors.

Results: We have designed and implemented RACER (Rapid Accurate Correction of Errors in Reads), an error correction program that targets the Illumina genome sequencer, which is currently the dominant NGS technology. RACER combines advanced data structures with an intricate analysis of …


A New Algorithm For De Novo Genome Assembly, Md. Bahlul Haider Dec 2012

A New Algorithm For De Novo Genome Assembly, Md. Bahlul Haider

Electronic Thesis and Dissertation Repository

The enormous amount of short reads produced by next generation sequencing (NGS) techniques such as Roche/454, Illumina/Solexa and SOLiD sequencing opened the possibility of de novo genome assembly. Some of the de novo genome assemblers (e.g., Edena, SGA) use an overlap graph approach to assemble a genome, while others (e.g., ABySS and SOAPdenovo) use a de Bruijn graph approach. Currently, the approaches based on the de Bruijn graph are the most successful, yet their performance is far from being able to assemble entire genomic sequences. We developed a new overlap graph based genome assembler called Paired-End Genome ASsembly Using Short-sequences …


Website Adaptive Navigation Effects On User Experiences, James C. Speirs Dec 2012

Website Adaptive Navigation Effects On User Experiences, James C. Speirs

Theses and Dissertations

The information search process within a website can often be frustrating and confusing for website visitors. Navigational structures are often complex and multitiered, hiding links with several layers of navigation that user's might be interested in. Poor navigation causes user frustration. Adaptive navigation can be used to improve the user's navigational experience by flattening the navigational structure and reducing the number of accessible links to only those that the user would be interested in. This examines the effects on a user's navigational experience, of using adaptive navigation as the main navigational structure on a website. This study measured these effects …


Necessary And Sufficient Informativity Conditions For Robust Network Reconstruction Using Dynamical Structure Functions, Vasu Nephi Chetty Dec 2012

Necessary And Sufficient Informativity Conditions For Robust Network Reconstruction Using Dynamical Structure Functions, Vasu Nephi Chetty

Theses and Dissertations

Dynamical structure functions were developed as a partial structure representation of linear time-invariant systems to be used in the reconstruction of biological networks. Dynamical structure functions contain more information about structure than a system's transfer function, while requiring less a priori information for reconstruction than the complete computational structure associated with the state space realization. Early sufficient conditions for network reconstruction with dynamical structure functions severely restricted the possible applications of the reconstruction process to networks where each input independently controls a measured state. The first contribution of this thesis is to extend the previously established sufficient conditions to incorporate …


Exploring Computational Chemistry On Emerging Architectures, David Dewayne Jenkins Dec 2012

Exploring Computational Chemistry On Emerging Architectures, David Dewayne Jenkins

Doctoral Dissertations

Emerging architectures, such as next generation microprocessors, graphics processing units, and Intel MIC cards, are being used with increased popularity in high performance computing. Each of these architectures has advantages over previous generations of architectures including performance, programmability, and power efficiency. With the ever-increasing performance of these architectures, scientific computing applications are able to attack larger, more complicated problems. However, since applications perform differently on each of the architectures, it is difficult to determine the best tool for the job. This dissertation makes the following contributions to computer engineering and computational science. First, this work implements the computational chemistry variational …


Parallel For Loops On Heterogeneous Resources, Frederick Edward Weber Dec 2012

Parallel For Loops On Heterogeneous Resources, Frederick Edward Weber

Doctoral Dissertations

In recent years, Graphics Processing Units (GPUs) have piqued the interest of researchers in scientific computing. Their immense floating point throughput and massive parallelism make them ideal for not just graphical applications, but many general algorithms as well. Load balancing applications and taking advantage of all computational resources in a machine is a difficult challenge, especially when the resources are heterogeneous. This dissertation presents the clUtil library, which vastly simplifies developing OpenCL applications for heterogeneous systems. The core focus of this dissertation lies in clUtil's ParallelFor construct and our novel PINA scheduler which can efficiently load balance work onto multiple …


Dynamic Task Execution On Shared And Distributed Memory Architectures, Asim Yarkhan Dec 2012

Dynamic Task Execution On Shared And Distributed Memory Architectures, Asim Yarkhan

Doctoral Dissertations

Multicore architectures with high core counts have come to dominate the world of high performance computing, from shared memory machines to the largest distributed memory clusters. The multicore route to increased performance has a simpler design and better power efficiency than the traditional approach of increasing processor frequencies. But, standard programming techniques are not well adapted to this change in computer architecture design.

In this work, we study the use of dynamic runtime environments executing data driven applications as a solution to programming multicore architectures. The goals of our runtime environments are productivity, scalability and performance. We demonstrate productivity by …


Towards A Unification Of Supercomputing, Molecular Dynamics Simulation And Experimental Neutron And X-Ray Scattering Techniques, Benjamin Lindner Dec 2012

Towards A Unification Of Supercomputing, Molecular Dynamics Simulation And Experimental Neutron And X-Ray Scattering Techniques, Benjamin Lindner

Doctoral Dissertations

Molecular dynamics simulation has become an essential tool for scientific discovery and investigation. The ability to evaluate every atomic coordinate for each time instant sets it apart from other methodologies, which can only access experimental observables as an outcome of the atomic coordinates. Here, the utility of molecular dynamics is illustrated by investigating the structure and dynamics of fundamental models of cellulose fibers. For that, a highly parallel code has been developed to compute static and dynamical scattering functions efficiently on modern supercomputing architectures. Using state of the art supercomputing facilities, molecular dynamics code and parallelization strategies, this work also …


Volatile Memory Message Carving: A "Per Process Basis" Approach, Aisha Ibrahim Ali-Gombe Dec 2012

Volatile Memory Message Carving: A "Per Process Basis" Approach, Aisha Ibrahim Ali-Gombe

University of New Orleans Theses and Dissertations

The pace at which data and information transfer and storage has shifted from PCs to mobile devices is of great concern to the digital forensics community. Android is fast becoming the operating system of choice for these hand-held devices, hence the need to develop better forensic techniques for data recovery cannot be over-emphasized. This thesis analyzes the volatile memory for Motorola Android devices with a shift from traditional physical memory extraction to carving residues of data on a “per process basis”. Each Android application runs in a separate process within its own Dalvik Virtual Machine (JVM) instance, thus, the proposed …


Health Care Informatics Support Of A Simulated Study, Zeinab Salari Far Dec 2012

Health Care Informatics Support Of A Simulated Study, Zeinab Salari Far

Theses and Dissertations

The objective of this project is to assess the value of REDCap (Harris, 2009) by conducting a simulated breast cancer clinical trial and demonstration. REDCap is a free, secure, web-based application designed to support data capture for research studies. To assess REDCap's value, we conducted a simulation of a clinical trial study designed to compare the use of two new technologies for breast cancer diagnosis and treatment with current best practice breast cancer diagnosis and treatment. We call the trial, "Real-Time Operating Room BC Diagnostic Treatment (RORBCDT)". The RORBCDT clinical trial is designed to assess the value of a new …


A Flexible Consent Management System For Master Person Indices, Aditya Pakalapati Dec 2012

A Flexible Consent Management System For Master Person Indices, Aditya Pakalapati

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

In healthcare, a Master Person Index (MPI) is a system that integrates information of individual from multiple data sources. To ensure confidentiality, such systems, particularly in healthcare, need to respect individual and organizational constraints on the sharing of data. This report describes a reusable consent management system that enforces such constraints and how it has been tested in the context of the Utah Department of Health (UDOH) MPI for public health.


Effective Use Of Interactive Learning Modules In Classroom Study For Computer Science Education, Goldee Jamwal Dec 2012

Effective Use Of Interactive Learning Modules In Classroom Study For Computer Science Education, Goldee Jamwal

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

The National Science Foundation (NSF) is spending substantial resources to improve science, technology, engineering, and mathematics (STEM) education in the United States. The ultimate goal of these programs is to produce students with a better knowledge of math and science and who are more likely to pursue careers in STEM fields. Interactive learning modules can be used in the classroom environment for effective learning.

This study examines the learning preferences of Logan High School (located in Logan, Utah) students and evaluates the impacts of using interactive learning modules with classroom lectures compared to other traditional methods of teaching.


The Design And Implementation Of A Mobile Game Engine For The Android Platform, Jon Hammer Dec 2012

The Design And Implementation Of A Mobile Game Engine For The Android Platform, Jon Hammer

Computer Science and Computer Engineering Undergraduate Honors Theses

In this thesis, a two-dimensional game engine is proposed for the Android mobile platform that facilitates rapid development of those games by individual developers or hobbyists. The essential elements of game design are presented so as to introduce the reader to the concepts that are crucial for comprehension of the paper. A brief overview of the Android Operating System is also included for those unfamiliar with it. Three primary design goals are identified, and a prototype solution is described in detail. The prototype is then evaluated against those design goals to see how well it accomplishes each task. The results …


Web Based Virtual Environment For Education - S'Cape, Atul Thapliyal Dec 2012

Web Based Virtual Environment For Education - S'Cape, Atul Thapliyal

All Graduate Plan B and other Reports, Spring 1920 to Spring 2023

Simulations provide an environment to experiment safely, openly, and repeatedly for learning mastery. However, many simulation environments experienced within a classroom fail to include automated assessment components or automated data collection. Even when assessments are included, often they fail to account for the unpredictable nature of decision-making within a complex, 3D, open-ended simulation environment. Embedding assessments within a virtual simulation environment poses several challenges. First, the program must provide assessments aligned with educational requirements that will not take the learner cognitively “away” from their activities. Second, the program must not detract from the game-like experience that learners find engaging. Third, …


Validation Of Weak Form Thermal Analysis Algorithms Supporting Thermal Signature Generation, Elton Lewis Freeman Dec 2012

Validation Of Weak Form Thermal Analysis Algorithms Supporting Thermal Signature Generation, Elton Lewis Freeman

Masters Theses

Extremization of a weak form for the continuum energy conservation principle differential equation naturally implements fluid convection and radiation as flux Robin boundary conditions associated with unsteady heat transfer. Combining a spatial semi-discretization via finite element trial space basis functions with time-accurate integration generates a totally node-based algebraic statement for computing. Closure for gray body radiation is a newly derived node-based radiosity formulation generating piecewise discontinuous solutions, while that for natural-forced-mixed convection heat transfer is extracted from the literature. Algorithm performance, mathematically predicted by asymptotic convergence theory, is subsequently validated with data obtained in 24 hour diurnal field experiments for …


Exploration Of The Effectiveness Of Apple Ios Devices And Bluetooth Low-Energy Nodes In The Evaulation Of Stroke Patient Rehabilitation And Recovery, Robert Derveloy Dec 2012

Exploration Of The Effectiveness Of Apple Ios Devices And Bluetooth Low-Energy Nodes In The Evaulation Of Stroke Patient Rehabilitation And Recovery, Robert Derveloy

Masters Theses and Doctoral Dissertations

The goal of this study is to help establish a foundation for cost effective stroke patient telerehabilitation by examining the efficacy of off-the-shelf consumer motion sensors. This paper examines the practical implications of utilizing Apple iOS mobile devices and Variable Technology KORE NODE devices in the evaluation of stroke patient recovery. Several algorithms are proposed to handle the user positioning and cheating detection requirements of the Functional Reach Test (FRT), efficient scoring of the National Institutes of Health Stroke Scale (NIHSS) motor arm exams, and real-time fall detection monitoring.