Open Access. Powered by Scholars. Published by Universities.®

Computer Engineering Commons

Open Access. Powered by Scholars. Published by Universities.®

2012

Data Storage Systems

Institution
Keyword
Publication
Publication Type
File Type

Articles 1 - 25 of 25

Full-Text Articles in Computer Engineering

Amaethon – A Web Application For Farm Management And An Assessment Of Its Utility, Tyler Yero Dec 2012

Amaethon – A Web Application For Farm Management And An Assessment Of Its Utility, Tyler Yero

Master's Theses

Amaethon is a web application that is designed for enterprise farm management. It takes a job typically performed with spreadsheets, paper, or custom software and puts it on the web. Farm administration personnel may use it to schedule farm operations and manage their resources and equipment. A survey was con- ducted to assess Amaethon’s user interface design. Participants in the survey were two groups of students and a small group of agriculture professionals. Among other results, the survey indicated that a calendar interface inside Amaethon was preferred, and statistically no less effective, than a map interface. This is despite the …


Is Tech M&A Value-Additive?, Ani Deshmukh Nov 2012

Is Tech M&A Value-Additive?, Ani Deshmukh

Undergraduate Economic Review

Given rising M&A deal volume across all high-tech subsectors, the ability to measure post-acquisition performance becomes critical. Despite this growth, the relevant academic literature is severely lacking (Kohers and Kohers 2000). Using an event-study approach, I find that acquirers and targets both realize statistically significant day-0 abnormal returns (1.23% [p<0.1] and 8.1% [p<0.01], respectively). As positive stock returns signal positive growth prospects in a semi-strong efficient market, AR regressions found that firms' technological relatedness, deal financing, purchase price premiums, and the relative book to market ratio, explained most variance. Overall, high-tech transactions are value-additive for both targets and acquirers.


Fashionask: Pushing Community Answers To Your Fingertips, Wei Zhang, Lei Pang, Chong-Wah Ngo Nov 2012

Fashionask: Pushing Community Answers To Your Fingertips, Wei Zhang, Lei Pang, Chong-Wah Ngo

Research Collection School Of Computing and Information Systems

We demonstrate a multimedia-based question-answering system, named FashionAsk, by allowing users to ask questions referring to pictures snapped by mobile devices. Specifically, instead of asking verbose questions to depict visual instances, direct pictures are provided as part of questions. To answer these multi-modal questions, FashionAsk performs a large-scale instance search to infer the names of instances, and then matches with similar questions from communitycontributed QA websites as answers. The demonstration is conducted on a million-scale dataset of Web images and QA pairs in the domain of fashion products. Asking a multimedia question through FashionAsk can take as short as five …


Predicting Domain Adaptivity: Redo Or Recycle?, Ting Yao, Chong-Wah Ngo, Shiai Zhu Nov 2012

Predicting Domain Adaptivity: Redo Or Recycle?, Ting Yao, Chong-Wah Ngo, Shiai Zhu

Research Collection School Of Computing and Information Systems

Over the years, the academic researchers have contributed various visual concept classifiers. Nevertheless, given a new dataset, most researchers still prefer to develop large number of classifiers from scratch despite expensive labeling efforts and limited computing resources. A valid question is why not multimedia community “embrace the green” and recycle off-the-shelf classifiers for new dataset. The difficulty originates from the domain gap that there are many different factors that govern the development of a classifier and eventually drive its performance to emphasize certain aspects of dataset. Reapplying a classifier to an unseen dataset may end up GIGO (garbage in, garbage …


Community As A Connector: Associating Faces With Celebrity Names In Web Videos, Zhineng Chen, Chong-Wah Ngo, Juan Cao, Wei Zhang Nov 2012

Community As A Connector: Associating Faces With Celebrity Names In Web Videos, Zhineng Chen, Chong-Wah Ngo, Juan Cao, Wei Zhang

Research Collection School Of Computing and Information Systems

Associating celebrity faces appearing in videos with their names is of increasingly importance with the popularity of both celebrity videos and related queries. However, the problem is not yet seriously studied in Web video domain. This paper proposes a Community connected Celebrity Name-Face Association approach (CCNFA), where the community is regarded as an intermediate connector to facilitate the association. Specifically, with the names and faces extracted from Web videos, C-CNFA decomposes the association task into a three-step framework: community discovering, community matching and celebrity face tagging. To achieve the goal of efficient name-face association under this umbrella, algorithms such as …


Large Scale Processing And Storage Solution: Dna Safeguard Project, Eric Copp Oct 2012

Large Scale Processing And Storage Solution: Dna Safeguard Project, Eric Copp

Eric Copp

The DNA Safeguard project involves processing DNA sequence data in order to find nullomer sequences (non-existent short DNA sequences). While the fundamental algorithm for finding nullomer sequences is simple, it is complicated by the amount of data that must be handled. Four methods for handling terabytes of of data are investigated, single instance of a MySQL database, PVFS (Parallel Virtual File System), Hadoop, and a custom MPI (Message Passing Interface) program.


Contextualized Mobile Support For Learning By Doing In The Real World, Ray Bareiss, Natalie Linnell, Martin L. Griss Sep 2012

Contextualized Mobile Support For Learning By Doing In The Real World, Ray Bareiss, Natalie Linnell, Martin L. Griss

Martin L Griss

This research addresses the use of mobile devices with both embedded and external sensors to provide contextualized help, advice, and remediation to learners engaged in real-world learn-by-doing tasks. This work is situated within the context of learning a complex procedure, in particular emergency responders learning to conduct urban search and rescue operations. Research issues include the design and delivery of contextualized performance support and the inferring of learner actions and intentions from sensor data to ensure that the right support is delivered just in time, as it is relevant to what the learner is doing.


Hair Data Model: A New Data Model For Spatio-Temporal Data Mining, Abbas Madraky, Zulaiha Ali Othman, Abdul Razak Hamdan Sep 2012

Hair Data Model: A New Data Model For Spatio-Temporal Data Mining, Abbas Madraky, Zulaiha Ali Othman, Abdul Razak Hamdan

Abbas Madraky

Spatio-Temporal data is related to many of the issues around us such as satellite images, weather maps, transportation systems and so on. Furthermore, this information is commonly not static and can change over the time. Therefore the nature of this kind of data are huge, analysing data is a complex task. This research aims to propose an intermediate data model that can represented suitable for Spatio-Temporal data and performing data mining task easily while facing problem in frequently changing the data. In order to propose suitable data model, this research also investigate the analytical parameters, the structure and its specifications …


Active Storage And Ssd Caching In An Object Storage Environment, Michael T. Runde Aug 2012

Active Storage And Ssd Caching In An Object Storage Environment, Michael T. Runde

Master's Theses

The advancing performance and lowering costs required to implement additional processing power on system peripherals such as disk drives are increasingly allowing additional computing ability to be located directly on individual drives. Active Storage attempts to take advantage of this excess by moving some computationally intensive applications directly to the disk drives. This can remove the bottlenecks seen through interconnects between the drives and the CPU of an initiating system as well as remove the need for systems to handle these applications.

The contributions of this thesis are in two areas. The first is the development of a framework designed …


Measuring Merci: Exploring Data Mining Techniques For Examining Surgical Outcomes Of Stroke Patients, Matthew Ronald Mcnabb Aug 2012

Measuring Merci: Exploring Data Mining Techniques For Examining Surgical Outcomes Of Stroke Patients, Matthew Ronald Mcnabb

Masters Theses and Doctoral Dissertations

Mechanical Embolus Removal in Cerebral Ischemia (MERCI) has been supported by medical trials as an improved method of treating ischemic stroke past the safe window of time for administering clot-busting drugs, and was released for medical use in 2004. The importance of analyzing real-world data collected from MERCI clinical trials is key to providing insights on the effectiveness of MERCI. Most of the existing data analysis on MERCI results has thus far employed conventional statistical analysis techniques. To the best of the knowledge acquired in preliminary research, advanced data analytics and data mining techniques have not yet been systematically applied. …


Presentation On Optimized Still Image Batch Processing Of Special Collections Bound Monographs And Manuscripts Using Dng, Jpeg 2000, And Embedded Xmp Metadata, Michael J. Bennett Jun 2012

Presentation On Optimized Still Image Batch Processing Of Special Collections Bound Monographs And Manuscripts Using Dng, Jpeg 2000, And Embedded Xmp Metadata, Michael J. Bennett

UConn Library Presentations

No abstract provided.


Optimized Still Image Batch Processing Of Special Collections Bound Monographs And Manuscripts Using Dng, Jpeg 2000, And Embedded Xmp Metadata, Michael J. Bennett Jun 2012

Optimized Still Image Batch Processing Of Special Collections Bound Monographs And Manuscripts Using Dng, Jpeg 2000, And Embedded Xmp Metadata, Michael J. Bennett

Published Works

Batch still image processing is examined in the context of operational bound monographs and manuscripts reformatting. The scaling of overall workflows through the flexible use of Lightroom, Photoshop, VueScan, and Jhove on parametrically-edited raw DNG and batch-rendered JPEG 2000 files is surveyed. Potential gains in processing efficiency, in comprehensive device data capture and preservation, in adaptable master image repurposing capabilities, and in the smoother growth of the required large-scale digital storage capacities that surround such operational conversions are considered.


Beaglebone Webcam Server, Alexander Corcoran Jun 2012

Beaglebone Webcam Server, Alexander Corcoran

Computer Engineering

The Beaglebone Webcam Server is a Linux based IP webcam, based on an inexpensive ARM development board, which hosts its own web server to display the webcam feed. The server has the ability to either connect to a wired router, or to act as a wireless access point in order for users to connect and control its functions via any Wi-Fi enabled device.


Improvement Of Statistical Process Control At St. Jude Medical's Cardiac Manufacturing Facility, Christopher Lance Edwards Jun 2012

Improvement Of Statistical Process Control At St. Jude Medical's Cardiac Manufacturing Facility, Christopher Lance Edwards

Master's Theses

Sig sigma is a methodology where companies strive to reproduce results ending up having a 99.9996% chance their product will be void of defects. In order for companies to reach six sigma, statistical process control (SPC) needs to be introduced. SPC has many different tools associated with it, control charts being one of them. Control charts play a vital role in managing how a process is behaving. Control charts allow users to identify special causes, or shifts, and can therefore change the process to keep producing good products, free of defects.

There are many factories and manufacturing facilities having implemented …


A Scalable Inline Cluster Deduplication Framework For Big Data Protection, Yinjin Fu, Hong Jiang, Nong Xiao May 2012

A Scalable Inline Cluster Deduplication Framework For Big Data Protection, Yinjin Fu, Hong Jiang, Nong Xiao

CSE Technical Reports

Cluster deduplication has become a widely deployed technology in data protection services for Big Data to satisfy the requirements of service level agreement (SLA). However, it remains a great challenge for cluster deduplica- tion to strike a sensible tradeoff between the conflicting goals of scalable dedu- plication throughput and high duplicate elimination ratio in cluster systems with low-end individual secondary storage nodes. We propose Σ-Dedupe, a scalable inline cluster deduplication framework, as a middleware deployable in cloud da- ta centers, to meet this challenge by exploiting data similarity and locality to op- timize cluster deduplication in inter-node and intra-node scenarios, …


Millennium Systems On The Cloud: Experience On System Migration From Sun V490 To Vmware Platform, Yiu On Li, Jimmy Tsang Apr 2012

Millennium Systems On The Cloud: Experience On System Migration From Sun V490 To Vmware Platform, Yiu On Li, Jimmy Tsang

Hong Kong Innovative Users Group Meetings

No abstract provided.


The Censsis Web-Accessible Image Database System, Furong Yang, David Kaeli Apr 2012

The Censsis Web-Accessible Image Database System, Furong Yang, David Kaeli

David Kaeli

The Gordon-CenSSIS Web accessible Image Database System (CenSSIS-DB) is a scientific database that enables effective collaborative scientific data sharing and accelerates fundamental research. We describe a state-of-the-art system using the Oracle RDBMS and J2EE technologies to provide remote, Internet based data management. The system incorporates efficient submission and retrieval of images and metadata, indexing of metadata for efficient searching, and complex relational query capabilities.


Datalogger Sequence Execution Engine (Dsqee), Edmund Yingxiang Yee Apr 2012

Datalogger Sequence Execution Engine (Dsqee), Edmund Yingxiang Yee

Computer Engineering

The PolySat Research Group accepts projects from several companies that wish to use a CubeSat for some experiment. One of the projects called Intelligent Payload Experiment, or IPEX, needs software to interact with out system avionics. One of these software will be datalogger, which will be augmented from its original datalogging scheme to support sequentially execution of commands/algorithms that our client, Jet Propulsion Laboratory, or JPL, need. My part of the project explains the software design decisions behind datalogger.


Digital Content Preservation Across Domain Verticals, Soha Maad, Borislav Dimitrov Mar 2012

Digital Content Preservation Across Domain Verticals, Soha Maad, Borislav Dimitrov

Borislav D Dimitrov

The authors present a novel approach to develop scalable systems and services for preserving digital content generated from various application domains. The aim is to deliver an integrative scalable approach for digital content preservation across domain verticals. This would involve consolidating approaches for modeling document workflow, preserving the integrity of heterogeneous data, and developing robust and scalable tools for digital preservation ensuring interoperability across domains verticals. The authors consider various application domains including: healthcare, public, business and finance, media and performing art, and education. The authors focus on specific case studies of digital content preservation across the considered domain verticals. …


An Interactive Visualization Model For Analyzing Data Storage System Workloads, Steven Charubhat Pungdumri Mar 2012

An Interactive Visualization Model For Analyzing Data Storage System Workloads, Steven Charubhat Pungdumri

Master's Theses

The performance of hard disks has become increasingly important as the volume of data storage increases. At the bottom level of large-scale storage networks is the hard disk. Despite the importance of hard drives in a storage network, it is often difficult to analyze the performance of hard disks due to the sheer size of the datasets seen by hard disks. Additionally, hard drive workloads can have several multi-dimensional characteristics, such as access time, queue depth and block-address space. The result is that hard drive workloads are extremely diverse and large, making extracting meaningful information from hard drive workloads very …


Naked Object File System (Nofs): A Framework To Expose An Object-Oriented Domain Model As A File System, Joseph P. Kaylor, Konstantin Läufer, George K. Thiruvathukal Jan 2012

Naked Object File System (Nofs): A Framework To Expose An Object-Oriented Domain Model As A File System, Joseph P. Kaylor, Konstantin Läufer, George K. Thiruvathukal

Konstantin Läufer

We present Naked Objects File System (NOFS), a novel framework that allows a developer to expose a domain model as a file system by leveraging the Naked Objects design principle. NOFS allows a developer to construct a file system without having to understand or implement all details related to normal file systems development. In this paper we explore file systems frameworks and object-oriented frameworks in a historical context and present an example domain model using the framework. This paper is based on a fully-functional implementation that is distributed as free/open source software, including virtual machine images to demonstrate and study …


Neutrosophic Masses & Indeterminate Models Applications To Information Fusion, Florentin Smarandache Jan 2012

Neutrosophic Masses & Indeterminate Models Applications To Information Fusion, Florentin Smarandache

Branch Mathematics and Statistics Faculty and Staff Publications

In this paper we introduce the indeterminate models in information fusion, which are due either to the existence of some indeterminate elements in the fusion space or to some indeterminate masses. The best approach for dealing with such models is the neutrosophic logic.


The Practicality Of Cloud Computing, Xiaohua (Cindy) Li Jan 2012

The Practicality Of Cloud Computing, Xiaohua (Cindy) Li

Librarian Publications

Since its inception, cloud computing has become the current paradigm. Organizations of different size and type have embraced the concept because of its both technological and economic advantages. Sacred Heart University Library has recently published its newly designed website on the cloud. For a small academic library, what does it mean to put their online data on the cloud? This paper will analyze and discuss the advantages of cloud computing, and some potential obstacles created by it through the author’s observations. This paper hopes the uniqueness of the case will contribute to the improvement of cloud computing experience of other …


How Is M&S Interoperability Different From Other Interoperability Domains?, Andreas Tolk, Saikou Y. Diallo, Jose J. Padilla, Charles D. Turnitsa Jan 2012

How Is M&S Interoperability Different From Other Interoperability Domains?, Andreas Tolk, Saikou Y. Diallo, Jose J. Padilla, Charles D. Turnitsa

Computational Modeling & Simulation Engineering Faculty Publications

During every standard workshop or event, the examples of working interoperability solutions are used to motivate for 'plug and play' standards for M&S as well, like standardized batteries for electronics, or the use of XML to exchange data between heterogeneous systems. While these are successful applications of standards, they are off the mark regarding M&S interoperability. The challenge of M&S is that the product that needs to be made interoperable is not the service or the system alone, but the model behind it as well. The paper shows that the alignment of conceptualizations is the real problem that is not …


Erasure Techniques In Mrd Codes, Florentin Smarandache, W.B. Vasantha Kandasamy, R. Sujatha, R.S. Raja Durai Jan 2012

Erasure Techniques In Mrd Codes, Florentin Smarandache, W.B. Vasantha Kandasamy, R. Sujatha, R.S. Raja Durai

Branch Mathematics and Statistics Faculty and Staff Publications

In this book the authors study the erasure techniques in concatenated Maximum Rank Distance (MRD) codes. The authors for the first time in this book introduce the new notion of concatenation of MRD codes with binary codes, where we take the outer code as the RD code and the binary code as the inner code. The concatenated code consists of the codewords of the outer code expressed in terms of the alphabets of the inner code. These new class of codes are defined as CRM codes. This concatenation techniques helps one to construct any CRM code of desired minimum distance …