Open Access. Powered by Scholars. Published by Universities.®

Engineering Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 30 of 720

Full-Text Articles in Engineering

Speeding Up The Quantification Of Contrast Sensitivity Functions Using Multidimensional Bayesian Active Learning, Shohaib Shaffiey Aug 2022

Speeding Up The Quantification Of Contrast Sensitivity Functions Using Multidimensional Bayesian Active Learning, Shohaib Shaffiey

McKelvey School of Engineering Theses & Dissertations

No abstract provided.


Model-Based Deep Learning For Computational Imaging, Xiaojian Xu Aug 2022

Model-Based Deep Learning For Computational Imaging, Xiaojian Xu

McKelvey School of Engineering Theses & Dissertations

This dissertation addresses model-based deep learning for computational imaging. The motivation of our work is driven by the increasing interests in the combination of imaging model, which provides data-consistency guarantees to the observed measurements, and deep learning, which provides advanced prior modeling driven by data. Following this idea, we develop multiple algorithms by integrating the classical model-based optimization and modern deep learning to enable efficient and reliable imaging. We demonstrate the performance of our algorithms by validating their performance on various imaging applications and providing rigorous theoretical analysis.

The dissertation evaluates and extends three general frameworks, plug-and-play priors (PnP), regularized …


Development Of The Assessment Of Clinical Prediction Model Transportability (Apt) Checklist, Sean Chonghwan Yu Aug 2022

Development Of The Assessment Of Clinical Prediction Model Transportability (Apt) Checklist, Sean Chonghwan Yu

McKelvey School of Engineering Theses & Dissertations

Clinical Prediction Models (CPM) have long been used for Clinical Decision Support (CDS) initially based on simple clinical scoring systems, and increasingly based on complex machine learning models relying on large-scale Electronic Health Record (EHR) data. External implementation – or the application of CPMs on sites where it was not originally developed – is valuable as it reduces the need for redundant de novo CPM development, enables CPM usage by low resource organizations, facilitates external validation studies, and encourages collaborative development of CPMs. Further, adoption of externally developed CPMs has been facilitated by ongoing interoperability efforts in standards, policy, and …


The Challenges Of Applying Computational Legal Analysis To Mhealth Security And Privacy Regulations, Brian Tung Aug 2021

The Challenges Of Applying Computational Legal Analysis To Mhealth Security And Privacy Regulations, Brian Tung

McKelvey School of Engineering Theses & Dissertations

As our world has grown in complexity, so have our laws. By one measure, the United States Code has grown over 30x as long since 1935, and the 186,000-page Code of Federal Regulations has grown almost 10x in length since 1938. Our growing legal system is too complicated; it’s impossible for people to know all the laws that apply to them. However, people are still subject to the law, even if they are unfamiliar with it. Therein lies the need for computational legal analysis. Tools of computation (e.g., data visualization, algorithms, and artificial intelligence) have the potential to transform civic …


Machine Learning For Analog/Mixed-Signal Integrated Circuit Design Automation, Weidong Cao Aug 2021

Machine Learning For Analog/Mixed-Signal Integrated Circuit Design Automation, Weidong Cao

McKelvey School of Engineering Theses & Dissertations

Analog/mixed-signal (AMS) integrated circuits (ICs) play an essential role in electronic systems by processing analog signals and performing data conversion to bridge the analog physical world and our digital information world.Their ubiquitousness powers diverse applications ranging from smart devices and autonomous cars to crucial infrastructures. Despite such critical importance, conventional design strategies of AMS circuits still follow an expensive and time-consuming manual process and are unable to meet the exponentially-growing productivity demands from industry and satisfy the rapidly-changing design specifications from many emerging applications. Design automation of AMS IC is thus the key to tackling these challenges and has been …


Continuous-Time And Complex Growth Transforms For Analog Computing And Optimization, Oindrila Chatterjee Aug 2021

Continuous-Time And Complex Growth Transforms For Analog Computing And Optimization, Oindrila Chatterjee

McKelvey School of Engineering Theses & Dissertations

Analog computing is a promising and practical candidate for solving complex computational problems involving algebraic and differential equations. At the fundamental level, an analog computing framework can be viewed as a dynamical system that evolves following fundamental physical principles, like energy minimization, to solve a computing task. Additionally, conservation laws, such as conservation of charge, energy, or mass, provide a natural way to couple and constrain spatially separated variables. Taking a cue from these observations, in this dissertation, I have explored a novel dynamical system-based computing framework that exploits naturally occurring analog conservation constraints to solve a variety of optimization …


A Neuromorphic Machine Learning Framework Based On The Growth Transform Dynamical System, Ahana Gangopadhyay Aug 2021

A Neuromorphic Machine Learning Framework Based On The Growth Transform Dynamical System, Ahana Gangopadhyay

McKelvey School of Engineering Theses & Dissertations

As computation increasingly moves from the cloud to the source of data collection, there is a growing demand for specialized machine learning algorithms that can perform learning and inference at the edge in energy and resource-constrained environments. In this regard, we can take inspiration from small biological systems like insect brains that exhibit high energy-efficiency within a small form-factor, and show superior cognitive performance using fewer, coarser neural operations (action potentials or spikes) than the high-precision floating-point operations used in deep learning platforms. Attempts at bridging this gap using neuromorphic hardware has produced silicon brains that are orders of magnitude …


Photoacoustic Imaging, Feature Extraction, And Machine Learning Implementation For Ovarian And Colorectal Cancer Diagnosis, Eghbal Amidi Aug 2021

Photoacoustic Imaging, Feature Extraction, And Machine Learning Implementation For Ovarian And Colorectal Cancer Diagnosis, Eghbal Amidi

McKelvey School of Engineering Theses & Dissertations

Among all cancers related to women’s reproductive systems, ovarian cancer has the highest mortality rate. Pelvic examination, transvaginal ultrasound (TVUS), and blood testing for cancer antigen 125 (CA-125), are the conventional screening tools for ovarian cancer, but they offer very low specificity. Other tools, such as magnetic resonance imaging (MRI), computed tomography (CT), and positron emission tomography (PET), also have limitations in detecting small lesions. In the USA, considering men and women separately, colorectal cancer is the third most common cause of death related to cancer; for men and women combined, it is the second leading cause of cancer deaths. …


Improving Additional Adversarial Robustness For Classification, Michael Guo May 2021

Improving Additional Adversarial Robustness For Classification, Michael Guo

McKelvey School of Engineering Theses & Dissertations

Although neural networks have achieved remarkable success on classification, adversarial robustness is still a significant concern. There are now a series of approaches for designing adversarial examples and methods to defending against them. This paper consists of two projects. In our first work, we propose an approach by leveraging cognitive salience to enhance additional robustness on top of these methods. Specifically, for image classification, we split an image into the foreground (salient region) and background (the rest) and allow significantly larger adversarial perturbations in the background to produce stronger attacks. Furthermore, we show that adversarial training with dual-perturbation attacks yield …


Assessment And Diagnosis Of Human Colorectal And Ovarian Cancer Using Optical Imaging And Computer-Aided Diagnosis, Yifeng Zeng May 2021

Assessment And Diagnosis Of Human Colorectal And Ovarian Cancer Using Optical Imaging And Computer-Aided Diagnosis, Yifeng Zeng

McKelvey School of Engineering Theses & Dissertations

Tissue optical scattering has recently emerged as an important diagnosis parameter associated with early tumor development and progression. To characterize the differences between benign and malignant colorectal tissues, we have created an automated optical scattering coefficient mapping algorithm using an optical coherence tomography (OCT) system. A novel feature called the angular spectrum index quantifies the scattering coefficient distribution. In addition to scattering, subsurface morphological changes are also associated with the development of colorectal cancer. We have observed a specific mucosa structure indicating normal human colorectal tissue, and have developed a real-time pattern recognition neural network to localize this specific structure …


A Collaborative Knowledge-Based Security Risk Assessments Solution Using Blockchains, Tara Thaer Salman May 2021

A Collaborative Knowledge-Based Security Risk Assessments Solution Using Blockchains, Tara Thaer Salman

McKelvey School of Engineering Theses & Dissertations

Artificial intelligence and machine learning have recently gained wide adaptation in building intelligent yet simple and proactive security risk assessment solutions. Intrusion identification, malware detection, and threat intelligence are examples of security risk assessment applications that have been revolutionized with these breakthrough technologies. With the increased risk and severity of cyber-attacks and the distributed nature of modern threats and vulnerabilities, it becomes critical to pose a distributed intelligent assessment solution that evaluates security risks collaboratively. Blockchain, as a decade-old successful distributed ledger technology, has the potential to build such collaborative solutions. However, in order to be used for such solutions, …


Deep Learning For Task-Based Image Quality Assessment In Medical Imaging, Weimin Zhou Jan 2021

Deep Learning For Task-Based Image Quality Assessment In Medical Imaging, Weimin Zhou

McKelvey School of Engineering Theses & Dissertations

It has been advocated to use objective measures of image quality (IQ) for assessing and optimizing medical imaging systems. Objective measures of IQ quantify the performance of an observer at a specific diagnostic task. Binary signal detection tasks and joint signal detection and localization (detection-localization) tasks are commonly considered in medical imaging. When optimizing imaging systems for binary signal detection tasks, the performance of the Bayesian Ideal Observer (IO) has been advocated for use as a figure-of-merit (FOM). The IO maximizes the observer performance that is summarized by the receiver operating characteristic (ROC) curve. When signal detection-localization tasks are considered, …


Holistic Control For Cyber-Physical Systems, Yehan Ma Jan 2021

Holistic Control For Cyber-Physical Systems, Yehan Ma

McKelvey School of Engineering Theses & Dissertations

The Industrial Internet of Things (IIoT) are transforming industries through emerging technologies such as wireless networks, edge computing, and machine learning. However, IIoT technologies are not ready for control systems for industrial automation that demands control performance of physical processes, resiliency to both cyber and physical disturbances, and energy efficiency. To meet the challenges of IIoT-driven control, we propose holistic control as a cyber-physical system (CPS) approach to next-generation industrial automation systems. In contrast to traditional industrial automation systems where computing, communication, and control are managed in isolation, holistic control orchestrates the management of cyber platforms (networks and computing platforms) …


Machine Learning Morphisms: A Framework For Designing And Analyzing Machine Learning Work Ows, Applied To Separability, Error Bounds, And 30-Day Hospital Readmissions, Eric Zenon Cawi Jan 2021

Machine Learning Morphisms: A Framework For Designing And Analyzing Machine Learning Work Ows, Applied To Separability, Error Bounds, And 30-Day Hospital Readmissions, Eric Zenon Cawi

McKelvey School of Engineering Theses & Dissertations

A machine learning workflow is the sequence of tasks necessary to implement a machine learning application, including data collection, preprocessing, feature engineering, exploratory analysis, and model training/selection. In this dissertation we propose the Machine Learning Morphism (MLM) as a mathematical framework to describe the tasks in a workflow. The MLM is a tuple consisting of: Input Space, Output Space, Learning Morphism, Parameter Prior, Empirical Risk Function. This contains the information necessary to learn the parameters of the learning morphism, which represents a workflow task. In chapter 1, we give a short review of typical tasks present in a workflow, as …


Convex Relaxations For Particle-Gradient Flow With Applications In Super-Resolution Single-Molecule Localization Microscopy, Hesam Mazidisharfabadi Aug 2020

Convex Relaxations For Particle-Gradient Flow With Applications In Super-Resolution Single-Molecule Localization Microscopy, Hesam Mazidisharfabadi

McKelvey School of Engineering Theses & Dissertations

Single-molecule localization microscopy (SMLM) techniques have become advanced bioanalytical tools by quantifying the positions and orientations of molecules in space and time at the nanoscale. With the noisy and heterogeneous nature of SMLM datasets in mind, we discuss leveraging particle-gradient flow 1) for quantifying the accuracy of localization algorithms with and without ground truth and 2) as a basis for novel, model-driven localization algorithms with empirically robust performance. Using experimental data, we demonstrate that overlapping images of molecules, a typical consequence of densely packed biological structures, cause biases in position estimates and reconstruction artifacts. To minimize such biases, we develop …


Domain Specific Computing In Tightly-Coupled Heterogeneous Systems, Anthony Michael Cabrera Aug 2020

Domain Specific Computing In Tightly-Coupled Heterogeneous Systems, Anthony Michael Cabrera

McKelvey School of Engineering Theses & Dissertations

Over the past several decades, researchers and programmers across many disciplines have relied on Moores law and Dennard scaling for increases in compute capability in modern processors. However, recent data suggest that the number of transistors per square inch on integrated circuits is losing pace with Moores laws projection due to the breakdown of Dennard scaling at smaller semiconductor process nodes. This has signaled the beginning of a new “golden age in computer architecture” in which the paradigm will be shifted from improving traditional processor performance for general tasks to architecting hardware that executes a class of applications in a …


Investigating Single Precision Floating General Matrix Multiply In Heterogeneous Hardware, Steven Harris Aug 2020

Investigating Single Precision Floating General Matrix Multiply In Heterogeneous Hardware, Steven Harris

McKelvey School of Engineering Theses & Dissertations

The fundamental operation of matrix multiplication is ubiquitous across a myriad of disciplines. Yet, the identification of new optimizations for matrix multiplication remains relevant for emerging hardware architectures and heterogeneous systems. Frameworks such as OpenCL enable computation orchestration on existing systems, and its availability using the Intel High Level Synthesis compiler allows users to architect new designs for reconfigurable hardware using C/C++. Using the HARPv2 as a vehicle for exploration, we investigate the utility of several of the most notable matrix multiplication optimizations to better understand the performance portability of OpenCL and the implications for such optimizations on this and …


Exploring Usage Of Web Resources Through A Model Of Api Learning, Finn Voichick May 2020

Exploring Usage Of Web Resources Through A Model Of Api Learning, Finn Voichick

McKelvey School of Engineering Theses & Dissertations

Application programming interfaces (APIs) are essential to modern software development, and new APIs are frequently being produced. Consequently, software developers must regularly learn new APIs, which they typically do on the job from online resources rather than in a formal educational context. The Kelleher–Ichinco COIL model, an acronym for “Collection and Organization of Information for Learning,” was recently developed to model the entire API learning process, drawing from information foraging theory, cognitive load theory, and external memory research. We ran an exploratory empirical user study in which participants performed a programming task using the React API with the goal of …


Exploring Attacks And Defenses In Additive Manufacturing Processes: Implications In Cyber-Physical Security, Nicholas Deily May 2020

Exploring Attacks And Defenses In Additive Manufacturing Processes: Implications In Cyber-Physical Security, Nicholas Deily

McKelvey School of Engineering Theses & Dissertations

Many industries are rapidly adopting additive manufacturing (AM) because of the added versatility this technology offers over traditional manufacturing techniques. But with AM, there comes a unique set of security challenges that must be addressed. In particular, the issue of part verification is critically important given the growing reliance of safety-critical systems on 3D printed parts. In this thesis, the current state of part verification technologies will be examined in the con- text of AM-specific geometric-modification attacks, and an automated tool for 3D printed part verification will be presented. This work will cover: 1) the impacts of malicious attacks on …


Elicitation And Aggregation Of Data In Knowledge Intensive Crowdsourcing, Dohoon Kim May 2020

Elicitation And Aggregation Of Data In Knowledge Intensive Crowdsourcing, Dohoon Kim

All Computer Science and Engineering Research

With the significant advance of internet and connectivity, crowdsourcing gained more popularity and various crowdsourcing platforms emerged. This project focuses on knowledge-intensive crowdsourcing, in which agents are presented with the tasks that require certain knowledge in domain. Knowledge-intensive crowdsourcing requires agents to have experiences on the specific domain. With the constraint of resources and its trait as sourcing from crowd, platform is likely to draw agents with different levels of expertise and knowledge and asking same task can result in bad performance. Some agents can give better information when they are asked with more general question or more knowledge-specific task …


A Virtual 4d Ct Scanner, Xiwen Li May 2020

A Virtual 4d Ct Scanner, Xiwen Li

All Computer Science and Engineering Research

4D CT scan is widely used in medical imaging. Images are acquired through phases. In this case, we can track the motion of organs such as heart. However, it also introduces motion artifacts. A lot of research focuses on remove these artifacts. It is difficult to acquire artifact data by a real CT scanner. In this project, we implement a virtual CT machine to simulate the real 4D CT scan. we also conduct experi- ments to check its clinical reality with respect to respiratory and heart motion parameters.


Centrality Of Blockchain, Zixuan Li May 2020

Centrality Of Blockchain, Zixuan Li

All Computer Science and Engineering Research

Decentralization is widely recognized as the property and one of most important advantage of blockchain over legacy systems. However, decentralization is often discussed on the consensus layer and recent research shows the trend of centralization on several subsystem of blockchain. In this project, we measured centralization of Bitcoin and Ethereum on source code, development eco-system, and network node levels. We found that the programming language of project is highly centralized, code clone is very common inside Bitcoin and Ethereum community, and developer contribution distribution is highly centralized. We further discuss how could these centralizations lead to security issues in blockchain. …


Solving Disappearance At Gastech With Visual Analytic Techniques, Saulet Yskak May 2020

Solving Disappearance At Gastech With Visual Analytic Techniques, Saulet Yskak

All Computer Science and Engineering Research

We are living in a society, where images and charts speak louder than words. Therefore, information visualization plays a major role in solving complex problems since it provides a visual summary of data that makes it easier to identify trends and patterns.

In this master project, I propose a web – based visual analytics tool that enables to analyze complex email and time based / event series data. The visual analytics framework uses test data from IEEE VAST Challenge 2014: Mini challenge 1 that concentrated on the disappearance of employees of a fictional GAStech company, but the tool allows users …


Predicting Disease Progression Using Deep Recurrent Neural Networks And Longitudinal Electronic Health Record Data, Seunghwan Kim May 2020

Predicting Disease Progression Using Deep Recurrent Neural Networks And Longitudinal Electronic Health Record Data, Seunghwan Kim

McKelvey School of Engineering Theses & Dissertations

Electronic Health Records (EHR) are widely adopted and used throughout healthcare systems and are able to collect and store longitudinal information data that can be used to describe patient phenotypes. From the underlying data structures used in the EHR, discrete data can be extracted and analyzed to improve patient care and outcomes via tasks such as risk stratification and prospective disease management. Temporality in EHR is innately present given the nature of these data, however, and traditional classification models are limited in this context by the cross- sectional nature of training and prediction processes. Finding temporal patterns in EHR is …


Predicting Disease Progression Using Deep Recurrent Neural Networks And Longitudinal Electronic Health Record Data, Seunghwan Kim May 2020

Predicting Disease Progression Using Deep Recurrent Neural Networks And Longitudinal Electronic Health Record Data, Seunghwan Kim

McKelvey School of Engineering Theses & Dissertations

Electronic Health Records (EHR) are widely adopted and used throughout healthcare systems and are able to collect and store longitudinal information data that can be used to describe patient phenotypes. From the underlying data structures used in the EHR, discrete data can be extracted and analyzed to improve patient care and outcomes via tasks such as risk stratification and prospective disease management. Temporality in EHR is innately present given the nature of these data, however, and traditional classification models are limited in this context by the cross-sectional nature of training and prediction processes. Finding temporal patterns in EHR is especially …


The Effects Of Mixed-Initiative Visualization Systems On Exploratory Data Analysis, Alvitta Ottley, Adam Kern Jan 2020

The Effects Of Mixed-Initiative Visualization Systems On Exploratory Data Analysis, Alvitta Ottley, Adam Kern

All Computer Science and Engineering Research

The primary purpose of information visualization is to act as a window between a user and the data. Historically, this has been accomplished via a single-agent framework: the only decision-maker in the relationship between visualization system and analyst is the analyst herself. Yet this framework arose not from first principles, but a necessity. Before this decade, computers were limited in their decision-making capabilities, especially in the face of large, complex datasets and visualization systems. This paper aims to present the design and evaluation of a mixed-initiative system that aids the user in handling large, complex datasets and dense visualization systems. …


Point Cloud Processing With Neural Networks, Stephanie Miller, Jiahao Li Dec 2019

Point Cloud Processing With Neural Networks, Stephanie Miller, Jiahao Li

All Computer Science and Engineering Research

In this project, we explore new techniques and architectures for applying deep neural networks when the input is point cloud data. We first consider applying convolutions on regular pixel and voxel grids, using polynomials of point coordinates and Fourier transforms to get a rich feature representation for all points mapped to the same pixel or voxel. We also apply these ideas to generalize the recently proposed "interpolated convolution", by learning continuous-space kernels as a combination of polynomial and Fourier basis kernels. Experiments on the ModelNet40 dataset demonstrate that our methods have superior performance over the baselines in 3D object recognition.


Static Taint Analysis Of Binary Executables Using Architecture-Neutral Intermediate Representation, Elaine Cole Dec 2019

Static Taint Analysis Of Binary Executables Using Architecture-Neutral Intermediate Representation, Elaine Cole

All Computer Science and Engineering Research

Ghidra, National Security Agency’s powerful reverse engineering framework, was recently released open-source in April 2019 and is capable of lifting instructions from a wide variety of processor architectures into its own register transfer language called p-code. In this project, we present a new tool which leverages Ghidra’s specific architecture-neutral intermediate representation to construct a control flow graph modeling all program executions of a given binary and apply static taint analysis. This technique is capable of identifying the information flow of malicious input from untrusted sources that may interact with key sinks or parts of the system without needing access to …


Pipelined Parallelism In A Work-Stealing Scheduler, Thomas Kelly Sep 2019

Pipelined Parallelism In A Work-Stealing Scheduler, Thomas Kelly

All Computer Science and Engineering Research

A pipeline is a particular type of parallel program structure, often used to represent loops with cross-iteration dependencies. Pipelines cannot be expressed with the typical parallel language constructs offered by most environments. Therefore, in order to run pipelines, it is necessary to write a parallel language and scheduler with specialized support for them. Some such schedulers are written exclusively for pipelines and unable to run any other type of program, which allows for certain optimizations that take advantage of the pipeline structure. Other schedulers implement support for pipelines on top of a general-purpose scheduling algorithm. One example of such an …


Decoupling Information And Connectivity Via Information-Centric Transport, Hila Ben Abraham Aug 2019

Decoupling Information And Connectivity Via Information-Centric Transport, Hila Ben Abraham

McKelvey School of Engineering Theses & Dissertations

The power of Information-Centric Networking architectures (ICNs) lies in their abstraction for communication --- the request for named data. This abstraction was popularized by the HyperText Transfer Protocol (HTTP) as an application-layer abstraction, and was extended by ICNs to also serve as their network-layer abstraction. In recent years, network mechanisms for ICNs, such as scalable name-based forwarding, named-data routing and in-network caching, have been widely explored and researched. However, to the best of our knowledge, the impact of this network abstraction on ICN applications has not been explored or well understood. The motivation of this dissertation is to address this …