Open Access. Powered by Scholars. Published by Universities.®

Engineering Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 16 of 16

Full-Text Articles in Engineering

Preserving User Data Privacy Through The Development Of An Android Solid Library, Alexandria Lim May 2023

Preserving User Data Privacy Through The Development Of An Android Solid Library, Alexandria Lim

Computer Science and Computer Engineering Undergraduate Honors Theses

In today’s world where any and all activity on the internet produces data, user data privacy and autonomy are not prioritized. Companies called data brokers are able to gather data elements of personal information numbering in the billions. This data can be anything from purchase history, credit card history, downloaded applications, and service subscriptions. This information can be analyzed and inferences can be drawn from analysis, categorizing people into groups that range in sensitivity — from hobbies to race and income classes. Not only do these data brokers constantly overlook data privacy, this mass amount of data makes them extremely …


How Blockchain Solutions Enable Better Decision Making Through Blockchain Analytics, Sammy Ter Haar May 2022

How Blockchain Solutions Enable Better Decision Making Through Blockchain Analytics, Sammy Ter Haar

Information Systems Undergraduate Honors Theses

Since the founding of computers, data scientists have been able to engineer devices that increase individuals’ opportunities to communicate with each other. In the 1990s, the internet took over with many people not understanding its utility. Flash forward 30 years, and we cannot live without our connection to the internet. The internet of information is what we called early adopters with individuals posting blogs for others to read, this was known as Web 1.0. As we progress, platforms became social allowing individuals in different areas to communicate and engage with each other, this was known as Web 2.0. As Dr. …


Modeling Damage Spread, Assessment, And Recovery Of Critical Systems, Justin Burns May 2022

Modeling Damage Spread, Assessment, And Recovery Of Critical Systems, Justin Burns

Graduate Theses and Dissertations

Critical infrastructure systems have recently become more vulnerable to attacks on their data systems through internet connectivity. If an attacker is successful in breaching a system’s defenses, it is imperative that operations are restored to the system as quickly as possible. This thesis focuses on damage assessment and recovery following an attack. A literature review is first conducted on work done in both database protection and critical infrastructure protection, then the thesis defines how damage affects the relationships between data and software. Then, the thesis proposes a model using a graph construction to show the cascading affects within a system …


Optimized Damage Assessment And Recovery Through Data Categorization In Critical Infrastructure System., Shruthi Ramakrishnan May 2022

Optimized Damage Assessment And Recovery Through Data Categorization In Critical Infrastructure System., Shruthi Ramakrishnan

Graduate Theses and Dissertations

Critical infrastructures (CI) play a vital role in majority of the fields and sectors worldwide. It contributes a lot towards the economy of nations and towards the wellbeing of the society. They are highly coupled, interconnected and their interdependencies make them more complex systems. Thus, when a damage occurs in a CI system, its complex interdependencies make it get subjected to cascading effects which propagates faster from one infrastructure to another resulting in wide service degradations which in turn causes economic and societal effects. The propagation of cascading effects of disruptive events could be handled efficiently if the assessment and …


Analysis Of Gpu Memory Vulnerabilities, Jarrett Hoover May 2022

Analysis Of Gpu Memory Vulnerabilities, Jarrett Hoover

Computer Science and Computer Engineering Undergraduate Honors Theses

Graphics processing units (GPUs) have become a widely used technology for various purposes. While their intended use is accelerating graphics rendering, their parallel computing capabilities have expanded their use into other areas. They are used in computer gaming, deep learning for artificial intelligence and mining cryptocurrencies. Their rise in popularity led to research involving several security aspects, including this paper’s focus, memory vulnerabilities. Research documented many vulnerabilities, including GPUs not implementing address space layout randomization, not zeroing out memory after deallocation, and not initializing newly allocated memory. These vulnerabilities can lead to a victim’s sensitive data being leaked to an …


Design, Extraction, And Optimization Tool Flows And Methodologies For Homogeneous And Heterogeneous Multi-Chip 2.5d Systems, Md Arafat Kabir Dec 2021

Design, Extraction, And Optimization Tool Flows And Methodologies For Homogeneous And Heterogeneous Multi-Chip 2.5d Systems, Md Arafat Kabir

Graduate Theses and Dissertations

Chip and packaging industries are making significant progress in 2.5D design as a result of increasing popularity of their application. In advanced high-density 2.5D packages, package redistribution layers become similar to chip Back-End-of-Line routing layers, and the gap between them scales down with pin density improvement. Chiplet-package interactions become significant and severely affect system performance and reliability. Moreover, 2.5D integration offers opportunities to apply novel design techniques. The traditional die-by-die design approach neither carefully considers these interactions nor fully exploits the cross-boundary design opportunities.

This thesis presents chiplet-package cross-boundary design, extraction, analysis, and optimization tool flows and methodologies for high-density …


Privacy-Preserving Cloud-Assisted Data Analytics, Wei Bao Jul 2021

Privacy-Preserving Cloud-Assisted Data Analytics, Wei Bao

Graduate Theses and Dissertations

Nowadays industries are collecting a massive and exponentially growing amount of data that can be utilized to extract useful insights for improving various aspects of our life. Data analytics (e.g., via the use of machine learning) has been extensively applied to make important decisions in various real world applications. However, it is challenging for resource-limited clients to analyze their data in an efficient way when its scale is large. Additionally, the data resources are increasingly distributed among different owners. Nonetheless, users' data may contain private information that needs to be protected.

Cloud computing has become more and more popular in …


Securing Fog Federation From Behavior Of Rogue Nodes, Mohammed Saleh H. Alshehri May 2021

Securing Fog Federation From Behavior Of Rogue Nodes, Mohammed Saleh H. Alshehri

Graduate Theses and Dissertations

As the technological revolution advanced information security evolved with an increased need for confidential data protection on the internet. Individuals and organizations typically prefer outsourcing their confidential data to the cloud for processing and storage. As promising as the cloud computing paradigm is, it creates challenges; everything from data security to time latency issues with data computation and delivery to end-users. In response to these challenges CISCO introduced the fog computing paradigm in 2012. The intent was to overcome issues such as time latency and communication overhead and to bring computing and storage resources close to the ground and the …


Data Forgery Detection In Automatic Generation Control: Exploration Of Automated Parameter Generation And Low-Rate Attacks, Yatish R. Dubasi May 2021

Data Forgery Detection In Automatic Generation Control: Exploration Of Automated Parameter Generation And Low-Rate Attacks, Yatish R. Dubasi

Computer Science and Computer Engineering Undergraduate Honors Theses

Automatic Generation Control (AGC) is a key control system utilized in electric power systems. AGC uses frequency and tie-line power flow measurements to determine the Area Control Error (ACE). ACE is then used by the AGC to adjust power generation and maintain an acceptable power system frequency. Attackers might inject false frequency and/or tie-line power flow measurements to mislead AGC into falsely adjusting power generation, which can harm power system operations. Various data forgery detection models are studied in this thesis. First, to make the use of predictive detection models easier for users, we propose a method for automated generation …


Lecture 11: The Road To Exascale And Legacy Software For Dense Linear Algebra, Jack Dongarra Apr 2021

Lecture 11: The Road To Exascale And Legacy Software For Dense Linear Algebra, Jack Dongarra

Mathematical Sciences Spring Lecture Series

In this talk, we will look at the current state of high performance computing and look at the next stage of extreme computing. With extreme computing, there will be fundamental changes in the character of floating point arithmetic and data movement. In this talk, we will look at how extreme-scale computing has caused algorithm and software developers to change their way of thinking on implementing and program-specific applications.


Lecture 01: Scalable Solvers: Universals And Innovations, David Keyes Apr 2021

Lecture 01: Scalable Solvers: Universals And Innovations, David Keyes

Mathematical Sciences Spring Lecture Series

As simulation and analytics enter the exascale era, numerical algorithms, particularly implicit solvers that couple vast numbers of degrees of freedom, must span a widening gap between ambitious applications and austere architectures to support them. We present fifteen universals for researchers in scalable solvers: imperatives from computer architecture that scalable solvers must respect, strategies towards achieving them that are currently well established, and additional strategies currently being developed for an effective and efficient exascale software ecosystem. We consider recent generalizations of what it means to “solve” a computational problem, which suggest that we have often been “oversolving” them at the …


An Embarrassment Of Riches: Data Integration In Vr Pompeii, Adam Schoelz May 2018

An Embarrassment Of Riches: Data Integration In Vr Pompeii, Adam Schoelz

Computer Science and Computer Engineering Undergraduate Honors Theses

It is fair to say that Pompeii is the most studied archaeological site in the world. Beyond the extensive remains of the city itself, the timing of its rediscovery and excavation place it in a unique historiographical position. The city has been continuously studied since the 18th century, with historians and archaeologists constantly reevaluating older sources as our knowledge of the ancient world expands. While several studies have approached the city from a data driven perspective, no studies of the city have taken a quantitative holistic approach on the scale of the VR Pompeii project. Hyper-specificity has been the order …


Modeling Information Reliability And Maintenance: A Systematic Literature Review, Daysi A. Guerra Garcia Dec 2015

Modeling Information Reliability And Maintenance: A Systematic Literature Review, Daysi A. Guerra Garcia

Industrial Engineering Undergraduate Honors Theses

Operating a business efficiently depends on effective everyday decision-making. In turn, those decisions are influenced by the quality of data used in the decision-making process, and maintaining good data quality becomes more challenging as a business expands. Protecting the quality of the data and the information it generates is a challenge faced by many companies across all industrial sectors. As companies begin to use data from these large data bases they will need to begin to develop strategies for maintaining and assessing the reliability of the information they generate using this data. A considerable amount of literature exists on data …


Data Integrity Verification In Cloud Computing, Katanosh Morovat May 2015

Data Integrity Verification In Cloud Computing, Katanosh Morovat

Graduate Theses and Dissertations

Cloud computing is an architecture model which provides computing and storage capacity as a service over the internet. Cloud computing should provide secure services for users and owners of data as well. Cloud computing services are a completely internet-based technology where data are stored and maintained in the data center of a cloud provider. Lack of appropriate control over the data might incur several security issues. As a result, some data stored in the cloud must be protected at all times. These types of data are called sensitive data. Sensitive data is defined as data that must be protected against …


An Open Source, Line Rate Datagram Protocol Facilitating Message Resiliency Over An Imperfect Channel, Christina Marie Smith Dec 2013

An Open Source, Line Rate Datagram Protocol Facilitating Message Resiliency Over An Imperfect Channel, Christina Marie Smith

Graduate Theses and Dissertations

Remote Direct Memory Access (RDMA) is the transfer of data into buffers between two compute nodes that does not require the involvement of a CPU or Operating System (OS). The idea is borrowed from Direct Memory Access (DMA) which allows memory within a compute node to be transferred without transiting through the CPU. RDMA is termed a zero-copy protocol as it eliminates the need to copy data between buffers within the protocol stack. Because of this and other features, RDMA promotes reliable, high throughput and low latency transfer for packet-switched networking. While the benefits of RMDA are well known and …


Parallelizing Scale Invariant Feature Transform On A Distributed Memory Cluster, Stanislav Bobovych Jan 2011

Parallelizing Scale Invariant Feature Transform On A Distributed Memory Cluster, Stanislav Bobovych

Inquiry: The University of Arkansas Undergraduate Research Journal

Scale Invariant Feature Transform (SIFT) is a computer vision algorithm that is widely-used to extract features from images. We explored accelerating an existing implementation of this algorithm with message passing in order to analyze large data sets. We successfully tested two approaches to data decomposition in order to parallelize SIFT on a distributed memory cluster.