Open Access. Powered by Scholars. Published by Universities.®

Computer Engineering Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 4 of 4

Full-Text Articles in Computer Engineering

Performance Modeling Of Inline Compression With Software Caching For Reducing The Memory Footprint In Pysdc, Sansriti Ranjan Aug 2023

Performance Modeling Of Inline Compression With Software Caching For Reducing The Memory Footprint In Pysdc, Sansriti Ranjan

All Theses

Modern HPC applications compute and analyze massive amounts of data. The data volume is growing faster than memory capabilities and storage improvements leading to performance bottlenecks. An example of this is pySDC, a framework for solving collocation problems iteratively using parallel-in-time methods. These methods require storing and exchanging 3D volume data for each parallel point in time. If a simulation consists of M parallel-in-time stages, where the full spatial problem has to be stored for the next iteration, the memory demand for a single state variable is M ×Nx ×Ny ×Nz per time-step. For an application simulation with many state …


A Study Of Non-Datapath Cache Replacement Algorithms, Steven G. Lyons Jr. Mar 2021

A Study Of Non-Datapath Cache Replacement Algorithms, Steven G. Lyons Jr.

FIU Electronic Theses and Dissertations

Conventionally, caching algorithms have been designed for the datapath — the levels of memory that must contain the data before it gets made available to the CPU. Attaching a fast device (such as an SSD) as a cache to a host that runs the application workload are recent developments. These host-side caches open up possibilities for what are referred to as non-datapath caches to exist. Non-Datapath caches are referred to as such because the caches do not exist on the traditional datapath, instead being optional memory locations for data. As these caches are optional, a new capability is available to …


On Optimizations Of Virtual Machine Live Storage Migration For The Cloud, Yaodong Yang Jul 2016

On Optimizations Of Virtual Machine Live Storage Migration For The Cloud, Yaodong Yang

Department of Computer Science and Engineering: Dissertations, Theses, and Student Research

Virtual Machine (VM) live storage migration is widely performed in the data cen- ters of the Cloud, for the purposes of load balance, reliability, availability, hardware maintenance and system upgrade. It entails moving all the state information of the VM being migrated, including memory state, network state and storage state, from one physical server to another within the same data center or across different data centers. To minimize its performance impact, this migration process is required to be transparent to applications running within the migrating VM, meaning that ap- plications will keep running inside the VM as if there were …


Mitigating Interference During Virtual Machine Live Migration Through Storage Offloading, Morgan S. Stuart Jan 2016

Mitigating Interference During Virtual Machine Live Migration Through Storage Offloading, Morgan S. Stuart

Theses and Dissertations

Today's cloud landscape has evolved computing infrastructure into a dynamic, high utilization, service-oriented paradigm. This shift has enabled the commoditization of large-scale storage and distributed computation, allowing engineers to tackle previously untenable problems without large upfront investment. A key enabler of flexibility in the cloud is the ability to transfer running virtual machines across subnets or even datacenters using live migration. However, live migration can be a costly process, one that has the potential to interfere with other applications not involved with the migration. This work investigates storage interference through experimentation with real-world systems and well-established benchmarks. In order to …