Open Access. Powered by Scholars. Published by Universities.®
- Discipline
-
- Computer and Systems Architecture (2)
- Electrical and Computer Engineering (2)
- Bioinformatics (1)
- Biomedical Engineering and Bioengineering (1)
- Computational Engineering (1)
-
- Computer Sciences (1)
- Data Storage Systems (1)
- Hardware Systems (1)
- Life Sciences (1)
- Other Computer Engineering (1)
- Other Electrical and Computer Engineering (1)
- Physical Sciences and Mathematics (1)
- Physics (1)
- Programming Languages and Compilers (1)
- Quantum Physics (1)
- Systems and Communications (1)
- Systems and Integrative Engineering (1)
Articles 1 - 3 of 3
Full-Text Articles in Computer Engineering
Compilation Optimizations To Enhance Resilience Of Big Data Programs And Quantum Processors, Travis D. Lecompte
Compilation Optimizations To Enhance Resilience Of Big Data Programs And Quantum Processors, Travis D. Lecompte
LSU Doctoral Dissertations
Modern computers can experience a variety of transient errors due to the surrounding environment, known as soft faults. Although the frequency of these faults is low enough to not be noticeable on personal computers, they become a considerable concern during large-scale distributed computations or systems in more vulnerable environments like satellites. These faults occur as a bit flip of some value in a register, operation, or memory during execution. They surface as either program crashes, hangs, or silent data corruption (SDC), each of which can waste time, money, and resources. Hardware methods, such as shielding or error correcting memory (ECM), …
Large-Scale Data Analysis And Deep Learning Using Distributed Cyberinfrastructures And High Performance Computing, Richard Dodge Platania
Large-Scale Data Analysis And Deep Learning Using Distributed Cyberinfrastructures And High Performance Computing, Richard Dodge Platania
LSU Doctoral Dissertations
Data in many research fields continues to grow in both size and complexity. For instance, recent technological advances have caused an increased throughput in data in various biological-related endeavors, such as DNA sequencing, molecular simulations, and medical imaging. In addition, the variance in the types of data (textual, signal, image, etc.) adds an additional complexity in analyzing the data. As such, there is a need for uniquely developed applications that cater towards the type of data. Several considerations must be made when attempting to create a tool for a particular dataset. First, we must consider the type of algorithm required …
A Study Of Scalability And Cost-Effectiveness Of Large-Scale Scientific Applications Over Heterogeneous Computing Environment, Arghya K. Das
A Study Of Scalability And Cost-Effectiveness Of Large-Scale Scientific Applications Over Heterogeneous Computing Environment, Arghya K. Das
LSU Doctoral Dissertations
Recent advances in large-scale experimental facilities ushered in an era of data-driven science. These large-scale data increase the opportunity to answer many fundamental questions in basic science. However, these data pose new challenges to the scientific community in terms of their optimal processing and transfer. Consequently, scientists are in dire need of robust high performance computing (HPC) solutions that can scale with terabytes of data.
In this thesis, I address the challenges in three major aspects of scientific big data processing as follows: 1) Developing scalable software and algorithms for data- and compute-intensive scientific applications. 2) Proposing new cluster architectures …