Open Access. Powered by Scholars. Published by Universities.®

Computer Engineering Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 10 of 10

Full-Text Articles in Computer Engineering

Compilation Optimizations To Enhance Resilience Of Big Data Programs And Quantum Processors, Travis D. Lecompte Nov 2022

Compilation Optimizations To Enhance Resilience Of Big Data Programs And Quantum Processors, Travis D. Lecompte

LSU Doctoral Dissertations

Modern computers can experience a variety of transient errors due to the surrounding environment, known as soft faults. Although the frequency of these faults is low enough to not be noticeable on personal computers, they become a considerable concern during large-scale distributed computations or systems in more vulnerable environments like satellites. These faults occur as a bit flip of some value in a register, operation, or memory during execution. They surface as either program crashes, hangs, or silent data corruption (SDC), each of which can waste time, money, and resources. Hardware methods, such as shielding or error correcting memory (ECM), …


Lightweight Distributed Computing Framework For Orchestrating High Performance Computing And Big Data, Muhammed Numan İnce, Meli̇h Günay, Joseph Ledet May 2022

Lightweight Distributed Computing Framework For Orchestrating High Performance Computing And Big Data, Muhammed Numan İnce, Meli̇h Günay, Joseph Ledet

Turkish Journal of Electrical Engineering and Computer Sciences

In recent years, the need for the ability to work remotely and subsequently the need for the availability of remote computer-based systems has increased substantially. This trend has seen a dramatic increase with the onset of the 2020 pandemic. Often local data is produced, stored, and processed in the cloud to remedy this flood of computation and storage needs. Historically, HPC (high performance computing) and the concept of big data have been utilized for the storage and processing of large data. However, both HPC and Hadoop can be utilized as solutions for analytical work, though the differences between these may …


Optimization Of Real-Time Wireless Sensor Based Big Data With Deep Autoencoder Network: A Tourism Sector Application With Distributed Computing, Beki̇r Aksoy, Utku Kose Jan 2020

Optimization Of Real-Time Wireless Sensor Based Big Data With Deep Autoencoder Network: A Tourism Sector Application With Distributed Computing, Beki̇r Aksoy, Utku Kose

Turkish Journal of Electrical Engineering and Computer Sciences

Internet usage has increased rapidly with the development of information communication technologies. The increase in internet usage led to the growth of data volumes on the internet and the emergence of the big data concept. Therefore, it has become even more important to analyze the data and make it meaningful. In this study, 690 million queries and approximately 5.9 quadrillion data collected daily from different servers were recorded on the Redis servers by using real-time big data analysis method and load balance structure for a company operating in the tourism sector. Here, wireless networks were used as a triggering factor …


Data Analysis Through Social Media According To The Classified Crime, Serkan Savaş, Nuretti̇n Topaloğlu Jan 2019

Data Analysis Through Social Media According To The Classified Crime, Serkan Savaş, Nuretti̇n Topaloğlu

Turkish Journal of Electrical Engineering and Computer Sciences

The amount and variety of data generated through social media sites has increased along with the widespread use of social media sites. In addition, the data production rate has increased in the same way. The inclusion of personal information within these data makes it important to process the data and reach meaningful information within it. This process can be called intelligence and this meaningful information may be for commercial, academic, or security purposes. An example application is developed in this study for intelligence on Twitter. Crimes in Turkey are classified according to Turkish Statistical Institute criminal data and keywords are …


Web Personalization Issues In Big Data And Semantic Web: Challenges Andopportunities, Bujar Raufi, Florije Ismaili, Jaumin Ajdari, Xhemal Zenuni Jan 2019

Web Personalization Issues In Big Data And Semantic Web: Challenges Andopportunities, Bujar Raufi, Florije Ismaili, Jaumin Ajdari, Xhemal Zenuni

Turkish Journal of Electrical Engineering and Computer Sciences

Web personalization is a process that utilizes a set of methods, techniques, and actions for adapting the linking structure of an information space or its content or both to user interaction preferences. The aim of personalization is to enhance the user experience by retrieving relevant resources and presenting them in a meaningful fashion. The advent of big data introduced new challenges that locate user modeling and personalization community in a new research setting. In this paper, we introduce the research challenges related to Web personalization analyzed in the context of big data and the Semantic Web. This paper also introduces …


A Study Of Scalability And Cost-Effectiveness Of Large-Scale Scientific Applications Over Heterogeneous Computing Environment, Arghya K. Das Jun 2018

A Study Of Scalability And Cost-Effectiveness Of Large-Scale Scientific Applications Over Heterogeneous Computing Environment, Arghya K. Das

LSU Doctoral Dissertations

Recent advances in large-scale experimental facilities ushered in an era of data-driven science. These large-scale data increase the opportunity to answer many fundamental questions in basic science. However, these data pose new challenges to the scientific community in terms of their optimal processing and transfer. Consequently, scientists are in dire need of robust high performance computing (HPC) solutions that can scale with terabytes of data.

In this thesis, I address the challenges in three major aspects of scientific big data processing as follows: 1) Developing scalable software and algorithms for data- and compute-intensive scientific applications. 2) Proposing new cluster architectures …


Hadoop Framework Implementation And Performance Analysis On A Cloud, Göksu Zeki̇ye Özen, Mehmet Tekerek, Rayi̇mbek Sultanov Jan 2017

Hadoop Framework Implementation And Performance Analysis On A Cloud, Göksu Zeki̇ye Özen, Mehmet Tekerek, Rayi̇mbek Sultanov

Turkish Journal of Electrical Engineering and Computer Sciences

The Hadoop framework uses the MapReduce programming paradigm to process big data by distributing data across a cluster and aggregating. MapReduce is one of the methods used to process big data hosted on large clusters. In this method, jobs are processed by dividing into small pieces and distributing over nodes. Parameters such as distributing method over nodes, the number of jobs held in a parallel fashion, and the number of nodes in the cluster affect the execution time of jobs. The aim of this paper is to determine how the numbers of nodes, maps, and reduces affect the performance of …


Internet Of Things To Smart Iot Through Semantic, Cognitive, And Perceptual Computing, Amit P. Sheth Jan 2016

Internet Of Things To Smart Iot Through Semantic, Cognitive, And Perceptual Computing, Amit P. Sheth

Publications

Rapid growth in the Internet of Things (IoT) has resulted in a massive growth of data generated by these devices and sensors put on the Internet. Physical-cyber-social (PCS) big data consist of this IoT data, complemented by relevant Web-based and social data of various modalities. Smart data is about exploiting this PCS big data to get deep insights and make it actionable, and making it possible to facilitate building intelligent systems and applications. This article discusses key AI research in semantic computing, cognitive computing, and perceptual computing. Their synergistic use is expected to power future progress in building intelligent systems …


An Automated Approach For Digital Forensic Analysis Of Heterogeneous Big Data, Hussam Mohammed, Nathan Clarke, Fudong Li Jan 2016

An Automated Approach For Digital Forensic Analysis Of Heterogeneous Big Data, Hussam Mohammed, Nathan Clarke, Fudong Li

Journal of Digital Forensics, Security and Law

The major challenges with big data examination and analysis are volume, complex interdependence across content, and heterogeneity. The examination and analysis phases are considered essential to a digital forensics process. However, traditional techniques for the forensic investigation use one or more forensic tools to examine and analyse each resource. In addition, when multiple resources are included in one case, there is an inability to cross-correlate findings which often leads to inefficiencies in processing and identifying evidence. Furthermore, most current forensics tools cannot cope with large volumes of data. This paper develops a novel framework for digital forensic analysis of heterogeneous …


Exploring Hidden Coherent Feature Groups And Temporal Semantics For Multimedia Big Data Analysis, Yimin Yang Aug 2015

Exploring Hidden Coherent Feature Groups And Temporal Semantics For Multimedia Big Data Analysis, Yimin Yang

FIU Electronic Theses and Dissertations

Thanks to the advanced technologies and social networks that allow the data to be widely shared among the Internet, there is an explosion of pervasive multimedia data, generating high demands of multimedia services and applications in various areas for people to easily access and manage multimedia data. Towards such demands, multimedia big data analysis has become an emerging hot topic in both industry and academia, which ranges from basic infrastructure, management, search, and mining to security, privacy, and applications. Within the scope of this dissertation, a multimedia big data analysis framework is proposed for semantic information management and retrieval with …