Open Access. Powered by Scholars. Published by Universities.®

Digital Commons Network

Open Access. Powered by Scholars. Published by Universities.®

Electrical and Computer Engineering

Old Dominion University

Machine learning

Articles 1 - 3 of 3

Full-Text Articles in Entire DC Network

Gpu Utilization: Predictive Sarimax Time Series Analysis, Dorothy Dorie Parry Apr 2023

Gpu Utilization: Predictive Sarimax Time Series Analysis, Dorothy Dorie Parry

Modeling, Simulation and Visualization Student Capstone Conference

This work explores collecting performance metrics and leveraging the output for prediction on a memory-intensive parallel image classification algorithm - Inception v3 (or "Inception3"). Experimental results were collected by nvidia-smi on a computational node DGX-1, equipped with eight Tesla V100 Graphic Processing Units (GPUs). Time series analysis was performed on the GPU utilization data taken, for multiple runs, of Inception3’s image classification algorithm (see Figure 1). The time series model applied was Seasonal Autoregressive Integrated Moving Average Exogenous (SARIMAX).


A Survey Of Using Machine Learning In Iot Security And The Challenges Faced By Researchers, Khawlah M. Harahsheh, Chung-Hao Chen Jan 2023

A Survey Of Using Machine Learning In Iot Security And The Challenges Faced By Researchers, Khawlah M. Harahsheh, Chung-Hao Chen

Electrical & Computer Engineering Faculty Publications

The Internet of Things (IoT) has become more popular in the last 15 years as it has significantly improved and gained control in multiple fields. We are nowadays surrounded by billions of IoT devices that directly integrate with our lives, some of them are at the center of our homes, and others control sensitive data such as military fields, healthcare, and datacenters, among others. This popularity makes factories and companies compete to produce and develop many types of those devices without caring about how secure they are. On the other hand, IoT is considered a good insecure environment for cyber …


Runtime Energy Savings Based On Machine Learning Models For Multicore Applications, Vaibhav Sundriyal, Masha Sosonkina Jun 2022

Runtime Energy Savings Based On Machine Learning Models For Multicore Applications, Vaibhav Sundriyal, Masha Sosonkina

Electrical & Computer Engineering Faculty Publications

To improve the power consumption of parallel applications at the runtime, modern processors provide frequency scaling and power limiting capabilities. In this work, a runtime strategy is proposed to maximize energy savings under a given performance degradation. Machine learning techniques were utilized to develop performance models which would provide accurate performance prediction with change in operating core-uncore frequency. Experiments, performed on a node (28 cores) of a modern computing platform showed significant energy savings of as much as 26% with performance degradation of as low as 5% under the proposed strategy compared with the execution in the unlimited power case.