Open Access. Powered by Scholars. Published by Universities.®

Computer Engineering Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 4 of 4

Full-Text Articles in Computer Engineering

Multi-Scale Attention Networks For Pavement Defect Detection, Junde Chen, Yuxin Wen, Yaser Ahangari Nanehkaran, Defu Zhang, Adan Zeb Jul 2023

Multi-Scale Attention Networks For Pavement Defect Detection, Junde Chen, Yuxin Wen, Yaser Ahangari Nanehkaran, Defu Zhang, Adan Zeb

Engineering Faculty Articles and Research

Pavement defects such as cracks, net cracks, and pit slots can cause potential traffic safety problems. The timely detection and identification play a key role in reducing the harm of various pavement defects. Particularly, the recent development in deep learning-based CNNs has shown competitive performance in image detection and classification. To detect pavement defects automatically and improve effects, a multi-scale mobile attention-based network, which we termed MANet, is proposed to perform the detection of pavement defects. The architecture of the encoder-decoder is used in MANet, where the encoder adopts the MobileNet as the backbone network to extract pavement defect features. …


Towards Qos-Based Embedded Machine Learning, Tom Springer, Erik Linstead, Peiyi Zhao, Chelsea Parlett-Pelleriti Oct 2022

Towards Qos-Based Embedded Machine Learning, Tom Springer, Erik Linstead, Peiyi Zhao, Chelsea Parlett-Pelleriti

Engineering Faculty Articles and Research

Due to various breakthroughs and advancements in machine learning and computer architectures, machine learning models are beginning to proliferate through embedded platforms. Some of these machine learning models cover a range of applications including computer vision, speech recognition, healthcare efficiency, industrial IoT, robotics and many more. However, there is a critical limitation in implementing ML algorithms efficiently on embedded platforms: the computational and memory expense of many machine learning models can make them unsuitable in resource-constrained environments. Therefore, to efficiently implement these memory-intensive and computationally expensive algorithms in an embedded computing environment, innovative resource management techniques are required at the …


Exploring The Efficacy Of Transfer Learning In Mining Image‑Based Software Artifacts, Natalie Best, Jordan Ott, Erik J. Linstead Aug 2020

Exploring The Efficacy Of Transfer Learning In Mining Image‑Based Software Artifacts, Natalie Best, Jordan Ott, Erik J. Linstead

Engineering Faculty Articles and Research

Background

Transfer learning allows us to train deep architectures requiring a large number of learned parameters, even if the amount of available data is limited, by leveraging existing models previously trained for another task. In previous attempts to classify image-based software artifacts in the absence of big data, it was noted that standard off-the-shelf deep architectures such as VGG could not be utilized due to their large parameter space and therefore had to be replaced by customized architectures with fewer layers. This proves to be challenging to empirical software engineers who would like to make use of existing architectures without …


Learning In The Machine: To Share Or Not To Share?, Jordan Ott, Erik Linstead, Nicholas Lahaye, Pierre Baldi Mar 2020

Learning In The Machine: To Share Or Not To Share?, Jordan Ott, Erik Linstead, Nicholas Lahaye, Pierre Baldi

Engineering Faculty Articles and Research

Weight-sharing is one of the pillars behind Convolutional Neural Networks and their successes. However, in physical neural systems such as the brain, weight-sharing is implausible. This discrepancy raises the fundamental question of whether weight-sharing is necessary. If so, to which degree of precision? If not, what are the alternatives? The goal of this study is to investigate these questions, primarily through simulations where the weight-sharing assumption is relaxed. Taking inspiration from neural circuitry, we explore the use of Free Convolutional Networks and neurons with variable connection patterns. Using Free Convolutional Networks, we show that while weight-sharing is a pragmatic optimization …