Open Access. Powered by Scholars. Published by Universities.®

Computer Engineering Commons

Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics

Chapman University

Deep learning

Publication Year

Articles 1 - 4 of 4

Full-Text Articles in Computer Engineering

A Deep Learning-Based Approach To Extraction Of Filler Morphology In Sem Images With The Application Of Automated Quality Inspection, Md. Fashiar Rahman, Tzu-Liang Bill Tseng, Jianguo Wu, Yuxin Wen, Yirong Lin Mar 2022

A Deep Learning-Based Approach To Extraction Of Filler Morphology In Sem Images With The Application Of Automated Quality Inspection, Md. Fashiar Rahman, Tzu-Liang Bill Tseng, Jianguo Wu, Yuxin Wen, Yirong Lin

Engineering Faculty Articles and Research

Automatic extraction of filler morphology (size, orientation, and spatial distribution) in Scanning Electron Microscopic (SEM) images is essential in many applications such as automatic quality inspection in composite manufacturing. Extraction of filler morphology greatly depends on accurate segmentation of fillers (fibers and particles), which is a challenging task due to the overlap of fibers and particles and their obscure presence in SEM images. Convolution Neural Networks (CNNs) have been shown to be very effective at object recognition in digital images. This paper proposes an automatic filler detection system in SEM images, utilizing a Mask Region-based CNN architecture. The proposed system …


On-Device Deep Learning Inference For System-On-Chip (Soc) Architectures, Tom Springer, Elia Eiroa-Lledo, Elizabeth Stevens, Erik Linstead Mar 2021

On-Device Deep Learning Inference For System-On-Chip (Soc) Architectures, Tom Springer, Elia Eiroa-Lledo, Elizabeth Stevens, Erik Linstead

Engineering Faculty Articles and Research

As machine learning becomes ubiquitous, the need to deploy models on real-time, embedded systems will become increasingly critical. This is especially true for deep learning solutions, whose large models pose interesting challenges for target architectures at the “edge” that are resource-constrained. The realization of machine learning, and deep learning, is being driven by the availability of specialized hardware, such as system-on-chip solutions, which provide some alleviation of constraints. Equally important, however, are the operating systems that run on this hardware, and specifically the ability to leverage commercial real-time operating systems which, unlike general purpose operating systems such as Linux, can …


Exploring The Efficacy Of Transfer Learning In Mining Image‑Based Software Artifacts, Natalie Best, Jordan Ott, Erik J. Linstead Aug 2020

Exploring The Efficacy Of Transfer Learning In Mining Image‑Based Software Artifacts, Natalie Best, Jordan Ott, Erik J. Linstead

Engineering Faculty Articles and Research

Background

Transfer learning allows us to train deep architectures requiring a large number of learned parameters, even if the amount of available data is limited, by leveraging existing models previously trained for another task. In previous attempts to classify image-based software artifacts in the absence of big data, it was noted that standard off-the-shelf deep architectures such as VGG could not be utilized due to their large parameter space and therefore had to be replaced by customized architectures with fewer layers. This proves to be challenging to empirical software engineers who would like to make use of existing architectures without …


Learning In The Machine: To Share Or Not To Share?, Jordan Ott, Erik Linstead, Nicholas Lahaye, Pierre Baldi Mar 2020

Learning In The Machine: To Share Or Not To Share?, Jordan Ott, Erik Linstead, Nicholas Lahaye, Pierre Baldi

Engineering Faculty Articles and Research

Weight-sharing is one of the pillars behind Convolutional Neural Networks and their successes. However, in physical neural systems such as the brain, weight-sharing is implausible. This discrepancy raises the fundamental question of whether weight-sharing is necessary. If so, to which degree of precision? If not, what are the alternatives? The goal of this study is to investigate these questions, primarily through simulations where the weight-sharing assumption is relaxed. Taking inspiration from neural circuitry, we explore the use of Free Convolutional Networks and neurons with variable connection patterns. Using Free Convolutional Networks, we show that while weight-sharing is a pragmatic optimization …