Open Access. Powered by Scholars. Published by Universities.®

Software Engineering Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 3 of 3

Full-Text Articles in Software Engineering

An Empirical Study Of Pre-Trained Model Reuse In The Hugging Face Deep Learning Model Registry, Wenxin Jiang, Nicholas Synovic, Matt Hyatt, Taylor R. Schorlemmer, Rohan Sethi, Yung-Hsiang Lu, George K. Thiruvathukal, James C. Davis Jan 2023

An Empirical Study Of Pre-Trained Model Reuse In The Hugging Face Deep Learning Model Registry, Wenxin Jiang, Nicholas Synovic, Matt Hyatt, Taylor R. Schorlemmer, Rohan Sethi, Yung-Hsiang Lu, George K. Thiruvathukal, James C. Davis

Department of Electrical and Computer Engineering Faculty Publications

Deep Neural Networks (DNNs) are being adopted as components in software systems. Creating and specializing DNNs from scratch has grown increasingly difficult as state-of-the-art architectures grow more complex. Following the path of traditional software engineering, machine learning engineers have begun to reuse large-scale pre-trained models (PTMs) and fine-tune these models for downstream tasks. Prior works have studied reuse practices for traditional software packages to guide software engineers towards better package maintenance and dependency management. We lack a similar foundation of knowledge to guide behaviors in pre-trained model ecosystems.

In this work, we present the first empirical investigation of PTM reuse. …


On-Device Deep Learning Inference For System-On-Chip (Soc) Architectures, Tom Springer, Elia Eiroa-Lledo, Elizabeth Stevens, Erik Linstead Mar 2021

On-Device Deep Learning Inference For System-On-Chip (Soc) Architectures, Tom Springer, Elia Eiroa-Lledo, Elizabeth Stevens, Erik Linstead

Engineering Faculty Articles and Research

As machine learning becomes ubiquitous, the need to deploy models on real-time, embedded systems will become increasingly critical. This is especially true for deep learning solutions, whose large models pose interesting challenges for target architectures at the “edge” that are resource-constrained. The realization of machine learning, and deep learning, is being driven by the availability of specialized hardware, such as system-on-chip solutions, which provide some alleviation of constraints. Equally important, however, are the operating systems that run on this hardware, and specifically the ability to leverage commercial real-time operating systems which, unlike general purpose operating systems such as Linux, can …


Exploring The Efficacy Of Transfer Learning In Mining Image‑Based Software Artifacts, Natalie Best, Jordan Ott, Erik J. Linstead Aug 2020

Exploring The Efficacy Of Transfer Learning In Mining Image‑Based Software Artifacts, Natalie Best, Jordan Ott, Erik J. Linstead

Engineering Faculty Articles and Research

Background

Transfer learning allows us to train deep architectures requiring a large number of learned parameters, even if the amount of available data is limited, by leveraging existing models previously trained for another task. In previous attempts to classify image-based software artifacts in the absence of big data, it was noted that standard off-the-shelf deep architectures such as VGG could not be utilized due to their large parameter space and therefore had to be replaced by customized architectures with fewer layers. This proves to be challenging to empirical software engineers who would like to make use of existing architectures without …