Open Access. Powered by Scholars. Published by Universities.®

Computer Engineering Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 6 of 6

Full-Text Articles in Computer Engineering

Survey On Deep Neural Networks In Speech And Vision Systems, M. Alam, Manar D. Samad, Lasitha Vidyaratne, ‪Alexander Glandon, Khan M. Iftekharuddin Dec 2020

Survey On Deep Neural Networks In Speech And Vision Systems, M. Alam, Manar D. Samad, Lasitha Vidyaratne, ‪Alexander Glandon, Khan M. Iftekharuddin

Computer Science Faculty Research

This survey presents a review of state-of-the-art deep neural network architectures, algorithms, and systems in speech and vision applications. Recent advances in deep artificial neural network algorithms and architectures have spurred rapid innovation and development of intelligent speech and vision systems. With availability of vast amounts of sensor data and cloud computing for processing and training of deep neural networks, and with increased sophistication in mobile and embedded technology, the next-generation intelligent systems are poised to revolutionize personal and commercial computing. This survey begins by providing background and evolution of some of the most successful deep learning models for intelligent …


Exploring The Efficacy Of Transfer Learning In Mining Image‑Based Software Artifacts, Natalie Best, Jordan Ott, Erik J. Linstead Aug 2020

Exploring The Efficacy Of Transfer Learning In Mining Image‑Based Software Artifacts, Natalie Best, Jordan Ott, Erik J. Linstead

Engineering Faculty Articles and Research

Background

Transfer learning allows us to train deep architectures requiring a large number of learned parameters, even if the amount of available data is limited, by leveraging existing models previously trained for another task. In previous attempts to classify image-based software artifacts in the absence of big data, it was noted that standard off-the-shelf deep architectures such as VGG could not be utilized due to their large parameter space and therefore had to be replaced by customized architectures with fewer layers. This proves to be challenging to empirical software engineers who would like to make use of existing architectures without …


Learning In The Machine: To Share Or Not To Share?, Jordan Ott, Erik Linstead, Nicholas Lahaye, Pierre Baldi Mar 2020

Learning In The Machine: To Share Or Not To Share?, Jordan Ott, Erik Linstead, Nicholas Lahaye, Pierre Baldi

Engineering Faculty Articles and Research

Weight-sharing is one of the pillars behind Convolutional Neural Networks and their successes. However, in physical neural systems such as the brain, weight-sharing is implausible. This discrepancy raises the fundamental question of whether weight-sharing is necessary. If so, to which degree of precision? If not, what are the alternatives? The goal of this study is to investigate these questions, primarily through simulations where the weight-sharing assumption is relaxed. Taking inspiration from neural circuitry, we explore the use of Free Convolutional Networks and neurons with variable connection patterns. Using Free Convolutional Networks, we show that while weight-sharing is a pragmatic optimization …


Coverage Guided Differential Adversarial Testing Of Deep Learning Systems, Jianmin Guo, Houbing Song, Yue Zhao, Yu Jiang Jan 2020

Coverage Guided Differential Adversarial Testing Of Deep Learning Systems, Jianmin Guo, Houbing Song, Yue Zhao, Yu Jiang

Publications

Deep learning is increasingly applied to safety-critical application domains such as autonomous cars and medical devices. It is of significant importance to ensure their reliability and robustness. In this paper, we propose DLFuzz, the coverage guided differential adversarial testing framework to guide deep learing systems exposing incorrect behaviors. DLFuzz keeps minutely mutating the input to maximize the neuron coverage and the prediction difference between the original input and the mutated input, without manual labeling effort or cross-referencing oracles from other systems with the same functionality. We also design multiple novel strategies for neuron selection to improve the neuron coverage. The …


Transformer Neural Networks For Automated Story Generation, Kemal Araz Jan 2020

Transformer Neural Networks For Automated Story Generation, Kemal Araz

Dissertations

Towards the last two-decade Artificial Intelligence (AI) proved its use on tasks such as image recognition, natural language processing, automated driving. As discussed in the Moore’s law the computational power increased rapidly over the few decades (Moore, 1965) and made it possible to use the techniques which were computationally expensive. These techniques include Deep Learning (DL) changed the field of AI and outperformed other models in a lot of fields some of which mentioned above. However, in natural language generation especially for creative tasks that needs the artificial intelligent models to have not only a precise understanding of the given …


Deepmag+ : Sniffing Mobile Apps In Magnetic Field Through Deep Learning, Rui Ning, Cong Wang, Chunsheng Xin, Jiang Li, Hongyi Wu Jan 2020

Deepmag+ : Sniffing Mobile Apps In Magnetic Field Through Deep Learning, Rui Ning, Cong Wang, Chunsheng Xin, Jiang Li, Hongyi Wu

Electrical & Computer Engineering Faculty Publications

This paper reports a new side-channel attack to smartphones using the unrestricted magnetic sensor data. We demonstrate that attackers can effectively infer the Apps being used on a smartphone with an accuracy of over 80%, through training a deep Convolutional Neural Networks (CNN). Various signal processing strategies have been studied for feature extractions, including a tempogram based scheme. Moreover, by further exploiting the unrestricted motion sensor to cluster magnetometer data, the sniffing accuracy can increase to as high as 98%. To mitigate such attacks, we propose a noise injection scheme that can effectively reduce the App sniffing accuracy to only …