Open Access. Powered by Scholars. Published by Universities.®

Computer Engineering Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 2 of 2

Full-Text Articles in Computer Engineering

Toward Generating Efficient Deep Neural Networks, Chengcheng Li May 2023

Toward Generating Efficient Deep Neural Networks, Chengcheng Li

Doctoral Dissertations

Recent advances in deep neural networks have led to tremendous applications in various tasks, such as object classification and detection, image synthesis, natural language processing, game playing, and biological imaging. However, deploying these pre-trained networks on resource-limited devices poses a challenge, as most state-of- the-art networks contain millions of parameters, making them cumbersome and slow in real-world applications. To address this problem, numerous network compression and acceleration approaches, also known as efficient deep neural networks or efficient deep learning, have been investigated, in terms of hardware and software (algorithms), training, and inference. The aim of this dissertation is to study …


Neuron Clustering For Mitigating Catastrophic Forgetting In Supervised And Reinforcement Learning, Benjamin Frederick Goodrich Dec 2015

Neuron Clustering For Mitigating Catastrophic Forgetting In Supervised And Reinforcement Learning, Benjamin Frederick Goodrich

Doctoral Dissertations

Neural networks have had many great successes in recent years, particularly with the advent of deep learning and many novel training techniques. One issue that has affected neural networks and prevented them from performing well in more realistic online environments is that of catastrophic forgetting. Catastrophic forgetting affects supervised learning systems when input samples are temporally correlated or are non-stationary. However, most real-world problems are non-stationary in nature, resulting in prolonged periods of time separating inputs drawn from different regions of the input space.

Reinforcement learning represents a worst-case scenario when it comes to precipitating catastrophic forgetting in neural networks. …