Open Access. Powered by Scholars. Published by Universities.®

Engineering Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 5 of 5

Full-Text Articles in Engineering

Architectures And Algorithms For Intrinsic Computation With Memristive Devices, Jens Bürger Aug 2016

Architectures And Algorithms For Intrinsic Computation With Memristive Devices, Jens Bürger

Dissertations and Theses

Neuromorphic engineering is the research field dedicated to the study and design of brain-inspired hardware and software tools. Recent advances in emerging nanoelectronics promote the implementation of synaptic connections based on memristive devices. Their non-volatile modifiable conductance was shown to exhibit the synaptic properties often used in connecting and training neural layers. With their nanoscale size and non-volatile memory property, they promise a next step in designing more area and energy efficient neuromorphic hardware.

My research deals with the challenges of harnessing memristive device properties that go beyond the behaviors utilized for synaptic weight storage. Based on devices that exhibit …


Memory And Information Processing In Recurrent Neural Networks, Alireza Goudarzi, Sarah Marzen, Peter Banda, Guy Feldman, Matthew R. Lakin, Christof Teuscher, Darko Stefanovic Apr 2016

Memory And Information Processing In Recurrent Neural Networks, Alireza Goudarzi, Sarah Marzen, Peter Banda, Guy Feldman, Matthew R. Lakin, Christof Teuscher, Darko Stefanovic

Electrical and Computer Engineering Faculty Publications and Presentations

Recurrent neural networks (RNN) are simple dynamical systems whose computational power has been attributed to their short-term memory. Short-term memory of RNNs has been previously studied analytically only for the case of orthogonal networks, and only under annealed approximation, and uncorrelated input. Here for the first time, we present an exact solution to the memory capacity and the task-solving performance as a function of the structure of a given network instance, enabling direct determination of the function-structure relation in RNNs. We calculate the memory capacity for arbitrary networks with exponentially correlated input and further related it to the performance of …


Information Representation And Computation Of Spike Trains In Reservoir Computing Systems With Spiking Neurons And Analog Neurons, Amin Almassian Mar 2016

Information Representation And Computation Of Spike Trains In Reservoir Computing Systems With Spiking Neurons And Analog Neurons, Amin Almassian

Dissertations and Theses

Real-time processing of space-and-time-variant signals is imperative for perception and real-world problem-solving. In the brain, spatio-temporal stimuli are converted into spike trains by sensory neurons and projected to the neurons in subcortical and cortical layers for further processing.

Reservoir Computing (RC) is a neural computation paradigm that is inspired by cortical Neural Networks (NN). It is promising for real-time, on-line computation of spatio-temporal signals. An RC system incorporates a Recurrent Neural Network (RNN) called reservoir, the state of which is changed by a trajectory of perturbations caused by a spatio-temporal input sequence. A trained, non- recurrent, linear readout-layer interprets the …


Sparse Adaptive Local Machine Learning Algorithms For Sensing And Analytics, Jack Cannon Jan 2016

Sparse Adaptive Local Machine Learning Algorithms For Sensing And Analytics, Jack Cannon

Undergraduate Research & Mentoring Program

The goal of digital image processing is to capture, transmit, and display images as efficiently as possible. Such tasks are computationally intensive because an image is digitally represented by large amounts of data. It is possible to render an image by reconstructing it with a subset of the most relevant data. One such procedure used to accomplish this task is commonly referred to as sparse coding. For our purpose, we use images of handwritten digits that are presented to an artificial neural network. The network implements Rozell's locally competitive algorithm (LCA) to generate a sparse code. This sparse code is …


A Brief Review Of Speaker Recognition Technology, Clark D. Shaver, John M. Acken Jan 2016

A Brief Review Of Speaker Recognition Technology, Clark D. Shaver, John M. Acken

Electrical and Computer Engineering Faculty Publications and Presentations

This paper reviews the development of speaker recognition systems from pre-computing days to current trends. Advances in various sciences which have allowed autonomous speaker recognition systems to become a practical means of identity authentication are also reviewed.