Open Access. Powered by Scholars. Published by Universities.®
Physical Sciences and Mathematics Commons™
Open Access. Powered by Scholars. Published by Universities.®
- Keyword
-
- Recurrent Neural Networks (2)
- Artificial Spin Ice (1)
- Batch Normalization (1)
- Convolutional Neural Networks (1)
- Deep Learning (1)
-
- Dynamic inversion; high-parameter stabilization; high-gain control; unknown relative degree; control with model uncertainty (1)
- Exploding Gradients (1)
- Ferromagnetic Domains (1)
- Frustrated Magnetism (1)
- Machine Learning (1)
- Magnetic Order (1)
- Micromagnetism (1)
- Neural Networks (1)
- Protein Contact Map (1)
- Quasicrystals (1)
- RNA Secondary Structure (1)
- Symmetrized CNN Architecture (1)
- Vanishing Gradients (1)
Articles 1 - 4 of 4
Full-Text Articles in Physical Sciences and Mathematics
Effects Of Aperiodicity And Frustration On The Magnetic Properties Of Artificial Quasicrystals, Barry Farmer
Effects Of Aperiodicity And Frustration On The Magnetic Properties Of Artificial Quasicrystals, Barry Farmer
Theses and Dissertations--Physics and Astronomy
Quasicrystals have been shown to exhibit physical properties that are dramatically different from their periodic counterparts. A limited number of magnetic quasicrystals have been fabricated and measured, and they do not exhibit long-range magnetic order, which is in direct conflict with simulations that indicate such a state should be accessible. This dissertation adopts a metamaterials approach in which artificial quasicrystals are fabricated and studied with the specific goal of identifying how aperiodicity affects magnetic long-range order. Electron beam lithography techniques were used to pattern magnetic thin films into two types of aperiodic tilings, the Penrose P2, and Ammann-Beenker tilings. SQUID …
Filtered-Dynamic-Inversion Control For Unknown Minimum-Phase Systems With Unknown Relative Degree, Sumit Suryakant Kamat
Filtered-Dynamic-Inversion Control For Unknown Minimum-Phase Systems With Unknown Relative Degree, Sumit Suryakant Kamat
Theses and Dissertations--Mechanical Engineering
We present filtered-dynamic-inversion (FDI) control for unknown linear time-invariant systems that are multi-input multi-output and minimum phase with unknown-but-bounded relative degree. This FDI controller requires limited model information, specifically, knowledge of an upper bound on the relative degree and knowledge of the first nonzero Markov parameter. The FDI controller is a single-parameter high-parameter-stabilizing controller that is robust to uncertainty in the relative degree. We characterize the stability of the closed-loop system. We present numerical examples, where the FDI controller is implemented in feedback with mathematical and physical systems. The numerical examples demonstrate that the FDI controller for unknown relative degree …
Orthogonal Recurrent Neural Networks And Batch Normalization In Deep Neural Networks, Kyle Eric Helfrich
Orthogonal Recurrent Neural Networks And Batch Normalization In Deep Neural Networks, Kyle Eric Helfrich
Theses and Dissertations--Mathematics
Despite the recent success of various machine learning techniques, there are still numerous obstacles that must be overcome. One obstacle is known as the vanishing/exploding gradient problem. This problem refers to gradients that either become zero or unbounded. This is a well known problem that commonly occurs in Recurrent Neural Networks (RNNs). In this work we describe how this problem can be mitigated, establish three different architectures that are designed to avoid this issue, and derive update schemes for each architecture. Another portion of this work focuses on the often used technique of batch normalization. Although found to be successful …
Unitary And Symmetric Structure In Deep Neural Networks, Kehelwala Dewage Gayan Maduranga
Unitary And Symmetric Structure In Deep Neural Networks, Kehelwala Dewage Gayan Maduranga
Theses and Dissertations--Mathematics
Recurrent neural networks (RNNs) have been successfully used on a wide range of sequential data problems. A well-known difficulty in using RNNs is the vanishing or exploding gradient problem. Recently, there have been several different RNN architectures that try to mitigate this issue by maintaining an orthogonal or unitary recurrent weight matrix. One such architecture is the scaled Cayley orthogonal recurrent neural network (scoRNN), which parameterizes the orthogonal recurrent weight matrix through a scaled Cayley transform. This parametrization contains a diagonal scaling matrix consisting of positive or negative one entries that can not be optimized by gradient descent. Thus the …