Open Access. Powered by Scholars. Published by Universities.®

Other Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 4 of 4

Full-Text Articles in Other Mathematics

Normalization Techniques For Sequential And Graphical Data, Cole Pospisil Jan 2023

Normalization Techniques For Sequential And Graphical Data, Cole Pospisil

Theses and Dissertations--Mathematics

Normalization methods have proven to be an invaluable tool in the training of deep neural networks. In particular, Layer and Batch Normalization are commonly used to mitigate the risks of exploding and vanishing gradients. This work presents two methods which are related to these normalization techniques. The first method is Batch Normalized Preconditioning (BNP) for recurrent neural networks (RNN) and graph convolutional networks (GCN). BNP has been suggested as a technique for Fully Connected and Convolutional networks for achieving similar performance benefits to Batch Normalization by controlling the condition number of the Hessian through preconditioning on the gradients. We extend …


Novel Architectures And Optimization Algorithms For Training Neural Networks And Applications, Vasily I. Zadorozhnyy Jan 2023

Novel Architectures And Optimization Algorithms For Training Neural Networks And Applications, Vasily I. Zadorozhnyy

Theses and Dissertations--Mathematics

The two main areas of Deep Learning are Unsupervised and Supervised Learning. Unsupervised Learning studies a class of data processing problems in which only descriptions of objects are known, without label information. Generative Adversarial Networks (GANs) have become among the most widely used unsupervised neural net models. GAN combines two neural nets, generative and discriminative, that work simultaneously. We introduce a new family of discriminator loss functions that adopts a weighted sum of real and fake parts, which we call adaptive weighted loss functions. Using the gradient information, we can adaptively choose weights to train a discriminator in the direction …


Batch Normalization Preconditioning For Neural Network Training, Susanna Luisa Gertrude Lange Jan 2022

Batch Normalization Preconditioning For Neural Network Training, Susanna Luisa Gertrude Lange

Theses and Dissertations--Mathematics

Batch normalization (BN) is a popular and ubiquitous method in deep learning that has been shown to decrease training time and improve generalization performance of neural networks. Despite its success, BN is not theoretically well understood. It is not suitable for use with very small mini-batch sizes or online learning. In this work, we propose a new method called Batch Normalization Preconditioning (BNP). Instead of applying normalization explicitly through a batch normalization layer as is done in BN, BNP applies normalization by conditioning the parameter gradients directly during training. This is designed to improve the Hessian matrix of the loss …


On The Dimension Of A Certain Measure Arising From A Quasilinear Elliptic Partial Differential Equation, Murat Akman Jan 2014

On The Dimension Of A Certain Measure Arising From A Quasilinear Elliptic Partial Differential Equation, Murat Akman

Theses and Dissertations--Mathematics

We study the Hausdorff dimension of a certain Borel measure associated to a positive weak solution of a certain quasilinear elliptic partial differential equation in a simply connected domain in the plane. We also assume that the solution vanishes on the boundary of the domain. Then it is shown that the Hausdorff dimension of this measure is less than one, equal to one, greater than one depending on the homogeneity of the certain function. This work generalizes the work of Makarov when the partial differential equation is the usual Laplace's equation and the work of Lewis and his coauthors when …