Open Access. Powered by Scholars. Published by Universities.®

Theory and Algorithms Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 2 of 2

Full-Text Articles in Theory and Algorithms

Anomaly Detection In Sequential Data: A Deep Learning-Based Approach, Jayesh Soni Jun 2022

Anomaly Detection In Sequential Data: A Deep Learning-Based Approach, Jayesh Soni

FIU Electronic Theses and Dissertations

Anomaly Detection has been researched in various domains with several applications in intrusion detection, fraud detection, system health management, and bio-informatics. Conventional anomaly detection methods analyze each data instance independently (univariate or multivariate) and ignore the sequential characteristics of the data. Anomalies in the data can be detected by grouping the individual data instances into sequential data and hence conventional way of analyzing independent data instances cannot detect anomalies. Currently: (1) Deep learning-based algorithms are widely used for anomaly detection purposes. However, significant computational overhead time is incurred during the training process due to static constant batch size and learning …


An Angle-Based Stochastic Gradient Descent Method For Machine Learning: Principle And Application, Chongya Song Feb 2021

An Angle-Based Stochastic Gradient Descent Method For Machine Learning: Principle And Application, Chongya Song

FIU Electronic Theses and Dissertations

In deep learning, optimization algorithms are employed to expedite the resolution to accurate models through the calibrations of the current gradient and the associated learning rate. A major shortcoming of these existing methods is the manner in which the calibration terms are computed, only utilizing the previous gradients during their computations. Because the gradient is a time-sensitive variable computed at a specific moment in time, it is possible that older gradients can introduce significant deviation into the calibration terms. Although most algorithms alleviate this situation by combining the exponential moving average of the previous gradients, we found that this method …