Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

Computer Sciences

PDF

Machine Learning Faculty Publications

Series

2023

Machine learning

Articles 1 - 3 of 3

Full-Text Articles in Physical Sciences and Mathematics

A Unified Optimization Framework Of Ann-Snn Conversion: Towards Optimal Mapping From Activation Values To Firing Rates, Haiyan Jiang, Srinivas Anumasa, Giulia De Masi, Huan Xiong, Bin Gu Jul 2023

A Unified Optimization Framework Of Ann-Snn Conversion: Towards Optimal Mapping From Activation Values To Firing Rates, Haiyan Jiang, Srinivas Anumasa, Giulia De Masi, Huan Xiong, Bin Gu

Machine Learning Faculty Publications

Spiking Neural Networks (SNNs) have gained significant attention for their energy-efficient and fast-inference capabilities, but training SNNs from scratch can be challenging due to the discrete nature of spikes. One alternative method is to convert an Artificial Neural Network (ANN) into an SNN, known as ANN-SNN conversion. Currently, existing ANN-SNN conversion methods often involve redesigning the ANN with a new activation function, rather than utilizing the traditional ReLU, and converting it to an SNN. However, these methods do not take into account the potential performance loss between the regular ANN with ReLU and the tailored ANN. In this work, we …


High-Probability Bounds For Stochastic Optimization And Variational Inequalities: The Case Of Unbounded Variance, Abdurakhmon Sadiev, Marina Danilova, Eduard Gorbunov, Samuel Horváth, Gauthier Gidel, Pavel Dvurechensky, Alexander Gasnikov, Peter Richtárik Jul 2023

High-Probability Bounds For Stochastic Optimization And Variational Inequalities: The Case Of Unbounded Variance, Abdurakhmon Sadiev, Marina Danilova, Eduard Gorbunov, Samuel Horváth, Gauthier Gidel, Pavel Dvurechensky, Alexander Gasnikov, Peter Richtárik

Machine Learning Faculty Publications

During recent years the interest of optimization and machine learning communities in high-probability convergence of stochastic optimization methods has been growing. One of the main reasons for this is that high-probability complexity bounds are more accurate and less studied than in-expectation ones. However, SOTA high-probability non-asymptotic convergence results are derived under strong assumptions such as the boundedness of the gradient noise variance or of the objective's gradient itself. In this paper, we propose several algorithms with high-probability convergence results under less restrictive assumptions. In particular, we derive new high-probability convergence results under the assumption that the gradient/operator noise has bounded …


On The Accelerated Noise-Tolerant Power Method, Zhiqiang Xu Apr 2023

On The Accelerated Noise-Tolerant Power Method, Zhiqiang Xu

Machine Learning Faculty Publications

We revisit the acceleration of the noise-tolerant power method for which, despite previous studies, the results remain unsatisfactory as they are either wrong or suboptimal, also lacking generality. In this work, we present a simple yet general and optimal analysis via noise-corrupted Chebyshev polynomials, which allows a larger iteration rank p than the target rank k, requires less noise conditions in a new form, and achieves the optimal iteration complexity (Equation presented) for some q satisfying k ≤ q ≤ p in a certain regime of the momentum parameter. Interestingly, it shows dynamic dependence of the noise tolerance on the …