Open Access. Powered by Scholars. Published by Universities.®

Engineering Commons

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 3 of 3

Full-Text Articles in Engineering

Recall Distortion In Neural Network Pruning And The Undecayed Pruning Algorithm, Aidan Good, Jiaqi Lin, Hannah Sieg, Mikey Ferguson, Xin Yu, Shandian Zhe, Jerzy Wieczorek, Thiago Serra Nov 2022

Recall Distortion In Neural Network Pruning And The Undecayed Pruning Algorithm, Aidan Good, Jiaqi Lin, Hannah Sieg, Mikey Ferguson, Xin Yu, Shandian Zhe, Jerzy Wieczorek, Thiago Serra

Faculty Conference Papers and Presentations

Pruning techniques have been successfully used in neural networks to trade accuracy for sparsity. However, the impact of network pruning is not uniform: prior work has shown that the recall for underrepresented classes in a dataset may be more negatively affected. In this work, we study such relative distortions in recall by hypothesizing an intensification effect that is inherent to the model. Namely, that pruning makes recall relatively worse for a class with recall below accuracy and, conversely, that it makes recall relatively better for a class with recall above accuracy. In addition, we propose a new pruning algorithm aimed …


Training Thinner And Deeper Neural Networks: Jumpstart Regularization, Carles Riera, Camilo Rey, Thiago Serra, Eloi Puertas, Oriol Pujol Jun 2022

Training Thinner And Deeper Neural Networks: Jumpstart Regularization, Carles Riera, Camilo Rey, Thiago Serra, Eloi Puertas, Oriol Pujol

Faculty Conference Papers and Presentations

Neural networks are more expressive when they have multiple layers. In turn, conventional training methods are only successful if the depth does not lead to numerical issues such as exploding or vanishing gradients, which occur less frequently when the layers are sufficiently wide. However, increasing width to attain greater depth entails the use of heavier computational resources and leads to overparameterized models. These subsequent issues have been partially addressed by model compression methods such as quantization and pruning, some of which relying on normalization-based regularization of the loss function to make the effect of most parameters negligible. In this work, …


Strengthening A Linear Reformulation Of The 0-1 Cubic Knapsack Problem Via Variable Reordering, Richard Forrester, Lucas Waddell Jan 2022

Strengthening A Linear Reformulation Of The 0-1 Cubic Knapsack Problem Via Variable Reordering, Richard Forrester, Lucas Waddell

Faculty Journal Articles

The 0-1 cubic knapsack problem (CKP), a generalization of the classical 0-1 quadratic knapsack problem, is an extremely challenging NP-hard combinatorial optimization problem. An effective exact solution strategy for the CKP is to reformulate the nonlinear problem into an equivalent linear form that can then be solved using a standard mixed-integer programming solver. We consider a classical linearization method and propose a variant of a more recent technique for linearizing 0-1 cubic programs applied to the CKP. Using a variable reordering strategy, we show how to improve the strength of the linear programming relaxation of our proposed reformulation, which ultimately …