Open Access. Powered by Scholars. Published by Universities.®

Digital Commons Network

Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics

Selected Works

Hava Siegelmann

Selected Works

1994

Articles 1 - 1 of 1

Full-Text Articles in Entire DC Network

On A Learnability Question Associated To Neural Networks With Continuous Activations, Bhaskar Dasgupta, Hava Siegelmann, Eduardo Sontag Jun 1994

On A Learnability Question Associated To Neural Networks With Continuous Activations, Bhaskar Dasgupta, Hava Siegelmann, Eduardo Sontag

Hava Siegelmann

This paper deals with learnability of concept classes defined by neural networks, showing the hardness of PAC-learning (in the complexity, not merely information-theoretic sense) for networks with a particular class of activation. The obstruction lies not with the VC dimension, which is known to grow slowly; instead, the result follows the fact that the loading problem is NP-complete. (The complexity scales badly with input dimension; the loading problem is polynomial-time if the input dimension is constant). Similar and well-known theorems had already been proved by Megiddo and by Blum and Rivest, for binary-threshold networks. It turns out the general problem …