Open Access. Powered by Scholars. Published by Universities.®

Digital Commons Network

Open Access. Powered by Scholars. Published by Universities.®

Articles 1 - 16 of 16

Full-Text Articles in Entire DC Network

The Computational Power Of Interactive Recurrent Neural Networks, Jèrèmie Cabessa, Hava Siegelmann Mar 2012

The Computational Power Of Interactive Recurrent Neural Networks, Jèrèmie Cabessa, Hava Siegelmann

Hava Siegelmann

In classical computation, rational- and real-weighted recurrent neural networks were shown to be respectively equivalent to and strictly more powerful than the standard Turing machine model. Here, we study the computational power of recurrent neural networks in a more biologically oriented computational framework, capturing the aspects of sequential interactivity and persistence of memory. In this context, we prove that so-called interactive rational- and real-weighted neural networks show the same computational powers as interactive Turing machines and interactive Turing machines with advice, respectively. A mathematical characterization of each of these computational powers is also provided. It follows from these results that …


Neuronal Integration Of Dynamic Sources: Bayesian Learning And Bayesian Inference, Hava Siegelmann, Lars E. Holzman Sep 2010

Neuronal Integration Of Dynamic Sources: Bayesian Learning And Bayesian Inference, Hava Siegelmann, Lars E. Holzman

Hava Siegelmann

One of the brain’s most basic functions is integrating sensory data from diverse sources. This ability causes us to question whether the neural system is computationally capable of intelligently integrating data, not only when sources have known, fixed relative dependencies but also when it must determine such relative weightings based on dynamic conditions, and then use these learned weightings to accurately infer information about the world. We suggest that the brain is, in fact, fully capable of computing this parallel task in a single network and describe a neural inspired circuit with this property. Our implementation suggests the possibility that …


Flexible Kernel Memory, Dimitri Nowicki, Hava Siegelmann Jun 2010

Flexible Kernel Memory, Dimitri Nowicki, Hava Siegelmann

Hava Siegelmann

This paper introduces a new model of associative memory, capable of both binary and continuous-valued inputs. Based on kernel theory, the memory model is on one hand a generalization of Radial Basis Function networks and, on the other, is in feature space, analogous to a Hopfield network. Attractors can be added, deleted, and updated on-line simply, without harming existing memories, and the number of attractors is independent of input dimension. Input vectors do not have to adhere to a fixed or bounded dimensionality; they can increase and decrease it without relearning previous memories. A memory consolidation process enables the network …


Probabilistic Analysis Of A Differential Equation For Linear Programming, Asa Ben-Hur, Joshua Feinberg, Shmuel Fishman, Hava Siegelmann Apr 2009

Probabilistic Analysis Of A Differential Equation For Linear Programming, Asa Ben-Hur, Joshua Feinberg, Shmuel Fishman, Hava Siegelmann

Hava Siegelmann

In this paper we address the complexity of solving linear programming problems with a set of differential equations that converge to a fixed point that represents the optimal solution. Assuming a probabilistic model, where the inputs are i.i.d. Gaussian variables, we compute the distribution of the convergence rate to the attracting fixed point. Using the framework of Random Matrix Theory, we derive a simple expression for this distribution in the asymptotic limit of large problem size. In this limit, we find the surprising result that the distribution of the convergence rate is a scaling function of a single variable. This …


Transcriptional Responses To Estrogen And Progesterone In Mammary Gland Identify Networks Regulating P53 Activity, Shaolei Lu, Klaus A. Becker, Mary J. Hagen, Haoheng Yan, Amy L. Roberts, Lesley A. Mathews, Sallie S. Schneider, Hava Siegelmann, Kyle J. Macbeth, Stephen M. Tirrell, Jeffrey L. Blanchard, D. Joseph Jerry Sep 2008

Transcriptional Responses To Estrogen And Progesterone In Mammary Gland Identify Networks Regulating P53 Activity, Shaolei Lu, Klaus A. Becker, Mary J. Hagen, Haoheng Yan, Amy L. Roberts, Lesley A. Mathews, Sallie S. Schneider, Hava Siegelmann, Kyle J. Macbeth, Stephen M. Tirrell, Jeffrey L. Blanchard, D. Joseph Jerry

Hava Siegelmann

Estrogen and progestins are essential for mammary growth and differentiation but also enhance the activity of the p53 tumor suppressor protein in the mammary epithelium. However, the pathways by which these hormones regulate p53 activity are unknown. Microarrays were used to profile the transcriptional changes within the mammary gland after administration of either vehicle, 17β-estradiol (E), or progesterone (P) individually and combined (EP). Treatment with EP yielded 1182 unique genes that were differentially expressed compared to the vehicle-treated group. Although 30% of genes were responsive to either E or P individually, combined treatment with both EP had a synergistic effect …


Time-Warped Longest Common Subsequence Algorithm For Music Retrieval, Anyuan Guo, Hava Siegelmann Sep 2004

Time-Warped Longest Common Subsequence Algorithm For Music Retrieval, Anyuan Guo, Hava Siegelmann

Hava Siegelmann

Recent advances in music information retrieval have enabled users to query a database by singing or humming into a microphone. The queries are often inaccurate versions of the original songs due to singing errors and errors introduced in the music transcription process. In this paper, we present the Time-Warped Longest Common Sub-sequence algorithm (T-WLCS), which deals with singing errors involving rhythmic distortions. The algorithm is employed on song retrieval tasks, where its performance is compared to the longest common subsequence algorithm.


Computation In Gene Networks, Hava Siegelmann, Asa Ben-Hur Feb 2004

Computation In Gene Networks, Hava Siegelmann, Asa Ben-Hur

Hava Siegelmann

Genetic regulatory networks have the complex task of controlling all aspects of life. Using a model of gene expression by piecewise linear differential equations we show that this process can be considered as a process of computation. This is demonstrated by showing that this model can simulate memory bounded Turing machines. The simulation is robust with respect to perturbations of the system, an important property for both analog computers and biological systems. Robustness is achieved using a condition that ensures that the model equations, that are generally chaotic, follow a predictable dynamics.


Verifying Properties Of Neural Networks, Pedro Rodriques, J. Félix Costa, Hava Siegelmann May 2001

Verifying Properties Of Neural Networks, Pedro Rodriques, J. Félix Costa, Hava Siegelmann

Hava Siegelmann

In the beginning of nineties, Hava Siegelmann proposed a new computational model, the Artificial Recurrent Neural Network (ARNN), and proved that it could perform hypercomputation. She also established the equivalence between the ARNN and other analog systems that support hypercomputation, launching the foundations of an alternative computational theory. In this paper we contribute to this alternative theory by exploring the use of formal methods in the verification of temporal properties of ARNNs. Based on the work of Bradfield in verification of temporal properties of infinite systems, we simplify his tableau system, keeping its expressive power, and show that it is …


Symbolic Dynamics And Computation In Model Gene Networks, R. Edwards, Hava Siegelmann, K. Aziza, L. Glass Feb 2001

Symbolic Dynamics And Computation In Model Gene Networks, R. Edwards, Hava Siegelmann, K. Aziza, L. Glass

Hava Siegelmann

We analyze a class of ordinary differential equations representing a simplified model of a genetic network. In this network, the model genes control the production rates of other genes by a logical function. The dynamics in these equations are represented by a directed graph on an n-dimensional hypercube (n-cube) in which each edge is directed in a unique orientation. The vertices of the n-cube correspond to orthants of state space, and the edges correspond to boundaries between adjacent orthants. The dynamics in these equations can be represented symbolically. Starting from a point on the boundary between neighboring orthants, the equation …


Clustering Irregular Shapes Using High-Order Neurons, H. Lipson, Hava Siegelmann Sep 2000

Clustering Irregular Shapes Using High-Order Neurons, H. Lipson, Hava Siegelmann

Hava Siegelmann

This article introduces a method for clustering irregularly shaped data arrangements using high-order neurons. Complex analytical shapes are modeled by replacing the classic synaptic weight of the neuron by high-order tensors in homogeneous coordinates. In the first- and second-order cases, this neuron corresponds to a classic neuron and to an ellipsoidalmetric neuron. We show how high-order shapes can be formulated to follow the maximum-correlation activation principle and permit simple local Hebbian learning. We also demonstrate decomposition of spatial arrangements of data clusters, including very close and partially overlapping clusters, which are difficult to distinguish using classic neurons. Superior results are …


A Support Vector Method For Clustering, Asa Ben-Hur, David Horn, Hava Siegelmann, Vladimir Vapnik Aug 2000

A Support Vector Method For Clustering, Asa Ben-Hur, David Horn, Hava Siegelmann, Vladimir Vapnik

Hava Siegelmann

We present a novel method for clustering using the support vector machine approach. Data points are mapped to a high dimensional feature space, where support vectors are used to define a sphere enclosing them. The boundary of the sphere forms in data space a set of closed contours containing the data. Data points enclosed by each contour are defined as a cluster. As the width parameter of the Gaussian kernel is decreased, these contours fit the data more tightly and splitting of contours occurs. The algorithm works by separating clusters according to valleys in the underlying probability distribution, and thus …


Computational Complexity For Continuous Time Dynamics, Hava Siegelmann, Asa Ben-Hur, Shmuel Fishman Aug 1999

Computational Complexity For Continuous Time Dynamics, Hava Siegelmann, Asa Ben-Hur, Shmuel Fishman

Hava Siegelmann

Dissipative flows model a large variety of physical systems. In this Letter the evolution of such systems is interpreted as a process of computation; the attractor of the dynamics represents the output. A framework for an algorithmic analysis of dissipative flows is presented, enabling the comparison of the performance of discrete and continuous time analog computation models. A simple algorithm for finding the maximum of n numbers is analyzed, and shown to be highly efficient. The notion of tractable (polynomial) computation in the Turing model is conjectured to correspond to computation with tractable (analytically solvable) dynamical systems having polynomial complexity.


Discontinuities In Recurrent Neural Networks, R. Gavaldà, Hava Siegelmann Mar 1999

Discontinuities In Recurrent Neural Networks, R. Gavaldà, Hava Siegelmann

Hava Siegelmann

This article studies the computational power of various discontinuous real computational models that are based on the classical analog recurrent neural network (ARNN). This ARNN consists of finite number of neurons; each neuron computes a polynomial net function and a sigmoid-like continuous activation function. We introduce arithmetic networks as ARNN augmented with a few simple discontinuous (e.g., threshold or zero test) neurons. We argue that even with weights restricted to polynomial time computable reals, arithmetic networks are able to compute arbitrarily complex recursive functions. We identify many types of neural networks that are at least as powerful as arithmetic nets, …


Attractor Systems And Analog Computation, Hava Siegelmann, Shmuel Fishman Mar 1998

Attractor Systems And Analog Computation, Hava Siegelmann, Shmuel Fishman

Hava Siegelmann

Attractor systems are useful in neurodynamics, mainly in the modeling of associative memory. This paper presents a complexity theory for continuous phase space dynamical systems with discrete or continuous time update, which evolve to attractors. In our framework we associate complexity classes with different types of attractors. Fixed points belong to the class BPPd, while chaotic attractors are in NPd. The BPP=NP question of classical complexity theory is translated into a question in the realm of chaotic dynamical systems. This theory enables an algorithmic analysis of attractor networks and flows for the solution of various problem such as linear programming. …


Turing Universality Of Neural Nets (Revisited), J. Pedro Neto, Hava Siegelmann, J. Félix Costa, C. P. Suárez Araujo Jan 1997

Turing Universality Of Neural Nets (Revisited), J. Pedro Neto, Hava Siegelmann, J. Félix Costa, C. P. Suárez Araujo

Hava Siegelmann

We show how to use recursive function theory to prove Turing universality of finite analog recurrent neural nets, with a piecewise linear sigmoid function as activation function. We emphasize the modular construction of nets within nets, a relevant issue from the software engineering point of view.


On A Learnability Question Associated To Neural Networks With Continuous Activations, Bhaskar Dasgupta, Hava Siegelmann, Eduardo Sontag Jun 1994

On A Learnability Question Associated To Neural Networks With Continuous Activations, Bhaskar Dasgupta, Hava Siegelmann, Eduardo Sontag

Hava Siegelmann

This paper deals with learnability of concept classes defined by neural networks, showing the hardness of PAC-learning (in the complexity, not merely information-theoretic sense) for networks with a particular class of activation. The obstruction lies not with the VC dimension, which is known to grow slowly; instead, the result follows the fact that the loading problem is NP-complete. (The complexity scales badly with input dimension; the loading problem is polynomial-time if the input dimension is constant). Similar and well-known theorems had already been proved by Megiddo and by Blum and Rivest, for binary-threshold networks. It turns out the general problem …