Open Access. Powered by Scholars. Published by Universities.®

Digital Commons Network

Open Access. Powered by Scholars. Published by Universities.®

PDF

Selected Works

Physical Sciences and Mathematics

2000

Hava Siegelmann

Articles 1 - 2 of 2

Full-Text Articles in Entire DC Network

Clustering Irregular Shapes Using High-Order Neurons, H. Lipson, Hava Siegelmann Sep 2000

Clustering Irregular Shapes Using High-Order Neurons, H. Lipson, Hava Siegelmann

Hava Siegelmann

This article introduces a method for clustering irregularly shaped data arrangements using high-order neurons. Complex analytical shapes are modeled by replacing the classic synaptic weight of the neuron by high-order tensors in homogeneous coordinates. In the first- and second-order cases, this neuron corresponds to a classic neuron and to an ellipsoidalmetric neuron. We show how high-order shapes can be formulated to follow the maximum-correlation activation principle and permit simple local Hebbian learning. We also demonstrate decomposition of spatial arrangements of data clusters, including very close and partially overlapping clusters, which are difficult to distinguish using classic neurons. Superior results are …


A Support Vector Method For Clustering, Asa Ben-Hur, David Horn, Hava Siegelmann, Vladimir Vapnik Aug 2000

A Support Vector Method For Clustering, Asa Ben-Hur, David Horn, Hava Siegelmann, Vladimir Vapnik

Hava Siegelmann

We present a novel method for clustering using the support vector machine approach. Data points are mapped to a high dimensional feature space, where support vectors are used to define a sphere enclosing them. The boundary of the sphere forms in data space a set of closed contours containing the data. Data points enclosed by each contour are defined as a cluster. As the width parameter of the Gaussian kernel is decreased, these contours fit the data more tightly and splitting of contours occurs. The algorithm works by separating clusters according to valleys in the underlying probability distribution, and thus …