Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

Computer Sciences

Brigham Young University

Series

Particle swarm optimization

Articles 1 - 7 of 7

Full-Text Articles in Physical Sciences and Mathematics

A Speculative Approach To Parallelization In Particle Swarm Optimization, Matthew Gardner, Andrew Mcnabb, Kevin Seppi Dec 2011

A Speculative Approach To Parallelization In Particle Swarm Optimization, Matthew Gardner, Andrew Mcnabb, Kevin Seppi

Faculty Publications

Particle swarm optimization (PSO) has previously been parallelized primarily by distributing the computation corresponding to particles across multiple processors. In these approaches, the only benefit of additional processors is an increased swarm size. However, in many cases this is not efficient when scaled to very large swarm sizes (on very large clusters). Current methods cannot answer well the question: “How can 1000 processors be fully utilized when 50 or 100 particles is the most efficient swarm size?” In this paper we attempt to answer that question with a speculative approach to the parallelization of PSO that we refer to as …


An Exploration Of Topologies And Communication In Large Particle Swarms, Matthew Gardner, Andrew Mcnabb, Kevin Seppi May 2009

An Exploration Of Topologies And Communication In Large Particle Swarms, Matthew Gardner, Andrew Mcnabb, Kevin Seppi

Faculty Publications

Particle Swarm Optimization (PSO) has typically been used with small swarms of about 50 particles. However, PSO is more efficiently parallelized with large swarms. We formally describe existing topologies and identify variations which are better suited to large swarms in both sequential and parallel computing environments. We examine the performance of PSO for benchmark functions with respect to swarm size and topology. We develop and demonstrate a new PSO variant which leverages the unique strengths of large swarms. “Hearsay PSO” allows for information to flow quickly through the swarm, even with very loosely connected topologies. These loosely connected topologies are …


Parallel Pso Using Mapreduce, Andrew Mcnabb, Christopher K. Monson, Kevin Seppi Sep 2007

Parallel Pso Using Mapreduce, Andrew Mcnabb, Christopher K. Monson, Kevin Seppi

Faculty Publications

In optimization problems involving large amounts of data, such as web content, commercial transaction information, or bioinformatics data, individual function evaluations may take minutes or even hours. Particle Swarm Optimization (PSO) must be parallelized for such functions. However, large-scale parallel programs must communicate efficiently, balance work across all processors, and address problems such as failed nodes. We present MapReduce Particle Swarm Optimization (MRPSO), a PSO implementation based on the MapReduce parallel programming model. We describe MapReduce and show how PSO can be naturally expressed in this model, without explicitly addressing any of the details of parallelization. We present a benchmark …


Particle Swarm Optimization In Dynamic Pricing, Christopher K. Monson, Patrick B. Mullen, Kevin Seppi, Sean C. Warnick Jul 2006

Particle Swarm Optimization In Dynamic Pricing, Christopher K. Monson, Patrick B. Mullen, Kevin Seppi, Sean C. Warnick

Faculty Publications

Dynamic pricing is a real-time machine learning problem with scarce prior data and a concrete learning cost. While the Kalman Filter can be employed to track hidden demand parameters and extensions to it can facilitate exploration for faster learning, the exploratory nature of Particle Swarm Optimization makes it a natural choice for the dynamic pricing problem. We compare both the Kalman Filter and existing particle swarm adaptations for dynamic and/or noisy environments with a novel approach that time-decays each particle's previous best value; this new strategy provides more graceful and effective transitions between exploitation and exploration, a necessity in the …


Linear Equality Constraints And Homomorphous Mappings In Pso, Christopher K. Monson, Kevin Seppi Sep 2005

Linear Equality Constraints And Homomorphous Mappings In Pso, Christopher K. Monson, Kevin Seppi

Faculty Publications

We present a homomorphous mapping that converts problems with linear equality constraints into fully unconstrained and lower-dimensional problems for optimization with PSO. This approach, in contrast with feasibility preservation methods, allows any unconstrained optimization algorithm to be applied to a problem with linear equality constraints, making available tools that are known to be effective and simplifying the process of choosing an optimizer for these kinds of constrained problems. The application of some PSO algorithms to a problem that has undergone the mapping presented here is shown to be more effective and more consistent than other approaches to handling linear equality …


Choosing A Starting Configuration For Particle Swarm Optimization, Mark Richards, Dan A. Ventura Jul 2004

Choosing A Starting Configuration For Particle Swarm Optimization, Mark Richards, Dan A. Ventura

Faculty Publications

The performance of Particle Swarm Optimization can be improved by strategically selecting the starting positions of the particles. This work suggests the use of generators from centroidal Voronoi tessellations as the starting points for the swarm. The performance of swarms initialized with this method is compared with the standard PSO algorithm on several standard test functions. Results suggest that CVT initialization improves PSO performance in high-dimensional spaces.


Dynamic Sociometry In Particle Swarm Optimization, Mark Richards, Dan A. Ventura Sep 2003

Dynamic Sociometry In Particle Swarm Optimization, Mark Richards, Dan A. Ventura

Faculty Publications

The performance of Particle Swarm Optimization is greatly affected by the size and sociometry of the swarm. This research proposes a dynamic sociometry, which is shown to be more effective on some problems than the standard star and ring sociometries. The performance of various combinations of swarm size and sociometry on six different test functions is qualitatively analyzed.