Se ha denunciado esta presentación.
Utilizamos tu perfil de LinkedIn y tus datos de actividad para personalizar los anuncios y mostrarte publicidad más relevante. Puedes cambiar tus preferencias de publicidad en cualquier momento.

Particle swarm optimization

concepts and working of particle swarm optimization.

  • Inicia sesión para ver los comentarios

Particle swarm optimization

  1. 1. Particle Swarm Optimization (PSO)
  2. 2. Introduction  Many difficulties such as multi- modality, dimensionality and differentiability are associated with the optimization of large-scale problems.  Traditional techniques such as steepest decent, linear programing and dynamic programing generally fail to solve such large-scale problems especially with nonlinear objective functions.
  3. 3. Introduction…  Traditional techniques often fail to solve optimization problems that have many local optima.  To overcome these problems, there is a need to develop more powerful optimization techniques.
  4. 4. Introduction…  Some of the well-known population- based optimization techniques are:  Genetic Algorithms (GA)  Artificial Immune Algorithms (AIA)  Ant Colony Optimization (ACO)  Particle Swarm Optimization (PSO)  Bacteria Foraging Optimization (BFO)  Artificial Bee Colony (ABC)  Biogeography-Based Optimization (BBO) Etc.
  5. 5. Particle Swarm Optimization (PSO)  Particle swarm optimization (PSO) is an evolutionary computation technique developed by Kennedy and Eberhart.  It exhibits common evolutionary computation attributes including initialization with a population of random solutions and searching for optima by updating generations.
  6. 6. Concept  A Simulation of a simplified social system.  The original intent was to graphically simulate the graceful but unpredictable choreography of a bird flock.  Each particle keeps track of its coordinates in the problem space, which are associated with the best solution (fitness) it has achieved so
  7. 7. How it works ??  PSO is initialized with a group of random particles (solutions) and then  Searches for optima by updating generations.  Potential solutions, called particles, are then ‘‘flown’’ through the problem space by following the current optimum particles.
  8. 8. How it Works ??  Each particle keeps track of its coordinates in the problem space, which are associated with the best solution (fitness) it has achieved so far.  This value is called ‘pBest’.  Another "best" value that is tracked by the particle swarm optimizer is the best value obtained so far by any particle in the population.  This second best value is a global best and called “gbest”.
  9. 9. How it works ??  The particle swarm optimization concept consists of, at each step, changing the velocity (i.e. accelerating) of each particle toward its ‘pBest’ and ‘gBest’ locations (global version of PSO).
  10. 10. PSO Algorithm (General) Searches Hyperspace of Problem for Optimum  Define problem to search  How many dimensions?  Solution criteria?  Initialize Population  Random initial positions  Random initial velocities  Determine Best Position  Global Best Position  Personal Best Position  Update Velocity and Position Equations
  11. 11. The step-by-step implementation
  12. 12. Step 1:  Initialize PSO parameters which are necessary for the algorithm.  population size which indicates the number of individuals,  number of generations necessary for the termination criterion,  cognitive constant, social constant,  variation of inertia weight, maximum velocity,  number of design variables and respective ranges for the design variables.
  13. 13. Step 2:  Generate random population equal to the population size specified.  Each population member contains the value of all the design variables. This value of design variable is randomly generated in between the design variable range specified.  population means the group of birds (particles) which represents the set of solutions.
  14. 14. Step 3:  Obtain the values of the objective function for all the population members.  For the first iteration, value of objective function indicates the pBest for the respective particle in the solution.  Identify the particle with best objective function value which identifies as gBest.  If the problem is a constrained optimization problem, then a specific approach such as static penalty, dynamic penalty and adaptive penalty is used to convert the constrained optimization problem into the unconstrained optimization problem.
  15. 15. Step 4:  Update the velocity of each particle and Check for the maximum velocity.  If the velocity obtained exceeds the maximum velocity,  then reduce the existing velocity to the maximum velocity.
  16. 16. Step 5:  Update the position of the particles,  Check all the design variables for the upper and lower limits.
  17. 17. Step 6:  Obtain the value of objective function for all the particles.  The new solution replaces the pBest if it has better function value.  Identify the gBest from the population.  Update the value of inertia weight if required.
  18. 18. Step 7:  Best obtained results are saved using elitism.  All elite members are not modified using crossover and mutation operators but can be replaced if better solutions are obtained in any iteration.
  19. 19. Step 8:  Repeat the steps (from step 4) until the specified number of generations or termination criterion is reached.
  20. 20. Advantages  PSO is based on the intelligence. It can be applied into both scientific research and engineering use.  PSO have no overlapping and mutation calculation.  The search can be carried out by the speed of the particle. During the development of several generations, only the most optimist particle can transmit information onto the other particles, and the speed of the researching is very fast.
  21. 21. Advantages…  The calculation in PSO is very simple. Compared with the other developing calculations, it occupies the bigger optimization ability and it can be completed easily.  PSO adopts the real number code, and it is decided directly by the solution. The number of the dimension is equal to the constant of the solution.
  22. 22. Disadvantages  The method easily suffers from the partial optimism, which causes the less exact at the regulation of its speed and the direction.  The method can not work out the problems of scattering and  The method can not work out the problems of non-coordinate system, such as the solution to the energy field and the moving rules of the particles in the energy field
  23. 23. Thank You !!

×