Content deleted Content added
→A basic, canonical PSO algorithm: Added an alternate way to initialize some parameters. |
→Variations and practicalities: Additional examples of simple variations. |
||
Line 104:
== Variations and practicalities ==
There are a number of considerations in using PSO in practice; one might wish to clamp the positions or velocities to a certain
Note that the research literature has uncovered many heuristics and variants determined to be better with respect to convergence speed and robustness, such as clever choices of <math>\omega</math>, <math>c_i</math>, and <math>r_i</math>. There are also other variants of the algorithm, such as discretized versions for searching over subsets of <math>\mathbb{Z}^n</math> rather than <math>\mathbb{R}^n</math>. There has also been experimentation with coevolutionary versions of the PSO algorithm with good results reported. Very frequently the value of <math>\omega</math> is taken to decrease over time; e.g., one might have the PSO run for a certain number of iterations and DECREASE linearly from a starting value (0.9, say) to a final value (0.4, say) in order to facilitate exploitation over exploration in later states of the search. The literature is full of such heuristics. In other words, the canonical PSO algorithm is not as strong as various improvements which have been developed on several common function optimization benchmarks and consulting the literature for ideas on parameter choices and variants for particular problems is likely to be helpful.
|