Swarmsize to give to particleswarm optimization

12 visualizaciones (últimos 30 días)
Rafael
Rafael el 27 de Mzo. de 2023
Comentada: Rafael el 29 de Mzo. de 2023
Im using PSO to find the minimum of a function that has 3 variables, the first time I did not customize the paramaters so the errors I got were ok but not great.
Eventually I found that I can change 'particlesize' and 'MaxIterations'. Changing 'MaxIterations' did not make any difference, however, changing the particlesize improved the error a lot.
This is the options I give to the particleswarm function.
options = optimoptions('particleswarm','MaxIterations',10e12,'SwarmSize',1200);
As you can see the size Im using is 1200, instead of the default nvars*10.
The problem that came up as you are probably guessing is the complexity (time) increased too.
So, I would like to do some sort of analysis of the complexity vs error in order to find an optimal size, but Im not sure how should I start.
The only idea I have is too run the optimization 100 times for each value of particlesize like 100,150,200...., and then averaging the error and time for each value of the size, but I don´t know if this is an effiecient way to do it, or even a correct way to do it.
I would choose the size that corresponded to the best time*error, being the best the one with the lowest value.

Respuestas (1)

Alan Weiss
Alan Weiss el 28 de Mzo. de 2023
For most optimization problems, as opposed to algorithm development, the question is how to obtain a good solution in as few function evaluations as possible. To do so, usually the best advice is to give the tightest bounds possible, meaning upper and lower limits on each of your three variables with these limits as close together as possible for each variable. After that, the best advice is to choose the most appropriate solver. For smooth problems, that is usually fmincon. For nonsmooth problems, that is usually patternsearch. Also, if you want a global minimum rather than a local minimum, the best thing to do is usually run MultiStart for smooth problems using fmincon as a solver, or run patternsearch starting from a variety of different points.
I am not sure what you are trying to do: solve an optimization problem efficiently or investigate the particleswarm algorithm. I answered as if you are trying to optimize efficiently.
Of course, there are a host of considerations that would modify my basic advice. Provide more information and we might be able to help you more relevantly.
Alan Weiss
MATLAB mathematical toolbox documentation
  1 comentario
Rafael
Rafael el 29 de Mzo. de 2023
Thank you for your answer, it was very insightful, however I'm trying to do both things you suggested.
To be more clear, I tried to solve the problem using fmincon and fminsearch and they never output a good estimate, so I concluded that my function was not smooth. Then I found that particleswarm was appropriate to use in this case.
So now, I've chosen to use particleswarm, and as such, I want to study the best way to optimize it. So here's where my question comes from, how should I study the efficiency of the PSO? Right now Im running the algorithm 100 times for each swarmsize (pre-determined), and then I will plot the mean error and mean time the algorithm took to run, then I think I can conclude which size is optimal.
I am aware that restraining my limits will improve the optimization, and I already did that.
I will consider your opinions and after studying the best way to use particleswarm, will try other functions such as patternsearch.
So basically, yes I want the best results as possible, and that might lead me to change algorithms, but right now, I would like to study the performance of particleswarm regarding my particular problem.

Iniciar sesión para comentar.

Categorías

Más información sobre Solver Outputs and Iterative Display en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by