Calling GA() with 10 generations 10x vs GA() with 100 generations.
1 visualización (últimos 30 días)
Mostrar comentarios más antiguos
I'm running GA optimization on a problem, and I want a way to save intermediate results because of potential crashes and early exiting if desired.
I've been running GA like this:
options = optimoptions('ga','PlotFcn', @gaplotbestf,'Display','iter','PopulationSize',150,'MaxGenerations',10);
[x,fval,exitflag,output,final_pop,final_scores] = ga(fun,nvars,[],[],[],[],lb,ub,[],[],options);
for idx = 1:9
file_name = 'dummy';% Excluded file naming code for brevity
save(file_name,'final_pop','final_scores');
options = optimoptions('ga','PlotFcn', @gaplotbestf,'Display','iter','PopulationSize',150,'MaxGenerations',10,'InitialPopulationMatrix',final_pop,'InitialScoresMatrix',final_scores);
[x,fval,exitflag,output,final_pop,final_scores] = ga(fun,nvars,[],[],[],[],lb,ub,[],[],options);
end
Clarification edit: The options with each call after the first gives ga() the final population and scores from the previous ga() run with the option in optimoptions(): 'InitialPopulationMatrix',final_pop,'InitialScoresMatrix',final_scores
Is there any difference in the results GA will give (besides due to different seeding) vs running with 100 generations?
options = optimoptions('ga','PlotFcn', @gaplotbestf,'Display','iter','PopulationSize',150,'MaxGenerations',100,'UseParallel',true,'SelectionFcn',{@selectiontournament,2},'EliteCount',13);
[x,fval,exitflag,output,final_pop,final_scores] = ga(fun,nvars,[],[],[],[],lb,ub,[],[],options);
0 comentarios
Respuestas (1)
John D'Errico
el 17 de Abr. de 2024
Editada: John D'Errico
el 17 de Abr. de 2024
Is there any difference? Of course! Make it more extreme yet. Suppose you were to stop after 1 iteration. ONLY 1. But do it 100 times. Each call will not come even remotely close to convergence. So you will get random crapola 100 times. Surely that is different when compared to allowing the optimizer to run to the point where it has converged, but doing it only once?
Even were you to take the average of 100 sets of randomly unconverged crap will still probably not be very good.
2 comentarios
John D'Errico
el 17 de Abr. de 2024
Ok. That is different.
GA does not generate anything in the form of a hessian matrix, and that is how a memory would be employed. As such, I think the two cases would now be similar, that is as long as you can start out with an initial population that is identical to where GA ended up last. I don't know they would be identical of course. The random seed state would influence things, so if that seed managed to get touched between calls, all bets are off.
Ver también
Categorías
Más información sobre Problem-Based Optimization Setup en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!