What optimization algorithm would work best for my network problem?

8 visualizaciones (últimos 30 días)
Robert Gray
Robert Gray el 8 de Oct. de 2024
Respondida: Sam Chak el 14 de Oct. de 2024
I need to optimize a network of nodes and links, but don't know what Matlab optimization algorithm I should learn and use. A very simplified figure of the network looks like this:
"I" is some initial Intensity input. Iout1 and Iout2 are 2 intensity outputs for this example. A general node has at most 2 inputs and 2 outputs. Each Node has at most 4 intensity reduction factors f1, f2, f3, f4, and is pictured like this:
Each node has a "Knob" (k1) that can be "turned" to change the value of the this node's 4 intensity reduction factors f1, f2, f3, f4. So f1(k1), f2(k1), f3(k1), f4(k1), these are not independent reduction factors. (Each reduction factor has a value 0 < f < 1.)
I need to do an optimization over the nodes' knobs so that the output intensities Iout1, Iout2 have the same value, and are as large as possible.
Above is for illustration only. The real network has 77 nodes (so 77 knobs to be adjusted) and 25 outputs that should all be the same (or nearly the same).
I can define the knobs to have discrete, say, 50 different values. Or I can code the knobs to be continuous values from 0 to 1. Its not the knob values, it is the reduction factors f1, f2, f3, f4, that go into the equations. The knobs only tell me which values to use. The reduction factors are only tied to one another by knob value. That is, changing knob value so that f1 gets larger does not tell me if f2, f3, f4 will be smaller or larger, only that f2, f3, f4 will change their values as well.
What Matlab Global Optimization Toolbox algorithm would be best to optimize this type of network?
Any suggestions for what to do or look into, and what not to do, would be very helpful.

Respuestas (2)

TED MOSBY
TED MOSBY el 14 de Oct. de 2024
Hi Robert,
I have outlined some suggestions for optimization algorithms for your model using MATLAB's Global Optimization Toolbox:
Genetic Algorithm (GA):
  • Why: GA is well-suited for complex optimization problems with non-linear, non-convex objective functions and discrete or continuous variables. It can handle the large search space and multiple local minima typical in network optimization.
  • How to Use: You can define a custom fitness function that evaluates the difference between the output intensities and penalizes deviations from equality, while maximizing the intensity.
  • MATLAB Function:ga
Particle Swarm Optimization (PSO):
  • Why: PSO is another global optimization technique that can efficiently explore the search space. It's good for continuous optimization problems and can be adapted for discrete problems as well.
  • How to Use:Like GA, you'll need a custom objective function. PSO can be more straightforward to implement, and tune compared to GA.
  • Matlab Function:particleswarm
Simulated Annealing (SA):
  • Why: SA is effective for discrete and continuous optimization problems and can escape local minima by allowing uphill moves. It's useful when the search space is rugged.
  • How to Use: Define an objective function that balances the equality of outputs and their maximization.
  • Matlab Function:simulannealbnd
Surrogate Optimization:
  • Why: This method is useful for expensive-to-evaluate functions, where the optimization process involves approximating the objective function. It can be efficient with many variables.
  • How to Use: Define a surrogate model that approximates your network's behavior and iteratively refines the model.
  • Matlab Function:surrogateopt
However, I suggest you use the first method for network which is Genetic Algorithm because:
  • Flexibility: GA can handle both discrete and continuous variables, making it suitable for your problem where knob values can be either.
  • Robustness: It can navigate complex, non-linear landscapes with multiple local optima.
  • Parallelization: Matlab’s GA implementation supports parallel computing, which can significantly speed up the optimization process for large problems like yours.
Refer the documentation for the Global Optimization Toolbox: www.mathworks.com/help/gads/index.html?s_tid=CRUX_lftnav
Hope this helps!

Sam Chak
Sam Chak el 14 de Oct. de 2024
It may be difficult to determine which optimization algorithm would work best for your network problem. Some optimization algorithms perform well with specific classes of problems. Take a look at the optimization solvers in the Global Optimization Toolbox:
Here is a simple demo based on what you described regarding the node and the "knob" of the activation function.
%% Target
x = linspace(0, 1, 1001);
y = power(x, 1/4); % Target
bias = [0; 0.75]; % a fixed point where two vectors intersect
in1 = [0; -0.75]; % input vector 1
in2 = [1; 0.25]; % input vector 2
figure
hold on
plot(x, y, 'linewidth', 1.5), grid on
plot( bias(1), bias(2), 'p', 'linewidth', 1.5, 'markersize', 15) % bias
plot([bias(1), bias(1)+in1(1)], [bias(2), bias(2)+in1(2)], '--', 'linewidth', 1.5) % input vector 1
plot([bias(1), bias(1)+in2(1)], [bias(2), bias(2)+in2(2)], '--', 'linewidth', 1.5) % input vector 2
hold off
axis("padded")
xlabel({'$x$'}, 'interpreter', 'latex', 'fontsize', 14);
ylabel({'$y$'}, 'interpreter', 'latex', 'fontsize', 14);
title({'Target function: $y = \sqrt[4]{x}$'}, 'interpreter', 'latex', 'fontsize', 14);
%% Single-variable Optimization using fmincon
costfun = @cost; % cost function
k0 = 1; % initial guess of knob
knob = fmincon(costfun, k0)
Local minimum found that satisfies the constraints. Optimization completed because the objective function is non-decreasing in feasible directions, to within the value of the optimality tolerance, and constraints are satisfied to within the value of the constraint tolerance.
knob = 2.5858
%% Initial activation function
f1 = (1 - x).^k0; % Initial activation function, f1(knob)
f2 = ( x).^k0; % Initial activation function, f2(knob)
figure
plot(x, [f1; f2], 'linewidth', 1.5), grid on
axis("padded")
legend({'$f_{1}$', '$f_{2}$'}, 'interpreter', 'latex', 'location', 'east', 'fontsize', 14)
xlabel({'$x$'}, 'interpreter', 'latex', 'fontsize', 14);
ylabel({'$\mathbf{f}$'}, 'interpreter', 'latex', 'fontsize', 14);
title({'Initial activation functions'}, 'interpreter', 'latex', 'fontsize', 14);
%% Optimized activation function
f1 = (1 - x).^knob; % Optimized activation function, f1(knob)
f2 = ( x).^knob; % Optimized activation function, f2(knob)
figure
plot(x, [f1; f2], 'linewidth', 1.5), grid on
axis("padded")
legend({'$f_{1}$', '$f_{2}$'}, 'interpreter', 'latex', 'location', 'best', 'fontsize', 14)
xlabel({'$x$'}, 'interpreter', 'latex', 'fontsize', 14);
ylabel({'$\mathbf{f}$'}, 'interpreter', 'latex', 'fontsize', 14);
title({'Optimized activation functions'}, 'interpreter', 'latex', 'fontsize', 14);
%% Plot results
xout = f1*in1(1) + f2*in2(1) + bias(1); % Output 1
yout = f1*in1(2) + f2*in2(2) + bias(2); % Output 2
plot(x, y , 'linewidth', 1.5), hold on
plot(xout, yout, 'linewidth', 1.5), grid on, grid minor, hold off
axis("padded")
legend({'$\sqrt[4]{x}$', '$\mathcal{N}(x)$'}, 'interpreter', 'latex', 'location', 'best', 'fontsize', 14)
xlabel({'$x$'}, 'interpreter', 'latex', 'fontsize', 14);
ylabel({'$y$'}, 'interpreter', 'latex', 'fontsize', 14);
title({'Approximation of $\sqrt[4]{x}$'}, 'interpreter', 'latex', 'fontsize', 14);
%% Cost function
function J = cost(param)
x = linspace(0, 1, 1001);
bias = [0; 0.75]; % a fixed point where two vectors intersect
in1 = [0; -0.75]; % input vector 1
in2 = [1; 0.25]; % input vector 2
knob = param;
f1 = (1 - x).^knob; % Custom activation function, f1(knob)
f2 = ( x).^knob; % Custom activation function, f2(knob)
xout = f1*in1(1) + f2*in2(1) + bias(1);
yout = f1*in1(2) + f2*in2(2) + bias(2);
error = power(xout, 1/4) - yout;
J = trapz(x, error.^2);
end

Categorías

Más información sobre Genetic Algorithm en Help Center y File Exchange.

Productos


Versión

R2024b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by