How to use regularization with lsqnonlin function

Hi,
I would like to use lsqnonlin functinon with regularization (e.g. Tikhonov regularization). However, to my big surprise there is no such option it seems!
I like lsqnonlin because I can specify a cost function with an anonymous function in a convenient way.
Please, tell me there is a way to add regularization or another function that does the same thing with regularization?
Also please note that my fonction takes as input a matrix and convolves with a small kernel k:
fun = @(x)conv2(x,k,'same')-y;
lsqnonlinoptions=optimset('Algorithm','Levenberg-Marquardt','MaxIter',maxIterArg);
x_lsq =lsqnonlin(fun,x0,[],[],lsqnonlinoptions);

 Respuesta aceptada

Bruno Luong
Bruno Luong el 14 de Nov. de 2020
Editada: Bruno Luong el 14 de Nov. de 2020
You are free to incorporation anything in the objective function, e.g., l^2 regularization
regcoef = some_small_coef;
fun = @(x) [reshape(conv2(x,k,'same')-y,[],1); regcoef*x(:)];

12 comentarios

AD
AD el 14 de Nov. de 2020
Oh thanks sounds great! But how come lsqnonlin knows that it is the regularization? And do you know why it is not documented (since regularization is such a well known thing)?
AD
AD el 14 de Nov. de 2020
and I guess you put just regcoef*x(:) and not regcoef*norm(x).^2 because lsqnonlin will add the norm, it that correct?
Bruno Luong
Bruno Luong el 14 de Nov. de 2020
No it doesn't care as long as the regularization is a 2-norm of some basis.
It's not necessary to be documented since user is free to put in the objective whatever he likes. Here the fitness + regularization.
Bruno Luong
Bruno Luong el 14 de Nov. de 2020
Editada: Bruno Luong el 14 de Nov. de 2020
lsqnonlin will minimize
conv_fitness_term + sum (regcoef*x(:)).^2
which is the initial pb + l^2 regularization with (regcoef^2) as tikhonov reguratilzation coefficient
conv_fitness_term + (regcoef^2) * norm(x,2)
AD
AD el 14 de Nov. de 2020
Editada: AD el 14 de Nov. de 2020
ok thanks! So implicitely what you added regcoef*x(:), when it comes to lsqnonlin, it somehow knowns that the next rows of the long vector is the regularization and therefore must be summed ... the shape of this thing : [reshape(conv2(x,k,'same')-y,[],1); regcoef*x(:)] is in fact a long vector since you vectorized everything. I must say this is very confusing and I doubt that may ppl would find that by themselves without an explanation... (so thanks again for that!) Do you know how lsqnonlin detects that which parts of this long vector corresponds to regularization and which part to the conv_fitness term? if it had been 2 rows i could understand perhaps but 1 long vector is strange?
AD
AD el 14 de Nov. de 2020
if I may ask, also but I don't want to abuse your time, but how would you do for an L1 regularization ?
Bruno Luong
Bruno Luong el 15 de Nov. de 2020
Editada: Bruno Luong el 15 de Nov. de 2020
For L1 regularization you better use tool such as lasso
"Do you know how lsqnonlin detects that which parts of this long vector corresponds to regularization and which part to the conv_fitness term?"
Again lsqnonlin doesn't care what is what (this is user interpretation), it just try to minimize the l2 norm of the output the user-provided model function, whatever it is. A little suggestion of experemental: you can even shuffle the vector
[reshape(conv2(x,k,'same')-y,[],1); regcoef*x(:)]
so that everything is mixed up (keep the shuffle fixe however), it still works.
AD
AD el 15 de Nov. de 2020
Thanks for the info. Ok I understand that lsqnonlin will try to minimize the norm of what ever vector is output by my function i.e. a vector of real values lets say v. But then, it means that lsqnonlin has no way of computing the gradients of the cost function? And therefore does not use gradients in this case for the minimization?
Bruno Luong
Bruno Luong el 15 de Nov. de 2020
"But then, it means that lsqnonlin has no way of computing the gradients of the cost function?"
What makes you think that??? lsqnonlin estimates the gradient by finite difference as many smooth optimization algorithm, or use user's supply jacobian to compute the gradient see (SpecifyObjectiveGradient) doc.
BTW your deconvolusion problem, f(x) is linear, you can just solve with linear algebra, including addisional l^2 regularization. No need lsqnonlin.
AD
AD el 15 de Nov. de 2020
Editada: AD el 15 de Nov. de 2020
Also I noticed lsqnonlin works with the conv2 problem I showed in the OP, but when I try:
fun = @(theta) reshape(imrotate(x,theta,'crop')-y,[],1);
it doesnt find the rotation angle and seem to just output the same x as I gave him (with very minor rotation maybe 1.5 degrees whereas I gave an y with 20 degrees) . Note that x is the ground truth image (no rotation) and y is the image with rotation see full code below)
any idea why it does not work with rotation? (Although I know there are other methods for estimating params but I am curious about lsqnonlin )
full function code so you can try it yourself:
function lsq_rec_estimRot_withReg(x,y,maxIterArg)
fun = @(theta) reshape(imrotate(x,theta,'crop')-y,[],1);
theta0 = randn();%initial guess
lsqnonlinoptions=optimset('Algorithm','Levenberg-Marquardt','MaxIter',maxIterArg);
theta_hat =lsqnonlin(fun,theta0,[],[],lsqnonlinoptions);
%% disp inside func for now
figure;
subplot(131);
imshow(x);
title('x');
subplot(132);
imshow(y);
title('y');
subplot(133);
imshow(imrotate(x,theta_hat,'crop'));
title(['x_{hat}, rot angle:' num2str(theta_hat) ] );
end
Bruno Luong
Bruno Luong el 15 de Nov. de 2020
I have no image-processing tbox so I can't try, and I guess I know the reason. But I stop to answer your many questions that drift from the original question.
AD
AD el 15 de Nov. de 2020
Editada: AD el 15 de Nov. de 2020
Ok, - sorry for the many questions, its just that I am trying things - shall I post another question with this? It's just for getting a better intuition of things... I know there are algorithms for image re-alignment.

Iniciar sesión para comentar.

Más respuestas (0)

Categorías

Más información sobre Mathematics en Centro de ayuda y File Exchange.

Productos

Versión

R2020a

Etiquetas

Preguntada:

AD
el 14 de Nov. de 2020

Editada:

AD
el 15 de Nov. de 2020

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by