fminsearch and different results on estimated parameters.

3 visualizaciones (últimos 30 días)
Dev Vencappa
Dev Vencappa el 20 de Nov. de 2013
Respondida: Marc el 21 de Nov. de 2013
Dear Matlab users,
I am new to Matlab and intend to later on maximize a customised log-likelihood function. In my early attempts to learn how to do this in Matlab, I decided to first experiment with the exponential distribution model for survival analysis. Below are the codes (I used the censored model for this) for the function to be maximized (admittedly, this may not be the most efficient coding for this problem, so suggestions are very welcome):
function like=expcovariate_mle(params,xb,t,c);
theta=exp(xb*params');
like = -sum(c.*log(theta)-t.*theta);
grad= -sum(c./theta-t);
hess= -sum(-c./(theta.^2));
end
and here are my codes for executing the above:
clear all;
warning off all
format long
cd 'E:\work\'
xlsread('final_data.xls','Sheet1','A2:Y759');
c=ans(:,8);
t=ans(:,1);
data=ans(:,:);
settled=data(:,8);
def=data(:,17);
est=data(:,18);
lia=data(:,19);
nol=data(:,20);
pat=data(:,21);
lap=data(:,22);
nla=data(:,23);
censor=1-settled;
cons=ones(758,1);
xb=[cons,def,est,lia,lap,nla];
options = optimset('MaxFunEvals',10000);
params0=ones(1,6);
[params]=fminsearch(@(params) expcovariate_mle(params,xb,t,c),params0,options);
Out of interest, I have been comparing how the above works when coded in other software (I use STATA and Limdep). What was quite puzzling for me was that when I have up to 6 columns in matrix xb (i.e. xb=[ones(758,1),def,est,lia,nol,pat] the Matlab codes above produce exactly the same results as STATA/Limdep. However, if my xb matrix contains one or two additional variables, ( i.e. xb=[ones(758,1),def,est,lia,nol,pat,lap,nla] ) then Matlab produces very different parameters from STATA/Limdep. I am puzzled as to why this, as I have been experimenting with the above incrementally, i.e. start with 2 columns, three columns, etc in my matrix xb, but this only produces similar results to STATA/Limdep up to including 6 columns in xb. When I add additional variables, I get very different results from STATA/Limdep.
I wrote similar codes in Matlab to estimate the Weibull Accelerated Failure Time Survival model and I have exactly the same problem as above, i.e. the estimated parameters are the same only up to using 6 variables, but more than 6 variables produce different parameters from STATA/Limdep.
Has anyone every experienced this problem before, or am I missing out something obvious from the use of the command fminsearch?
As a novice user, I would very much appreciate any help/suggestions/guidance on this please.
Many thanks, Dev

Respuesta aceptada

Matt J
Matt J el 20 de Nov. de 2013
Editada: Matt J el 20 de Nov. de 2013
FMINSEARCH doesn't use a very robust minimization algorithm, which is why you're charged extra money for the Optimization Toolbox. FMINSEARCH is rigorous for 1 unknown variable and empirically successful for small numbers of variables. More than 6 unknowns would be pushing it, though.
I'd be interested to see if things behave better, though, if you rewrite some of your code as follows
theta=xb*params';
like = -sum(c.*theta-t.*exp(theta));
  2 comentarios
Dev Vencappa
Dev Vencappa el 20 de Nov. de 2013
Thanks for clarifying this Matt.
I tried estimating the parameters in log space as you suggested, but this gives exactly the same parameter values as before (i.e. they are the same as STATA/Limdep up to 6 variables, but different parameter values with 7 or more variables).
I will investigate and try the Optimization Toolbox if it provides better algorithm.
Thanks and regards, Dev
Matt J
Matt J el 20 de Nov. de 2013
You could also try playing with the tolerance parameters, e.g. make TolFun smaller. In any case, you should be assessing the additional diagnostic outputs of fminsearch
[x,fval,exitflag,output]=fminsearch
to see if the algorithm stopped in a healthy manner.

Iniciar sesión para comentar.

Más respuestas (1)

Marc
Marc el 21 de Nov. de 2013
fminsearch is based on a nelder mean simplex which does not require knowledge of the jacobian. For fitting parameters in odes or DAEs I have found this a my modified version from numerical recipes in fortran 95 to be very good against the solvers in the optimization and global optimization. That said, each method has there strengths. Typically, trivial problems are handled very similarly by each. It's when your needs become more demanding then it really helps to understand the strengths of each.
Good news is that the various functions are all set up very similarly, so it's easy to swap methods or try them all if we are talking seconds.
salt to suit

Categorías

Más información sobre Mathematics en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by