Published on *OpenMx* (http://openmx.psyc.virginia.edu)

By *carey*

Created *12/22/2010 - 13:01*

Wed, 12/22/2010 - 13:01 — carey [1]

There MAY be a problem with the optimization using some likelihood functions in OpenMx. Specifically, the gradient and the standard errors from the output slots in an MxModel are congruent with minimizing -log(L). The calculatedHessian, however, is congruent with minimizing -2log(L). The attached R code provides an illustration.

The potential problem involves which function (-log(L) or -2log(L)) NPSOL minimizes. If it is -log(L), then everything is fine with respect to optimization, but OpenMx is incorrectly calculating the Hessian. If it is -2log(L), however, things MAY get gnarly. The estimated gradient elements will be twice as big as they should be: i.e., dX/d(-2log(L)) = 2dX/d(-log(L)). One of the convergence criteria is the norm of the gradient. If g is the gradient then the norm equals t(g) %*% g. If the function minimized is -2log(L), then the norm of the gradient will be FOUR times its appropriate value. This could lead to convergence problems, particularly with ill conditioned problems.

Best,

Greg

**Links:**

[1] http://openmx.psyc.virginia.edu/users/carey

[2] http://openmx.psyc.virginia.edu/thread/806

[3] http://openmx.psyc.virginia.edu/thread/546

[4] http://openmx.psyc.virginia.edu/forums/opensem-forums/fit-functions