Error when fitting latent growth curve using ordinal data and sex definition vars

10 replies [Last post]
ngillespie's picture
Offline
Joined: 09/24/2009

Hi all,
I'm getting the following error when after running a latent growth curve with ordinal data and sex definition variables I ask for the summary output:

> lgcOrdACESumm <- summary(lgcOrdACEFit)
Error in omxConvertLabel(refNames[j], modelname, dataname, namespace) :
trying to get slot "name" from an object of a basic class ("NULL") with no slots

Cheers,

Nathan

AttachmentSize
LongitudinalTwinAnalysis_MatrixRawOrd.R6.9 KB
jepq_ord.txt162.3 KB
mspiegel's picture
Offline
Joined: 07/31/2009
My bad. We haven't written

My bad. We haven't written test cases for models that include grandchildren. I've fixed two related bugs in the library to get the script working. In order to get the script running in OpenMx 0.2.9, here's a small rewrite such that the script contains 5 children models, instead of 3 children models and 2 grandchildren models.

AttachmentSize
LongitudinalTwinAnalysis_MatrixRawOrd.R 6.94 KB
ngillespie's picture
Offline
Joined: 09/24/2009
Hi Tim (Brick), Did you ever

Hi Tim (Brick),
Did you ever get to the bottom of the identical -2LLs in Open and ClassicMx each with different paramter estimates? Just to recap I wrote a latent growth curve script for ordinal data with sex defs on the latent intercept and slope means.
Cheers,
Nathan

tbrick's picture
Offline
Joined: 07/31/2009
Nathan, I haven't found a

Nathan,

I haven't found a specific error/misspecification in the script, but my guess is that the model is not uniquely identified. If you recall, OpenMx agrees with Mx-Classic about the likelihoods at both sets of estimates. This implies that there is a wide space at the minimum of the likelihood surface--there is no single correct answer, but a set of answers with identical (and minimal) likelihoods.

I'll dig into it again and see if I can figure out why this is the case, and whether there is a misspecification in the code or an issue with OpenMx. I'll get back to you if I find anything.

I'd recommend posting a problem description and the latest version of the code, in case somebody else wants to take a look at it, too.

carey's picture
Offline
Joined: 10/19/2009
tim is correct. the model is

tim is correct.
the model is not identified. the attached code gives the same model but using different start values and creates object lgcOrdACEFit1. note how
lgcOrdACEFit@output$minimum - lgcOrdACEFit1@output$minimum
gives very similar function values, yet
lgcOrdACEFit@output$estimate - lgcOrdACEFit1@output$estimate
demonstrates that the parameters differ.

hate to sound like a broken record (http://openmx.psyc.virginia.edu/thread/428), but the issue could have been avoided by
eigen(lgcOrdACEFit@output$calculatedHessian, TRUE, only.values=TRUE)
or
eigen(lgcOrdACEFit1@output$calculatedHessian, TRUE, only.values=TRUE)
both of which clearly show that the calculated Hessian is not positive definite. again, i strongly suggest adding a warning to the user when this is the case. it would have saved some time for nathan.

greg

AttachmentSize
LongitudinalTwinAnalysis_MatrixRawOrd_1.R 6.98 KB
mspiegel's picture
Offline
Joined: 07/31/2009
Error message for

Error message for underidentified model has been added to the OpenMx 1.0 feature set: http://openmx.psyc.virginia.edu/thread/449

carey's picture
Offline
Joined: 10/19/2009
caution here! a WARNING is

caution here!

a WARNING is needed, not an ERROR. one can get negative eigenvalues in an identified model with high multicollinearity simply because of numerical error in the calculation of the Hessian. all of the canned routines (e.g., PROC MIXED in SAS) print a warning and not an error.

the solution is to try different starting values. if one gets the same (or very similar) function values but different parameter estimates, then the model is not identified.

greg

mspiegel's picture
Offline
Joined: 07/31/2009
Ah, ok a warning can be

Ah, ok a warning can be thrown instead of an error. From a developer's point of view, I will state that users often ignore warnings. Warnings are either ignored deliberately, i.e. a user reads the warning but doesn't understand what it means, and just accepts the final output as correct. Or the warning is ignored less deliberately, perhaps if either 1000 models are executing simultaneously and 1/10 of them generate warnings or if one model generates 100 warnings, and the user decides to ignore all the warnings.

neale's picture
Offline
Joined: 07/31/2009
I agree that a warning is

I agree that a warning is necessary in this case, and accept that users may ignore them, at their peril. While we should do as much as possible to insulate them from danger, we cannot change the world such that optimization is an exact science and therefore cannot protect them from all possible problems. In this case, the user will need to have the output (i.e., warning not error) in order to decide whether there really is a problem, as noted by Greg.

Steve's picture
Offline
Joined: 07/30/2009
Agreed.

Agreed.

ngillespie's picture
Offline
Joined: 09/24/2009
Thanks for this. This was

Thanks for this. This was very helpful. N