Optimizers
http://openmx.psyc.virginia.edu/taxonomy/term/4/0
enWhat is the numerical approximation method used to find maximum likelihood estimator from log-likelihood function in OpenMx?
http://openmx.psyc.virginia.edu/thread/1375
<p>1.<br />
What is the numerical approximation method used to find maximum likelihood estimator from log-likelihood function in OpenMx?</p>
<p>2. How to deal with multiple estimators when you have to maximize log likelihood function here in OpenMx? Say profile likelihood or partial derivation (may not be the case).</p>
<p>3. Can some of you please provide me a document on how the OpenMx works on the maximization of log likelihood function to find MLE?</p>
<p>Thank you so much.</p>
<p>Arnond</p>
http://openmx.psyc.virginia.edu/thread/1375#commentsOptimizersSat, 28 Apr 2012 06:32:29 +0000Arnond Sakworawich1375 at http://openmx.psyc.virginia.eduStarting values
http://openmx.psyc.virginia.edu/thread/1345
<p>Hi all!</p>
<p>When reading through the optimization innards of OpenMx, I came across this piece of code:</p>
<p>in npsolWrap.c, l.356:<br />
<div class="geshifilter"><pre class="rsplus geshifilter-rsplus" style="font-family:monospace;"> <span style="color: #0000FF; font-weight: bold;">if</span><span style="color: #080;">(</span><span style="color: #080;">(</span>x<span style="color: #080;">[</span>k<span style="color: #080;">]</span> <span style="color: #080;">==</span> <span style="color: #ff0000;">0.0</span><span style="color: #080;">)</span> <span style="color: #080;">&&</span> <span style="color: #080;">!</span>disableOptimizer<span style="color: #080;">)</span> <span style="color: #080;">{</span>
x<span style="color: #080;">[</span>k<span style="color: #080;">]</span> <span style="color: #080;">+=</span> <span style="color: #ff0000;">0.1</span><span style="color: #080;">;</span>
<span style="color: #080;">}</span></pre></div></p>
<p>If I am not mistaken, the array x holds the starting values at that time point. I presume that this if-clause is included in order to avoid some "bad" starting conditions of the model-implied covariance matrix. However, shouldn't users be informed or warned that the model is actually fitted with different starting values than those they had specified? </p>
<p>best regards,<br />
Andreas</p>
http://openmx.psyc.virginia.edu/thread/1345#commentsOptimizersTue, 10 Apr 2012 07:41:23 +0000brandmaier1345 at http://openmx.psyc.virginia.eduWhy MLEs in a CFA model are different between analyses of raw data and covariance matrix data?
http://openmx.psyc.virginia.edu/thread/1266
<p>I have conducted a large scale numerical simulation study to see small<br />
sample properties of SEM and then have the following question:<br />
I would appreciate it if you could teach me why MLEs in a CFA model are<br />
different between analyses of raw data and covariance matrix data in<br />
OpenMx package of R. Are the optimization methods employed different?<br />
The maximum difference I encountered is 0.084267.</p>
<p>When I compare the results, I adjusted their scales, i.e., multiplied by<br />
sqrt(N/N-1) for factor loadings and by N/N-1 for error variances.<br />
The adjustment can approach the two estimates to each other closely.</p>
<p>The code and sample covariance matrix are in the attachment.<br />
I would be glad, if someone helped me.</p>
<table id="attachments" class="sticky-enabled">
<thead><tr><th>Attachment</th><th>Size</th> </tr></thead>
<tbody>
<tr class="odd"><td><a href="http://openmx.psyc.virginia.edu/sites/default/files/source.R">source.R</a></td><td>2.23 KB</td> </tr>
<tr class="even"><td><a href="http://openmx.psyc.virginia.edu/sites/default/files/sample_covariance_matrix.txt">sample_covariance_matrix.txt</a></td><td>4.43 KB</td> </tr>
</tbody>
</table>
http://openmx.psyc.virginia.edu/thread/1266#commentsOptimizersThu, 23 Feb 2012 00:37:45 +0000Ami1266 at http://openmx.psyc.virginia.educonvergence status OK but calculated Hessian with negative eigenvalue
http://openmx.psyc.virginia.edu/thread/781
<p>Hello,</p>
<p>I just found in a run that the calculated Hessian has a negative eigenvalue (which also result in NAs for some standard errors) and some of the gradients seems large. However, the convergence code is 0 so no error or warning is displayed. Should a warning be displayed in this case?</p>
<p>I tried another starting value and this problem goes away, with objective function decreases about 0.28, indicating the first run does not reach the minimum.</p>
<p>The code and data are in the attachment.</p>
<p>Thanks.</p>
<p>- Hao</p>
<table id="attachments" class="sticky-enabled">
<thead><tr><th>Attachment</th><th>Size</th> </tr></thead>
<tbody>
<tr class="odd"><td><a href="http://openmx.psyc.virginia.edu/sites/default/files/code.R">code.R</a></td><td>1.84 KB</td> </tr>
<tr class="even"><td><a href="http://openmx.psyc.virginia.edu/sites/default/files/Data.dat">Data.dat</a></td><td>6.6 KB</td> </tr>
</tbody>
</table>
http://openmx.psyc.virginia.edu/thread/781#commentsOptimizersTue, 11 Jan 2011 22:51:13 +0000wuhao_osu781 at http://openmx.psyc.virginia.eduSEM without free parameters troubles OpenMX (and myself)
http://openmx.psyc.virginia.edu/thread/737
<p>Hello!</p>
<p>Today, I felt like fixing all my free parameters and call mxRun() on my model. I expected the optimizer to converge instantly or rather not to be invoked. Instead, it seems the backend is running into an infinite loop somehow. Maybe the backend needs to check the condition of no free parameters in the model.</p>
<p>I did this (admitted, a rather strange setting) because I intended to evaluate the -2LL of dataset "A" on a model that I ran previously on dataset "B". I thought if I fixed all parameters, set the dataset to "A" and mxRun() it, I'd get the desired -2LL of "A" under the model estimated by "B" as a result. So, I wonder what you suggest as the standard way to obtain a likelihood of a different dataset given an estimated model. </p>
<p>best,<br />
Andreas</p>
http://openmx.psyc.virginia.edu/thread/737#commentsOptimizersWed, 10 Nov 2010 11:20:44 +0000brandmaier737 at http://openmx.psyc.virginia.eduReferencing Data in an Algebra for mxAlgebraObjective
http://openmx.psyc.virginia.edu/thread/648
<p>I'm trying to spec out a least-squares based optimizer using the mxAlgebraObjective, but I've been unable to include data in an algebra. Is that possible/feasible, or do I need to move to mxRObjective?</p>
http://openmx.psyc.virginia.edu/thread/648#commentsOptimizersMon, 23 Aug 2010 16:25:38 +0000Ryne648 at http://openmx.psyc.virginia.eduRepeatability across platforms
http://openmx.psyc.virginia.edu/thread/272
<p>I have run into an interesting problem. A model (which is admittedly of low stability) that I used in class today ran on the Mac (with warnings, but reasonably close estimates) but not on the PC (non-invertible). Both are Intel processors, both running R 2.9.2 and OpenMx 0.2.2. The model is of simulated data with boundary constraints on residuals because variance explained is very high. </p>
<p>I attach the script and data as a test case for the limits of similarities between architectures.</p>
<table id="attachments" class="sticky-enabled">
<thead><tr><th>Attachment</th><th>Size</th> </tr></thead>
<tbody>
<tr class="odd"><td><a href="http://openmx.psyc.virginia.edu/sites/default/files/LDEUnivariateExample091116_0.R">LDEUnivariateExample091116.R</a></td><td>2.89 KB</td> </tr>
<tr class="even"><td><a href="http://openmx.psyc.virginia.edu/sites/default/files/LDEUnivariateExample091116_0.txt">LDEUnivariateExample091116.txt</a></td><td>2.88 KB</td> </tr>
</tbody>
</table>
http://openmx.psyc.virginia.edu/thread/272#commentsOptimizersMon, 16 Nov 2009 17:22:34 +0000Steve272 at http://openmx.psyc.virginia.educrash with raw data and mxMLObjective
http://openmx.psyc.virginia.edu/thread/137
<p>Just got this crash running a FIML version of the front page factor model without updating the object to FIML</p>
<p>> data(demoOneFactor)<br />
> factorModel <- mxModel("One Factor",<br />
+ mxMatrix("Full", 5, 1, values=0.2,<br />
+ free=T, name="A"),<br />
+ mxMatrix("Symm", 1, 1, values=1,<br />
+ free=F, name="L"),<br />
+ mxMatrix("Diag", 5, 5, values=1,<br />
+ free=T, name="U"),<br />
+ mxAlgebra(A %*% L %*% t(A) + U, name="R",<br />
+ dimnames = list(names(demoOneFactor),<br />
+ names(demoOneFactor))),<br />
+ mxMLObjective("R"),<br />
+ mxData(demoOneFactor, type="raw"))<br />
> summary(mxRun(factorModel))<br />
Running One Factor </p>
<p> *** caught segfault ***<br />
address 0x15c0bdf0, cause 'memory not mapped'</p>
<p>Traceback:<br />
1: .Call("callNPSOL", objective, startVals, constraints, matrices, parameters, algebras, data, options, state, PACKAGE = "OpenMx")<br />
2: mxRun(factorModel)</p>
http://openmx.psyc.virginia.edu/thread/137#commentsOptimizersFri, 04 Sep 2009 16:03:31 +0000tbates137 at http://openmx.psyc.virginia.eduCrashing R in OS X
http://openmx.psyc.virginia.edu/thread/103
<p>Here's an interesting one...</p>
<p>So I've recently been working on converting an mx script to OpenMx (file attached). I might be doing something I shouldn't, but the script caused R on my Mac to close w/out explanation or warning. Everything goes fine until I hit "mxRun." I ran this under Windows XP and got the following error:</p>
<p>> EstModel <- mxRun(CoupledModel)<br />
Running CoupledModel<br />
Error in mxRun(CoupledModel) :<br />
BLAS/LAPACK routine 'DGEMM ' gave error code -13</p>
<p>So the first problem is this error code, the second is that R did not give this warning in OS X but rather terminated the program. I used source() to get OpenMx, so I'm still on version 708. </p>
<table id="attachments" class="sticky-enabled">
<thead><tr><th>Attachment</th><th>Size</th> </tr></thead>
<tbody>
<tr class="odd"><td><a href="http://openmx.psyc.virginia.edu/sites/default/files/CoupledOscillator_0.R">CoupledOscillator.R</a></td><td>3.71 KB</td> </tr>
</tbody>
</table>
http://openmx.psyc.virginia.edu/thread/103#commentsOptimizersSun, 23 Aug 2009 14:24:36 +0000pdeboeck103 at http://openmx.psyc.virginia.edulikelihood statistic returned from npsol
http://openmx.psyc.virginia.edu/thread/78
<p>i am following up here on the discussion previously started on the email list. i'm still unsure, after perusing the code and the npsol manual as to weather the raw likelihood or the -2 log likelihood is being returned from the optimizer. i'm wondering if anyone might know where i could find this info? or how i might reasonably search for it in the code (?) </p>
http://openmx.psyc.virginia.edu/thread/78#commentsOptimizersFri, 14 Aug 2009 17:51:31 +0000skenny78 at http://openmx.psyc.virginia.edushared memory space between R & OpenMx AND between R & xxm
http://openmx.psyc.virginia.edu/thread/50-shared-memory-space-between-r-openmx-and-between-r-xxm
<p>Posted on behalf of Paras:</p>
<p>Hi,</p>
<p>I am hoping that the c gurus can help me understand how OpenMx matrices<br />
are stored, updated and passed around.</p>
<p>Here is the situation:<br />
We are trying design data-structures for the block-sparse BLAS routines<br />
and are struggling with the issue of ownership of "double" matrices. Our<br />
key data-structure is called the "blockSparseMatrix". For example, there<br />
may be a huge "factor-loading" matrix where each dense sub-block would<br />
map on to one or more OpenMx parameter matrices. In the ordinary<br />
factor-model, each dense sub-block would point to the single<br />
factor-loading matrix. Alternatively in the growth-curve case with<br />
definition variables, each dense-block would point to person specific<br />
factor-loading matrix.</p>
<p>The struct blockSparseMatrix has a field **dns or a "pointer to an array<br />
of doubles representing the dense matrix". From a memory and efficiency<br />
perspective, it makes sense for OpenMx to create, update and destroy the<br />
memory for dense-matrices. Pointer to the dense matrix would be passed<br />
to the backend at the beginning and would be retained throughout the<br />
estimation process as a member of the blockSparseMatrix structure.</p>
<p>>From an OO perspective, having the main application write directly to a<br />
struct-member seems to be problematic. However, I see no alternative to<br />
doing so -- without copying huge amounts of data at each iteration.</p>
<p>>From OpenMx's design perspective and how you envision development of<br />
other plug-ins, what approach would you recommend?</p>
<p>How does OpenMx deal with R? If I understand correctly, R insists that<br />
all data be "copied" between R and external-packages, unless the access<br />
is read-only. Is this what happens at each iteration?</p>
http://openmx.psyc.virginia.edu/thread/50-shared-memory-space-between-r-openmx-and-between-r-xxm#commentsOptimizersWed, 05 Aug 2009 22:07:04 +0000mspiegel50 at http://openmx.psyc.virginia.eduSchedule for categorical outcome estimation
http://openmx.psyc.virginia.edu/thread/45-schedule-categorical-outcome-estimation
<p>Now that the closed beta is released we can get back to the categorical outcomes estimation. What is the status and expected completion date on the backend? Are there front end issues that need to be dealt with?</p>
http://openmx.psyc.virginia.edu/thread/45-schedule-categorical-outcome-estimation#commentsOptimizersWed, 05 Aug 2009 13:44:40 +0000Steve45 at http://openmx.psyc.virginia.edu