[R] Optimization of large nonlinear models

Ruben Roa Ureta ruben@ro@@uret@ @end|ng |rom m@||@com
Wed Dec 24 11:50:32 CET 2025


Dear R experts.

I am running customized versions of nonlinear models in my package CatDyn.
These are models with 140 parameters to estimate and composite likelihoods made of mixtures of adjusted profile normal, adjusted profile lognormal, and a robust version of the lognormal.
There are 3^6 composite likelihoods, because of 3 core likelihoods and 6 agents acting to produce the data for the model, each one having one of the 3 likelihoods.
The numerical methods I'm using are CG and spg, as these worked the best for these models in other, smaller optimization problems within the same set of models in CatDyn.

My motivation for this message is that the optimization is taking days for each of the 3^6 composite likelihoods on an Ubuntu 24.04 AMD Ryzen™ 7 8700G w/ Radeon™ 780M Graphics×16 with 128 GB RAM.
I was expecting much faster optimization with 128 GB RAM.

Some of you may have experience in running large nonlinear optimization problems in R.
Is there any advice on how to speed up these rather large-ish optimization problems in R? 
Either software, hardware, or both?

I apologize in advance if you consider this not a proper question for the mail list.

Ruben
---
Ruben H. Roa-Ureta, Ph. D.
Consultant in Statistical Modeling
ORCID ID 0000-0002-9620-5224



More information about the R-help mailing list