mlad - maximizing likelihood functions using automatic differentiation
mlad maximizes a log-likelihood function where the log-likelihood function is programmed in Python. This enables the gradients and Hessian matrix to be obtained using automatic differentiation and makes better use of multiple CPUs. With large datasets
mlad tends to be substantially faster than
ml and has the important advantage that you don’t have to derive the gradients and the Hessian matrix analytically.
You can install
mlad within Stata using
. ssc install mlad
You will also need access to Python from Stata and the following Python modules installed, jax, numpy, scipy and importlib.
Please note that I have only tested using the CPU onnly version of jax.
mlad only worked unfer Linux, but is now possible to install jax in Windows.
You can find install details for jax on the Jax GitHub page.
Examples of using mlad
I have developed some tutorial examples using
mlad. There are speed tests and of course speed depends on the capabilities of your computer. All speed tests are performed on the following.
- AMD Ryzen 9 5900X - 12 Cores (2 threads per core)
- CPU speed 4800 MHz
- RAM 32Gb
- Running Windows 10
I have Stata MP2, but I restrict to 1 processor for most examples.
The examples are below - I intend to add more examples in the future.
- A Weibull model - a first example.
- Interval censoring.
- Cure Models
- Splines for the log hazard function.
- Splines for the log hazard function - using pysetup().
- Flexible parametric model with random effects
- Poisson regression (post back estimates to