Using external minimizers

We show how to use an external minimizer to find the minimum of a function and then use iminuit to compute the parameter uncertainties.

We will demonstrate this with a maximum-likelihood fit of a normal distribution, which is carried out with scipy.optimize.minimize. iminuit is then used to compute the parameter uncertainties.

[1]:
from iminuit import Minuit
import numpy as np
from scipy.stats import norm
from scipy.optimize import minimize
[2]:
# normally distributed data
rng = np.random.default_rng(1)
x = rng.normal(size=1000)

# negative log-likelihood for a normal distribution
def nll(par):
    return -np.sum(norm(par[0], par[1]).logpdf(x))

nll.errordef = Minuit.LIKELIHOOD

# minimize nll with scipy.optimize.minimize
result = minimize(nll, np.ones(2))
result
[2]:
      fun: 1405.1044194490182
 hess_inv: array([[ 9.11426039e-04, -4.57424334e-05],
       [-4.57424334e-05,  4.68208132e-04]])
      jac: array([0., 0.])
  message: 'Optimization terminated successfully.'
     nfev: 45
      nit: 12
     njev: 15
   status: 0
  success: True
        x: array([-0.05425323,  0.98626113])
[3]:
# initialize Minuit with the fit result from scipy.optimize.minimize
m = Minuit(nll, result.x)
m.hesse()  # this also works without calling MIGRAD before
[3]:
Name Value Hesse Error Minos Error- Minos Error+ Limit- Limit+ Fixed
0 x0 -0.054 0.031
1 x1 0.986 0.022
x0 x1
x0 0.000973 6.92e-08
x1 6.92e-08 0.000486

We can also compute the “Hesse errors” at any other point than the minimum. These cannot be interpreted as parameter uncertainties, they are just some numbers related to the second derivative of the cost function at that point.

[4]:
m.values = (1.0, 0.5)
m.hesse()
[4]:
Name Value Hesse Error Minos Error- Minos Error+ Limit- Limit+ Fixed
0 x0 1.000 0.031
1 x1 0.500 0.006
x0 x1
x0 0.000963 0.000169 (0.861)
x1 0.000169 (0.861) 4.01e-05

Likewise, it one can also run MINOS to get MINOS estimates. Note that MINOS can fail if the starting point is not actually a minimum. So here we reset the values to the solution found by scipy.optimize.

[5]:
m.values = result.x
m.minos()
[5]:
Name Value Hesse Error Minos Error- Minos Error+ Limit- Limit+ Fixed
0 x0 -0.054 0.031 -0.031 0.031
1 x1 0.986 0.022 -0.022 0.022
x0 x1
Error -0.031 0.031 -0.022 0.022
Valid True True True True
At Limit False False False False
Max FCN False False False False
New Min False False False False
x0 x1
x0 0.000973 0
x1 0 0.000486

We can see that MINOS ran successfully. The Hesse Errors were also updated, because MINOS needs HESSE to run first. HESSE is called automatically in this case.