LS#
- class cuqi.solver.LS(func, x0, jacfun=None, method='trf', loss='linear', tol=1e-06, maxit=10000.0)#
Wrapper for
scipy.optimize.least_squares()
.Solve nonlinear least-squares problems with bounds:
\[\min F(x) = 0.5 * \sum(\rho(f_i(x)^2), i = 0, ..., m-1)\]subject to \(lb <= x <= ub\).
- Parameters:
func (callable f(x,*args)) – Function to minimize.
x0 (ndarray) – Initial guess.
Jac (callable f(x,*args), optional) – The Jacobian of func. If None, then the solver approximates the Jacobian.
loss (callable rho(x,*args)) – Determines the loss function ‘linear’ : rho(z) = z. Gives a standard least-squares problem. ‘soft_l1’: rho(z) = 2*((1 + z)**0.5 - 1). The smooth approximation of l1 (absolute value) loss. ‘huber’ : rho(z) = z if z <= 1 else 2*z**0.5 - 1. Works similar to ‘soft_l1’. ‘cauchy’ : rho(z) = ln(1 + z). Severely weakens outliers influence, but may cause difficulties in optimization process. ‘arctan’ : rho(z) = arctan(z). Limits a maximum loss on a single residual, has properties similar to ‘cauchy’.
method (str or callable, optional) – Type of solver. Should be one of ‘trf’, Trust Region Reflective algorithm: for large sparse problems with bounds. ‘dogbox’, dogleg algorithm with rectangular trust regions, for small problems with bounds. ‘lm’, Levenberg-Marquardt algorithm as implemented in MINPACK. Doesn’t handle bounds and sparse Jacobians.
- __init__(func, x0, jacfun=None, method='trf', loss='linear', tol=1e-06, maxit=10000.0)#
Methods