ScipyLSQ#
- class cuqi.solver.ScipyLSQ(func, x0, jacfun='2-point', method='trf', loss='linear', tol=1e-06, maxit=10000.0, **kwargs)#
Wrapper for
scipy.optimize.least_squares()
.Solve nonlinear least-squares problems with bounds:
\[\min F(x) = 0.5 * \sum(\rho(f_i(x)^2), i = 0, ..., m-1)\]subject to \(lb <= x <= ub\).
- Parameters:
func (callable f(x,*args)) – Function to minimize.
x0 (ndarray) – Initial guess.
Jac (callable f(x,*args), optional) – The Jacobian of func. If not specified, the solver approximates the Jacobian with a finite difference scheme.
loss (callable rho(x,*args)) – Determines the loss function ‘linear’ : rho(z) = z. Gives a standard least-squares problem. ‘soft_l1’: rho(z) = 2*((1 + z)**0.5 - 1). The smooth approximation of l1 (absolute value) loss. ‘huber’ : rho(z) = z if z <= 1 else 2*z**0.5 - 1. Works similar to ‘soft_l1’. ‘cauchy’ : rho(z) = ln(1 + z). Severely weakens outliers influence, but may cause difficulties in optimization process. ‘arctan’ : rho(z) = arctan(z). Limits a maximum loss on a single residual, has properties similar to ‘cauchy’.
method (str or callable, optional) – Type of solver. Should be one of ‘trf’, Trust Region Reflective algorithm: for large sparse problems with bounds. ‘dogbox’, dogleg algorithm with rectangular trust regions, for small problems with bounds. ‘lm’, Levenberg-Marquardt algorithm as implemented in MINPACK. Doesn’t handle bounds and sparse Jacobians.
tol (The numerical tolerance for convergence checks.)
maxit (The maximum number of iterations.)
kwargs (Additional keyword arguments passed to scipy's least_squares. Empty by default. See documentation for scipy.optimize.least_squares)
- __init__(func, x0, jacfun='2-point', method='trf', loss='linear', tol=1e-06, maxit=10000.0, **kwargs)#
Methods