Nonlinear Solvers

This section is still missing a lot of content. If you have used or developed Julia packages in this domain, we would love your help! Please visit the "Contributing" section of the repository that hosts this website for information on contributions. Table of contents:

This section will be split into two categories; Numerical nonlinear solvers, and symbolic nonlinear solvers.

Numerical Nonlinear Solvers

The most complete one is NonlinearSolve.jl, which is part of the SciML ecosystem. It takes the role of a meta-package, and build on top of other packages that implement the actual algorithms. The benefit is that you can define the problem once, and then solve it with a number of different solvers by changing a keyword argument.

The JuMP.dev framework provides a simple grammar for defining optimization cost functions, or models. It allows non-linear models to be defined, which can then be optimized using any of the compatible solvers, such as Ipopt.jl

There are also a number of other packages that provide nonlinear solver algorithms. Several of them are part of the JuliaNLSolvers organization, such as NLSolve.jl and Optim.jl. There is also Roots.jl and SIAMFANLEquations.jl.

Finally, there are a number of packages that specialize in optimizing nonlinear least-squares functions, discussed below.

Nonlinear Least Squares Solvers

Nonlinear Least Squares (NLLS) solvers are a particular class of numerical nonlinear solvers that optimize problems of the form:

arg minx 12iρi(fi(x)2),s. to cj(x)<bj j, ck(x)=ek k,\begin{aligned}\argmin_{\mathbf{x}} ~&~ \frac12 \sum_i \rho_i\left(\| f_i(\mathbf{x})\|^2\right), \\ \mathrm{s.~to} ~&~ c_j(\mathbf{x}) < b_j ~~\forall j, \\ ~&~ c_k(\mathbf{x}) = e_k ~~\forall k, \end{aligned}

where

  • x\mathbf{x} is the set of variables to be optimized over.

  • fi()f_i() are (potentially multi-dimensional) nonlinear functions of x\mathbf{x}, whose squared norms are to be minimized.

  • ρi()\rho_i() are monotonically increasing robustification functions, that can be used to downweight larger errors.

  • cj()c_j() are linear or nonlinear scalar functions of x\mathbf{x} on the output of which bound constraints bjb_j are placed.

  • ck()c_k() are linear or nonlinear scalar functions of x\mathbf{x} on the output of which equality constraints eke_k are placed.

A number of packages and solvers exist for this class of problem:

More general nonlinear solvers can also often be used to optimize NLLS problems, e.g.

However, the more specialized packages tend to offer better performance.

Feature comparison

Different packages and solvers offer different features. Here's a summary of the important ones:

IpopttrontrunkCaNNOLeS.jlNLLSsolver.jlLeastSquaresOptim.jl
Registered
Uses JuMP model definition
Bound constraints
Equality constraints
Non-linear constraints
Robustified cost functions
Non-Euclidean variables
Dense auto-differentiation
Supports sparsity
Sparse auto-differentiation
  • Registered: The solver can be installed automatically using Julia's package manager.

  • Uses JuMP model definition: The JuMP.dev framework provides a simple grammar for defining optimization cost functions, or models. Some solvers support these models.

  • Bound constraints: Solvers can require some function output to be above or below a certain value.

  • Equality constraints: Solvers can require some function output to equal to a certain value.

  • Non-linear constraints: Functions used in constraints can be some nonlinear function of the variables.

  • Robustified cost functions: A scalar, monotonic function, ρ:R+R+\rho : \mathbb{R}^+ \rightarrow \mathbb{R}^+ can be used to downweight larger errors.

  • Non-Euclidean variables: Variables can exist on a non-linear manifold in a higher dimensional space, e.g. 3D rotations represented by a 9-element 3x3 matrix.

  • Dense auto-differentiation: The solver supports auto-differentiation of a dense Jacobian of the cost function.

  • Supports sparsity: The solver can exploit sparsity within the Jacobian to optimize very large, sparse problems.

  • Sparse auto-differentiation: The solver supports auto-differentiation of a sparse Jacobian of the cost function.

Performance evaluation

Different solvers provide different performance on different problems, so any evaluation is subjective. Here, performance is evaluated on unconstrained, unrobustified problems, some small and dense, others larger and sparse. Only solvers able to optimize all problems are included. Performance is evaluated by the time taken to optimize the cost function. This script was used to evaluate the algorithms, on an Apple M1 Pro CPU. Except where timings are omitted, solvers converged to the global optimum.

Small, dense problems
Medium sized, sparse problems

Symbolic Nonlinear Solvers

A nonlinear symbolic problem could for example be to solve x2=4x^2=4 for xx. The current state of symbolic nonlinear solving in native Julia is unfortunately quite poor. There is an open issue for Symbolics.jl to add such functionality, but for now, the best option is to use SymPy.jl.

This website is a community effort covering a lot of ever-changing information. It will therefore never be complete or without error. If you see something wrong, or have something to contribute, please see the "Contributing" section in the github repository.

Last modified: May 03, 2024. Built with Franklin.jl