Sum of squares decompositionsYALMIP has a built-in module for sum of squares calculations. In its most basic formulation, a sum of squares (SOS) problem takes a polynomial p(x) and tries to find a real-valued polynomial vector function h(x) such that p(x)=hT(x)h(x) (or equivalently p(x)=vT(x)Qv(x), where Q is positive semidefinite and v(x) is a vector of monomials), hence proving non-negativity of the polynomial p(x). Read more about standard sum of squares decompositions in [Parrilo]. In addition to standard SOS decompositions, YALMIP also supports linearly and nonlinearly parameterized problems, decomposition of matrix valued polynomials and computation of low rank decompositions. The examples below only introduce the basic features related to
sum-of-squares in YALMIP. For more features and details, run the example
Doing it your self the hard wayBefore we introduce the efficiently implemented SOS module in YALMIP, let us briefly mention how one could implement a SOS solver in high-level YALMIP code. Of course, you would never use this approach, but it might be useful to see the basic idea. Define a polynomial.
Introduce a decomposition
The polynomials have to match, hence all coefficient in the polynomial describing the difference of the two polynomials have to be zero.
The problem above is typically more efficiently solved when interpreted in primal form, hence we dualize it.
Adding parameterizations does not change the code significantly. Here is the code to find a lower bound on the polynomial
Matrix valued decompositions are a bit trickier, but still straightforward.
Once again, this is the basic idea behind the SOS module in YALMIP. However, the implementation is far more efficient and uses various approaches to reduce complexity, hence the approaches above should never be used in practice. Simple sum of squares problemsThe following lines of code presents some typical manipulation when working
with SOS-calculations (a more detailed description is available if you run
the sum of squares example in The most important commands are sos (to define a SOS constraint) and solvesos (to solve the problem).
The sum of squares decomposition is extracted with the command sosd.
To see if the decomposition was successful, we simply calculate p(x)-hT(x)h(x) which should be 0. However, due to numerical errors, the difference will not be zero. A useful command then is clean. Using clean, we remove all monomials with coefficients smaller than, e.g., 1e-6.
The decomposition p(x)-vT(x)Qv(x) can also be obtained easily.
Note : Even though hT(x)h(x) and vT(x)Qv(x) should be the same in theory, they are typically not. The reason is partly numerical issues with floating point numbers, but more importantly due to the way YALMIP treats the case when Q not is positive definite (often the case due to numerical issues in the SDP solver). The decomposition is in theory defined as h(x)=chol(Q)v(x) . If Q is indefinite, YALMIP uses an approximation of the Cholesky factor based on a singular value decomposition. The quality of the decomposition can alternatively be evaluated using checkset. The value reported is the largest coefficient in the polynomial p(x)-hT(x)h(x)
It is very rare (except in contrived academic examples) that one finds a decomposition with a positive definite Q and zero mismatch between vT(x)Qv(x) and the polynomial p(x) .The reason is simply that all SDP solver uses real arithmetics. Hence, in principle, the decomposition has no theoretical value as a certificate for non-negativity unless additional post-analysis is performed (relating the size of the residual with the eigenvalues of the Gramian). Parameterized problemsThe involved polynomials can be parameterized, and we can optimize the parameters under the constraint that the polynomial is a sum of squares. As an example, we can calculate a lower bound on a polynomial. The second argument can be used for an objective function to be minimized. Here, we maximize the lower bound, i.e. minimize the negative lower bound. The third argument is an options structure while the fourth argument is a vector containing all parametric variables in the polynomial (in this example we only have one variable, but several parametric variables can be defined by simply concatenating them).
YALMIP automatically declares all variables appearing in the objective function and in non-SOS constraints as parametric variables. Hence, the following code is equivalent.
Multiple SOS constraints can also be used. Consider the following problem of finding the smallest possible t such that the polynomials t(1+xy)2-xy+(1-y)2 and (1-xy)2+xy+t(1+y)2 are both sum of squares.
If you have parametric variables, bounding the feasible region typically improves numerical behavior. Having lower bounds will additionally decrease the size of the optimization problem (variables bounded from below can be treated as translated cone variables in dualization, which is used by solvesos). One of the most common mistake people make when using the sum of squares module is to forget to declare some parametric variables. This will typically lead to a (of-course erroneous) huge sum of squares problem which easily freezes MATLAB and/or give strange error messages (trivially infeasible, nonlinear parameterization, etc). Make sure to initially run the module in verbose mode to see how YALMIP characterizes the problem (most importantly to check the number of parametric variables and independent variables). If you use nonlinear operators (min, max, abs,...) on parametric variables in your problem, it is recommended to always declare the parametric variables.
When you use
a kernel representation (
The quality
of the SOS approximation is typically improved substantially if the
tolerance and precision options of the semidefinite solver is
decreased. As an example, having
To evaluate
the quality and success of the sum of squares decomposition, do not
forget to study the discrepancy between the decomposition and the
original polynomial. Polynomial parameterizationsA special feature of the sum of squares package in YALMIP is the possibility to work with nonlinear SOS parameterizations, i.e. SOS problems resulting in PMIs (polynomial matrix inequalities) instead of LMIs. The following piece of code solves a nonlinear control synthesis problem using sum of squares. Note that this requires the solver PENBMI.
Matrix valued problemsIn the same sense that the moment implementation in YALMIP supports optimization over nonlinear semidefiniteness constraint using moments, YALMIP also supports the dual SOS approach. Instead of computing a decomposition p(x) = vT(x)Qv(x), the matrix SOS decomposition is P(x) = (kron(I,v(x))TQ(kron(I,v(x)).
Of course, parameterized problems etc can also be solved with matrix valued SOS constraints. At the moment, YALMIP extends some of the reduction techniques from the scalar case to exploit symmetry and structure of the polynomial matrix, but there is room for obvious improvements. If you think you might need this, make a feature request. Keep in mind, that a simple scalarization can be more efficient in many cases.
This approach will however only prove positive semidefiniteness, it will not provide a decomposition of the polynomial matrix. Low-rank sum-of-squares (experimental)By using the capabilities of the solver LMIRANK, we can pose sum-of-squares problems where we search for decompositions with few terms (low-rank Gramian). Consider the following problem where a trace heuristic leads to an effective rank of 5, perhaps 6.
We solve the problem using LMIRANK instead, and aim for a rank less than or equal to 4. The desired rank is specified easily in the sos construct.
The resulting rank is indeed effectively 4. Note though that LMIRANK works on the dual problem side, and can return slightly infeasible solutions (in terms of positive definiteness.) Keep in mind that sum-of-squares decompositions almost always be slightly wrong, in one way or the other. If a dual (also called image) approach is used (corresponding to sos.model=2), positive definiteness may be violated, and if a primal approach (also called kernel) approach is used (corresponding to sos.model=1), there is typically a difference between the polynomial and its decomposition. This simply due to the way SDP solvers and floating point arithmetic work. See more in the example sosex.m Remember that LMIRANK is a local solver, hence there are no guarantees that it will find a low rank solution even though one is known to exist. Moreover, note that LMIRANK does not support objective functions. OptionsIn the examples above, we are mainly using the default settings when solving the SOS problem, but there are a couple of options that can be changed to alter the computations. The most important are:
|