1 | <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN"> |
---|
2 | <html> |
---|
3 | |
---|
4 | <head> |
---|
5 | <meta http-equiv="Content-Language" content="en-us"> |
---|
6 | <title>YALMIP Example : Sum-of-squares decompositions</title> |
---|
7 | <meta http-equiv="Content-Type" content="text/html; charset=windows-1251"> |
---|
8 | <meta content="Microsoft FrontPage 6.0" name="GENERATOR"> |
---|
9 | <meta name="ProgId" content="FrontPage.Editor.Document"> |
---|
10 | <link href="yalmip.css" type="text/css" rel="stylesheet"> |
---|
11 | <base target="_self"> |
---|
12 | </head> |
---|
13 | |
---|
14 | <body leftMargin="0" topMargin="0"> |
---|
15 | |
---|
16 | <div align="left"> |
---|
17 | <table border="0" cellpadding="4" cellspacing="3" style="border-collapse: collapse" bordercolor="#000000" width="100%" align="left" height="100%"> |
---|
18 | <tr> |
---|
19 | <td width="100%" align="left" height="100%" valign="top"> |
---|
20 | <h2>Sum of squares decompositions</h2> |
---|
21 | <hr noShade SIZE="1"> |
---|
22 | <p>YALMIP has a built-in module for sum of squares calculations. In its most basic formulation, a |
---|
23 | sum of squares (SOS) problem takes a polynomial |
---|
24 | <strong>p(x)</strong> and tries to find a real-valued polynomial vector function |
---|
25 | <strong>h(x)</strong> such |
---|
26 | that <strong>p(x)=h<sup>T</sup>(x)h(x)</strong> (or equivalently <strong>p(x)=v<sup>T</sup>(x)Qv(x)</strong>, |
---|
27 | where <b>Q</b> is positive semidefinite and <strong>v(x)</strong> is a vector of monomials), hence proving non-negativity of the polynomial <strong>p(x)</strong>. |
---|
28 | Read more about standard sum of squares decompositions in |
---|
29 | <a href="readmore.htm#PARRILO">[Parrilo]</a>. </p> |
---|
30 | <p>In addition to standard SOS decompositions, YALMIP also supports |
---|
31 | <a href="#linear">linearly</a> and <a href="#polynomial">nonlinearly</a> |
---|
32 | parameterized problems, decomposition of <a href="#matrix">matrix |
---|
33 | valued</a> polynomials and computation of<a href="#lowrank"> low rank |
---|
34 | decompositions</a>.</p> |
---|
35 | <p>The examples below only introduce the basic features related to |
---|
36 | sum-of-squares in YALMIP. For more features and details, run the example |
---|
37 | <code>sosex.m</code></p> |
---|
38 | <h3>Doing it your self the hard way</h3> |
---|
39 | <p>Before we introduce the efficiently implemented SOS module in YALMIP, |
---|
40 | let us briefly mention how one could implement a SOS solver in |
---|
41 | high-level YALMIP code. Of course, you would never use this approach, |
---|
42 | but it might be useful to see the basic idea. Define a polynomial.</p> |
---|
43 | <table cellPadding="10" width="100%" id="table5"> |
---|
44 | <tr> |
---|
45 | <td class="xmpcode"> |
---|
46 | <pre>x = sdpvar(1,1);y = sdpvar(1,1); |
---|
47 | p = (1+x)^4 + (1-y)^2;</pre> |
---|
48 | </td> |
---|
49 | </tr> |
---|
50 | </table> |
---|
51 | <p>Introduce a decomposition</p> |
---|
52 | <table cellPadding="10" width="100%" id="table6"> |
---|
53 | <tr> |
---|
54 | <td class="xmpcode"> |
---|
55 | <pre>v = monolist([x y],degree(p)/2); |
---|
56 | Q = sdpvar(length(v)); |
---|
57 | p_sos = v'*Q*v;</pre> |
---|
58 | </td> |
---|
59 | </tr> |
---|
60 | </table> |
---|
61 | <p>The polynomials have to match, hence all coefficient in the polynomial |
---|
62 | describing the difference of the two polynomials have to be zero.</p> |
---|
63 | <table cellPadding="10" width="100%" id="table7"> |
---|
64 | <tr> |
---|
65 | <td class="xmpcode"> |
---|
66 | <pre>F = set(coefficients(p-p_sos,[x y]) == 0) + set(Q > 0); |
---|
67 | solvesdp(F)</pre> |
---|
68 | </td> |
---|
69 | </tr> |
---|
70 | </table> |
---|
71 | <p>The problem above is typically more efficiently solved when interpreted |
---|
72 | in primal form, hence we <a href="dual.htm#dualize">dualize</a> it.</p> |
---|
73 | <table cellPadding="10" width="100%" id="table11"> |
---|
74 | <tr> |
---|
75 | <td class="xmpcode"> |
---|
76 | <pre>F = set(coefficients(p-p_sos,[x y]) == 0) + set(Q > 0); |
---|
77 | [F,obj] = dualize(F,[]); |
---|
78 | solvesdp(F,-obj)</pre> |
---|
79 | </td> |
---|
80 | </tr> |
---|
81 | </table> |
---|
82 | <p>Adding parameterizations does not change the code significantly. Here is the |
---|
83 | code to find a lower bound on the polynomial</p> |
---|
84 | <table cellPadding="10" width="100%" id="table8"> |
---|
85 | <tr> |
---|
86 | <td class="xmpcode"> |
---|
87 | <pre>sdpvar t |
---|
88 | F = set(coefficients((p-t)-p_sos,[x y]) == 0) + set(Q > 0); |
---|
89 | solvesdp(F,-t)</pre> |
---|
90 | </td> |
---|
91 | </tr> |
---|
92 | </table> |
---|
93 | <p>Matrix valued decompositions are a bit trickier, but still |
---|
94 | straightforward.</p> |
---|
95 | <table cellPadding="10" width="100%" id="table9"> |
---|
96 | <tr> |
---|
97 | <td class="xmpcode"> |
---|
98 | <pre>sdpvar x y |
---|
99 | P = [1+x^2 -x+y+x^2;-x+y+x^2 2*x^2-2*x*y+y^2]; |
---|
100 | m = size(P,1); |
---|
101 | v = monolist([x y],degree(P)/2); |
---|
102 | Q = sdpvar(length(v)*m); |
---|
103 | R = kron(eye(m),v)'*Q*kron(eye(m),v)-P; |
---|
104 | s = coefficients(R(find(triu(R))),[x y]); |
---|
105 | solvesdp(set(Q > 0) + set(s==0)); |
---|
106 | sdisplay(clean(kron(eye(m),v)'*double(Q)*kron(eye(m),v),1e-6))</pre> |
---|
107 | </td> |
---|
108 | </tr> |
---|
109 | </table> |
---|
110 | <p>Once again, this is the basic idea behind the SOS module in YALMIP. |
---|
111 | However, the implementation is far more efficient and uses various |
---|
112 | approaches to reduce complexity, hence the approaches above should never be |
---|
113 | used in practice.</p> |
---|
114 | <h3>Simple sum of squares problems</h3> |
---|
115 | <p>The following lines of code presents some typical manipulation when working |
---|
116 | with SOS-calculations (a more detailed description is available if you run |
---|
117 | the sum of squares example in <code>yalmipdemo.m</code>). </p> |
---|
118 | <p>The most important commands are <a href="reference.htm#sos"> |
---|
119 | sos</a> (to define a SOS constraint) and |
---|
120 | <a href="reference.htm#solvesos"> |
---|
121 | solvesos</a> (to solve the problem).</p> |
---|
122 | <table cellPadding="10" width="100%"> |
---|
123 | <tr> |
---|
124 | <td class="xmpcode"> |
---|
125 | <pre>x = sdpvar(1,1);y = sdpvar(1,1); |
---|
126 | p = (1+x)^4 + (1-y)^2; |
---|
127 | F = set(sos(p)); |
---|
128 | solvesos(F);</pre> |
---|
129 | </td> |
---|
130 | </tr> |
---|
131 | </table> |
---|
132 | <p>The sum of squares decomposition is extracted with the command |
---|
133 | <a href="reference.htm#sosd"> |
---|
134 | sosd</a>.</p> |
---|
135 | <table cellPadding="10" width="100%"> |
---|
136 | <tr> |
---|
137 | <td class="xmpcode"> |
---|
138 | <pre>h = sosd(F); |
---|
139 | sdisplay(h)<font color="#000000"> |
---|
140 | |
---|
141 | ans = </font></pre> |
---|
142 | <pre><font color="#000000"> '-1.203-1.9465x+0.22975y-0.97325x^2' |
---|
143 | '0.7435-0.45951x-0.97325y-0.22975x^2' |
---|
144 | '0.0010977+0.00036589x+0.0010977y-0.0018294x^2'</font></pre> |
---|
145 | </td> |
---|
146 | </tr> |
---|
147 | </table> |
---|
148 | <p>To see if the decomposition was successful, we simply calculate <strong> |
---|
149 | p(x)-h<sup>T</sup>(x)h(x) </strong>which should be 0. However, due to numerical errors, |
---|
150 | the difference will not be zero. A useful command then is |
---|
151 | <a href="reference.htm#clean"> |
---|
152 | clean</a>. Using |
---|
153 | <a href="reference.htm#clean"> |
---|
154 | clean</a>, we remove all monomials with coefficients smaller than, e.g., 1e-6.</p> |
---|
155 | <table cellPadding="10" width="100%"> |
---|
156 | <tr> |
---|
157 | <td class="xmpcode"> |
---|
158 | <pre>clean(p-h'*h,1e-6) |
---|
159 | <font color="#000000"> |
---|
160 | ans =</font></pre> |
---|
161 | <pre><font color="#000000"> 0</font></pre> |
---|
162 | </td> |
---|
163 | </tr> |
---|
164 | </table> |
---|
165 | <p>The decomposition <strong> |
---|
166 | p(x)-v<sup>T</sup>(x)Qv(x) </strong>can also be obtained easily.</p> |
---|
167 | <table cellPadding="10" width="100%"> |
---|
168 | <tr> |
---|
169 | <td class="xmpcode"> |
---|
170 | <pre>x = sdpvar(1,1);y = sdpvar(1,1); |
---|
171 | p = (1+x)^4 + (1-y)^2; |
---|
172 | F = set(sos(p)); |
---|
173 | [sol,v,Q] = solvesos(F); |
---|
174 | clean(p-v{1}'*Q{1}*v{1},1e-6)</pre> |
---|
175 | <pre><font color="#000000"> ans = |
---|
176 | 0</font></pre> |
---|
177 | </td> |
---|
178 | </tr> |
---|
179 | </table> |
---|
180 | <p><font color="#FF0000">Note</font> : Even though <strong> |
---|
181 | h<sup>T</sup>(x)h(x) </strong>and <strong> |
---|
182 | v<sup>T</sup>(x)Qv(x)</strong> should be the same in theory, they are |
---|
183 | typically not. The reason is partly numerical issues with floating point |
---|
184 | numbers, but more importantly due to the way YALMIP treats the case when |
---|
185 | <b>Q</b> |
---|
186 | not is positive definite (often the case due to numerical issues in the |
---|
187 | SDP solver). The decomposition is in theory defined as <strong>h(x)=chol(Q)v(x)</strong> |
---|
188 | . If <strong> |
---|
189 | Q</strong> is indefinite, YALMIP uses an approximation of the Cholesky |
---|
190 | factor based on a singular value decomposition.</p> |
---|
191 | <p>The quality of the decomposition can alternatively be evaluated using |
---|
192 | <a href="reference.htm#checkset">checkset</a>. The value reported is the |
---|
193 | largest coefficient in the polynomial <strong> |
---|
194 | p(x)-h<sup>T</sup>(x)h(x)</strong></p> |
---|
195 | <table cellPadding="10" width="100%" id="table1"> |
---|
196 | <tr> |
---|
197 | <td class="xmpcode"> |
---|
198 | <pre>checkset(F) |
---|
199 | <font color="#000000">++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ |
---|
200 | | ID| Constraint| Type| Primal residual| |
---|
201 | ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ |
---|
202 | | #1| Numeric value| SOS constraint (polynomial)| 7.3674e-012| |
---|
203 | ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++</font></pre> |
---|
204 | <pre>e = checkset(F(is(F,'sos'))) |
---|
205 | |
---|
206 | <font color="#000000">e =</font></pre> |
---|
207 | <pre><font color="#000000"> 7.3674e-012</font></pre> |
---|
208 | </td> |
---|
209 | </tr> |
---|
210 | </table> |
---|
211 | <p>It is very rare (except in contrived academic examples) that one finds a |
---|
212 | decomposition with a positive definite <strong> |
---|
213 | Q </strong>and zero mismatch<strong> |
---|
214 | </strong>between <strong> |
---|
215 | v<sup>T</sup>(x)Qv(x) </strong>and the polynomial <strong> |
---|
216 | p(x)</strong> .The reason is simply that all SDP solver uses real |
---|
217 | arithmetics. Hence, in principle, the decomposition has no theoretical |
---|
218 | value as a certificate for non-negativity unless additional post-analysis |
---|
219 | is performed (relating the size of the residual with the eigenvalues of |
---|
220 | the Gramian).</p> |
---|
221 | <h3><a name="linear"></a>Parameterized problems</h3> |
---|
222 | <p>The involved polynomials can be parameterized, and we can optimize the parameters |
---|
223 | under the constraint that the polynomial is a sum of squares. As an example, |
---|
224 | we can calculate a lower bound on a polynomial. The second argument can be used for an objective function to be |
---|
225 | minimized. Here, we maximize the lower bound, i.e. minimize the negative |
---|
226 | lower bound. The third argument is an options structure while the fourth argument is a vector |
---|
227 | containing all parametric variables in the polynomial (in this example |
---|
228 | we only have one variable, but several parametric variables can be defined |
---|
229 | by simply concatenating them).</p> |
---|
230 | <table cellPadding="10" width="100%"> |
---|
231 | <tr> |
---|
232 | <td class="xmpcode"> |
---|
233 | <pre>sdpvar x y lower |
---|
234 | p = (1+x*y)^2-x*y+(1-y)^2; |
---|
235 | F = set(sos(p-lower)); |
---|
236 | solvesos(F,-lower,[],lower); |
---|
237 | double(lower) |
---|
238 | <font color="#000000"> |
---|
239 | ans =</font></pre> |
---|
240 | <pre><font color="#000000"> 0.75</font></pre> |
---|
241 | </td> |
---|
242 | </tr> |
---|
243 | </table> |
---|
244 | <p>YALMIP automatically declares all variables appearing in the objective |
---|
245 | function and in non-SOS constraints as parametric variables. Hence, the |
---|
246 | following code is equivalent.</p> |
---|
247 | <table cellPadding="10" width="100%"> |
---|
248 | <tr> |
---|
249 | <td class="xmpcode"> |
---|
250 | <pre>solvesos(F,-lower); |
---|
251 | double(lower) |
---|
252 | <font color="#000000"> |
---|
253 | ans =</font></pre> |
---|
254 | <pre><font color="#000000"> 0.75</font></pre> |
---|
255 | </td> |
---|
256 | </tr> |
---|
257 | </table> |
---|
258 | <p>Multiple SOS constraints can also be used. Consider the following problem of |
---|
259 | finding the smallest possible <b>t</b> such that the polynomials <b>t(1+xy)<sup>2</sup>-xy+(1-y)<sup>2</sup></b> |
---|
260 | and <b>(1-xy)<sup>2</sup>+xy+t(1+y)<sup>2</sup> </b>are both |
---|
261 | sum of squares.</p> |
---|
262 | <table cellPadding="10" width="100%"> |
---|
263 | <tr> |
---|
264 | <td class="xmpcode"> |
---|
265 | <pre>sdpvar x y t |
---|
266 | p1 = t*(1+x*y)^2-x*y+(1-y)^2; |
---|
267 | p2 = (1-x*y)^2+x*y+t*(1+y)^2; |
---|
268 | F = set(sos(p1))+set(sos(p2)); |
---|
269 | solvesos(F,t);</pre> |
---|
270 | <pre>double(t) |
---|
271 | <font color="#000000"> |
---|
272 | ans = |
---|
273 | |
---|
274 | 0.2500</font></pre> |
---|
275 | <pre>sdisplay(sosd(F(1)))</pre> |
---|
276 | <pre><font color="#000000">ans = </font></pre> |
---|
277 | <pre><font color="#000000"> '-1.102+0.95709y+0.14489xy' |
---|
278 | '-0.18876-0.28978y+0.47855xy'</font></pre> |
---|
279 | <pre>sdisplay(sosd(F(2)))</pre> |
---|
280 | <pre><font color="#000000">ans = </font></pre> |
---|
281 | <pre><font color="#000000"> '-1.024-0.18051y+0.76622xy' |
---|
282 | '-0.43143-0.26178y-0.63824xy' |
---|
283 | '0.12382-0.38586y+0.074568xy'</font></pre> |
---|
284 | <pre> </pre> |
---|
285 | </td> |
---|
286 | </tr> |
---|
287 | </table> |
---|
288 | <p> |
---|
289 | <img border="0" src="demoicon.gif" width="16" height="16">If you have |
---|
290 | parametric variables, bounding the feasible region typically |
---|
291 | improves numerical behavior. Having lower bounds will |
---|
292 | additionally decrease the size of the optimization problem |
---|
293 | (variables bounded from below can be treated as translated cone |
---|
294 | variables in <a href="dual.htm#dualization">dualization</a>, which |
---|
295 | is used by <a href="reference.htm#solvesos"> |
---|
296 | solvesos</a>).</p> |
---|
297 | <p> |
---|
298 | <img border="0" src="demoicon.gif" width="16" height="16"> One of the |
---|
299 | most common mistake people make when using the sum of squares module |
---|
300 | is to forget to declare some parametric variables. This will |
---|
301 | typically lead to a (of-course erroneous) huge sum of squares |
---|
302 | problem which easily freezes MATLAB and/or give strange error |
---|
303 | messages (trivially infeasible, nonlinear parameterization, etc). |
---|
304 | Make sure to initially run the module in verbose mode to see how |
---|
305 | YALMIP characterizes the problem (most importantly to check the |
---|
306 | number of parametric variables and independent variables). If you |
---|
307 | use nonlinear operators (<b>min</b>, <b>max</b>, <b>abs</b>,...) on |
---|
308 | parametric variables in your problem, it is recommended to always |
---|
309 | declare the parametric variables.</p> |
---|
310 | <p> |
---|
311 | <img border="0" src="demoicon.gif" width="16" height="16">When you use |
---|
312 | a kernel representation (<code>sos.model=1</code> and typically the case also |
---|
313 | for <code>sos.model=0</code>), YALMIP will derive and solve a |
---|
314 | problem which is related to the dual of your original problem. This |
---|
315 | means that warnings about infeasibility after solving the SDP actually means |
---|
316 | unbounded objective, and vice versa. If you use <code>sos.model=2</code>, |
---|
317 | a primal problem is solved, and YALMIP error messages are directly related to |
---|
318 | your problem.</p> |
---|
319 | <p> |
---|
320 | <img border="0" src="demoicon.gif" width="16" height="16">The quality |
---|
321 | of the SOS approximation is typically improved substantially if the |
---|
322 | tolerance and precision options of the semidefinite solver is |
---|
323 | decreased. As an example, having <code>sedumi.eps</code> |
---|
324 | less than 10<sup>-10 </sup> when solving sum of squares problems is |
---|
325 | typically recommended for anything but trivial problems. There is a |
---|
326 | higher likelihood that the semidefinite solver will complain about |
---|
327 | numerical problems in the end-phase, but the resulting solutions are |
---|
328 | typically much better. This seem to be even more important in |
---|
329 | parameterized problems.</p> |
---|
330 | <p> |
---|
331 | <img border="0" src="demoicon.gif" width="16" height="16">To evaluate |
---|
332 | the quality and success of the sum of squares decomposition, do not |
---|
333 | forget to study the discrepancy between the decomposition and the |
---|
334 | original polynomial.<checksetche The quality |
---|
335 | of the SOS approximation is typically improved substantially if the |
---|
336 | tolerance and precision options of the semidefinite solver is |
---|
337 | decreased. As an example, having <code> No problems in the |
---|
338 | semidefinite solver is no guarantee for a successful decomposition |
---|
339 | (due to numerical reasons, tolerances in the solvers etc.)</p> |
---|
340 | <h3><a name="nonlinear"></a>Polynomial parameterizations</h3> |
---|
341 | <p>A special feature of the sum of squares package in YALMIP is the |
---|
342 | possibility to work with nonlinear SOS parameterizations, i.e. SOS problems |
---|
343 | resulting in PMIs (polynomial matrix inequalities) instead of LMIs. The following piece of code solves a |
---|
344 | nonlinear control <i>synthesis</i> problem using sum of squares. Note |
---|
345 | that this requires the solver <a href="solvers.htm#penbmi">PENBMI</a>.</p> |
---|
346 | <table cellPadding="10" width="100%"> |
---|
347 | <tr> |
---|
348 | <td class="xmpcode"> |
---|
349 | <pre>clear all |
---|
350 | yalmip('clear'); |
---|
351 | |
---|
352 | % States... |
---|
353 | sdpvar x1 x2 |
---|
354 | x = [x1;x2]; |
---|
355 | |
---|
356 | % Non-quadratic Lyapunov z'Pz |
---|
357 | z = [x1;x2;x1^2]; |
---|
358 | P = sdpvar(3,3); |
---|
359 | V = z'*P*z; |
---|
360 | |
---|
361 | % Non-linear state feedback |
---|
362 | v = [x1;x2;x1^2]; |
---|
363 | K = sdpvar(1,3); |
---|
364 | u = K*v; |
---|
365 | |
---|
366 | % System x' = f(x)+Bu |
---|
367 | f = [1.5*x1^2-0.5*x1^3-x2; 3*x1-x2]; |
---|
368 | B = [0;1]; |
---|
369 | |
---|
370 | % Closed loop system, u = Kv |
---|
371 | fc = f+B*K*v; |
---|
372 | |
---|
373 | % Stability and performance constraint dVdt < -x'x-u'u |
---|
374 | % NOTE : This polynomial is bilinear in P and K |
---|
375 | F = set(sos(-jacobian(V,x)*fc-x'*x-u'*u)); |
---|
376 | |
---|
377 | % P is positive definite, bound P and K for numerical reasons |
---|
378 | F = F + set(P>0)+set(25>P(:)>-25)+set(25>K>-25); |
---|
379 | |
---|
380 | % Minimize trace(P) |
---|
381 | % Parametric variables P and K automatically detected |
---|
382 | % by YALMIP since they are both constrained |
---|
383 | solvesos(F,trace(P));</pre> |
---|
384 | </td> |
---|
385 | </tr> |
---|
386 | </table> |
---|
387 | <h3><a name="matrix"></a>Matrix valued problems</h3> |
---|
388 | <p>In the same sense that the moment implementation in YALMIP supports |
---|
389 | <a href="moment.htm#matrix">optimization over nonlinear semidefiniteness constraint</a> |
---|
390 | using moments, YALMIP also supports the dual SOS approach. Instead of computing a |
---|
391 | decomposition <strong> |
---|
392 | p(x) = |
---|
393 | v<sup>T</sup>(x)Qv(x)</strong>, the matrix SOS decomposition is <strong>P(x) = (kron(I,v(x))</strong><strong><sup>T</sup>Q(kron(I,v(x))</strong>.</p> |
---|
394 | <table cellPadding="10" width="100%" id="table4"> |
---|
395 | <tr> |
---|
396 | <td class="xmpcode"> |
---|
397 | <pre>sdpvar x1 x2 |
---|
398 | P = [1+x1^2 -x1+x2+x1^2;-x1+x2+x1^2 2*x1^2-2*x1*x2+x2^2]; |
---|
399 | [sol,v,Q] = solvesos(set(sos(P))); |
---|
400 | sdisplay(v{1}) |
---|
401 | <font color="#000000"> ans = </font></pre> |
---|
402 | <pre><font color="#000000"> '1' '0' |
---|
403 | 'x2' '0' |
---|
404 | 'x1' '0' |
---|
405 | '0' '1' |
---|
406 | '0' 'x2' |
---|
407 | '0' 'x1' |
---|
408 | |
---|
409 | </font>clean(v{1}'*Q{1}*v{1}-P,1e-8) |
---|
410 | <font color="#000000">ans =</font></pre> |
---|
411 | <pre><font color="#000000"> 0 0 |
---|
412 | 0 0</font></pre> |
---|
413 | </td> |
---|
414 | </tr> |
---|
415 | </table> |
---|
416 | <p>Of course, parameterized problems etc can also be solved with matrix |
---|
417 | valued SOS constraints. </p> |
---|
418 | <p>At the moment, YALMIP extends some of the reduction techniques from |
---|
419 | the scalar case to exploit symmetry and structure of the polynomial |
---|
420 | matrix, but there is room for obvious improvements. If you think you |
---|
421 | might need this, make a feature request.</p> |
---|
422 | <p>Keep in mind, that a simple scalarization can be more efficient in |
---|
423 | many cases.</p> |
---|
424 | <table cellPadding="10" width="100%" id="table12"> |
---|
425 | <tr> |
---|
426 | <td class="xmpcode"> |
---|
427 | <pre>w = sdpvar(length(P),1); |
---|
428 | [sol,v,Q] = solvesos(set(sos(w'*P*w))); |
---|
429 | clean(v{1}'*Q{1}*v{1}-w'*P*w,1e-8) |
---|
430 | <font color="#000000">ans = |
---|
431 | 0 </font></pre> |
---|
432 | </td> |
---|
433 | </tr> |
---|
434 | </table> |
---|
435 | <p>This approach will however only prove positive semidefiniteness, it |
---|
436 | will not provide a decomposition of the polynomial matrix.</p> |
---|
437 | <h3><a name="lowrank"></a>Low-rank sum-of-squares (<font color="#FF0000">experimental</font>)</h3> |
---|
438 | <p>By using the capabilities of the solver <a href="solvers.htm#LMIRANK">LMIRANK</a>, |
---|
439 | we can pose sum-of-squares problems where we search for decompositions |
---|
440 | with few terms (low-rank Gramian). Consider the following problem where |
---|
441 | a trace heuristic leads to an effective rank of 5, perhaps 6. </p> |
---|
442 | <table cellPadding="10" width="100%" id="table2"> |
---|
443 | <tr> |
---|
444 | <td class="xmpcode"> |
---|
445 | <pre>x = sdpvar(1,1); |
---|
446 | y = sdpvar(1,1); |
---|
447 | f = 200*(x^3 - 4*x)^2+200 * (y^3 - 4*y)^2+(y - x)*(y + x)*x*(x + 2)*(x*(x - 2)+2*(y^2 - 4)); |
---|
448 | g = 1 + x^2 + y^2; |
---|
449 | p = f * g; |
---|
450 | F = set(sos(p)); |
---|
451 | [sol,v,Q] = solvesos(F,[],sdpsettings('sos.traceobj',1)); |
---|
452 | |
---|
453 | eig(Q{1}) |
---|
454 | <font color="#000000">ans =</font></pre> |
---|
455 | <pre><font color="#000000"> 1.0e+003 *</font></pre> |
---|
456 | <pre><font color="#000000"> 0.0000 |
---|
457 | 0.0000 |
---|
458 | 0.0000 |
---|
459 | 0.0000 |
---|
460 | 0.0000 |
---|
461 | 0.0000 |
---|
462 | 0.0000 |
---|
463 | 0.0000 |
---|
464 | 0.0001 |
---|
465 | 0.0124 |
---|
466 | 0.3977 |
---|
467 | 3.3972 |
---|
468 | 3.4000 |
---|
469 | 6.7972</font></pre> |
---|
470 | </td> |
---|
471 | </tr> |
---|
472 | </table> |
---|
473 | <p>We solve the problem using <a href="solvers.htm#LMIRANK">LMIRANK</a> |
---|
474 | instead, and aim for a rank less than or equal to 4. The desired rank is |
---|
475 | specified easily in the <a href="reference.htm#sos">sos</a> construct.</p> |
---|
476 | <table cellPadding="10" width="100%" id="table3"> |
---|
477 | <tr> |
---|
478 | <td class="xmpcode"> |
---|
479 | <pre>x = sdpvar(1,1); |
---|
480 | y = sdpvar(1,1); |
---|
481 | f = 200*(x^3 - 4*x)^2+200 * (y^3 - 4*y)^2+(y - x)*(y + x)*x*(x + 2)*(x*(x - 2)+2*(y^2 - 4)); |
---|
482 | g = 1 + x^2 + y^2; |
---|
483 | p = f * g; |
---|
484 | F = set(sos(p,<font color="#FF0000">4</font>)); |
---|
485 | [sol,v,Q] = solvesos(F,[],sdpsettings('lmirank.eps',0)); |
---|
486 | |
---|
487 | eig(Q{1}) |
---|
488 | <font color="#000000">ans =</font></pre> |
---|
489 | <pre><font color="#000000"> 1.0e+003 *</font></pre> |
---|
490 | <pre><font color="#000000"> -0.0000 |
---|
491 | -0.0000 |
---|
492 | -0.0000 |
---|
493 | -0.0000 |
---|
494 | -0.0000 |
---|
495 | -0.0000 |
---|
496 | -0.0000 |
---|
497 | -0.0000 |
---|
498 | 0.0000 |
---|
499 | 0.0000 |
---|
500 | 0.4634 |
---|
501 | 4.2674 |
---|
502 | 4.6584 |
---|
503 | 7.1705</font></pre> |
---|
504 | </td> |
---|
505 | </tr> |
---|
506 | </table> |
---|
507 | <p>The resulting rank is indeed effectively 4. Note though that <a href="solvers.htm#LMIRANK">LMIRANK</a> |
---|
508 | works on the dual problem side, and can return slightly infeasible |
---|
509 | solutions (in terms of positive definiteness.) Keep in mind that |
---|
510 | sum-of-squares decompositions <i>almost always</i> be slightly wrong, in |
---|
511 | one way or the other. If a dual (also called image) approach is used |
---|
512 | (corresponding to <font color="#0000FF">sos.model=2</font>), positive |
---|
513 | definiteness may be violated, and if a primal approach (also called |
---|
514 | kernel) approach is used (corresponding to <font color="#0000FF"> |
---|
515 | sos.model=1</font>), there is typically a difference between the |
---|
516 | polynomial and its decomposition. This simply due to the way SDP solvers |
---|
517 | and floating point arithmetic work. See more in the example <font color="#0000FF">sosex.m</font></p> |
---|
518 | <p>Remember that <a href="solvers.htm#LMIRANK">LMIRANK</a> is a local |
---|
519 | solver, hence there are no guarantees that it will find a low rank |
---|
520 | solution even though one is known to exist. Moreover, note that <a href="solvers.htm#LMIRANK">LMIRANK</a> |
---|
521 | does not support objective functions.<h3>Options</h3> |
---|
522 | <p>In the examples above, we are mainly using the default settings when |
---|
523 | solving the SOS problem, but there are a couple of options that can be |
---|
524 | changed to alter the computations. The most important are:</p> |
---|
525 | <table border="1" cellspacing="1" style="border-collapse: collapse" width="100%" bordercolor="#000000" bgcolor="#FFFFE0" id="table10"> |
---|
526 | <tr> |
---|
527 | <td width="301"><code>sdpsettings('sos.model',[0|1|2])</code></td> |
---|
528 | <td>The SDP formulation of a SOS problem is not unique but can be |
---|
529 | done in several ways. YALMIP supports two version, here called image |
---|
530 | and kernel representation. If you set the value to 1, a kernel |
---|
531 | representation will be used, while 2 will result in an image |
---|
532 | representation. If the option is set to the default value 0, YALMIP |
---|
533 | will automatically select the representation (kernel by default).<p> |
---|
534 | The kernel representation will most often give a smaller and |
---|
535 | numerically more robust semidefinite program, but cannot be used for |
---|
536 | nonlinearly parameterized SOS programs (i.e. problems resulting in |
---|
537 | BMIs etc) or problems with integrality |
---|
538 | constraints on parametric variables.</p> </td> |
---|
539 | </tr> |
---|
540 | <tr> |
---|
541 | <td width="301"><code>sdpsettings('sos.newton',[0|1])</code></td> |
---|
542 | <td>To reduce the size of the resulting SDP, a Newton polytope |
---|
543 | reduction algorithm is applied by default. For efficiency, you must |
---|
544 | have <a href="solvers.htm#cdd">CDDMEX</a> or <a href="solvers.htm#glpk"> |
---|
545 | GLPKMEX</a> installed.</td> |
---|
546 | </tr> |
---|
547 | <tr> |
---|
548 | <td width="301"><code>sdpsettings('sos.congruence',[0|1])</code></td> |
---|
549 | <td>A useful feature in YALMIP is the use of symmetry of the |
---|
550 | polynomial to block-diagonalize the SDP. This can make a huge difference |
---|
551 | for some SOS problems and is applied by default.</td> |
---|
552 | </tr> |
---|
553 | <tr> |
---|
554 | <td width="301"><code>sdpsettings('sos.inconsistent',[0|1])</code></td> |
---|
555 | <td>This options can be used to further reduce the size of the SDP. It |
---|
556 | is turned off by default since it typically not gives any major |
---|
557 | reduction in problem size once the Newton polytope reduction has been |
---|
558 | applied. In some situations, it might however be useful to use this |
---|
559 | strategy instead of the linear programming based Newton reduction (it |
---|
560 | cannot suffer from numerical problems and does not require any |
---|
561 | efficient LP solver), and for some problems, it can reduce models that |
---|
562 | are minimal in the Newton polytope sense, leading to a more |
---|
563 | numerically robust solution of the resulting SDP.</td> |
---|
564 | </tr> |
---|
565 | <tr> |
---|
566 | <td width="301"><code>sdpsettings('sos.extlp',[0|1])</code></td> |
---|
567 | <td>When a kernel representation model is used, the SDP problem is |
---|
568 | derived using the <a href="dual.htm#dualize">dualization</a> |
---|
569 | function. For some problems, the strategy in the dualization may |
---|
570 | affect the numerical conditioning of the problem. If you encounter |
---|
571 | numerical problems, it can sometimes be resolved by setting this |
---|
572 | option to 0. This will typically yield a slightly larger problem, but |
---|
573 | can improve the numerical performance. Note that this option only is of use |
---|
574 | if you have parametric variables with explicit non-zero lower bounds (constraints like <b>set(t>-100)</b>).</td> |
---|
575 | </tr> |
---|
576 | <tr> |
---|
577 | <td width="301"><code>sdpsettings('sos.postprocess',[0|1])</code></td> |
---|
578 | <td>In practice, the SDP computations will never give a completely |
---|
579 | correct decomposition (due to floating point numbers and in many |
---|
580 | cases numerical problems in the solver). YALMIP can try to recover |
---|
581 | from these problems by applying a heuristic post-processing |
---|
582 | algorithm. This can in many cases improve the results.</td> |
---|
583 | </tr> |
---|
584 | </table> |
---|
585 | <p> </td> |
---|
586 | </tr> |
---|
587 | </table> |
---|
588 | </div> |
---|
589 | |
---|
590 | </body> |
---|
591 | |
---|
592 | </html> |
---|