Polynomial Optimization Sufficient Condition Application to Global Optimization
Sums of Squares and Global Optimization Mehdi Ghasemi
Murray Marshall
Department of Mathematics and Statistics University of Saskatchewan
Real Algebra, Geometry and Convexity, Leipzig 2011
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
Outline 1
Polynomial Optimization SOS Approximation Notations
2
Sufficient Condition The Main Result Application
3
Application to Global Optimization An Alternative for SDP Runtime Comparison Explicit Lower Bounds
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
Outline 1
Polynomial Optimization SOS Approximation Notations
2
Sufficient Condition The Main Result Application
3
Application to Global Optimization An Alternative for SDP Runtime Comparison Explicit Lower Bounds
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
Unconstrained Polynomial Optimization
Fix a polynomial f (X ) ∈ R[X ] = R[X1 , · · · , Xn ], where n ≥ 1 is an integer. Let f∗ = inf{f (a) : a ∈ Rn }, finding f∗ is known as Unconstrained Polynomial Optimization. We say f is PSD if f∗ ≥ 0 and is PD if f∗ > 0. Obviously many applications, Is a NP-hard problem.
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
Unconstrained Polynomial Optimization
Fix a polynomial f (X ) ∈ R[X ] = R[X1 , · · · , Xn ], where n ≥ 1 is an integer. Let f∗ = inf{f (a) : a ∈ Rn }, finding f∗ is known as Unconstrained Polynomial Optimization. We say f is PSD if f∗ ≥ 0 and is PD if f∗ > 0. Obviously many applications, Is a NP-hard problem.
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
Unconstrained Polynomial Optimization
Fix a polynomial f (X ) ∈ R[X ] = R[X1 , · · · , Xn ], where n ≥ 1 is an integer. Let f∗ = inf{f (a) : a ∈ Rn }, finding f∗ is known as Unconstrained Polynomial Optimization. We say f is PSD if f∗ ≥ 0 and is PD if f∗ > 0. Obviously many applications, Is a NP-hard problem.
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
Unconstrained Polynomial Optimization
Fix a polynomial f (X ) ∈ R[X ] = R[X1 , · · · , Xn ], where n ≥ 1 is an integer. Let f∗ = inf{f (a) : a ∈ Rn }, finding f∗ is known as Unconstrained Polynomial Optimization. We say f is PSD if f∗ ≥ 0 and is PD if f∗ > 0. Obviously many applications, Is a NP-hard problem.
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
The Dual Problem
f∗ > −∞ ⇐⇒ ∃r ∈ R s.t. f − r is PSD. Moreover f∗ = sup{r : f − r is PSD}. If deg(f ) = m then f = f0 + f1 + · · · + fm where deg(fi ) = i, i = 0, . . . , m. If f∗ > −∞ then fm is PSD. If fm is PD then f∗ > −∞. If f is PSD then deg(f ) = 2d.
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
The Dual Problem
f∗ > −∞ ⇐⇒ ∃r ∈ R s.t. f − r is PSD. Moreover f∗ = sup{r : f − r is PSD}. If deg(f ) = m then f = f0 + f1 + · · · + fm where deg(fi ) = i, i = 0, . . . , m. If f∗ > −∞ then fm is PSD. If fm is PD then f∗ > −∞. If f is PSD then deg(f ) = 2d.
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
The Dual Problem
f∗ > −∞ ⇐⇒ ∃r ∈ R s.t. f − r is PSD. Moreover f∗ = sup{r : f − r is PSD}. If deg(f ) = m then f = f0 + f1 + · · · + fm where deg(fi ) = i, i = 0, . . . , m. If f∗ > −∞ then fm is PSD. If fm is PD then f∗ > −∞. If f is PSD then deg(f ) = 2d.
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
The Dual Problem
f∗ > −∞ ⇐⇒ ∃r ∈ R s.t. f − r is PSD. Moreover f∗ = sup{r : f − r is PSD}. If deg(f ) = m then f = f0 + f1 + · · · + fm where deg(fi ) = i, i = 0, . . . , m. If f∗ > −∞ then fm is PSD. If fm is PD then f∗ > −∞. If f is PSD then deg(f ) = 2d.
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
The Dual Problem
f∗ > −∞ ⇐⇒ ∃r ∈ R s.t. f − r is PSD. Moreover f∗ = sup{r : f − r is PSD}. If deg(f ) = m then f = f0 + f1 + · · · + fm where deg(fi ) = i, i = 0, . . . , m. If f∗ > −∞ then fm is PSD. If fm is PD then f∗ > −∞. If f is PSD then deg(f ) = 2d.
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
SOS Approximation Semidefinite Programming
If f = g12 + · · · + gk2 , (g1 , . . . , gk ∈ R[X ]) then f is PDS. It is known that deciding whether a polynomial is SOS, is much easier that deciding its PSD-ness. P2d,n = {PSD forms of degree 2d in n variables}. Σ2d,n = {SOS forms of degree 2d in n variables}. Hilbert P2d,n = Σ2d,n if and only if (n ≤ 2) or (d = 1) or (n = 4 and d = 2).
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
SOS Approximation Semidefinite Programming
If f = g12 + · · · + gk2 , (g1 , . . . , gk ∈ R[X ]) then f is PDS. It is known that deciding whether a polynomial is SOS, is much easier that deciding its PSD-ness. P2d,n = {PSD forms of degree 2d in n variables}. Σ2d,n = {SOS forms of degree 2d in n variables}. Hilbert P2d,n = Σ2d,n if and only if (n ≤ 2) or (d = 1) or (n = 4 and d = 2).
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
SOS Approximation Semidefinite Programming
If f = g12 + · · · + gk2 , (g1 , . . . , gk ∈ R[X ]) then f is PDS. It is known that deciding whether a polynomial is SOS, is much easier that deciding its PSD-ness. P2d,n = {PSD forms of degree 2d in n variables}. Σ2d,n = {SOS forms of degree 2d in n variables}. Hilbert P2d,n = Σ2d,n if and only if (n ≤ 2) or (d = 1) or (n = 4 and d = 2).
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
SOS Approximation Semidefinite Programming
If f = g12 + · · · + gk2 , (g1 , . . . , gk ∈ R[X ]) then f is PDS. It is known that deciding whether a polynomial is SOS, is much easier that deciding its PSD-ness. P2d,n = {PSD forms of degree 2d in n variables}. Σ2d,n = {SOS forms of degree 2d in n variables}. Hilbert P2d,n = Σ2d,n if and only if (n ≤ 2) or (d = 1) or (n = 4 and d = 2).
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
SOS Approximation Semidefinite Programming
P Let R[X ]2 be the cone of all SOS polynomials in R[X ] and for f ∈ R[X ] define X fsos := sup{r ∈ R : f − r ∈ R[X ]2 }.
fsos ≤ f∗ . If f2d ∈ Σ◦2d,n then fsos 6= −∞. If fsos > −∞ then it can be computed in polynomial time, using Semidefinite Programming (SDP). When the size of input grows, SDP becomes awkward.
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
SOS Approximation Semidefinite Programming
P Let R[X ]2 be the cone of all SOS polynomials in R[X ] and for f ∈ R[X ] define X fsos := sup{r ∈ R : f − r ∈ R[X ]2 }.
fsos ≤ f∗ . If f2d ∈ Σ◦2d,n then fsos 6= −∞. If fsos > −∞ then it can be computed in polynomial time, using Semidefinite Programming (SDP). When the size of input grows, SDP becomes awkward.
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
SOS Approximation Semidefinite Programming
P Let R[X ]2 be the cone of all SOS polynomials in R[X ] and for f ∈ R[X ] define X fsos := sup{r ∈ R : f − r ∈ R[X ]2 }.
fsos ≤ f∗ . If f2d ∈ Σ◦2d,n then fsos 6= −∞. If fsos > −∞ then it can be computed in polynomial time, using Semidefinite Programming (SDP). When the size of input grows, SDP becomes awkward.
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
SOS Approximation Semidefinite Programming
P Let R[X ]2 be the cone of all SOS polynomials in R[X ] and for f ∈ R[X ] define X fsos := sup{r ∈ R : f − r ∈ R[X ]2 }.
fsos ≤ f∗ . If f2d ∈ Σ◦2d,n then fsos 6= −∞. If fsos > −∞ then it can be computed in polynomial time, using Semidefinite Programming (SDP). When the size of input grows, SDP becomes awkward.
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
SOS Approximation Semidefinite Programming
P Let R[X ]2 be the cone of all SOS polynomials in R[X ] and for f ∈ R[X ] define X fsos := sup{r ∈ R : f − r ∈ R[X ]2 }.
fsos ≤ f∗ . If f2d ∈ Σ◦2d,n then fsos 6= −∞. If fsos > −∞ then it can be computed in polynomial time, using Semidefinite Programming (SDP). When the size of input grows, SDP becomes awkward.
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
SOS Approximation Alternative Approaches
Let Φm (f , f0 ) be a formula in terms of coefficients of f , such that Φm (f , f0 ) ⇒ f is SOS.
∀r (Φm (f , f0 − r ) ⇒ r ≤ fsos ). fΦ = sup{r ∈ R : Φm (f , f0 − r )} is a lower bound for fsos and hence for f∗ .
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
SOS Approximation Alternative Approaches
Let Φm (f , f0 ) be a formula in terms of coefficients of f , such that Φm (f , f0 ) ⇒ f is SOS.
∀r (Φm (f , f0 − r ) ⇒ r ≤ fsos ). fΦ = sup{r ∈ R : Φm (f , f0 − r )} is a lower bound for fsos and hence for f∗ .
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
SOS Approximation Alternative Approaches
Let Φm (f , f0 ) be a formula in terms of coefficients of f , such that Φm (f , f0 ) ⇒ f is SOS.
∀r (Φm (f , f0 − r ) ⇒ r ≤ fsos ). fΦ = sup{r ∈ R : Φm (f , f0 − r )} is a lower bound for fsos and hence for f∗ .
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
Outline 1
Polynomial Optimization SOS Approximation Notations
2
Sufficient Condition The Main Result Application
3
Application to Global Optimization An Alternative for SDP Runtime Comparison Explicit Lower Bounds
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
Notations N = {0, 1, 2, . . .}. For X = (X1 , . . . , Xn ), α = (α1 , . . . , αn ) ∈ Nn , and a = (a1 , . . . , an ) ∈ Rn : |α| := α1 + · · · + αn . X α := X1α1 · · · Xnαn . aα := a1α1 · · · anαn , with convention being 00 = 1.
0 := (0, . . . , 0), i := (δi1 , . . . , δin ) where δij = 1 if i = j and 0 otherwise. For a (univariate) of the form Pn−1 polynomial n i p(t) = t − i=0 ai t , where each ai is nonnegative and at least one ai is nonzero, we denote by C(p) the unique positive root of p. By convention, C(t n ) := 0.
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
Notations N = {0, 1, 2, . . .}. For X = (X1 , . . . , Xn ), α = (α1 , . . . , αn ) ∈ Nn , and a = (a1 , . . . , an ) ∈ Rn : |α| := α1 + · · · + αn . X α := X1α1 · · · Xnαn . aα := a1α1 · · · anαn , with convention being 00 = 1.
0 := (0, . . . , 0), i := (δi1 , . . . , δin ) where δij = 1 if i = j and 0 otherwise. For a (univariate) of the form Pn−1 polynomial n i p(t) = t − i=0 ai t , where each ai is nonnegative and at least one ai is nonzero, we denote by C(p) the unique positive root of p. By convention, C(t n ) := 0.
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
Notations N = {0, 1, 2, . . .}. For X = (X1 , . . . , Xn ), α = (α1 , . . . , αn ) ∈ Nn , and a = (a1 , . . . , an ) ∈ Rn : |α| := α1 + · · · + αn . X α := X1α1 · · · Xnαn . aα := a1α1 · · · anαn , with convention being 00 = 1.
0 := (0, . . . , 0), i := (δi1 , . . . , δin ) where δij = 1 if i = j and 0 otherwise. For a (univariate) of the form Pn−1 polynomial n i p(t) = t − i=0 ai t , where each ai is nonnegative and at least one ai is nonzero, we denote by C(p) the unique positive root of p. By convention, C(t n ) := 0.
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
Notations N = {0, 1, 2, . . .}. For X = (X1 , . . . , Xn ), α = (α1 , . . . , αn ) ∈ Nn , and a = (a1 , . . . , an ) ∈ Rn : |α| := α1 + · · · + αn . X α := X1α1 · · · Xnαn . aα := a1α1 · · · anαn , with convention being 00 = 1.
0 := (0, . . . , 0), i := (δi1 , . . . , δin ) where δij = 1 if i = j and 0 otherwise. For a (univariate) of the form Pn−1 polynomial n i p(t) = t − i=0 ai t , where each ai is nonnegative and at least one ai is nonzero, we denote by C(p) the unique positive root of p. By convention, C(t n ) := 0.
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
Notations N = {0, 1, 2, . . .}. For X = (X1 , . . . , Xn ), α = (α1 , . . . , αn ) ∈ Nn , and a = (a1 , . . . , an ) ∈ Rn : |α| := α1 + · · · + αn . X α := X1α1 · · · Xnαn . aα := a1α1 · · · anαn , with convention being 00 = 1.
0 := (0, . . . , 0), i := (δi1 , . . . , δin ) where δij = 1 if i = j and 0 otherwise. For a (univariate) of the form Pn−1 polynomial n i p(t) = t − i=0 ai t , where each ai is nonnegative and at least one ai is nonzero, we denote by C(p) the unique positive root of p. By convention, C(t n ) := 0.
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
Notations
Every polynomial f ∈ R[X ] can be written as P f (X ) = α∈Nn fα X α , for finitely many α. Ωf := {α : fα 6= 0} \ {0, 2d1 , . . . , 2dn } where 2d = deg(f ). f0 := f0 f2d,i := f2di
∆f := {α ∈ Ωf : fα X α is not square} = {α ∈ Ωf : fα < 0 or α 6∈ 2Nn }. P P P f (X ) = ni=1 f2d,i Xi2d + α∈∆f fα X α + β∈Ωf \∆f fβ X β + f0
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
Notations
Every polynomial f ∈ R[X ] can be written as P f (X ) = α∈Nn fα X α , for finitely many α. Ωf := {α : fα 6= 0} \ {0, 2d1 , . . . , 2dn } where 2d = deg(f ). f0 := f0 f2d,i := f2di
∆f := {α ∈ Ωf : fα X α is not square} = {α ∈ Ωf : fα < 0 or α 6∈ 2Nn }. P P P f (X ) = ni=1 f2d,i Xi2d + α∈∆f fα X α + β∈Ωf \∆f fβ X β + f0
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
Notations
Every polynomial f ∈ R[X ] can be written as P f (X ) = α∈Nn fα X α , for finitely many α. Ωf := {α : fα 6= 0} \ {0, 2d1 , . . . , 2dn } where 2d = deg(f ). f0 := f0 f2d,i := f2di
∆f := {α ∈ Ωf : fα X α is not square} = {α ∈ Ωf : fα < 0 or α 6∈ 2Nn }. P P P f (X ) = ni=1 f2d,i Xi2d + α∈∆f fα X α + β∈Ωf \∆f fβ X β + f0
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
Notations
Every polynomial f ∈ R[X ] can be written as P f (X ) = α∈Nn fα X α , for finitely many α. Ωf := {α : fα 6= 0} \ {0, 2d1 , . . . , 2dn } where 2d = deg(f ). f0 := f0 f2d,i := f2di
∆f := {α ∈ Ωf : fα X α is not square} = {α ∈ Ωf : fα < 0 or α 6∈ 2Nn }. P P P f (X ) = ni=1 f2d,i Xi2d + α∈∆f fα X α + β∈Ωf \∆f fβ X β + f0
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
SOS Approximation Notations
Notations
Every polynomial f ∈ R[X ] can be written as P f (X ) = α∈Nn fα X α , for finitely many α. Ωf := {α : fα 6= 0} \ {0, 2d1 , . . . , 2dn } where 2d = deg(f ). f0 := f0 f2d,i := f2di
∆f := {α ∈ Ωf : fα X α is not square} = {α ∈ Ωf : fα < 0 or α 6∈ 2Nn }. P P P f (X ) = ni=1 f2d,i Xi2d + α∈∆f fα X α + β∈Ωf \∆f fβ X β + f0
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
The Main Result Application
Outline 1
Polynomial Optimization SOS Approximation Notations
2
Sufficient Condition The Main Result Application
3
Application to Global Optimization An Alternative for SDP Runtime Comparison Explicit Lower Bounds
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
The Main Result Application
Hurwitz-Reznick Theorem Fidalgo-Kovacec
Hurwitz-Reznick P Suppose p(X ) = ni=1 αi Xi2d − 2dX α , where α = (α1 , . . . , αn ) ∈ Nn , |α| = 2d. Then p is sos. Fidalgo-Kovacek P For a form p(X ) = ni=1 βi Xi2d − µX α such that αi ≥ 0 and βi ≥ 0, for every i = 1, · · · , n and µ ≥ 0 if all αi are even, the following are equivalent: 1
2 3
p is PSD. α Qn βi 2di |µ| ≤ 2d i=1 αi . p is SOS. Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
The Main Result Application
Hurwitz-Reznick Theorem Fidalgo-Kovacec
Hurwitz-Reznick P Suppose p(X ) = ni=1 αi Xi2d − 2dX α , where α = (α1 , . . . , αn ) ∈ Nn , |α| = 2d. Then p is sos. Fidalgo-Kovacek P For a form p(X ) = ni=1 βi Xi2d − µX α such that αi ≥ 0 and βi ≥ 0, for every i = 1, · · · , n and µ ≥ 0 if all αi are even, the following are equivalent: 1
2 3
p is PSD. α Qn βi 2di |µ| ≤ 2d i=1 αi . p is SOS. Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
The Main Result Application
New Criterion The statement
Theorem 1 Suppose f is a form of degree 2d. A sufficient condition for f to be SOS is that there exist real numbers aα,i for α ∈ ∆f , i = 1, . . . , n such that 1 2 3
aα,i ≥ 0 and aα,i = 0 if and only if αi = 0. ∀α ∈ ∆f (2d)2d aαα = |fα |2d αα . P f2d,i ≥ α∈∆f aα,i , i = 1, . . . , n.
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
The Main Result Application
New Criterion Proof
Proof. Suppose that such real numbers exist. Then condition (2) together with Fidalgo-Kovacec’s theorem, implies that Pn 2d + f X α is SOS for each α ∈ ∆ . a α f i=1 α,i Xi So, n X X X fα X α aα,i )Xi2d + ( i=1 α∈∆f
α∈∆f
is SOS. Combining Pn with (3),2dit follows P that α f (X ) = i=1 f2d,i Xi + α∈∆f fα X is SOS.
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
The Main Result Application
New Criterion Proof
Proof. Suppose that such real numbers exist. Then condition (2) together with Fidalgo-Kovacec’s theorem, implies that Pn 2d + f X α is SOS for each α ∈ ∆ . a α f i=1 α,i Xi So, n X X X fα X α aα,i )Xi2d + ( i=1 α∈∆f
α∈∆f
is SOS. Combining Pn with (3),2dit follows P that α f (X ) = i=1 f2d,i Xi + α∈∆f fα X is SOS.
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
The Main Result Application
New Criterion Proof
Proof. Suppose that such real numbers exist. Then condition (2) together with Fidalgo-Kovacec’s theorem, implies that Pn 2d + f X α is SOS for each α ∈ ∆ . a α f i=1 α,i Xi So, n X X X fα X α aα,i )Xi2d + ( i=1 α∈∆f
α∈∆f
is SOS. Combining Pn with (3),2dit follows P that α f (X ) = i=1 f2d,i Xi + α∈∆f fα X is SOS.
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
The Main Result Application
Outline 1
Polynomial Optimization SOS Approximation Notations
2
Sufficient Condition The Main Result Application
3
Application to Global Optimization An Alternative for SDP Runtime Comparison Explicit Lower Bounds
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
The Main Result Application
Applications Lasserre’s Criterion
Lasserre For any polynomial f ∈ R[X ] of degree 2d, if P f0 ≥ |fα | 2d−|α| 2d , and α∈∆ P αi f2d,i ≥ |fα | 2d , i = 1, . . . , n, α∈∆
then f is SOS. Proof. Let ¯f (X , Y ) = Y 2d f ( XY1 , . . . , XYn ) be the homogenization of f . αi i| and aα,Y = |¯fα | 2d−|α Apply Theorem 1 to ¯f with aα,i = |¯fα | 2d 2d .
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
The Main Result Application
Applications Lasserre’s Criterion
Lasserre For any polynomial f ∈ R[X ] of degree 2d, if P f0 ≥ |fα | 2d−|α| 2d , and α∈∆ P αi f2d,i ≥ |fα | 2d , i = 1, . . . , n, α∈∆
then f is SOS. Proof. Let ¯f (X , Y ) = Y 2d f ( XY1 , . . . , XYn ) be the homogenization of f . αi i| and aα,Y = |¯fα | 2d−|α Apply Theorem 1 to ¯f with aα,i = |¯fα | 2d 2d .
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
The Main Result Application
Applications Fidalgo-Kovacec’s Criterion
Fidalgo-Kovacec Suppose f ∈ R[X ] is a form of degree 2d and min f2d,i ≥
i=1,...,n
1 1 X |fα |(αα ) 2d . 2d
α∈∆f
Then f is SOS. Proof. α/2d Apply Theorem 1 to aα,i = |fα | α 2d , ∀α ∈ ∆f , i = 1 . . . , n.
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
The Main Result Application
Applications Fidalgo-Kovacec’s Criterion
Fidalgo-Kovacec Suppose f ∈ R[X ] is a form of degree 2d and min f2d,i ≥
i=1,...,n
1 1 X |fα |(αα ) 2d . 2d
α∈∆f
Then f is SOS. Proof. α/2d Apply Theorem 1 to aα,i = |fα | α 2d , ∀α ∈ ∆f , i = 1 . . . , n.
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
An Alternative for SDP Runtime Comparison Explicit Lower Bounds
Outline 1
Polynomial Optimization SOS Approximation Notations
2
Sufficient Condition The Main Result Application
3
Application to Global Optimization An Alternative for SDP Runtime Comparison Explicit Lower Bounds
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
An Alternative for SDP Runtime Comparison Explicit Lower Bounds
Geometric Program A function φ : Rn>0 → R defined as φ(x) = cx1a1 · · · xnan where c > 0, ai ∈ R and x = (x1 , . . . , xn ) is called a monomial function. A sum of monomial functions, i.e., a function of the form φ(x) =
K X
ci x1a1i · · · xnani
i=1
where ci > 0 for i = 1, . . . , K , is called a posynomial function. Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
An Alternative for SDP Runtime Comparison Explicit Lower Bounds
Geometric Program A function φ : Rn>0 → R defined as φ(x) = cx1a1 · · · xnan where c > 0, ai ∈ R and x = (x1 , . . . , xn ) is called a monomial function. A sum of monomial functions, i.e., a function of the form φ(x) =
K X
ci x1a1i · · · xnani
i=1
where ci > 0 for i = 1, . . . , K , is called a posynomial function. Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
An Alternative for SDP Runtime Comparison Explicit Lower Bounds
Geometric Program
An optimization problem of the form Minimize φ0 (x) Subject to φi (x) ≤ 1, i = 1, . . . , m ψi (x) = 1, i = 1, . . . , p where φ0 , . . . , φm are posynomials and ψ1 , . . . , ψp are monomial functions, is called a geometric program.
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
An Alternative for SDP Runtime Comparison Explicit Lower Bounds
Geometric Program
Theorem Let f be a polynomial of degree 2d and r ∈ R. Suppose there exist real numbers aα,i , α ∈ ∆, i = 1, . . . , n such that 1 2 3
4
aα,i ≥ 0 and aα,i = 0 if and only if αi = 0. (2d)2d aαα = |fα |2d αα for each α ∈ ∆ such that |α| = 2d. P f2d,i ≥ α∈∆ aα,i for i = 1, . . . , n. h 2d α i 1 P 2d−|α| |fα | α f0 − r ≥ α∈∆<2d (2d − |α|) (2d) . 2d aα α
Then f − r is SOS. In particular fsos ≥ r .
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
An Alternative for SDP Runtime Comparison Explicit Lower Bounds
Geometric Program Fix a polynomial f ∈ R[X ] of degree 2d and take h 2d α i 1 P 2d−|α| |fα | α φ0 (a) = α α∈∆<2d (2d − |α|) (2d)2d aα P aα,i φi (a) = i = 1, . . . , n α∈∆ f2d,i 2d aα (2d) ψα (a) = α α ∈ ∆, |α| = 2d. |fα |2d αα
φi ’s are posynomials and ψα ’s are monomial functions. Let fGP = f0 − r ∗ where r ∗ is an optimum value for φ0 . A special case occurs when {α ∈ ∆ : |α| = 2d} = ∅. In this case monomial constraints are vacuous and the feasibility set is always non-empty so −∞ < fGP .
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
An Alternative for SDP Runtime Comparison Explicit Lower Bounds
Geometric Program Fix a polynomial f ∈ R[X ] of degree 2d and take h 2d α i 1 P 2d−|α| |fα | α φ0 (a) = α α∈∆<2d (2d − |α|) (2d)2d aα P aα,i φi (a) = i = 1, . . . , n α∈∆ f2d,i 2d aα (2d) ψα (a) = α α ∈ ∆, |α| = 2d. |fα |2d αα
φi ’s are posynomials and ψα ’s are monomial functions. Let fGP = f0 − r ∗ where r ∗ is an optimum value for φ0 . A special case occurs when {α ∈ ∆ : |α| = 2d} = ∅. In this case monomial constraints are vacuous and the feasibility set is always non-empty so −∞ < fGP .
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
An Alternative for SDP Runtime Comparison Explicit Lower Bounds
Geometric Program Fix a polynomial f ∈ R[X ] of degree 2d and take h 2d α i 1 P 2d−|α| |fα | α φ0 (a) = α α∈∆<2d (2d − |α|) (2d)2d aα P aα,i φi (a) = i = 1, . . . , n α∈∆ f2d,i 2d aα (2d) ψα (a) = α α ∈ ∆, |α| = 2d. |fα |2d αα
φi ’s are posynomials and ψα ’s are monomial functions. Let fGP = f0 − r ∗ where r ∗ is an optimum value for φ0 . A special case occurs when {α ∈ ∆ : |α| = 2d} = ∅. In this case monomial constraints are vacuous and the feasibility set is always non-empty so −∞ < fGP .
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
An Alternative for SDP Runtime Comparison Explicit Lower Bounds
Geometric Program
If f2d ∈ Σ◦2d,n then −∞ < f0 − a∗ ≤ fsos where a∗ is an optimum solution for the following geometric program i 1 h 2d α P |fα | α −|α| 2d−|α| Minimize (2d − |α|) <2d α α∈∆ (2d)2d aα Subject to P a ≤1 i = 1, · · · , n α∈∆
α,i
Here > 0 is given such that f2d − (
Ghasemi, Marshall
Pn
2d i=1 Xi )
∈ Σ2d,n .
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
An Alternative for SDP Runtime Comparison Explicit Lower Bounds
Outline 1
Polynomial Optimization SOS Approximation Notations
2
Sufficient Condition The Main Result Application
3
Application to Global Optimization An Alternative for SDP Runtime Comparison Explicit Lower Bounds
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
An Alternative for SDP Runtime Comparison Explicit Lower Bounds
GP. vs. SDP.
Table: Average running time (seconds) to calculate fsos and fGP
n\2d 3 4 5 6
4 0.73 0.08 0.98 0.08 1.43 0.08 1.59 0.09
6 1 0.08 1.8 0.13 4.13 0.25 13.24 0.37
Ghasemi, Marshall
8 1.66 0.12 5.7 0.3 44.6 0.8 573 2.2
10 2.9 0.28 25.5 0.76 -
12 6.38 0.36 -
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
An Alternative for SDP Runtime Comparison Explicit Lower Bounds
Outline 1
Polynomial Optimization SOS Approximation Notations
2
Sufficient Condition The Main Result Application
3
Application to Global Optimization An Alternative for SDP Runtime Comparison Explicit Lower Bounds
Ghasemi, Marshall
SOS polynomials & Optimization
Polynomial Optimization Sufficient Condition Application to Global Optimization
An Alternative for SDP Runtime Comparison Explicit Lower Bounds
Explicit Lower Bounds P P Suppose that f (X ) = ni=1 Xi2d + α∈Ω fα X α + f0 , where |α| < 2d, ∀α ∈ Ω. Let P |α| rL := f0 − α∈∆<2d |fα | 2d−|α| 2d k P αi |α| k ≥ max C(t 2d − α∈∆<2d |fα | 2d t ) i=1,··· ,n
rFK := f0 − k 2d , k ≥ C(t 2d − bi :=
2d−i 1 (2d − i) 2d 2d
rdmt := f0 −
P2d−1 i=1
bi t i ), 1
X
|fα |(αα ) 2d , i = 1, . . . , 2d − 1
α∈∆,|α|=i
P
α∈∆,|α|<2d (2d
− |α|)
fα 2d
2d
t |α| αα
Then rL , rFK , rdmt ≤ fsos . Ghasemi, Marshall
SOS polynomials & Optimization
1 2d−|α|
,
Polynomial Optimization Sufficient Condition Application to Global Optimization
An Alternative for SDP Runtime Comparison Explicit Lower Bounds
References M. Marshall, Positive Polynomials and Sum of Squares, Mathematical Surveys and Monographs, Vol.146, 2008. C. Fidalgo, and A. Kovacec, Diagonal minus tail forms and Lasserre’s sufficient conditions for sums of squares, Math. Z., 2010. M. Ghasemi, M. Marshall, Lower bounds for a polynomial in terms of its coefficients, Arch. Math. (Basel) 95 (343-353), 2010. J. B. Lasserre, Global Optimization with Polynomials and the Problem of Moments, SIAM J. Optim. Volume 11, Issue 3 (796-817), 2001. B. Reznick, Forms derived from the arithmetic geometric inequality, Math. Ann. 283 (431-464), 1989. Ghasemi, Marshall
SOS polynomials & Optimization