Upgrade to Pro — share decks privately, control downloads, hide ads and more …

凸最適化からDC最適化まで

Avatar for ざきまつ ざきまつ
August 17, 2025

 凸最適化からDC最適化まで

修士時代の研究室のゼミで使ったスライドです。超絶ざっくりと凸最適化とDC最適化について書いています。

Avatar for ざきまつ

ざきまつ

August 17, 2025
Tweet

Other Decks in Science

Transcript

  1. ತ࠷దԽ͔Β DC ࠷దԽ·Ͱ ͖͟·ͭ ػցγεςϜίʔεɹम࢜ 1 ೥ February 10, 2023

    ͖͟·ͭ (ػցγεςϜίʔεɹम࢜ 1 ೥) ತ࠷దԽ͔Β DC ࠷దԽ·Ͱ February 10, 2023 1 / 23
  2. ໨࣍ 1 Property of Convex 2 Convex Optimization 3 DC

    Optimization ͖͟·ͭ (ػցγεςϜίʔεɹम࢜ 1 ೥) ತ࠷దԽ͔Β DC ࠷దԽ·Ͱ February 10, 2023 2 / 23
  3. Property of Convex ໨࣍ 1 Property of Convex 2 Convex

    Optimization 3 DC Optimization ͖͟·ͭ (ػցγεςϜίʔεɹम࢜ 1 ೥) ತ࠷దԽ͔Β DC ࠷దԽ·Ͱ February 10, 2023 3 / 23
  4. Property of Convex ತू߹ Definition (ತू߹: convex set) ͋Δू߹ C

    Λఆٛ͢ΔɽC ⊂ Rn ͕ತू߹Ͱ͋Δͱ͸ɼ x1, x2 ∈ C, θ ∈ [0, 1] ⇒ (1 − θ)x1 + θx2 ∈ C ͕੒Γཱͭ͜ͱΛݴ͏ɽ ௚ײతʹ͸ɼx1 ͱ x2 Λ݁Μͩઢ෼ͷ಺෼఺͕ू߹ C ಺ʹଘࡏ͢Δ͜ͱɽ ͖͟·ͭ (ػցγεςϜίʔεɹम࢜ 1 ೥) ತ࠷దԽ͔Β DC ࠷దԽ·Ͱ February 10, 2023 4 / 23
  5. Property of Convex ತؔ਺ Definition (ತؔ਺) ͋Δ࣮਺஋ؔ਺Λ f ͱ͢Δɽf ͷΤϐάϥϑ͕ತू߹ͷͱ͖ɼf

    ͸ತؔ਺Ͱ͋Δɽ Definition (Τϐάϥϑ) ҎԼͷू߹͸ɼf ͷΤϐάϥϑͰ͋Δɽ epi f ≜ {(x, y) ∈ Rn × R : y ≥ f(x)} ఆٛͷ؆୯ͳղऍͷ࢓ํͱͯ͠͸ɼ • ؔ਺ͷ্ଆ͕Τϐάϥϑ • ؔ਺ͷ্ଆ͕ԜΜͩΓ͍ͯ͠ͳ͚Ε͹ɼತؔ਺ʢԼʹತʣ ͖͟·ͭ (ػցγεςϜίʔεɹम࢜ 1 ೥) ತ࠷దԽ͔Β DC ࠷దԽ·Ͱ February 10, 2023 5 / 23
  6. Property of Convex ತؔ਺ Ҏ্ 2 ͭͷఆٛΑΓɼತؔ਺͸ҰൠతʹҎԼͷΑ͏ʹද͞ΕΔɽ (1 − θ)f(x1)

    + θf(x2) ≥ f ((1 − θ)x1 + θx2) (1) ਤ 2: ತؔ਺ͱඇತؔ਺ ͖͟·ͭ (ػցγεςϜίʔεɹम࢜ 1 ೥) ತ࠷దԽ͔Β DC ࠷దԽ·Ͱ February 10, 2023 6 / 23
  7. Property of Convex ͪͳΈʹ ʮತ͕͋ΔͳΒɼԜ΋͋ΔͷͰ͸ʁ ʁʯˠɹ͋Γ·͢ɽ͑͐ɼ͋Γ·͢ͱ΋ɽ Definition (Ԝؔ਺) ͋Δ࣮਺஋ؔ਺Λ g

    ͱ͢Δɽg ͷූ߸൓స͕ತؔ਺Ͱ͋Δ࣌ɼg ͸Ԝؔ਺Ͱ͋Δɽ (1 − θ)f(x1) + θf(x2) ≤ f ((1 − θ)x1 + θx2) (2) Ԝؔ਺ͷྫ • ର਺ؔ਺ g(x) = log x • ਖ਼ݭؔ਺ g(x) = sin x (0 ≤ x ≤ π) • ೋ࣍ؔ਺ g(x) = −x2 • ฏํؔ਺ g(x) = √ x ͖͟·ͭ (ػցγεςϜίʔεɹम࢜ 1 ೥) ತ࠷దԽ͔Β DC ࠷దԽ·Ͱ February 10, 2023 7 / 23
  8. Convex Optimization ໨࣍ 1 Property of Convex 2 Convex Optimization

    3 DC Optimization ͖͟·ͭ (ػցγεςϜίʔεɹम࢜ 1 ೥) ತ࠷దԽ͔Β DC ࠷దԽ·Ͱ February 10, 2023 8 / 23
  9. Convex Optimization ತ࠷దԽ (Convex Optimization) ͱ͸   ؔ਺ f

    : Rn → R ∪ {∞} ͕ϓϩύʔͳತؔ਺ɼ͔ͭू߹ C ⊂ Rn ͕ดತू߹Ͱ͋Δͱ ͖ɼx ⊂ C ͷதͰ f(x) Λ࠷খʹ͢Δ x ΛٻΊΔ໰୊ɽ   f(x) < ∞ ͱͳΔΑ͏ͳ x ͕ҰͭͰ΋ଘࡏ͢Δʢdom f ̸= ∅ʣͱ͖ɼؔ਺ f ͸ϓϩύʔͰ ͋Δͱ͍͏ɽ Definition (dom fʢ࣮ޮఆٛҬʣ) ؔ਺ f ͷఆٛҬͷ͏ͪɼf(x) ͕࣮਺஋ΛऔΔΑ͏ͳ x ͷू߹Λ f ͷ࣮ޮఆٛҬͱݺͿɽ dom f ≜ {x ∈ R : f(x) < ∞} ͖͟·ͭ (ػցγεςϜίʔεɹम࢜ 1 ೥) ತ࠷దԽ͔Β DC ࠷దԽ·Ͱ February 10, 2023 10 / 23
  10. Convex Optimization ತ࠷దԽͷ୅දతͳख๏ʢΞϧΰϦζϜʣ • ࠷খೋ৐๏ • ൓෮ແ͠χϡʔτϯ๏ • ༗ޮ੍໿๏ •

    ಺఺๏ • ۙ઀ޯ഑๏ • ަޓํ޲৐਺๏ ͖͟·ͭ (ػցγεςϜίʔεɹम࢜ 1 ೥) ತ࠷దԽ͔Β DC ࠷దԽ·Ͱ February 10, 2023 12 / 23
  11. Convex Optimization ΞϧΰϦζϜͷҰྫɿۙ઀ޯ഑๏   ҎԼͷΑ͏ʹఆࣜԽ͞ΕΔ࠷దԽ໰୊Λߟ͑Δɽ minimize ω f(ω) ≜

    g(ω) + h(ω) (3) ͜͜Ͱɼg ͸ඍ෼Մೳͳತؔ਺ɼh ͸ඞͣ͠΋ඍ෼ՄೳͰ͸ͳ͍ತؔ਺Ͱ͋Δͱ͢Δɽ   ໰୊ (3) ʹର͢Δۙ઀ޯ഑๏͸ɼҎԼͷߋ৽ࣜʹΑͬͯ఺ ω Λߋ৽͢ΔΞϧΰϦζϜͰ ͋Δɽ ωk+1 = proxγh (ωk − γ∇g(ωk)) proxγh (v) ≜ argminω { h(ω) + 1 2γ ∥ω − v∥2 2 } ͖͟·ͭ (ػցγεςϜίʔεɹम࢜ 1 ೥) ತ࠷దԽ͔Β DC ࠷దԽ·Ͱ February 10, 2023 13 / 23
  12. Convex Optimization ತ࠷దԽɹ·ͱΊ • ತ࠷దԽͱ͸ɼ໨తؔ਺͕ತؔ਺Ͱදݱ͞ΕΔ࠷దԽ໰୊ͷ͜ͱɽ • ತ࠷దԽͰ͸ɼޯ഑৘ใΛར༻ͯ͠΋େҬతͳ࠷దղΛٻΊΕΔɽ • ಛ༗ͷ࠷దԽख๏ʹΑͬͯɼܭࢉΛ଎͘ߦ͏͜ͱ͕Ͱ͖Δɽ ▶

    ࠷্ೋ৐๏ ▶ ಺఺๏ ▶ ۙ઀ޯ഑๏ ▶ ަޓํ޲৐਺๏ ͳͲͳͲ etc ͖͟·ͭ (ػցγεςϜίʔεɹम࢜ 1 ೥) ತ࠷దԽ͔Β DC ࠷దԽ·Ͱ February 10, 2023 15 / 23
  13. DC Optimization ໨࣍ 1 Property of Convex 2 Convex Optimization

    3 DC Optimization ͖͟·ͭ (ػցγεςϜίʔεɹम࢜ 1 ೥) ತ࠷దԽ͔Β DC ࠷దԽ·Ͱ February 10, 2023 16 / 23
  14. DC Optimization DC ࠷దԽʹ͍ͭͯ DC ࠷దԽͱ͸ ໨తؔ਺͕ 2 ͭͷತؔ਺ͷࠩ (Difference

    of Convex Function) Ͱදݱ͞ΕΔඇತ࠷దԽ໰ ୊ʹ͍ͭͯɼ࠷খԽΛߦ͏ࡍʹ࢖͏ख๏ɽ 2 ͭͷತؔ਺ g, h : Rn → R ∪ {∞} Λ࢖ͬͯɼҎԼͷΑ͏ʹදݱ͞ΕΔɽ minimize ω g(ω) − h(ω) (4)   ؔ਺ f : Rn → R ∪ {∞} ʹରͯ͠ɼ2 ͭͷดತؔ਺ g, h ʹ͍ͭͯ f(ω) = g(ω) − h(ω), ∀ω ∈ Rn (5) ͕੒Γཱͭͱ͖ɼf Λ DC ؔ਺ͱ͍͍ɼ(5) ͷදݱΛ f ͷ DC දݱͱݺͿɽ   ͖͟·ͭ (ػցγεςϜίʔεɹम࢜ 1 ೥) ತ࠷దԽ͔Β DC ࠷దԽ·Ͱ February 10, 2023 17 / 23
  15. DC Optimization DC ࠷దԽʹ͍ͭͯ ͳͥඇತ࠷దԽ໰୊ʹͳΔͷ͔   • ತؔ਺ͷ࿨͸ತؔ਺Ͱ͋Δɽ •

    ತؔ਺ͷࠩ͸ತؔ਺ʹͳΒͳ͍৔߹͕͋Δɽ   ತؔ਺ͷࠩΛऔΔ࠷దԽ໰୊͸Ұൠʹ͸ඇತ࠷దԽ໰୊ͱͳΓɼେہత࠷దղΛٻΊΔ ͷ͸ࠔ೉Ͱ͋Δɽ ͦ͜ͰɼDCA Λ༻͍Δ͜ͱͰɼDC ࠷దԽ໰୊Λತ࠷దԽ໰୊ʹม׵͢Δ͜ͱ͕Ͱ͖ɼ ತ࠷దԽͷख๏ʹΑͬͯ࠷దղΛٻΊΔ͜ͱ͕Ͱ͖Δɽ ͳ͓ɼ2 ֊ඍ෼͕Մೳͳ೚ҙͷؔ਺ͳͲɼଟ͘ͷؔ਺͕ DC ؔ਺ͱͳ͍ͬͯΔɽ ˠɹ༷ʑͳΫϥεͷ࠷దԽ໰୊͕ DC ࠷దԽ໰୊ʹΑͬͯදݱͰ͖Δɽ ͖͟·ͭ (ػցγεςϜίʔεɹम࢜ 1 ೥) ತ࠷దԽ͔Β DC ࠷దԽ·Ͱ February 10, 2023 18 / 23
  16. DC Optimization DC Algorithm Algorithm 1 Calculate Problem (4) with

    DCA 1: ద౰ͳॳظ఺ ω0 ΛͱΔɽ 2: for k = 0, 1, 2, . . . do 3: ݱࡏͷ൓෮఺ ωk ʹ͓͚Δ h ͷྼޯ഑ sk ∈ ∂h(ωk) Λܭࢉ͢Δɽ 4: ωk+1 = argminω ( g(ω) − s⊤ k ω ) 5: end for ∂h(x[k])ɿ࣌ࠁ k ʹ͓͚Δ x[k] ·ΘΓͷྼඍ෼ ∂h(x[k]) ≜ {s ∈ Rn : h(x) ≥ h(x[k]) + s⊤(x − x[k])} ඍ෼ͷ֓೦Λɼඍ෼ෆՄೳؔ਺ʹରͯ͠΋ద༻ͨ͠΋ͷʢઈର஋ؔ਺ͱ͔ʣ ɽ ͖͟·ͭ (ػցγεςϜίʔεɹम࢜ 1 ೥) ತ࠷దԽ͔Β DC ࠷దԽ·Ͱ February 10, 2023 19 / 23
  17. DC Optimization DC Algorithm ͷ໰୊఺ ω[k + 1] ∈ argminω

    ( g(ω) − s[k]⊤ω ) (6) ࠷దԽ໰୊ͷதʹɼࢠ໰୊ͱͯ͠ತ࠷దԽ໰୊ (6) ؚ͕·Ε͓ͯΓɼ֤൓෮Ͱ (6) ͷ࠷ద ղΛٻΊΔඞཁ͕͋Δɽ ͦͷͨΊɼେن໛ͳ໰୊΍ෳࡶͳ໰୊Ͱ͸ɼലେͳܭࢉίετʹͳΔՄೳੑ͕͋Δɽ ਤ 4: DCA ͷྲྀΕ ͖͟·ͭ (ػցγεςϜίʔεɹम࢜ 1 ೥) ತ࠷దԽ͔Β DC ࠷దԽ·Ͱ February 10, 2023 20 / 23
  18. DC Optimization pDCAε ࠷ۙͰ͸໰୊ͷߏ଄Λ׆͔͠ɼ֤൓෮ͰͷܭࢉΛܰ͘͢Δख๏͕ఏҊ͞Ε͍ͯΔɽ pDCAε   DCA ʹରͯ͠ɼۙ઀ޯ഑๏ͱ Nesterov

    ͷ֎ૠΛࢪͨ͠վྑ൛ΞϧΰϦζϜɽ   Algorithm 2 Calculate Problem (4) with pDCAe 1: supk βk < 1 ͱͳΔ βk ⊂ [0, 1) ΛఆΊɼω−1 = ω0 ͱ͢Δɽ 2: for k = 0, 1, 2, . . . do 3: ݱࡏͷ൓෮఺ ω[k] ʹ͓͚Δ h ͷྼޯ഑ sk ∈ ∂h(ωk) Λܭࢉ͢Δɽ 4: yk = ωk + βk(ωk − ωk−1) 5: ωk+1 = argminy { (∇f(yk) − sk)⊤ + L 2 ∥y − yk∥2 2 + g(y) } 6: end for ͖͟·ͭ (ػցγεςϜίʔεɹम࢜ 1 ೥) ತ࠷దԽ͔Β DC ࠷దԽ·Ͱ February 10, 2023 21 / 23
  19. DC Optimization ತ࠷దԽɹ·ͱΊ • ໨తؔ਺͕ 2 ͭͷತؔ਺ͷࠩͰදݱ͞ΕΔඇತ࠷దԽ໰୊ʹର͢Δख๏ɽ • DCA ʹΑͬͯɼඇತ࠷దԽ໰୊Λತ࠷దԽ໰୊ʹม׵͢Δ͜ͱ͕Ͱ͖Δɽ

    • ଟ͘ͷؔ਺͕ DC ؔ਺ͱͳ͍ͬͯͯɼ༷ʑͳΫϥεͷ࠷దԽ໰୊Λ DC ࠷దԽ໰୊ ͰදݱՄೳɽ • ͨͩɼ໰୊ʹΑͬͯ͸େن໛ͳܭࢉίετ͕ඞཁɽ • ࠷ۙͰ͸ɼܭࢉͷܰྔԽΛ໨ࢦͨ͠ख๏͕ఏҊ͞Ε͍ͯΔɽ ͖͟·ͭ (ػցγεςϜίʔεɹम࢜ 1 ೥) ತ࠷దԽ͔Β DC ࠷దԽ·Ͱ February 10, 2023 22 / 23
  20. References References • פ໺ળത, ౔୩ོ, ”౦ژେֶ޻ֶڭఔ جૅܥ ਺ֶ ࠷దԽͱม෼๏”, ؙળग़൛,

    2014. • Ӭݪਖ਼ষ, ”εύʔεϞσϦϯάʔجૅ͔ΒಈతγεςϜͷԠ༻ʔ”, ίϩφࣾ, 2017. • ҏ౻উ, ”ತ࠷దԽ໰୊ʹର͢ΔҰ࣍๏ͱͦͷཧ࿦ : Ճ଎ޯ഑๏ͱͦͷपล”, ΦϖϨʔγϣϯζɾϦαʔν = Communications of the Operations Research Society of Japan : ܦӦͷՊֶ, 64, 6, pp.326-334, 2019. • খ໺फ़༎, ”ۙ઀෼཭ΞϧΰϦζϜͱͦͷԠ༻: ৴߸ॲཧɾը૾ॲཧత؍఺͔Β”, ΦϖϨʔγϣϯζɾϦαʔ ν= Communications of the Operations Research Society of Japan: ܦӦͷՊֶ, 64, 6, pp.316-325, 2019. • খ໺फ़༎, ”ۙ઀෼཭ʹΑΔ෼ࢄತ࠷దԽʕަޓํ޲৐਺๏ʹجͮ͘ΞϓϩʔνΛத৺ͱͯ͠ʕ”, ܭଌͱ੍ޚ, 55, 11, pp.954-959, 2016. • ాதະདྷ, Ԟ໺و೭, ”DC ࠷దԽͷཧ࿦ͱԠ༻”, Ԡ༻਺ཧ, 29, 6, pp.14-23, 2019. • ޙ౻ॱ࠸, ෢ా࿕ࢠ, ”DC Ξϓϩʔνʹجͮ͘εύʔε࠷దԽ”, ΦϖϨʔγϣϯζɾϦαʔν= Communications of the Operations Research Society of Japan: ܦӦͷՊֶ, 64, 6, pp.352-359, 2019. • Wen, Bo and Chen, Xiaojun and Pong, Ting Kei, ”A proximal difference-of-convex algorithm with extrapolation”, Computational optimization and applications, 69, pp.297-324, 2018. ͖͟·ͭ (ػցγεςϜίʔεɹम࢜ 1 ೥) ತ࠷దԽ͔Β DC ࠷దԽ·Ͱ February 10, 2023 23 / 23