SlideShare una empresa de Scribd logo
1 de 34
•          k-NN
         •             Yes, No


Training Data




 Test Data



                                 3
4
•                         xi                       yi      i
              1   -1
    (xi , yi )(i = 1, . . . , l, xi ∈ Rn , yi ∈ {1, −1})

•                  w, b
    yi (w · (xi − b)) > 0 (i = 0, . . . , l)




                                                           5
•                w.x+b≧0
•                    1,
                           x     d(x)
                         if w · x + b ≥ 0
        d(x) =
                     −1, otherwise
•
    •




                                            6
Fisher                  (1)
•                               2    2


    •               w   w+b=0
    •   w   x+b=0




                                          7
Fisher                                 (2)
•                        m+ m-
                       d(x)=1   x                   d(x)=−1   x
         m+ =                       , m− =
                   |{x|d(x) = 1}|              |{x|d(x) = −1}|

•                         |(m+ − m− ) · w|
    •
•           w.x+b=0
                                2                             2
                 ((x − m+ ) · w) +             ((x − m− ) · w)
        d(x)=1                       d(x)=−1


    •

                                                                  8
Fisher                                   (3)
•
•              |w|=1                 J(w)                    w
    •                        w.x+b
    •   b
                                                   2
                            |(m+ − m− ) · w|
    J(w) =                           2                             2
             d(x)=1 ((x − m+ ) · w) +        d(x)=−1 ((x − m− ) · w)


                              J(w)             w
                              J(w)       w              0




                                                                       9
Fisher                    (4)
 J(w)
          w T SB w
J(w)    =
          w T SW w
          SB = (m+ − m− )(m+ − m− )T
          SW =      (x − m+ )(x − m+ )T +             (x − m− )(x − m− )T
                       d(x)=1               d(x)=−1
            ∂J(w)
          0       =0
             ∂w
          f           f g − fg
                  =
          g               g2

(wT SB w)SW w = (wT SW w)SB w
              2                        SB w      m+ − m−
 w ∝ S−1 (m+ − m− )
      W


                      Sw                                              10
SVM (Support Vector Machine)
•
    •
•




                               11
•       ρ(w,b)
                         xi · w              xi · w
    ρ(w, b) =   min             − max
              {xi |yi =1} |w|    {xi |yi =−1} |w|



                                                      12
2




w0 · x + b0 = ±1                 w0, b0
                                      xi · w0               xi · w0
            ρ(w0 , b0 ) =   min                − max
                          {xi |yi =1} |w0 |     {xi |yi =−1} |w0 |
                          1 − b0       −1 − b0       2
                        =           −           =
                           |w0 |         |w0 |    |w0 |
                                                                      13
•   2/|w0 |               w0 · w0


              yi (w0 · xi + b) ≥ 1 (i = 1, . . . , l)

                        w0 · w0                  w0

•               2                 2
    •                                            2
    •               1
• 2

    •                                                   14
(1)
    yi (w0 · xi + b) ≥ 1 (i = 1, . . . , l)                (1)

            w0 · w0                    w0
       Λ = (α1 , . . . , αl ) (αi ≥ 0)
                                       l
                            |w|2
       L(w, b, Λ)      =         −     αi (yi (xi · w + b) − 1)
                             2     i=1

•                             w, b                  Λ




                                                            15
(2)
•   w=w0, b=b0                      L(w, b, Λ)
                                                                    l
              ∂L(w, b, Λ)
                                            =           w0 −             αi yi xi = 0
                 ∂w                 w=w0
                                                              l
                                                                   i=1                  (2)
                 ∂L(w, b, Λ)
                                            =           −          αi yi = 0
                    ∂b               b=b0                    i=1
                           l                             l
                 w0 =           αi yi xi    ,                 αi yi = 0
                          i=1                           i=1

•                         w=w0, b=b0
                                                    l
                               1
       L(w0 , b0 , Λ) =          w0 · w0 −     αi [yi (xi · w0 + b0 ) − 1]
                               2           i=1
                                l               l        l
                                        1
                     =             αi −                       αi αj yi yj xi · xj
                               i=1
                                        2   i=1 j=1

•                    w         b
                     Λ                                                                        16
SVM
•           l
                            w, b
                αi yi = 0, αi ≥ 0
        i=1                                                              (3)
                             l            l   l
                                     1
      L(w0 , b0 , Λ) =          αi −               αi αj yi yj xi · xj
                            i=1
                                     2   i=1 j=1
                   Λ
• SVM
• w0                Λ
                        l
  •    (2)     ( w0 =   i=1 αi yi xi )
•   (2)    αi≠0       xi w             KKKT


    • KKT        : αi [yi (xi · w0 + b0 ) − 1] = 0
                                                                          17
•
•
•




    (A)   (B)   18
(        )
•
    •
    •
•
    •                            l               l     l
                                      1
        L(w0 , b0 , Λ) =         αi −                      αi αj yi yj xi · xj
                             i=1
                                      2         i=1 j=1

    •    x
                         l
                                 Φ(x)
                                            l   l
                                  1
        L(w0 , b0 , Λ) =     αi −                    αi αj yi yj Φ(xi ) · Φ(xj )
                         i=1
                                  2        i=1 j=1

    •                                 l
        Φ(x) · w0 + b0       =             αi yi Φ(x) · Φ(xi ) + b0 = 0
                                     i=1

    •            Φ
                                                                                   19
Kernel
•            K(x, y) = Φ(x)
                              √
                                Φ(y)
                                           √     √
•       Φ((x1 , x2 )) = (x1 , 2x1 x2 , x2 , 2x1 , 2x2 , 1)
                           2             2

        Φ((x1 , x2 )) · Φ((y1 , y2 ))
          = (x1 y1 )2 + 2x1 y1 x2 y2 + (x2 y2 )2 + 2x1 y1 + 2x2 y2 + 1
          = (x1 y1 + x2 y2 + 1)2
          = ((x1 , x2 ) · (y1 , y2 ) + 1)2
    •                 (6     )
•
    •                         (x · y + 1)d ,
    •     RBF                 exp(−||x − y||2 /2σ 2 ),
    •                         tanh(κx · y − δ)
         •   σ κ   δ
         •                                                     Mercer
                                                                         20
•

•
    •
        •   ξ

yi (w · xi + b) ≥ 1 − ξi
  where ξi ≥ 0 (i = 1, . . . , l)

                      l
    1
      w·w+C                ξi
    2                i=1

                                    21
(1)
         •
                               Λ = (α1 , . . . , αl ), R = (r1 , . . . , rl )
                   L
  L(w, ξ, b, Λ, R)
                         l            l                                            l
         1
     =     w·w+C              ξi −          αi [yi (xi · w + b) − 1 + ξi ] −            ri ξi
         2              i=1          i=1                                          i=1

w0 , b0 , ξi L
           0
                               w, b, ξi                                 KKT
                                                            l
             ∂L(w, ξ, b, Λ, R)
                                              = w0 −             α i y i xi = 0
                   ∂w            w=w0                      i=0
                                                      l
              ∂L(w, ξ, b, Λ, R)
                                              = −          αi yi = 0
                     ∂b              b=b0            i=0
              ∂L(w, ξ, b, Λ, R)
                                              = C − αi − ri = 0
                    ∂ξi                 0
                                     ξ=ξi                                                       22
(2)
•                                         l
                                             L
                                              1
                                                     l    l
        L(w, ξ, b, Λ, R) =               αi −                 αi αj yi yj xi · xj
                                              2
•
                                     i=1            i=1 j=1
                                                                     C     ξ
                                              SVM
    •                           αi             C
    •   C
•   C - αi - ri = 0        ri                                    0≦αi≦C

                  l
                                    w,b
                      αi yi = 0, 0 ≤ αi ≤ C
                i=1

                                      l              l    l
                                              1
        L(w, ξ, b, Λ, R)        =        αi −                 αi αj yi yj xi · xj
                                     i=1
                                              2     i=1 j=1
                      Λ                                                             23
: Karush-Kuhn-Tucker                     (KKT               )
•
•              gi(x) ≦ 0 (x = (x1, x2, ..., xn))                f(x)


•   KKT     :
                 m
      ∂f (x)           ∂gi (x)
              +     λi         = 0, j = 1, 2, ..., n
        ∂xj     i=1
                        ∂xj
       λi gi (x) = 0, λi ≥ 0, gi (x) ≤ 0, i = 1, 2, ..., m


•   f(x)   gi(x)                                   x, λ   KKT
                          f(x)




                                                                           24
SMO (Sequence Minimal Optimization)
 •   SVM
 •                     Λ=(α1, α2, ...,αl)
 •   αi
     •    6000                       6000
     •
 •               2    (αi, αj)
          2
     •    2      αi
 •               SMO


 •                           LD
                                            l        l   l
                                                1
 LD = L(w, ξ, b, Λ, R) =                   αi −               αi αj yi yj xi · xj
                                       i=1
                                                2   i=1 j=1
                                                                                    25
2                                       (1)
•   α 1 , α2                          LD
•                old  old
               α 1 , α2                        new  new
                                             α 1 , α2

                   Ei ≡ wold · xi + bold − yi
                    old

                      η ≡ 2K12 − K11 − K22 , where Kij = xi · xj
                                 α2
                                   y2 (E1 − E2 )
                                        old  old
                    new
                   α2       = α2 −
                               old
                                           η
                   l
                   i=1   αi y i = 0    γ ≡ α1 + sα2 = Const.
    LD              LD’=0


               η   =      2K12 − K11 − K22 = − | x2 − x1 |2 ≤ 0    26
2                                     (2)
• α 1 , α2       γ ≡ α1 + sα2 = Const.
•                                                 new  new
                                                α 1 , α2     0
             C
  •                                        α2
                                                  clipped
                                                 α2




  (A)                                    (B)                 27
2                               (3)
y1 = y1 (s = 1)
          L = max(0, α1 + α2 − C),
                      old  old
                                     H = min(C, α1 + α2 )
                                                 old  old


y1 = y2 (s = −1)
         L = max(0, α2 − α1 ),
                     old  old
                                  H = min(C, C + α2 − α1 )
                                                  old  old

                L ≤ α2 ≤ H
           s γ

 clipped
α2
                        
                         H,     if α2 ≥ H
                                     new
            clipped
           α2         =    new
                          α2 ,   if L < α2 < H
                                         new
                        
                          L,     if α2 ≤ L
                                     new


           LD

                                                             28
•         L ≤ α2 ≤ H




    (A)          (B)




    (C)          (D)
•    clipped
          α2

                       (B)




                     (C)




(A)


                       (D)
: (α1 , α2 )
    new  new

         clipped
: (α1 , α2
    new
                 )
2
 1. η = 2K12 − K11 − K22
 2. η < 0                                α
                                old old
                           y2 (E2 −E1 )
      (a) α2 = α2 +
           new  old
                                  η
         clipped
    (b) α2
                       clipped
    (c) α1 = α1 − s(α2
         new     old
                               − α2 )
                                  old

 3. η = 0        LD α2 1                          L   H
                                         α1           2(c)
 4.                                      α1,2
      • bnew     E new = 0
                                    clipped
wnew   = wold + (α1 − α1 )y1 x1 + (α2
                  new     old
                                            − α2 )y2 x2
                                               old


E new (x, y) = E old (x, y) + y1 (α1 − α1 )x1 · x
                                    new   old
                       clipped
               +y2 (α2         − α2 )x2 · x − bold + bnew
                                   old

                                                        clipped
bnew = bold − E old (x, y) − y1 (α1 − α1 )x1 · x − y2 (α2
                                  new  old
                                                                − α2 )x2 · x
                                                                   old
                                                                          31
αi
•                         α1 α2
•   α1
    •                   KKT                  KKT


    •
    •   2
        •    0 < αi < C
        •
•   α2
    •   LD
    •
              |E1-E2|
        •    E1               E2        E1         32
SMO                         SVM
•
•
    •                                   α≠0
•                     α 2
    •   2
•   2       α
    •           |E2-E1|
•                   LD            KKT




                                              33
•             3                    (                  )
    •   A                 B                               2


•
    •             (regression problem)
    •   0   100                   0      10, 10 20,


•             1
    •   Web
                         100                100
               Web
        •
    •   One Class SVM                                         34

Más contenido relacionado

La actualidad más candente

ตัวอย่างข้อสอบเก่า วิชาคณิตศาสตร์ ม.6 ปีการศึกษา 2553
ตัวอย่างข้อสอบเก่า วิชาคณิตศาสตร์ ม.6 ปีการศึกษา 2553ตัวอย่างข้อสอบเก่า วิชาคณิตศาสตร์ ม.6 ปีการศึกษา 2553
ตัวอย่างข้อสอบเก่า วิชาคณิตศาสตร์ ม.6 ปีการศึกษา 2553
Destiny Nooppynuchy
 
Emat 213 midterm 2 fall 2005
Emat 213 midterm 2 fall 2005Emat 213 midterm 2 fall 2005
Emat 213 midterm 2 fall 2005
akabaka12
 
修士論文発表会
修士論文発表会修士論文発表会
修士論文発表会
Keikusl
 
分かりやすいパターン認識第8章 学習アルゴリズムの一般化
分かりやすいパターン認識第8章 学習アルゴリズムの一般化分かりやすいパターン認識第8章 学習アルゴリズムの一般化
分かりやすいパターン認識第8章 学習アルゴリズムの一般化
Yohei Sato
 
X2 T06 02 Cylindrical Shells
X2 T06 02 Cylindrical ShellsX2 T06 02 Cylindrical Shells
X2 T06 02 Cylindrical Shells
Nigel Simmons
 
Solution 3 3
Solution 3 3Solution 3 3
Solution 3 3
usepnuh
 
Nonlinear Filtering and Path Integral Method (Paper Review)
Nonlinear Filtering and Path Integral Method (Paper Review)Nonlinear Filtering and Path Integral Method (Paper Review)
Nonlinear Filtering and Path Integral Method (Paper Review)
Kohta Ishikawa
 

La actualidad más candente (20)

One way to see higher dimensional surface
One way to see higher dimensional surfaceOne way to see higher dimensional surface
One way to see higher dimensional surface
 
ตัวอย่างข้อสอบเก่า วิชาคณิตศาสตร์ ม.6 ปีการศึกษา 2553
ตัวอย่างข้อสอบเก่า วิชาคณิตศาสตร์ ม.6 ปีการศึกษา 2553ตัวอย่างข้อสอบเก่า วิชาคณิตศาสตร์ ม.6 ปีการศึกษา 2553
ตัวอย่างข้อสอบเก่า วิชาคณิตศาสตร์ ม.6 ปีการศึกษา 2553
 
Digital fiiter
Digital fiiterDigital fiiter
Digital fiiter
 
Mesh Processing Course : Multiresolution
Mesh Processing Course : MultiresolutionMesh Processing Course : Multiresolution
Mesh Processing Course : Multiresolution
 
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
 
Emat 213 midterm 2 fall 2005
Emat 213 midterm 2 fall 2005Emat 213 midterm 2 fall 2005
Emat 213 midterm 2 fall 2005
 
Signal Processing Course : Convex Optimization
Signal Processing Course : Convex OptimizationSignal Processing Course : Convex Optimization
Signal Processing Course : Convex Optimization
 
Query Suggestion @ tokyotextmining#2
Query Suggestion @ tokyotextmining#2Query Suggestion @ tokyotextmining#2
Query Suggestion @ tokyotextmining#2
 
修士論文発表会
修士論文発表会修士論文発表会
修士論文発表会
 
分かりやすいパターン認識第8章 学習アルゴリズムの一般化
分かりやすいパターン認識第8章 学習アルゴリズムの一般化分かりやすいパターン認識第8章 学習アルゴリズムの一般化
分かりやすいパターン認識第8章 学習アルゴリズムの一般化
 
Iceaa07 Foils
Iceaa07 FoilsIceaa07 Foils
Iceaa07 Foils
 
Lecture 1: linear SVM in the primal
Lecture 1: linear SVM in the primalLecture 1: linear SVM in the primal
Lecture 1: linear SVM in the primal
 
Ism et chapter_12
Ism et chapter_12Ism et chapter_12
Ism et chapter_12
 
X2 T06 02 Cylindrical Shells
X2 T06 02 Cylindrical ShellsX2 T06 02 Cylindrical Shells
X2 T06 02 Cylindrical Shells
 
集合知プログラミングゼミ第1回
集合知プログラミングゼミ第1回集合知プログラミングゼミ第1回
集合知プログラミングゼミ第1回
 
03 finding roots
03 finding roots03 finding roots
03 finding roots
 
Dmss2011 public
Dmss2011 publicDmss2011 public
Dmss2011 public
 
Solution 3 3
Solution 3 3Solution 3 3
Solution 3 3
 
Nonlinear Filtering and Path Integral Method (Paper Review)
Nonlinear Filtering and Path Integral Method (Paper Review)Nonlinear Filtering and Path Integral Method (Paper Review)
Nonlinear Filtering and Path Integral Method (Paper Review)
 
Inse
InseInse
Inse
 

Destacado

Bd T Eq6 Cuadro De Coparacion
Bd T Eq6 Cuadro De CoparacionBd T Eq6 Cuadro De Coparacion
Bd T Eq6 Cuadro De Coparacion
eduardo martinez
 
Iturrama
IturramaIturrama
Iturrama
andarin
 
Te deseo
Te deseoTe deseo
Te deseo
anacjg
 
Canarias Excelencia Tecnológica
Canarias Excelencia TecnológicaCanarias Excelencia Tecnológica
Canarias Excelencia Tecnológica
Jaime Romero
 
Lévai Dóra, Molnár Patrik: ELGG, nyílt forráskódú közösségi portálmotor
Lévai Dóra, Molnár Patrik: ELGG, nyílt forráskódú közösségi portálmotorLévai Dóra, Molnár Patrik: ELGG, nyílt forráskódú közösségi portálmotor
Lévai Dóra, Molnár Patrik: ELGG, nyílt forráskódú közösségi portálmotor
HUNMOOT
 
Niños del mundo!!!
Niños del mundo!!!Niños del mundo!!!
Niños del mundo!!!
anacjg
 

Destacado (20)

Ficha 3 bsc exploradores
Ficha 3 bsc exploradoresFicha 3 bsc exploradores
Ficha 3 bsc exploradores
 
Bd T Eq6 Cuadro De Coparacion
Bd T Eq6 Cuadro De CoparacionBd T Eq6 Cuadro De Coparacion
Bd T Eq6 Cuadro De Coparacion
 
Iss
IssIss
Iss
 
iPod PP
iPod PPiPod PP
iPod PP
 
Electrónica digital
Electrónica digitalElectrónica digital
Electrónica digital
 
Terzomodulo
TerzomoduloTerzomodulo
Terzomodulo
 
Projektowanie instalacji z miedzi - podstawowe zalecenia
Projektowanie instalacji z miedzi - podstawowe zaleceniaProjektowanie instalacji z miedzi - podstawowe zalecenia
Projektowanie instalacji z miedzi - podstawowe zalecenia
 
Copia de unidad 8
Copia de unidad 8Copia de unidad 8
Copia de unidad 8
 
Iturrama
IturramaIturrama
Iturrama
 
Te deseo
Te deseoTe deseo
Te deseo
 
2ºtrabajomireia
2ºtrabajomireia2ºtrabajomireia
2ºtrabajomireia
 
123
123123
123
 
Canarias Excelencia Tecnológica
Canarias Excelencia TecnológicaCanarias Excelencia Tecnológica
Canarias Excelencia Tecnológica
 
Millinet Səsvermə
Millinet SəsverməMillinet Səsvermə
Millinet Səsvermə
 
Lévai Dóra, Molnár Patrik: ELGG, nyílt forráskódú közösségi portálmotor
Lévai Dóra, Molnár Patrik: ELGG, nyílt forráskódú közösségi portálmotorLévai Dóra, Molnár Patrik: ELGG, nyílt forráskódú közösségi portálmotor
Lévai Dóra, Molnár Patrik: ELGG, nyílt forráskódú közösségi portálmotor
 
Livro nacional de_regras_2015_
Livro nacional de_regras_2015_Livro nacional de_regras_2015_
Livro nacional de_regras_2015_
 
Gorbeia Central Park
Gorbeia Central ParkGorbeia Central Park
Gorbeia Central Park
 
Niños del mundo!!!
Niños del mundo!!!Niños del mundo!!!
Niños del mundo!!!
 
Tokyo Cabinet
Tokyo CabinetTokyo Cabinet
Tokyo Cabinet
 
Josh, juan capó nando
Josh, juan capó nandoJosh, juan capó nando
Josh, juan capó nando
 

Similar a Datamining 6th Svm

Datamining 7th Kmeans
Datamining 7th KmeansDatamining 7th Kmeans
Datamining 7th Kmeans
sesejun
 
Datamining 7th kmeans
Datamining 7th kmeansDatamining 7th kmeans
Datamining 7th kmeans
sesejun
 
Absolute value function
Absolute value functionAbsolute value function
Absolute value function
gindar
 
S101-52國立新化高中(代理)
S101-52國立新化高中(代理)S101-52國立新化高中(代理)
S101-52國立新化高中(代理)
yustar1026
 
Week 7 [compatibility mode]
Week 7 [compatibility mode]Week 7 [compatibility mode]
Week 7 [compatibility mode]
Hazrul156
 

Similar a Datamining 6th Svm (20)

Datamining 7th Kmeans
Datamining 7th KmeansDatamining 7th Kmeans
Datamining 7th Kmeans
 
Datamining 7th kmeans
Datamining 7th kmeansDatamining 7th kmeans
Datamining 7th kmeans
 
Complex varible
Complex varibleComplex varible
Complex varible
 
Complex varible
Complex varibleComplex varible
Complex varible
 
Ch33 11
Ch33 11Ch33 11
Ch33 11
 
Alg2 lesson 6-6
Alg2 lesson 6-6Alg2 lesson 6-6
Alg2 lesson 6-6
 
Optimal Finite Difference Grids
Optimal Finite Difference GridsOptimal Finite Difference Grids
Optimal Finite Difference Grids
 
Preserving Personalized Pagerank in Subgraphs(ICML 2011)
Preserving Personalized Pagerank in Subgraphs(ICML 2011) Preserving Personalized Pagerank in Subgraphs(ICML 2011)
Preserving Personalized Pagerank in Subgraphs(ICML 2011)
 
In to el
In to elIn to el
In to el
 
Tut 1
Tut 1Tut 1
Tut 1
 
calculo vectorial
calculo vectorialcalculo vectorial
calculo vectorial
 
Special second order non symmetric fitted method for singular
Special second order non symmetric fitted method for singularSpecial second order non symmetric fitted method for singular
Special second order non symmetric fitted method for singular
 
2º mat emática
2º mat emática2º mat emática
2º mat emática
 
Absolute value function
Absolute value functionAbsolute value function
Absolute value function
 
Simultaneous eqn2
Simultaneous eqn2Simultaneous eqn2
Simultaneous eqn2
 
Cs 601
Cs 601Cs 601
Cs 601
 
Chapter4 tf
Chapter4 tfChapter4 tf
Chapter4 tf
 
S101-52國立新化高中(代理)
S101-52國立新化高中(代理)S101-52國立新化高中(代理)
S101-52國立新化高中(代理)
 
Ch02 31
Ch02 31Ch02 31
Ch02 31
 
Week 7 [compatibility mode]
Week 7 [compatibility mode]Week 7 [compatibility mode]
Week 7 [compatibility mode]
 

Más de sesejun

次世代シーケンサが求める機械学習
次世代シーケンサが求める機械学習次世代シーケンサが求める機械学習
次世代シーケンサが求める機械学習
sesejun
 
20110524zurichngs 2nd pub
20110524zurichngs 2nd pub20110524zurichngs 2nd pub
20110524zurichngs 2nd pub
sesejun
 
20110524zurichngs 1st pub
20110524zurichngs 1st pub20110524zurichngs 1st pub
20110524zurichngs 1st pub
sesejun
 
20110214nips2010 read
20110214nips2010 read20110214nips2010 read
20110214nips2010 read
sesejun
 
Datamining 9th association_rule.key
Datamining 9th association_rule.keyDatamining 9th association_rule.key
Datamining 9th association_rule.key
sesejun
 
Datamining 8th hclustering
Datamining 8th hclusteringDatamining 8th hclustering
Datamining 8th hclustering
sesejun
 
Datamining r 4th
Datamining r 4thDatamining r 4th
Datamining r 4th
sesejun
 
Datamining r 3rd
Datamining r 3rdDatamining r 3rd
Datamining r 3rd
sesejun
 
Datamining r 2nd
Datamining r 2ndDatamining r 2nd
Datamining r 2nd
sesejun
 
Datamining r 1st
Datamining r 1stDatamining r 1st
Datamining r 1st
sesejun
 
Datamining 5th knn
Datamining 5th knnDatamining 5th knn
Datamining 5th knn
sesejun
 
Datamining 4th adaboost
Datamining 4th adaboostDatamining 4th adaboost
Datamining 4th adaboost
sesejun
 
Datamining 3rd naivebayes
Datamining 3rd naivebayesDatamining 3rd naivebayes
Datamining 3rd naivebayes
sesejun
 
Datamining 2nd decisiontree
Datamining 2nd decisiontreeDatamining 2nd decisiontree
Datamining 2nd decisiontree
sesejun
 
100401 Bioinfoinfra
100401 Bioinfoinfra100401 Bioinfoinfra
100401 Bioinfoinfra
sesejun
 
Datamining 8th Hclustering
Datamining 8th HclusteringDatamining 8th Hclustering
Datamining 8th Hclustering
sesejun
 
Datamining 9th Association Rule
Datamining 9th Association RuleDatamining 9th Association Rule
Datamining 9th Association Rule
sesejun
 

Más de sesejun (20)

RNAseqによる変動遺伝子抽出の統計: A Review
RNAseqによる変動遺伝子抽出の統計: A ReviewRNAseqによる変動遺伝子抽出の統計: A Review
RNAseqによる変動遺伝子抽出の統計: A Review
 
バイオインフォマティクスによる遺伝子発現解析
バイオインフォマティクスによる遺伝子発現解析バイオインフォマティクスによる遺伝子発現解析
バイオインフォマティクスによる遺伝子発現解析
 
次世代シーケンサが求める機械学習
次世代シーケンサが求める機械学習次世代シーケンサが求める機械学習
次世代シーケンサが求める機械学習
 
20110602labseminar pub
20110602labseminar pub20110602labseminar pub
20110602labseminar pub
 
20110524zurichngs 2nd pub
20110524zurichngs 2nd pub20110524zurichngs 2nd pub
20110524zurichngs 2nd pub
 
20110524zurichngs 1st pub
20110524zurichngs 1st pub20110524zurichngs 1st pub
20110524zurichngs 1st pub
 
20110214nips2010 read
20110214nips2010 read20110214nips2010 read
20110214nips2010 read
 
Datamining 9th association_rule.key
Datamining 9th association_rule.keyDatamining 9th association_rule.key
Datamining 9th association_rule.key
 
Datamining 8th hclustering
Datamining 8th hclusteringDatamining 8th hclustering
Datamining 8th hclustering
 
Datamining r 4th
Datamining r 4thDatamining r 4th
Datamining r 4th
 
Datamining r 3rd
Datamining r 3rdDatamining r 3rd
Datamining r 3rd
 
Datamining r 2nd
Datamining r 2ndDatamining r 2nd
Datamining r 2nd
 
Datamining r 1st
Datamining r 1stDatamining r 1st
Datamining r 1st
 
Datamining 5th knn
Datamining 5th knnDatamining 5th knn
Datamining 5th knn
 
Datamining 4th adaboost
Datamining 4th adaboostDatamining 4th adaboost
Datamining 4th adaboost
 
Datamining 3rd naivebayes
Datamining 3rd naivebayesDatamining 3rd naivebayes
Datamining 3rd naivebayes
 
Datamining 2nd decisiontree
Datamining 2nd decisiontreeDatamining 2nd decisiontree
Datamining 2nd decisiontree
 
100401 Bioinfoinfra
100401 Bioinfoinfra100401 Bioinfoinfra
100401 Bioinfoinfra
 
Datamining 8th Hclustering
Datamining 8th HclusteringDatamining 8th Hclustering
Datamining 8th Hclustering
 
Datamining 9th Association Rule
Datamining 9th Association RuleDatamining 9th Association Rule
Datamining 9th Association Rule
 

Último

Breaking Down the Flutterwave Scandal What You Need to Know.pdf
Breaking Down the Flutterwave Scandal What You Need to Know.pdfBreaking Down the Flutterwave Scandal What You Need to Know.pdf
Breaking Down the Flutterwave Scandal What You Need to Know.pdf
UK Journal
 

Último (20)

WebAssembly is Key to Better LLM Performance
WebAssembly is Key to Better LLM PerformanceWebAssembly is Key to Better LLM Performance
WebAssembly is Key to Better LLM Performance
 
Intro in Product Management - Коротко про професію продакт менеджера
Intro in Product Management - Коротко про професію продакт менеджераIntro in Product Management - Коротко про професію продакт менеджера
Intro in Product Management - Коротко про професію продакт менеджера
 
FDO for Camera, Sensor and Networking Device – Commercial Solutions from VinC...
FDO for Camera, Sensor and Networking Device – Commercial Solutions from VinC...FDO for Camera, Sensor and Networking Device – Commercial Solutions from VinC...
FDO for Camera, Sensor and Networking Device – Commercial Solutions from VinC...
 
State of the Smart Building Startup Landscape 2024!
State of the Smart Building Startup Landscape 2024!State of the Smart Building Startup Landscape 2024!
State of the Smart Building Startup Landscape 2024!
 
Simplified FDO Manufacturing Flow with TPMs _ Liam at Infineon.pdf
Simplified FDO Manufacturing Flow with TPMs _ Liam at Infineon.pdfSimplified FDO Manufacturing Flow with TPMs _ Liam at Infineon.pdf
Simplified FDO Manufacturing Flow with TPMs _ Liam at Infineon.pdf
 
Extensible Python: Robustness through Addition - PyCon 2024
Extensible Python: Robustness through Addition - PyCon 2024Extensible Python: Robustness through Addition - PyCon 2024
Extensible Python: Robustness through Addition - PyCon 2024
 
ADP Passwordless Journey Case Study.pptx
ADP Passwordless Journey Case Study.pptxADP Passwordless Journey Case Study.pptx
ADP Passwordless Journey Case Study.pptx
 
The Value of Certifying Products for FDO _ Paul at FIDO Alliance.pdf
The Value of Certifying Products for FDO _ Paul at FIDO Alliance.pdfThe Value of Certifying Products for FDO _ Paul at FIDO Alliance.pdf
The Value of Certifying Products for FDO _ Paul at FIDO Alliance.pdf
 
AI mind or machine power point presentation
AI mind or machine power point presentationAI mind or machine power point presentation
AI mind or machine power point presentation
 
ERP Contender Series: Acumatica vs. Sage Intacct
ERP Contender Series: Acumatica vs. Sage IntacctERP Contender Series: Acumatica vs. Sage Intacct
ERP Contender Series: Acumatica vs. Sage Intacct
 
Microsoft CSP Briefing Pre-Engagement - Questionnaire
Microsoft CSP Briefing Pre-Engagement - QuestionnaireMicrosoft CSP Briefing Pre-Engagement - Questionnaire
Microsoft CSP Briefing Pre-Engagement - Questionnaire
 
Event-Driven Architecture Masterclass: Challenges in Stream Processing
Event-Driven Architecture Masterclass: Challenges in Stream ProcessingEvent-Driven Architecture Masterclass: Challenges in Stream Processing
Event-Driven Architecture Masterclass: Challenges in Stream Processing
 
How Red Hat Uses FDO in Device Lifecycle _ Costin and Vitaliy at Red Hat.pdf
How Red Hat Uses FDO in Device Lifecycle _ Costin and Vitaliy at Red Hat.pdfHow Red Hat Uses FDO in Device Lifecycle _ Costin and Vitaliy at Red Hat.pdf
How Red Hat Uses FDO in Device Lifecycle _ Costin and Vitaliy at Red Hat.pdf
 
Where to Learn More About FDO _ Richard at FIDO Alliance.pdf
Where to Learn More About FDO _ Richard at FIDO Alliance.pdfWhere to Learn More About FDO _ Richard at FIDO Alliance.pdf
Where to Learn More About FDO _ Richard at FIDO Alliance.pdf
 
Your enemies use GenAI too - staying ahead of fraud with Neo4j
Your enemies use GenAI too - staying ahead of fraud with Neo4jYour enemies use GenAI too - staying ahead of fraud with Neo4j
Your enemies use GenAI too - staying ahead of fraud with Neo4j
 
Choosing the Right FDO Deployment Model for Your Application _ Geoffrey at In...
Choosing the Right FDO Deployment Model for Your Application _ Geoffrey at In...Choosing the Right FDO Deployment Model for Your Application _ Geoffrey at In...
Choosing the Right FDO Deployment Model for Your Application _ Geoffrey at In...
 
Overview of Hyperledger Foundation
Overview of Hyperledger FoundationOverview of Hyperledger Foundation
Overview of Hyperledger Foundation
 
Portal Kombat : extension du réseau de propagande russe
Portal Kombat : extension du réseau de propagande russePortal Kombat : extension du réseau de propagande russe
Portal Kombat : extension du réseau de propagande russe
 
Intro to Passkeys and the State of Passwordless.pptx
Intro to Passkeys and the State of Passwordless.pptxIntro to Passkeys and the State of Passwordless.pptx
Intro to Passkeys and the State of Passwordless.pptx
 
Breaking Down the Flutterwave Scandal What You Need to Know.pdf
Breaking Down the Flutterwave Scandal What You Need to Know.pdfBreaking Down the Flutterwave Scandal What You Need to Know.pdf
Breaking Down the Flutterwave Scandal What You Need to Know.pdf
 

Datamining 6th Svm

  • 1.
  • 2.
  • 3. k-NN • Yes, No Training Data Test Data 3
  • 4. 4
  • 5. xi yi i 1 -1 (xi , yi )(i = 1, . . . , l, xi ∈ Rn , yi ∈ {1, −1}) • w, b yi (w · (xi − b)) > 0 (i = 0, . . . , l) 5
  • 6. w.x+b≧0 • 1, x d(x) if w · x + b ≥ 0 d(x) = −1, otherwise • • 6
  • 7. Fisher (1) • 2 2 • w w+b=0 • w x+b=0 7
  • 8. Fisher (2) • m+ m- d(x)=1 x d(x)=−1 x m+ = , m− = |{x|d(x) = 1}| |{x|d(x) = −1}| • |(m+ − m− ) · w| • • w.x+b=0 2 2 ((x − m+ ) · w) + ((x − m− ) · w) d(x)=1 d(x)=−1 • 8
  • 9. Fisher (3) • • |w|=1 J(w) w • w.x+b • b 2 |(m+ − m− ) · w| J(w) = 2 2 d(x)=1 ((x − m+ ) · w) + d(x)=−1 ((x − m− ) · w) J(w) w J(w) w 0 9
  • 10. Fisher (4) J(w) w T SB w J(w) = w T SW w SB = (m+ − m− )(m+ − m− )T SW = (x − m+ )(x − m+ )T + (x − m− )(x − m− )T d(x)=1 d(x)=−1 ∂J(w) 0 =0 ∂w f f g − fg = g g2 (wT SB w)SW w = (wT SW w)SB w 2 SB w m+ − m− w ∝ S−1 (m+ − m− ) W Sw 10
  • 11. SVM (Support Vector Machine) • • • 11
  • 12. ρ(w,b) xi · w xi · w ρ(w, b) = min − max {xi |yi =1} |w| {xi |yi =−1} |w| 12
  • 13. 2 w0 · x + b0 = ±1 w0, b0 xi · w0 xi · w0 ρ(w0 , b0 ) = min − max {xi |yi =1} |w0 | {xi |yi =−1} |w0 | 1 − b0 −1 − b0 2 = − = |w0 | |w0 | |w0 | 13
  • 14. 2/|w0 | w0 · w0 yi (w0 · xi + b) ≥ 1 (i = 1, . . . , l) w0 · w0 w0 • 2 2 • 2 • 1 • 2 • 14
  • 15. (1) yi (w0 · xi + b) ≥ 1 (i = 1, . . . , l) (1) w0 · w0 w0 Λ = (α1 , . . . , αl ) (αi ≥ 0) l |w|2 L(w, b, Λ) = − αi (yi (xi · w + b) − 1) 2 i=1 • w, b Λ 15
  • 16. (2) • w=w0, b=b0 L(w, b, Λ) l ∂L(w, b, Λ) = w0 − αi yi xi = 0 ∂w w=w0 l i=1 (2) ∂L(w, b, Λ) = − αi yi = 0 ∂b b=b0 i=1 l l w0 = αi yi xi , αi yi = 0 i=1 i=1 • w=w0, b=b0 l 1 L(w0 , b0 , Λ) = w0 · w0 − αi [yi (xi · w0 + b0 ) − 1] 2 i=1 l l l 1 = αi − αi αj yi yj xi · xj i=1 2 i=1 j=1 • w b Λ 16
  • 17. SVM • l w, b αi yi = 0, αi ≥ 0 i=1 (3) l l l 1 L(w0 , b0 , Λ) = αi − αi αj yi yj xi · xj i=1 2 i=1 j=1 Λ • SVM • w0 Λ l • (2) ( w0 = i=1 αi yi xi ) • (2) αi≠0 xi w KKKT • KKT : αi [yi (xi · w0 + b0 ) − 1] = 0 17
  • 18. • • • (A) (B) 18
  • 19. ( ) • • • • • l l l 1 L(w0 , b0 , Λ) = αi − αi αj yi yj xi · xj i=1 2 i=1 j=1 • x l Φ(x) l l 1 L(w0 , b0 , Λ) = αi − αi αj yi yj Φ(xi ) · Φ(xj ) i=1 2 i=1 j=1 • l Φ(x) · w0 + b0 = αi yi Φ(x) · Φ(xi ) + b0 = 0 i=1 • Φ 19
  • 20. Kernel • K(x, y) = Φ(x) √ Φ(y) √ √ • Φ((x1 , x2 )) = (x1 , 2x1 x2 , x2 , 2x1 , 2x2 , 1) 2 2 Φ((x1 , x2 )) · Φ((y1 , y2 )) = (x1 y1 )2 + 2x1 y1 x2 y2 + (x2 y2 )2 + 2x1 y1 + 2x2 y2 + 1 = (x1 y1 + x2 y2 + 1)2 = ((x1 , x2 ) · (y1 , y2 ) + 1)2 • (6 ) • • (x · y + 1)d , • RBF exp(−||x − y||2 /2σ 2 ), • tanh(κx · y − δ) • σ κ δ • Mercer 20
  • 21. • • • • ξ yi (w · xi + b) ≥ 1 − ξi where ξi ≥ 0 (i = 1, . . . , l) l 1 w·w+C ξi 2 i=1 21
  • 22. (1) • Λ = (α1 , . . . , αl ), R = (r1 , . . . , rl ) L L(w, ξ, b, Λ, R) l l l 1 = w·w+C ξi − αi [yi (xi · w + b) − 1 + ξi ] − ri ξi 2 i=1 i=1 i=1 w0 , b0 , ξi L 0 w, b, ξi KKT l ∂L(w, ξ, b, Λ, R) = w0 − α i y i xi = 0 ∂w w=w0 i=0 l ∂L(w, ξ, b, Λ, R) = − αi yi = 0 ∂b b=b0 i=0 ∂L(w, ξ, b, Λ, R) = C − αi − ri = 0 ∂ξi 0 ξ=ξi 22
  • 23. (2) • l L 1 l l L(w, ξ, b, Λ, R) = αi − αi αj yi yj xi · xj 2 • i=1 i=1 j=1 C ξ SVM • αi C • C • C - αi - ri = 0 ri 0≦αi≦C l w,b αi yi = 0, 0 ≤ αi ≤ C i=1 l l l 1 L(w, ξ, b, Λ, R) = αi − αi αj yi yj xi · xj i=1 2 i=1 j=1 Λ 23
  • 24. : Karush-Kuhn-Tucker (KKT ) • • gi(x) ≦ 0 (x = (x1, x2, ..., xn)) f(x) • KKT : m ∂f (x) ∂gi (x) + λi = 0, j = 1, 2, ..., n ∂xj i=1 ∂xj λi gi (x) = 0, λi ≥ 0, gi (x) ≤ 0, i = 1, 2, ..., m • f(x) gi(x) x, λ KKT f(x) 24
  • 25. SMO (Sequence Minimal Optimization) • SVM • Λ=(α1, α2, ...,αl) • αi • 6000 6000 • • 2 (αi, αj) 2 • 2 αi • SMO • LD l l l 1 LD = L(w, ξ, b, Λ, R) = αi − αi αj yi yj xi · xj i=1 2 i=1 j=1 25
  • 26. 2 (1) • α 1 , α2 LD • old old α 1 , α2 new new α 1 , α2 Ei ≡ wold · xi + bold − yi old η ≡ 2K12 − K11 − K22 , where Kij = xi · xj α2 y2 (E1 − E2 ) old old new α2 = α2 − old η l i=1 αi y i = 0 γ ≡ α1 + sα2 = Const. LD LD’=0 η = 2K12 − K11 − K22 = − | x2 − x1 |2 ≤ 0 26
  • 27. 2 (2) • α 1 , α2 γ ≡ α1 + sα2 = Const. • new new α 1 , α2 0 C • α2 clipped α2 (A) (B) 27
  • 28. 2 (3) y1 = y1 (s = 1) L = max(0, α1 + α2 − C), old old H = min(C, α1 + α2 ) old old y1 = y2 (s = −1) L = max(0, α2 − α1 ), old old H = min(C, C + α2 − α1 ) old old L ≤ α2 ≤ H s γ clipped α2   H, if α2 ≥ H new clipped α2 = new α2 , if L < α2 < H new  L, if α2 ≤ L new LD 28
  • 29. L ≤ α2 ≤ H (A) (B) (C) (D)
  • 30. clipped α2 (B) (C) (A) (D) : (α1 , α2 ) new new clipped : (α1 , α2 new )
  • 31. 2 1. η = 2K12 − K11 − K22 2. η < 0 α old old y2 (E2 −E1 ) (a) α2 = α2 + new old η clipped (b) α2 clipped (c) α1 = α1 − s(α2 new old − α2 ) old 3. η = 0 LD α2 1 L H α1 2(c) 4. α1,2 • bnew E new = 0 clipped wnew = wold + (α1 − α1 )y1 x1 + (α2 new old − α2 )y2 x2 old E new (x, y) = E old (x, y) + y1 (α1 − α1 )x1 · x new old clipped +y2 (α2 − α2 )x2 · x − bold + bnew old clipped bnew = bold − E old (x, y) − y1 (α1 − α1 )x1 · x − y2 (α2 new old − α2 )x2 · x old 31
  • 32. αi • α1 α2 • α1 • KKT KKT • • 2 • 0 < αi < C • • α2 • LD • |E1-E2| • E1 E2 E1 32
  • 33. SMO SVM • • • α≠0 • α 2 • 2 • 2 α • |E2-E1| • LD KKT 33
  • 34. 3 ( ) • A B 2 • • (regression problem) • 0 100 0 10, 10 20, • 1 • Web 100 100 Web • • One Class SVM 34