SlideShare una empresa de Scribd logo
1 de 31
Descargar para leer sin conexión
Self-Adaptive Simulated Binary Crossover
     for Real-Parameter Optimization



   Kalyanmoy Deb*, S Karthik*, Tatsuya Okabe*
                     GECCO 2007

                   *Indian Inst. India
              **Honda Research Inst., Japan



                       Reviewed by

                   Paskorn Champrasert
                   paskorn@cs.umb.edu
                   http://dssg.cs.umb.edu
Outline
• Simulated Binary Crossover (SBX Crossover )
       (revisit)
• Problem in SBX
• Self-Adaptive SBX
• Simulation Results
   – Self-Adaptive SBX for Single-OOP
   – Self-Adaptive SBX for Multi-OOP
• Conclusion


 November 7, 08        DSSG Group Meeting       2/31
SBX
            Simulated Binary Crossover
• Simulated Binary Crossover (SBX) was proposed in
  1995 by Deb and Agrawal.

• SBX was designed with respect to the one-point
  crossover properties in binary-coded GA.
    – Average Property:
      The average of the decoded parameter values is the same
      before and after the crossover operation.
    – Spread Factor Property:
      The probability of occurrence of spread factor β ≈ 1 is more
      likely than any other β value.


November 7, 08              DSSG Group Meeting                       3/31
Important Property of One-Point Crossover
Spread factor:
  Spread factor β is defined as the ratio of the spread of
  offspring points to that of the parent points:
                             c1 −c2
                  β=       | p1 −p2 |
                                                               c1   c2

- Contracting Crossover      β<1                         p1              p2
   The offspring points are enclosed by the parent points.

- Expanding Crossover        β>1                    c1                        c2

   The offspring points enclose by the parent points. p1                 p2

- Stationary Crossover      β=1                               c1              c2
    The offspring points are the same as parent points        p1              p2
 November 7, 08              DSSG Group Meeting                               4/31
Spread Factor (β) in SBX
• The probability distribution of β in SBX should be similar (e.g.
   same shape) to the probability distribution of β in Binary-coded
   crossover.
                                  c(β) = 0.5(nc + 1)β nc , β ≤ 1
 • The probability distribution
 function is defined as:           c(β) = 0.5(nc + 1) β n1+2 , β > 1
                                                          c

                                                                  1.6
                                                                  1.4




                                       Probability Distribution
                                                                  1.2
                                                                  1.0                   n=2
                                                                  0.8
                                                                  0.6
                                                                  0.4
                                                                  0.2
                                                                  0.0
                                                                        0   1   2   3    4    5   6    7
                                                                                    β



November 7, 08              DSSG Group Meeting                                                        5/31
A large value of n gives a higher
                              probability for creating ‘near-parent’
contracting                   solutions.
   c(β) = 0.5(n + 1)β n , β ≤ 1



                      expanding
                                            1
                       c(β) = 0.5(n + 1) β n+2 , β > 1




                                                 Mostly, n=2 to 5
November 7, 08              DSSG Group Meeting                      6/31
Spread Factor (β) as a random number
          • β is a random number that follows the proposed
            probability distribution function:
                           1.6
                           1.4
                                                                         To get a β value,
Probability Distribution




                           1.2
                                                                         1) A unified random number u
                                                     n=2
                                                                            is generated [0,1]
                           1.0
                           0.8
                           0.6                                           2) Get β value that makes the
                           0.4
                           0.2
                                                                            area under the curve = u
                           0.0
                                     u
                                 0   1       2   3    4    5   6     7
                                                 β                          Offspring:
                                         β
                                                                                    c1 = x − 1 β(p2 − p1 )
                                                                                         ¯ 2
                                                                                    c2 = x + 1 β(p2 − p1 )
                                                                                         ¯ 2
                    November 7, 08                                 DSSG Group Meeting                    7/31
SBX Summary
• Simulated Binary Crossover (SBX) is a real-parameter
  combination operator which is commonly used in
  evolutionary algorithms (EA) to optimization problems.

    – SBX operator uses two parents and creates two offspring.

    – SBX operator uses a parameter called the distribution index
      (nc) which is kept fixed (i.e., 2 or 5) throughout a simulation
      run.
          • If nc is large the offspring solutions are close to the parent solutions
          • If nc is small the offspring solutions are away from the parent solutions

    – The distribution index (nc) has a direct effect in controlling the
      spread of offspring solutions.
 November 7, 08                     DSSG Group Meeting                                  8/31
Research Issues
•     Finding the distribution index (nc) to an appropriate value
      is an important task.
    –      The distribution index (nc) has effect in
          1) Convergence speed
                if nc is very high the offspring solutions are very close to the parent solutions the
                     convergence speed is very low.
          2) Local/Global optimum solutions.
             Past studies of SBX crossover GA found that it cannot work well in
             complicated problems (e.g., Restrigin’s function)




    November 7, 08                         DSSG Group Meeting                                     9/31
Self-Adaptive SBX
• This paper proposed a procedure for updating the
  distribution index (nc) adaptively to solve optimization
  problems.

    – The distribution index (nc) is updated every generation.

    – How the distribution index of the next generation (nc) is
      updated (increased or decreased) depends on how the
      offspring outperform (has better fitness value) than its two
      parents.
• The next slides will show the process to make an
  equation to update the distribution index.

November 7, 08               DSSG Group Meeting                      10/31
Probability density per offspring
         Spread factor distribution
                           1.6
                           1.4
                                                                                   (1)   c(β) = 0.5(nc + 1)β nc , β ≤ 1
Probability Distribution




                           1.2       (1)       (2)

                           1.0                             nc = 2
                                                                                   (2)   c(β) = 0.5(nc + 1) β n1+2 , β > 1
                                                                                                               c

                           0.8
                           0.6                                                                              ¯
                                                                                An example, when p1=2, p2=5 β>1
                           0.4
                           0.2
                           0.0             u
                                 0         1       2   3      4     5   6   7
                                                       β
                                               β                                                             (1)   (2)

         Offspring:
                                  ¯ 2¯
                             c1 = x − 1 β(p2 − p1 )                                              (1)   (2)

                                  ¯ 2¯
                             c2 = x + 1 β(p2 − p1 )



                            November 7, 08                                  DSSG Group Meeting                           11/31
Self-Adaptive SBX (1)
Consider the case that β > 1:

                                 c1                     c2
                     β>1
                                      p1               p2

                    c2 −c1
     β=           | p2 −p1 |
             (c2 −p2 )+(p2 −p1 )+(p1 −c1 )
    β=                  p2 −p1

    (c2 − p2 ) = (p1 − c1 )

                    2(c2 −p2 )
     β =1+           p2 −p1
                                  --- (1)

 November 7, 08                   DSSG Group Meeting         12/31
Self-Adaptive SBX (2)
                           1.6
                           1.4
                                                                                 (1)   c(β) = 0.5(nc + 1)β nc , β ≤ 1
Probability Distribution




                           1.2       (1)       (2)

                           1.0
                                                                                 (2)   c(β) = 0.5(nc + 1) β n1+2 , β > 1
                                                                                                             c

                           0.8
                           0.6
                           0.4
                           0.2
                           0.0
                                           u
                                 0         1       2   3      4   5   6   7
                                                       β
                                               β
                                                       (2)
                                                           Rβ  0.5(nc +1)
                                                                                = (u − 0.5)           --- (2)
                                                             1   β nc +2 dβ




                            November 7, 08                                DSSG Group Meeting                        13/31
(2)
                   Self-Adaptive SBX (3)
 Rβ  0.5(nc +1)
                                            --- (2)
  1    β nc +2 dβ = (u − 0.5)
Rβ
 1
    0.5(nc + 1)β −(nc +2) dβ =            (u − 0.5)

−0.5β −nc −1 − (−0.5) = (u − 0.5)
−0.5β −nc −1 = (u − 1)
log(β −nc −1 ) = log − (u−1)
                        0.5

(−nc − 1) log β = log(−2(u − 1))
                   log 2(1−u)
(−nc − 1) =           log β

                         log 2(1−u)
            nc = −(1 +      log β   )                --- (3)
  November 7, 08                DSSG Group Meeting             14/31
Self-Adaptive SBX (4)
                 c1                c2    c2
                                         ˆ
β>1
                      p1      p2

• If c2 is better than both parent solutions,
    – the authors assume that the region in which the child
      solutions is created is better than the region in which
      the parents solutions are.
    – the authors intend to extend the child solution further
      away from the closet parent solution (p2) by a factor α
      (α >1)


                 (c2 − p2 ) = α(c2 − p2 )
                  ˆ                                --- (4)


November 7, 08                DSSG Group Meeting             15/31
β Update
                 c1                    c2   c2
                                            ˆ
β>1                   p1          p2

                 2(c2 −p2 )
β =1+             p2 −p1
                                --- (1)      (c2 − p2) = α(c2 − p2) --- (4)
                                              ˆ


from (1)         ˆ
                 β =1+        2(c2 −p2 )
                                ˆ
                               p2 −p1

                              2((α(c2 −p2 )+p2 )−p2 )
                   =1+                p2 −p1
                              α2(c2 −p2 )
                   =1+          p2 −p1
                                                   β−1
  ˆ
  β = 1 + α(β − 1) --- (5)
November 7, 08                    DSSG Group Meeting                      16/31
Distribution Index (nc) Update
nc = −(1 +         log 2(1−u)
                              )   --- (3)        ˆ
                                                 β = 1 +α(β −1) --- (5)
                      log β

                   log 2(1−u)
nc = −(1 +
ˆ                         ˆ
                      log β
                              )

                     log 2(1−u)
nc = −(1 +
ˆ                  log(1+α(β−1)) )


                  log 2(1 − u) = −(nc + 1) log β

                                    (nc +1) log β
                  nc = −1 +
                  ˆ               log(1+α(β−1))        --- (6)



 November 7, 08                   DSSG Group Meeting                 17/31
Distribution Index (nc) Update




When c is better than p1 and p2 then the distribution index for the
next generation is calculated as (6). The distribution index will be
decreased in order to create an offspring (c’) further away from its
parent(p2) than the offspring (c) in the previous generation.
                    (nc +1) log β
 nc = −1 +
 ˆ                log(1+α(β−1))        --- (6)       [0,50]
                           α is a constant (α >1 )

 November 7, 08               DSSG Group Meeting                       18/31
Distribution Index (nc) Update
When c is worse than p1 and p2 then the distribution index for the
next generation is calculated as (7). The distribution index will be
increased in order to create an offspring (c’) closer to its
parent(p2) than the offspring (c) in the previous generation.

The 1/α is used instead of α

                      (nc +1) log β
      nc = −1 +
      ˆ             log(1+α(β−1))           --- (6)



                      (nc +1) log β
      nc = −1 +
      ˆ             log(1+(β−1)/α)          --- (7)


                                                    α is a constant (α >1 )
 November 7, 08                DSSG Group Meeting                             19/31
Distribution Index (nc) Update
         So, we are done with the expanding case                                                   (2)


                              How about contracting case                               (1)     ?
                                                        The same process is applied for                  (1)


                            1.6
                            1.4
                                                                                         (1)   c(β) = 0.5(nc + 1)β nc , β ≤ 1
Probability Distribution




                            1.2
                            1.0
                                                                                         (2)   c(β) = 0.5(nc + 1) β n1+2 , β > 1
                                                                                                                     c

                            0.8   (1)         (2)
                            0.6
                            0.4
                            0.2
                            0.0
                                      u
                                  0       1         2     3   4         5    6     7
                                                          β
                                        β                         (1)
                                                                   Rβ
                                                                        0
                                                                            0.5(nc + 1)β nc dβ = u                   --- (x)
                           November 7, 08                                        DSSG Group Meeting                            20/31
Distribution Index (nc) Update
                         for contracting case
When c is better than p1 and p2 then the distribution index for the
next generation is calculated as (8). The distribution index will be
decreased in order to create an offspring (c’) father away from its
parent(p2) than the offspring (c) in the previous generation.

                  1+nc                   --- (8)
       nc =
       ˆ           α     −1


 When c is worse than p1 and p2, 1/α is used instead of α

        nc = α(1 + nc ) − 1
        ˆ                                  --- (9)

                                                   α is a constant (α >1 )
 November 7, 08               DSSG Group Meeting                             21/31
Distribution Index Update Summary

                  Expanding Case                      Contracting Case
                     (u > 0.5)                            (u<0.5)
c is better than parents
                                                              1+nc
           nc = −1 +
           ˆ                 (nc +1) log β             ˆ
                                                       nc =    α     −1
                           log(1+α(β−1))


c is worse than parents
                             (nc +1) log β
          nc = −1 +
          ˆ                log(1+(β−1)/α)
                                                      nc =α(1+nc) −1
                                                      ˆ


  (u = 0.5), nc = nc
             ˆ
 November 7, 08                  DSSG Group Meeting                       22/31
Self-SBX Main Loop

                                                             Update nc


                                                         If rand < pc

                                              Parent 1                              Offspring C1
          Pt
                             Selection                             Crossover
        initial                               Parent 2                              Offspring C2
      population


                                                    If rand > pc
                   N = 150                                          Mutation
                   nc = 2
                   α = 1.5




                                                                      Pt+1
                                                                   Population
                                                                     at t+1


                                                                                N = 150
                                                                                α = 1.5
November 7, 08                           DSSG Group Meeting                                        23/31
Simulation Results
Sphere Problem:

 SBX
 • Initial nc is 2.
 • Population size =150
 • Pc = 0.9
 • The number of variables (n) =30
 • Pm =1/30
 • A run is terminated when a solution having a function value = 0.001
 • 11 runs are applied.

 Self-SBX
 • Pc = 0.7
 • α = 1.5



November 7, 08                 DSSG Group Meeting                        24/31
Fitness Value

                                   • Self-SBX can
                                     find better
                                     solutions




November 7, 08      DSSG Group Meeting              25/31
α Study
                         • α is varied in [1.05,2]
                         • The effect of α is not
                           significant since the
                           average number of
                           function evaluations is
                           similar.




November 7, 08   DSSG Group Meeting              26/31
Simulation Results
Rastrigin’s Function:

 Many local optima, one global minimum (0)
 The number of variables = 20
 SBX
 • Initial nc is 2.
 • Population size =100
 • Pc = 0.9
 • The number of variables (n) =30
 • Pm =1/30
 • A run is terminated when a solution having a function value = 0.001 or the maximum
   of 40,000 generations is reached.
 • 11 runs are applied.

 Self-SBX
 • Pc = 0.7
 • α = 1.5


November 7, 08                               DSSG Group Meeting                   27/31
Fitness Value



                       Best        Median   Worst



• Self-SBX can find a very close global optimal




November 7, 08      DSSG Group Meeting              28/31
α Study
                              • α is varied in
                                [1.05,10]
                              • The effect of α is
                                not significant since
                                the average number
                                of function
                                evaluations is
                                similar.
                              • α = 3 is the best

November 7, 08   DSSG Group Meeting                29/31
Self-SBX in MOOP
                       • Self-SBX is applied in
                         NSGA2-II.
                       • Larger hyper-volume is
                         better




November 7, 08        DSSG Group Meeting          30/31
Conclusion
• Self-SBX is proposed in this paper.
    – The distribution index is adjust adaptively on the run
      time.
• The simulation results show that Self-SBX can
  find the better solutions in single objective
  optimization problems and multi-objective
  optimization problems.

• However, α is used to tune up the distribution
  index.

November 7, 08           DSSG Group Meeting                31/31

Más contenido relacionado

La actualidad más candente

Gk blood relations
Gk   blood relationsGk   blood relations
Gk blood relationsMohit Singla
 
The Art of Visualising Software - Simon Brown
The Art of Visualising Software - Simon BrownThe Art of Visualising Software - Simon Brown
The Art of Visualising Software - Simon BrownValtech UK
 
Introduction to Optimization with Genetic Algorithm (GA)
Introduction to Optimization with Genetic Algorithm (GA)Introduction to Optimization with Genetic Algorithm (GA)
Introduction to Optimization with Genetic Algorithm (GA)Ahmed Gad
 
Endometriozis
EndometriozisEndometriozis
Endometriozisbalcan
 
Ant Colony Optimization presentation
Ant Colony Optimization presentationAnt Colony Optimization presentation
Ant Colony Optimization presentationPartha Das
 
Genetik algoritma
Genetik algoritmaGenetik algoritma
Genetik algoritmaKbraBeendik
 
Genetic Algorithms Made Easy
Genetic Algorithms Made EasyGenetic Algorithms Made Easy
Genetic Algorithms Made EasyPrakash Pimpale
 
Particle Swarm Optimization.pptx
Particle Swarm Optimization.pptxParticle Swarm Optimization.pptx
Particle Swarm Optimization.pptxNatiTilahun1
 
Knapsack problem solved by Genetic Algorithms
Knapsack problem solved by Genetic AlgorithmsKnapsack problem solved by Genetic Algorithms
Knapsack problem solved by Genetic AlgorithmsStelios Krasadakis
 
Particle Swarm Optimization
Particle Swarm OptimizationParticle Swarm Optimization
Particle Swarm OptimizationQasimRehman
 
Particle swarm optimization
Particle swarm optimizationParticle swarm optimization
Particle swarm optimizationHanya Mohammed
 
Ant Colony Optimization
Ant Colony OptimizationAnt Colony Optimization
Ant Colony OptimizationPratik Poddar
 

La actualidad más candente (20)

RapportComplet
RapportCompletRapportComplet
RapportComplet
 
Gk blood relations
Gk   blood relationsGk   blood relations
Gk blood relations
 
The Art of Visualising Software - Simon Brown
The Art of Visualising Software - Simon BrownThe Art of Visualising Software - Simon Brown
The Art of Visualising Software - Simon Brown
 
Introduction to Optimization with Genetic Algorithm (GA)
Introduction to Optimization with Genetic Algorithm (GA)Introduction to Optimization with Genetic Algorithm (GA)
Introduction to Optimization with Genetic Algorithm (GA)
 
Endometriozis
EndometriozisEndometriozis
Endometriozis
 
Ant Colony Optimization presentation
Ant Colony Optimization presentationAnt Colony Optimization presentation
Ant Colony Optimization presentation
 
AI Lecture 5 (game playing)
AI Lecture 5 (game playing)AI Lecture 5 (game playing)
AI Lecture 5 (game playing)
 
Ektopik usg-2009
Ektopik usg-2009Ektopik usg-2009
Ektopik usg-2009
 
Genetic algorithm
Genetic algorithmGenetic algorithm
Genetic algorithm
 
Genetik algoritma
Genetik algoritmaGenetik algoritma
Genetik algoritma
 
Psicologia jung
Psicologia   jungPsicologia   jung
Psicologia jung
 
Nature-inspired algorithms
Nature-inspired algorithmsNature-inspired algorithms
Nature-inspired algorithms
 
Genetic Algorithms Made Easy
Genetic Algorithms Made EasyGenetic Algorithms Made Easy
Genetic Algorithms Made Easy
 
Particle Swarm Optimization.pptx
Particle Swarm Optimization.pptxParticle Swarm Optimization.pptx
Particle Swarm Optimization.pptx
 
Adversarial search
Adversarial searchAdversarial search
Adversarial search
 
Knapsack problem solved by Genetic Algorithms
Knapsack problem solved by Genetic AlgorithmsKnapsack problem solved by Genetic Algorithms
Knapsack problem solved by Genetic Algorithms
 
Particle Swarm Optimization
Particle Swarm OptimizationParticle Swarm Optimization
Particle Swarm Optimization
 
Particle swarm optimization
Particle swarm optimizationParticle swarm optimization
Particle swarm optimization
 
Logic agent
Logic agentLogic agent
Logic agent
 
Ant Colony Optimization
Ant Colony OptimizationAnt Colony Optimization
Ant Colony Optimization
 

Similar a Self Adaptive Simulated Binary Crossover

Day 1 intro to functions
Day 1 intro to functionsDay 1 intro to functions
Day 1 intro to functionsErik Tjersland
 
Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)
Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)
Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)Matthew Lease
 
Computational complexity and simulation of rare events of Ising spin glasses
Computational complexity and simulation of rare events of Ising spin glasses Computational complexity and simulation of rare events of Ising spin glasses
Computational complexity and simulation of rare events of Ising spin glasses Martin Pelikan
 
Lecture6
Lecture6Lecture6
Lecture6voracle
 
Alg2.7 A Notes
Alg2.7 A NotesAlg2.7 A Notes
Alg2.7 A Notesmbetzel
 
Kai – An Open Source Implementation of Amazon’s Dynamo
Kai – An Open Source Implementation of Amazon’s DynamoKai – An Open Source Implementation of Amazon’s Dynamo
Kai – An Open Source Implementation of Amazon’s DynamoTakeru INOUE
 
Day 16 representing with equations
Day 16 representing with equationsDay 16 representing with equations
Day 16 representing with equationsErik Tjersland
 
Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...
Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...
Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...Jim Jenkins
 
Applied Deep Learning 11/03 Convolutional Neural Networks
Applied Deep Learning 11/03 Convolutional Neural NetworksApplied Deep Learning 11/03 Convolutional Neural Networks
Applied Deep Learning 11/03 Convolutional Neural NetworksMark Chang
 
28. distance formulatemplatetouchpad
28. distance formulatemplatetouchpad28. distance formulatemplatetouchpad
28. distance formulatemplatetouchpadMedia4math
 
The business case for automated software engineering
The business case for automated software engineering The business case for automated software engineering
The business case for automated software engineering CS, NcState
 
Distilling Free-Form Natural Laws from Experimental Data
Distilling Free-Form Natural Laws from Experimental DataDistilling Free-Form Natural Laws from Experimental Data
Distilling Free-Form Natural Laws from Experimental Dataswissnex San Francisco
 
On recent improvements in the conic optimizer in MOSEK
On recent improvements in the conic optimizer in MOSEKOn recent improvements in the conic optimizer in MOSEK
On recent improvements in the conic optimizer in MOSEKedadk
 
Locating objects without_bounding_boxes - paper review
Locating objects without_bounding_boxes - paper reviewLocating objects without_bounding_boxes - paper review
Locating objects without_bounding_boxes - paper reviewtaeseon ryu
 
Lesson 21: Curve Sketching (Section 10 version)
Lesson 21: Curve Sketching (Section 10 version)Lesson 21: Curve Sketching (Section 10 version)
Lesson 21: Curve Sketching (Section 10 version)Matthew Leingang
 
Anil Thomas - Object recognition
Anil Thomas - Object recognitionAnil Thomas - Object recognition
Anil Thomas - Object recognitionIntel Nervana
 
The H.264 Integer Transform
The H.264 Integer TransformThe H.264 Integer Transform
The H.264 Integer TransformIain Richardson
 

Similar a Self Adaptive Simulated Binary Crossover (20)

Day 1 intro to functions
Day 1 intro to functionsDay 1 intro to functions
Day 1 intro to functions
 
Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)
Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)
Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)
 
Computational complexity and simulation of rare events of Ising spin glasses
Computational complexity and simulation of rare events of Ising spin glasses Computational complexity and simulation of rare events of Ising spin glasses
Computational complexity and simulation of rare events of Ising spin glasses
 
Lecture6
Lecture6Lecture6
Lecture6
 
Alg2.7 A Notes
Alg2.7 A NotesAlg2.7 A Notes
Alg2.7 A Notes
 
Kai – An Open Source Implementation of Amazon’s Dynamo
Kai – An Open Source Implementation of Amazon’s DynamoKai – An Open Source Implementation of Amazon’s Dynamo
Kai – An Open Source Implementation of Amazon’s Dynamo
 
Day 16 representing with equations
Day 16 representing with equationsDay 16 representing with equations
Day 16 representing with equations
 
Presentation
PresentationPresentation
Presentation
 
Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...
Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...
Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...
 
Applied Deep Learning 11/03 Convolutional Neural Networks
Applied Deep Learning 11/03 Convolutional Neural NetworksApplied Deep Learning 11/03 Convolutional Neural Networks
Applied Deep Learning 11/03 Convolutional Neural Networks
 
6.2 notes
6.2 notes6.2 notes
6.2 notes
 
28. distance formulatemplatetouchpad
28. distance formulatemplatetouchpad28. distance formulatemplatetouchpad
28. distance formulatemplatetouchpad
 
The business case for automated software engineering
The business case for automated software engineering The business case for automated software engineering
The business case for automated software engineering
 
Distilling Free-Form Natural Laws from Experimental Data
Distilling Free-Form Natural Laws from Experimental DataDistilling Free-Form Natural Laws from Experimental Data
Distilling Free-Form Natural Laws from Experimental Data
 
On recent improvements in the conic optimizer in MOSEK
On recent improvements in the conic optimizer in MOSEKOn recent improvements in the conic optimizer in MOSEK
On recent improvements in the conic optimizer in MOSEK
 
Locating objects without_bounding_boxes - paper review
Locating objects without_bounding_boxes - paper reviewLocating objects without_bounding_boxes - paper review
Locating objects without_bounding_boxes - paper review
 
Lesson 21: Curve Sketching (Section 10 version)
Lesson 21: Curve Sketching (Section 10 version)Lesson 21: Curve Sketching (Section 10 version)
Lesson 21: Curve Sketching (Section 10 version)
 
1.2.4
1.2.41.2.4
1.2.4
 
Anil Thomas - Object recognition
Anil Thomas - Object recognitionAnil Thomas - Object recognition
Anil Thomas - Object recognition
 
The H.264 Integer Transform
The H.264 Integer TransformThe H.264 Integer Transform
The H.264 Integer Transform
 

Último

Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...apidays
 
FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024The Digital Insurer
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...Zilliz
 
Corporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptxCorporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptxRustici Software
 
ICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesrafiqahmad00786416
 
AXA XL - Insurer Innovation Award Americas 2024
AXA XL - Insurer Innovation Award Americas 2024AXA XL - Insurer Innovation Award Americas 2024
AXA XL - Insurer Innovation Award Americas 2024The Digital Insurer
 
DBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor PresentationDBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor PresentationDropbox
 
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...apidays
 
Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native ApplicationsWSO2
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Drew Madelung
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century educationjfdjdjcjdnsjd
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc
 
Ransomware_Q4_2023. The report. [EN].pdf
Ransomware_Q4_2023. The report. [EN].pdfRansomware_Q4_2023. The report. [EN].pdf
Ransomware_Q4_2023. The report. [EN].pdfOverkill Security
 
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodPolkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodJuan lago vázquez
 
Apidays Singapore 2024 - Scalable LLM APIs for AI and Generative AI Applicati...
Apidays Singapore 2024 - Scalable LLM APIs for AI and Generative AI Applicati...Apidays Singapore 2024 - Scalable LLM APIs for AI and Generative AI Applicati...
Apidays Singapore 2024 - Scalable LLM APIs for AI and Generative AI Applicati...apidays
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MIND CTI
 
MS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectorsMS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectorsNanddeep Nachan
 
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProduct Anonymous
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingEdi Saputra
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024The Digital Insurer
 

Último (20)

Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
 
FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
 
Corporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptxCorporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptx
 
ICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesICT role in 21st century education and its challenges
ICT role in 21st century education and its challenges
 
AXA XL - Insurer Innovation Award Americas 2024
AXA XL - Insurer Innovation Award Americas 2024AXA XL - Insurer Innovation Award Americas 2024
AXA XL - Insurer Innovation Award Americas 2024
 
DBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor PresentationDBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor Presentation
 
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
 
Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native Applications
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century education
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
 
Ransomware_Q4_2023. The report. [EN].pdf
Ransomware_Q4_2023. The report. [EN].pdfRansomware_Q4_2023. The report. [EN].pdf
Ransomware_Q4_2023. The report. [EN].pdf
 
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodPolkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
 
Apidays Singapore 2024 - Scalable LLM APIs for AI and Generative AI Applicati...
Apidays Singapore 2024 - Scalable LLM APIs for AI and Generative AI Applicati...Apidays Singapore 2024 - Scalable LLM APIs for AI and Generative AI Applicati...
Apidays Singapore 2024 - Scalable LLM APIs for AI and Generative AI Applicati...
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024
 
MS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectorsMS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectors
 
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024
 

Self Adaptive Simulated Binary Crossover

  • 1. Self-Adaptive Simulated Binary Crossover for Real-Parameter Optimization Kalyanmoy Deb*, S Karthik*, Tatsuya Okabe* GECCO 2007 *Indian Inst. India **Honda Research Inst., Japan Reviewed by Paskorn Champrasert paskorn@cs.umb.edu http://dssg.cs.umb.edu
  • 2. Outline • Simulated Binary Crossover (SBX Crossover ) (revisit) • Problem in SBX • Self-Adaptive SBX • Simulation Results – Self-Adaptive SBX for Single-OOP – Self-Adaptive SBX for Multi-OOP • Conclusion November 7, 08 DSSG Group Meeting 2/31
  • 3. SBX Simulated Binary Crossover • Simulated Binary Crossover (SBX) was proposed in 1995 by Deb and Agrawal. • SBX was designed with respect to the one-point crossover properties in binary-coded GA. – Average Property: The average of the decoded parameter values is the same before and after the crossover operation. – Spread Factor Property: The probability of occurrence of spread factor β ≈ 1 is more likely than any other β value. November 7, 08 DSSG Group Meeting 3/31
  • 4. Important Property of One-Point Crossover Spread factor: Spread factor β is defined as the ratio of the spread of offspring points to that of the parent points: c1 −c2 β= | p1 −p2 | c1 c2 - Contracting Crossover β<1 p1 p2 The offspring points are enclosed by the parent points. - Expanding Crossover β>1 c1 c2 The offspring points enclose by the parent points. p1 p2 - Stationary Crossover β=1 c1 c2 The offspring points are the same as parent points p1 p2 November 7, 08 DSSG Group Meeting 4/31
  • 5. Spread Factor (β) in SBX • The probability distribution of β in SBX should be similar (e.g. same shape) to the probability distribution of β in Binary-coded crossover. c(β) = 0.5(nc + 1)β nc , β ≤ 1 • The probability distribution function is defined as: c(β) = 0.5(nc + 1) β n1+2 , β > 1 c 1.6 1.4 Probability Distribution 1.2 1.0 n=2 0.8 0.6 0.4 0.2 0.0 0 1 2 3 4 5 6 7 β November 7, 08 DSSG Group Meeting 5/31
  • 6. A large value of n gives a higher probability for creating ‘near-parent’ contracting solutions. c(β) = 0.5(n + 1)β n , β ≤ 1 expanding 1 c(β) = 0.5(n + 1) β n+2 , β > 1 Mostly, n=2 to 5 November 7, 08 DSSG Group Meeting 6/31
  • 7. Spread Factor (β) as a random number • β is a random number that follows the proposed probability distribution function: 1.6 1.4 To get a β value, Probability Distribution 1.2 1) A unified random number u n=2 is generated [0,1] 1.0 0.8 0.6 2) Get β value that makes the 0.4 0.2 area under the curve = u 0.0 u 0 1 2 3 4 5 6 7 β Offspring: β c1 = x − 1 β(p2 − p1 ) ¯ 2 c2 = x + 1 β(p2 − p1 ) ¯ 2 November 7, 08 DSSG Group Meeting 7/31
  • 8. SBX Summary • Simulated Binary Crossover (SBX) is a real-parameter combination operator which is commonly used in evolutionary algorithms (EA) to optimization problems. – SBX operator uses two parents and creates two offspring. – SBX operator uses a parameter called the distribution index (nc) which is kept fixed (i.e., 2 or 5) throughout a simulation run. • If nc is large the offspring solutions are close to the parent solutions • If nc is small the offspring solutions are away from the parent solutions – The distribution index (nc) has a direct effect in controlling the spread of offspring solutions. November 7, 08 DSSG Group Meeting 8/31
  • 9. Research Issues • Finding the distribution index (nc) to an appropriate value is an important task. – The distribution index (nc) has effect in 1) Convergence speed if nc is very high the offspring solutions are very close to the parent solutions the convergence speed is very low. 2) Local/Global optimum solutions. Past studies of SBX crossover GA found that it cannot work well in complicated problems (e.g., Restrigin’s function) November 7, 08 DSSG Group Meeting 9/31
  • 10. Self-Adaptive SBX • This paper proposed a procedure for updating the distribution index (nc) adaptively to solve optimization problems. – The distribution index (nc) is updated every generation. – How the distribution index of the next generation (nc) is updated (increased or decreased) depends on how the offspring outperform (has better fitness value) than its two parents. • The next slides will show the process to make an equation to update the distribution index. November 7, 08 DSSG Group Meeting 10/31
  • 11. Probability density per offspring Spread factor distribution 1.6 1.4 (1) c(β) = 0.5(nc + 1)β nc , β ≤ 1 Probability Distribution 1.2 (1) (2) 1.0 nc = 2 (2) c(β) = 0.5(nc + 1) β n1+2 , β > 1 c 0.8 0.6 ¯ An example, when p1=2, p2=5 β>1 0.4 0.2 0.0 u 0 1 2 3 4 5 6 7 β β (1) (2) Offspring: ¯ 2¯ c1 = x − 1 β(p2 − p1 ) (1) (2) ¯ 2¯ c2 = x + 1 β(p2 − p1 ) November 7, 08 DSSG Group Meeting 11/31
  • 12. Self-Adaptive SBX (1) Consider the case that β > 1: c1 c2 β>1 p1 p2 c2 −c1 β= | p2 −p1 | (c2 −p2 )+(p2 −p1 )+(p1 −c1 ) β= p2 −p1 (c2 − p2 ) = (p1 − c1 ) 2(c2 −p2 ) β =1+ p2 −p1 --- (1) November 7, 08 DSSG Group Meeting 12/31
  • 13. Self-Adaptive SBX (2) 1.6 1.4 (1) c(β) = 0.5(nc + 1)β nc , β ≤ 1 Probability Distribution 1.2 (1) (2) 1.0 (2) c(β) = 0.5(nc + 1) β n1+2 , β > 1 c 0.8 0.6 0.4 0.2 0.0 u 0 1 2 3 4 5 6 7 β β (2) Rβ 0.5(nc +1) = (u − 0.5) --- (2) 1 β nc +2 dβ November 7, 08 DSSG Group Meeting 13/31
  • 14. (2) Self-Adaptive SBX (3) Rβ 0.5(nc +1) --- (2) 1 β nc +2 dβ = (u − 0.5) Rβ 1 0.5(nc + 1)β −(nc +2) dβ = (u − 0.5) −0.5β −nc −1 − (−0.5) = (u − 0.5) −0.5β −nc −1 = (u − 1) log(β −nc −1 ) = log − (u−1) 0.5 (−nc − 1) log β = log(−2(u − 1)) log 2(1−u) (−nc − 1) = log β log 2(1−u) nc = −(1 + log β ) --- (3) November 7, 08 DSSG Group Meeting 14/31
  • 15. Self-Adaptive SBX (4) c1 c2 c2 ˆ β>1 p1 p2 • If c2 is better than both parent solutions, – the authors assume that the region in which the child solutions is created is better than the region in which the parents solutions are. – the authors intend to extend the child solution further away from the closet parent solution (p2) by a factor α (α >1) (c2 − p2 ) = α(c2 − p2 ) ˆ --- (4) November 7, 08 DSSG Group Meeting 15/31
  • 16. β Update c1 c2 c2 ˆ β>1 p1 p2 2(c2 −p2 ) β =1+ p2 −p1 --- (1) (c2 − p2) = α(c2 − p2) --- (4) ˆ from (1) ˆ β =1+ 2(c2 −p2 ) ˆ p2 −p1 2((α(c2 −p2 )+p2 )−p2 ) =1+ p2 −p1 α2(c2 −p2 ) =1+ p2 −p1 β−1 ˆ β = 1 + α(β − 1) --- (5) November 7, 08 DSSG Group Meeting 16/31
  • 17. Distribution Index (nc) Update nc = −(1 + log 2(1−u) ) --- (3) ˆ β = 1 +α(β −1) --- (5) log β log 2(1−u) nc = −(1 + ˆ ˆ log β ) log 2(1−u) nc = −(1 + ˆ log(1+α(β−1)) ) log 2(1 − u) = −(nc + 1) log β (nc +1) log β nc = −1 + ˆ log(1+α(β−1)) --- (6) November 7, 08 DSSG Group Meeting 17/31
  • 18. Distribution Index (nc) Update When c is better than p1 and p2 then the distribution index for the next generation is calculated as (6). The distribution index will be decreased in order to create an offspring (c’) further away from its parent(p2) than the offspring (c) in the previous generation. (nc +1) log β nc = −1 + ˆ log(1+α(β−1)) --- (6) [0,50] α is a constant (α >1 ) November 7, 08 DSSG Group Meeting 18/31
  • 19. Distribution Index (nc) Update When c is worse than p1 and p2 then the distribution index for the next generation is calculated as (7). The distribution index will be increased in order to create an offspring (c’) closer to its parent(p2) than the offspring (c) in the previous generation. The 1/α is used instead of α (nc +1) log β nc = −1 + ˆ log(1+α(β−1)) --- (6) (nc +1) log β nc = −1 + ˆ log(1+(β−1)/α) --- (7) α is a constant (α >1 ) November 7, 08 DSSG Group Meeting 19/31
  • 20. Distribution Index (nc) Update So, we are done with the expanding case (2) How about contracting case (1) ? The same process is applied for (1) 1.6 1.4 (1) c(β) = 0.5(nc + 1)β nc , β ≤ 1 Probability Distribution 1.2 1.0 (2) c(β) = 0.5(nc + 1) β n1+2 , β > 1 c 0.8 (1) (2) 0.6 0.4 0.2 0.0 u 0 1 2 3 4 5 6 7 β β (1) Rβ 0 0.5(nc + 1)β nc dβ = u --- (x) November 7, 08 DSSG Group Meeting 20/31
  • 21. Distribution Index (nc) Update for contracting case When c is better than p1 and p2 then the distribution index for the next generation is calculated as (8). The distribution index will be decreased in order to create an offspring (c’) father away from its parent(p2) than the offspring (c) in the previous generation. 1+nc --- (8) nc = ˆ α −1 When c is worse than p1 and p2, 1/α is used instead of α nc = α(1 + nc ) − 1 ˆ --- (9) α is a constant (α >1 ) November 7, 08 DSSG Group Meeting 21/31
  • 22. Distribution Index Update Summary Expanding Case Contracting Case (u > 0.5) (u<0.5) c is better than parents 1+nc nc = −1 + ˆ (nc +1) log β ˆ nc = α −1 log(1+α(β−1)) c is worse than parents (nc +1) log β nc = −1 + ˆ log(1+(β−1)/α) nc =α(1+nc) −1 ˆ (u = 0.5), nc = nc ˆ November 7, 08 DSSG Group Meeting 22/31
  • 23. Self-SBX Main Loop Update nc If rand < pc Parent 1 Offspring C1 Pt Selection Crossover initial Parent 2 Offspring C2 population If rand > pc N = 150 Mutation nc = 2 α = 1.5 Pt+1 Population at t+1 N = 150 α = 1.5 November 7, 08 DSSG Group Meeting 23/31
  • 24. Simulation Results Sphere Problem: SBX • Initial nc is 2. • Population size =150 • Pc = 0.9 • The number of variables (n) =30 • Pm =1/30 • A run is terminated when a solution having a function value = 0.001 • 11 runs are applied. Self-SBX • Pc = 0.7 • α = 1.5 November 7, 08 DSSG Group Meeting 24/31
  • 25. Fitness Value • Self-SBX can find better solutions November 7, 08 DSSG Group Meeting 25/31
  • 26. α Study • α is varied in [1.05,2] • The effect of α is not significant since the average number of function evaluations is similar. November 7, 08 DSSG Group Meeting 26/31
  • 27. Simulation Results Rastrigin’s Function: Many local optima, one global minimum (0) The number of variables = 20 SBX • Initial nc is 2. • Population size =100 • Pc = 0.9 • The number of variables (n) =30 • Pm =1/30 • A run is terminated when a solution having a function value = 0.001 or the maximum of 40,000 generations is reached. • 11 runs are applied. Self-SBX • Pc = 0.7 • α = 1.5 November 7, 08 DSSG Group Meeting 27/31
  • 28. Fitness Value Best Median Worst • Self-SBX can find a very close global optimal November 7, 08 DSSG Group Meeting 28/31
  • 29. α Study • α is varied in [1.05,10] • The effect of α is not significant since the average number of function evaluations is similar. • α = 3 is the best November 7, 08 DSSG Group Meeting 29/31
  • 30. Self-SBX in MOOP • Self-SBX is applied in NSGA2-II. • Larger hyper-volume is better November 7, 08 DSSG Group Meeting 30/31
  • 31. Conclusion • Self-SBX is proposed in this paper. – The distribution index is adjust adaptively on the run time. • The simulation results show that Self-SBX can find the better solutions in single objective optimization problems and multi-objective optimization problems. • However, α is used to tune up the distribution index. November 7, 08 DSSG Group Meeting 31/31