The document summarizes a project on multi-objective optimization using the NSGA II and SPEA2 algorithms. A team of 5 students implemented the NSGA II and SPEA2 algorithms in MATLAB and tested them on various benchmark functions with 2 or more objectives. They compared the results of both algorithms on the benchmark functions and analyzed the Pareto fronts obtained.
Multi objective optimization and Benchmark functions result
1. Project Incharge: Mr. Divya Kumar
Muti-objective optimization using NSGA II and SPEA2
Team(CS07):
1 Piyush Agarwal
2 Saquib Aftab
3 Ravi Ratan
4 Ravi Shankar
5 Pradhumna Mainali
Multi-objective optimization 1/31
2. What we did?
Studied genetic algorithm: single objective optimization,
multi-objective optimization problems.
Implemented NSGA II and Strength Pareto Evolutionary
Algorithm (SPEA2) in MATLAB
Tested SPEA2 algorithm on all benchmark function
Tested NSGA II algorithm on all benchmark function
Compared the results
Multi-objective optimization 2/31
3. What is Evolutionary Algorithm?
Evolutionary algorithms (EAs) are often well-suited for
optimization problems involving several, often conflicting
objectives
Evolutionary algorithms typically generate sets of
solutions, allowing computation of an approximation of
the entire Pareto front
SPEA2 and NSGA II are two such Evolutionary
Algorithms implemented on multi-objective functions
Multi-objective optimization 3/31
4. Life cycle of EA
Initialization : Initializing the population in the first
generation satisfying the bounds and constraints of the
problem.
Parent Selection : Selection of fittest individuals for the
mating pool
Recombination: Forming new individuals from the
mating pool. Crossover and Mutation is applied to the
parents to produce new individuals.
Survivor Selection: Fittest individuals of parents and
children combined are selected as population for next
generation.
Multi-objective optimization 4/31
5. Life cycle of EA contd.
Figure 1: Life cycle of Evolutionary Algorithm
Multi-objective optimization 5/31
6. NSGA II Algorithm
Input
1 N (Population Size)
2 P (Population)
3 Q (Offsprings)
4 T (Maximum number of generations)
Output
1 A (non-dominated set)
Multi-objective optimization 6/31
7. NSGA II Algorithm Contd.
1 Initialization: Generate an initial population P.
2 Mating selection: Perform binary tournament selection
with replacement on P in order to fill the mating pool.
3 Variation: Apply recombination and mutation operators to
the mating pool and set P to the resulting population and
store the result into Q.
4 Non dominated sort: Non dominated sort of P and Q
5 Fronts division: Divide into fronts. Front 0 is
non-dominated.
6 New generation: Selection of new population from fronts
Multi-objective optimization 7/31
8. Fast non dominated sorting
1 Each population i is compared with every population j.
2 ni is the count of individuals which dominate the ith
population.
3 Si is the set of individuals that i dominates.
4 When ni = 0, that means the individual is the best solution
and is assigned first front.
5 After getting the first front,for each individuals in Si, np is
decremented by 1 and the next front in obtained like in
Step 4.
Multi-objective optimization 8/31
9. Generation of NSGA II
Figure 2: One generation of NSGA II algorithm
Multi-objective optimization 9/31
11. SPEA2 Algorithm
Input
1 N(Population Size)
2 N(Archive size)
3 T (Maximum number of generations)
Output
1 A (non-dominated set)
Multi-objective optimization 11/31
12. SPEA2 Algorithm contd.
1 Initialization: Generate an initial population P0 and create
the empty archive P0 = φ. Set t = 0.
2 Fitness assignment: Calculate fitness values of individuals
in Pt and Pt.
3 Environmental selection: Copy all non-dominated
individuals in Pt and Pt to Pt+1 keeping the size N.
4 Termination: If t ≥ T or another stopping criterion is
satisfied then set A to the set of decision vectors in Pt+1.
Stop.
5 Mating selection: Perform binary tournament selection
with replacement on Pt+1 in order to fill the mating pool.
6 Variation: Apply recombination and mutation operators to
the mating pool and set Pt+1 to the resulting population.
Increment generation counter (t = t + 1) and go to Step 2
Multi-objective optimization 12/31
14. Testing on Benchmark Functions
Both the algorithm, NSGA II and SPEA2 are tested on all
benchmark functions. Benchmark functions may be convex or
non-convex (un-constraint) or can have single or multiple
constraints.For all tests on benchmark, the Red graph
represents NSGA II curve and Yellow represents SPEA 2 curve.
The x-axis represents the First objective function and the y -
axis represents Second objective function.
Multi-objective optimization 14/31
15. Schaffer function N. 1
Minimize =
f1(x) = x2
f2(x) = (x − 2)2
s.t =
−A ≤ x ≤ A
10 ≤ A ≤ 105
Multi-objective optimization 15/31
17. Schaffer function N. 2
Minimize =
f1(x) =
−x, if x ≤ 1
x − 2, if 1 ≤ x < 3
4 − x, if 3 ≤ x < 4
x − 4, if 4 ≤ x
f2(x) = (x − 5)2
s.t = −5 ≤ x ≤ 10
Multi-objective optimization 17/31
28. Constr-Ex problem function
Minimize =
f1(x, y) = x
f2(x, y) =
1+y
x
s.t = g1(x, y) = y + 9x ≥ 6
g1(x, y) = −y + 9x ≥ 1
0.1 ≤ x ≤ 1
0 ≤ y ≤ 5
Multi-objective optimization 28/31
29. Work in progress
One of the application of multi-objective real life problems is
Portfolio Optimization. In a portfolio problem with an asset
universe of n securities, let xi (i = 1, 2, . . . , n) designate the
proportion of initial capital to be allocated to security i. And
typically there are two conflicting goals
Minimize risk. n
i=1
n
j=1 xiσijxj
Maximize profit n
i=1 rixi
where ri is the expected return of ith security. σij is the co
variance between ith and jth security.
Multi-objective optimization 29/31
30. Constraints of Portfolio Optimization
n
i=1
xi = 1
α <= xi <= β
dmin <= d <= dmax
5/10/40 rule
0 <= α <= β <= 1
where, α and β are minimum and maximum capital proportion
to be allocated to security i respectively. dmin and dmax is the
minimum and maximum number of non zero securities in the
portfolio respectively.
Multi-objective optimization 30/31
31. Conclusion
It was derived from the project that the multi-objective
evolutionary algorithm can solve multi-objective functions
satisfying given sets of constraints. Higher number of
generations will lead to better solutions until an upper bound is
reached where all solutions tend to converge. Multi-objective
optimization algorithms can solve various real life applications
by converting them into sets of objective functions and
applying constraints.
Multi-objective optimization 31/31