Reducing the time of heuristic algorithms for the Symmetric TSP
1. Reducing the time of
heuristic algorithms for
the Symmetric TSP
Guilherme Polo
Available at http://goo.gl/rjSw
2. Reducing the time of
heuristic algorithms for
the Symmetric TSP
Guilherme Polo
Available at http://goo.gl/rjSw
3. Summary
Problem Bibliography
Strategy Extras
Greedy
2-Opt
GRASP
Final results
4. Problem
The Traveling Salesman Problem (TSP)
consists in finding a hamiltonian
cycle of minimal cost.
For N points: N ! possible tours
Possible tours with distinct costs
in STSP: N !
2N
5. Strategy
Find a valid initial tour
Constructive algorithm: Greedy NN
Try to reduce the cost of the initial
tour
Local search: 2-Opt, 2.1/ -Opt
4
Try to escape from local optimum by
using GRASP
7. Greedy Algorithm 1 naive-greedy-nn(start, n, C)
be T [1 . . n] a new array
visited = ∅ ; curr = start ; cost = 0
for i = 1 to n − 1
T [i] = curr ; visited = visited ∪ {curr }
min = ∞ ; next = nil
for j = 1 to n
if j ∈ visited and C(curr , j) < min
min = C(curr , j)
next = j
cost = cost + min
curr = next
T [n] = curr
cost = cost + C(curr , start)
return (T, cost)
8. Problems with
NAIVE-GREEDY-NN
1.Considers every neighbors
2.Not practicable to work with nxn
matrix
~47.68 GB for an 80.000 points
instance (assuming 8 bytes integer)
Excessive pagination.
9. Problems with
NAIVE-GREEDY-NN
1.Considers every neighbors
2.Not practicable to work with nxn
matrix
~47.68 GB for an 80.000 points
instance (assuming 8 bytes integer)
80000 * 80000 * 8 pagination.
Excessive = 51200000000 Bytes
51200000000 / 1024 / 1024 / 1024 ≈ 47.6837158 GB
10. Problems with
NAIVE-GREEDY-NN
1.Not practicable to work with nxn
matrix
In the pseudocode, C is a function
capable of calculating the cost
between two points
Repeated calculations. Still
less expensive than working with
nxn matrix
11. Problems with
NAIVE-GREEDY-NN
Testing empirically with some TSPLIB
instances. (gcc -O2, Mac OS X)
1.Considers every neighbors
d2103 vm1084 ~0.02 s
~0.08 s d18512 ~ 2.94 s
d15112 ~5.20 s s
d2103 ~0.04 pla33810 ~ 8.58 s
d18512 ~9.10 s
d15112 ~2.15 s pla85900 ~56.04 s
Average of 5 executions, user time + system time
12. Improving the time of
the Greedy algorithm
A more adequate structure* solves the
problem
vm1084 ~0.01 s d18512 ~0.04 s
d2103 ~0.01 s pla33810 ~0.07 s
d15112 ~0.04 s pla85900 ~0.17 s
* Along with operations that make good use of the structure
13. Improving the time of
the Greedy algorithm
Excution time (seconds)
0.02
vm1084 0.01
0.04
d2103 0.01
2.15
d15112 0.04
2.94
d18512 0.04
8.58
pla33810 0.07
56.04 Naïve
pla85900 0.17 Improved
0.01 0.1 1 10 100
14. Improving the time of
the Greedy algorithm
Excution time (seconds) Improvement (times)
380
0.02 329.64
vm1084 0.01
0.04 254
d2103 0.01
2.15 122.57
d15112 127
0.04 73.50
53.75
2.94 2.00 4.00
d18512 0.04 1
vm1084 pla85900
8.58
pla33810 0.07
56.04 Naïve
pla85900 0.17 Improved
0.01 0.1 1 10 100
15. Improving the time of
the Greedy algorithm
Activity/Time in the improved Greedy
100% 9% Output
11% 11% 12%
15% Input
18% 20% 22%
75% Others
Initialization
37% 25% 20% 15% Greedy tour
50%
16% 17%
16%
14%
25%
30% 32% 34%
25%
0%
d15112 d18512 pla33810 pla85900
16. Time for running Greedy NN
10.0
7.5
y = 0.0966x - 0.1196
R² = 0.9988
5.0
2.5
0
pla85900 rl2-1116700 rl2-2147500 rl2-3178300 rl2-4209100
Time (s) Tendency
17. Time for structure initialization
10.0
7.5
y = 0.0408x1.4135
R² = 0.9941
5.0
2.5
0
pla85900 rl2-1116700 rl2-2147500 rl2-3178300 rl2-4209100
Time (s) Tendency
18. Initialization and Greedy tour
10.0
7.5
5.0
2.5
0
pla85900 rl2-1116700 rl2-2147500 rl2-3178300 rl2-4209100
Init time (s) Greedy time (s)
19. Improving the time of
the Greedy algorithm
Data structure:
k-d tree (2-d), Bentley 1975
Optimized for efficient search:
Friedman, Bentley, Finkel 1977
Adequate operations:
Bentley 1990 (Points don’t change)
21. k-d tree
or kd-tree;
or kdtree;
or multidimensional binary search tree
Allows performing proximity operations in an
efficient manner
Nearest neighbor
Neighbors in a fixed radius
Examples of usages: ray-tracing, n-body
simulation, 2-opt, database
22. Example construction
algorithm
build(P, l, u, d)
node = kdnode-new()
if u − l = = 0
node. val = P [l]
else
m = (l + u)/2
if (d mod 2) = = 0
sort(P, l, u, X)
node. cut = P [m]. x
else
sort(P, l, u, Y)
node. cut = P [m]. y
node. left = build(P, l, m, d + 1)
node. right = build(P, m + 1, u, d + 1)
return node
23. Example construction
algorithm
build(P, l, u, d)
node = kdnode-new()
Maximum amount of points
if u − l = = 0 in a bucket; 1 here
node. val = P [l]
else
m = (l + u)/2
if (d mod 2) = = 0
sort(P, l, u, X)
node. cut = P [m]. x
else
sort(P, l, u, Y)
node. cut = P [m]. y
node. left = build(P, l, m, d + 1)
node. right = build(P, m + 1, u, d + 1)
return node
24. Example construction
algorithm
build(P, l, u, d)
node = kdnode-new()
Maximum amount of points
if u − l = = 0 in a bucket; 1 here
node. val = P [l]
else
m = (l + u)/2
if (d mod 2) = = 0
sort(P, l, u, X)
node. cut = P [m]. x
else Beware of the cost
sort(P, l, u, Y)
node. cut = P [m]. y
node. left = build(P, l, m, d + 1)
node. right = build(P, m + 1, u, d + 1)
return node
25. Construction example
tree = BUILD(<(5;5), (25;10), (20;15), (10;20), (30;25), (10;30)>, 1, 6, 0)
35 l u d m cut val
10,30
30
30,25
25
10,20
20
20,15
15
25,10
10
5,5
5
0
0 5 10 15 20 25 30 35
Y
X
26. Construction example
tree = BUILD(<(5;5), (25;10), (20;15), (10;20), (30;25), (10;30)>, 1, 6, 0)
35 l u d m cut val
10,30
1 6 0 3 NIL
30
30,25
25
10,20
20
20,15
15
25,10
10
5,5
5
0
0 5 10 15 20 25 30 35
Y <(5;5), (25;10), (20, 15), (10;20), (30;25), (10;30)>
X Sort by x
27. Construction example
tree = BUILD(<(5;5), (25;10), (20;15), (10;20), (30;25), (10;30)>, 1, 6, 0)
35 l u d m cut val
10,30
1 6 0 3 NIL
30
30,25
25
10,20
20
20,15
15
25,10
10
5,5
5
0
0 5 10 15 20 25 30 35
Y <(5;5), (10;20), (10;30), (20;15), (25;10), (30;25)>
X Sorted by x
28. Construction example
tree = BUILD(<(5;5), (25;10), (20;15), (10;20), (30;25), (10;30)>, 1, 6, 0)
35 l u d m cut val
10,30
1 6 0 3 10 NIL
30
30,25
25
10,20
20
20,15
15
25,10
10
5,5
5
0
0 5 10 15 20 25 30 35
Y <(5;5), (10;20), (10;30), (20;15), (25;10), (30;25)>
X Sorted by x
29. Construction example
tree = BUILD(<(5;5), (25;10), (20;15), (10;20), (30;25), (10;30)>, 1, 6, 0)
35 l u d m cut val
10,30
1 6 0 3 10 NIL
30
30,25 1 3 1 2 20 NIL
25
10,20
20
20,15
15
25,10
10
5,5
5
0
0 5 10 15 20 25 30 35
Y <(5;5), (10;20), (10;30), (20;15), (25;10), (30;25)>
X Sorted by y
30. Construction example
tree = BUILD(<(5;5), (25;10), (20;15), (10;20), (30;25), (10;30)>, 1, 6, 0)
35 l u d m cut val
10,30
1 6 0 3 10 NIL
30
30,25 1 3 1 2 20 NIL
25 1 2 2 1 5 NIL
10,20
20
20,15
15
25,10
10
5,5
5
0
0 5 10 15 20 25 30 35
Y <(5;5), (10;20), (10;30), (20;15), (25;10), (30;25)>
X Sorted by x
31. Construction example
tree = BUILD(<(5;5), (25;10), (20;15), (10;20), (30;25), (10;30)>, 1, 6, 0)
35 l u d m cut val
10,30
1 6 0 3 10 NIL
30
30,25 1 3 1 2 20 NIL
25 1 2 2 1 5 NIL
10,20
20 1 1 3 NIL NIL (5;5)
20,15
15
25,10
10
5,5
5
0
0 5 10 15 20 25 30 35
Y <(5;5), (10;20), (10;30), (20;15), (25;10), (30;25)>
X
32. Construction example
tree = BUILD(<(5;5), (25;10), (20;15), (10;20), (30;25), (10;30)>, 1, 6, 0)
35 l u d m cut val
10,30
1 6 0 3 10 NIL
30
30,25 1 3 1 2 20 NIL
25 1 2 2 1 5 NIL
10,20
20 1 1 3 NIL NIL (5;5)
20,15 2 2 3 NIL NIL (10;20)
15
25,10
10
5,5
5
0
0 5 10 15 20 25 30 35
Y <(5;5), (10;20), (10;30), (20;15), (25;10), (30;25)>
X
33. Construction example
tree = BUILD(<(5;5), (25;10), (20;15), (10;20), (30;25), (10;30)>, 1, 6, 0)
35 l u d m cut val
10,30
1 6 0 3 10 NIL
30
30,25 1 3 1 2 20 NIL
25 1 2 2 1 5 NIL
10,20
20 1 1 3 NIL NIL (5;5)
20,15 2 2 3 NIL NIL (10;20)
15
3 3 2 NIL NIL (10;30)
25,10
10
5,5
5
0
0 5 10 15 20 25 30 35
Y <(5;5), (10;20), (10;30), (20;15), (25;10), (30;25)>
X
34. Construction example
tree = BUILD(<(5;5), (25;10), (20;15), (10;20), (30;25), (10;30)>, 1, 6, 0)
35 l u d m cut val
10,30
1 6 0 3 10 NIL
30
30,25 1 3 1 2 20 NIL
25 1 2 2 1 5 NIL
10,20
20 1 1 3 NIL NIL (5;5)
20,15 2 2 3 NIL NIL (10;20)
15
3 3 2 NIL NIL (10;30)
25,10
10 4 6 1 5 15 NIL
5,5
5
0
0 5 10 15 20 25 30 35
Y <(5;5), (10;20), (10;30), (25;10), (20;15), (30;25)>
X Sorted by y
35. Construction example
tree = BUILD(<(5;5), (25;10), (20;15), (10;20), (30;25), (10;30)>, 1, 6, 0)
35 l u d m cut val
10,30
1 6 0 3 10 NIL
30
30,25 1 3 1 2 20 NIL
25 1 2 2 1 5 NIL
10,20
20 1 1 3 NIL NIL (5;5)
20,15 2 2 3 NIL NIL (10;20)
15
3 3 2 NIL NIL (10;30)
25,10
10 4 6 1 5 15 NIL
5,5
4 5 2 4 20 NIL
5
0
0 5 10 15 20 25 30 35
Y <(5;5), (10;20), (10;30), (20;15), (25;10), (30;25)>
X Sorted by x
40. Construction example
tree = BUILD(<(5;5), (25;10), (20;15), (10;20), (30;25), (10;30)>, 1, 6, 0)
35
10,30 X 10
30
Cut dimension
30,25
25
10,20 Y 20 15
20
20,15
15
25,10 10; 30;
X 5 20
10 30 25
5,5
5
5; 10; 20; 25;
0 5 20 15 10
0 5 10 15 20 25 30 35
Y Another representation
X
41. rnn(root, d)
if root. val = NIL
Example algoritm if root. val = = target
return
for Top-Down NN thisdist = euc-2d(root. val , target)
if thisdist < dist
search dist = thisdist
nn = root. val
else
cutval = root. cut
if (d mod 2) = = 0
kdtree-nn(root, nntarget)
thisval = target. x
target = nntarget else thisval = target. y
dist = ∞ if thisval ≤ cutval
nn = NIL rnn(root. left, d + 1)
rnn(root, 0) if (thisval + dist) > cutval
return nn rnn(root. right, d + 1)
else
rnn(root. right, d + 1)
if (thisval − dist) < cutval
rnn(root. left, d + 1)
42. Example NN search
X 10
Y 20 15
10; 30;
X 5 20
30 25
5; 10; 20; 25;
5 20 15 10
Find the closest point to 10;30
43. Example NN search
10 <= 10
X 10
Y 20 15
10; 30;
X 5 20
30 25
5; 10; 20; 25;
5 20 15 10
Find the closest point to 10;30
44. Example NN search
10 <= 10
30 > 20
X 10
Y 20 15
10; 30;
X 5 20
30 25
5; 10; 20; 25;
5 20 15 10
Find the closest point to 10;30
45. Example NN search
10 <= 10
30 > 20
X 10
return
30 - ∞ < 20
Y 20 15
10; 30;
X 5 20
30 25
5; 10; 20; 25;
5 20 15 10
Find the closest point to 10;30
46. Example NN search
10 <= 10
30 > 20
X 10
return
30 - ∞ < 20
10 > 5
Y 20 15
10; 30;
X 5 20
30 25
5; 10; 20; 25;
5 20 15 10
Find the closest point to 10;30
47. Example NN search
10 <= 10
30 > 20
X 10
return
30 - ∞ < 20
10 > 5
Y 20 15 dist = 10
nn = 10;20
10; 30;
X 5 20
30 25
5; 10; 20; 25;
5 20 15 10
Find the closest point to 10;30
48. 10;20 15.81 Example NN search
10;30 25.50 10
10 <= 10
20;15 18.03 11.18 18.03
30 > 20
25;10 20.62 18.03 25 7.07
X 10
return
30 - ∞ < 20
30;25 32.02 20.62 20.62 14.14 15.81
10 > 5
5;5 10;20 10;30 20;15Y 25;10 20 15 dist = 10
nn = 10;20
Euclidean distance 10; 30;
X 5 20
30 25
5; 10; 20; 25;
5 20 15 10
Find the closest point to 10;30
49. Example NN search
10 <= 10
30 > 20
X 10
return
30 - ∞ < 20
10 > 5
Y 20 15 dist = 10
nn = 10;20
10; 30;
X 5 20
30 25
5; 10; 20; 25;
5 20 15 10
Find the closest point to 10;30
50. Example NN search
10 <= 10
30 > 20
X 10
return
30 - ∞ < 20
10 > 5
Y 20 15 dist = 10
nn = 10;20
10 - 10 < 5
10; 30;
X 5
30
20
25 não altera dist
5; 10; 20; 25;
5 20 15 10
Find the closest point to 10;30
51. 10;20 15.81 Example NN search
10;30 25.50 10
10 <= 10
20;15 18.03 11.18 18.03
30 > 20
25;10 20.62 18.03 25 7.07
X 10
return
30 - ∞ < 20
30;25 32.02 20.62 20.62 14.14 15.81
10 > 5
5;5 10;20 10;30 20;15Y 25;10 20 15 dist = 10
nn = 10;20
Euclidean distance 10; 30;
10 - 10 < 5
X 5
30
20
25 não altera dist
5; 10; 20; 25;
5 20 15 10
Find the closest point to 10;30
52. Example NN search
10 <= 10
30 > 20
X 10
return
30 - ∞ < 20
10 > 5
Y 20 15 dist = 10
nn = 10;20
10 - 10 < 5
10; 30;
X 5
30
20
25 não altera dist
dist unaffected
5; 10; 20; 25;
5 20 15 10
Find the closest point to 10;30
53. Example NN search
10 <= 10
30 > 20
X 10
return
30 - ∞ < 20
10 > 5
Y 20 15 dist = 10
nn = 10;20
10 - 10 < 5
10; 30;
X 5
30
20
25 dist unaffected
10 + 10 > 10
5; 10; 20; 25;
5 20 15 10
Find the closest point to 10;30
54. Example NN search
10 <= 10
30 > 20
X 10
return
30 - ∞ < 20
10 > 5
Y 20 15 dist = 10
nn = 10;20
10 - 10 < 5
10; 30;
X 5
30
20
25 dist unaffected
10 + 10 > 10
5; 10; 20; 25; 30 > 15
5 20 15 10 não altera dist
30 - 10 ≮ 15
Find the closest point to 10;30
55. 10;20 15.81 Example NN search
10;30 25.50 10
10 <= 10
20;15 18.03 11.18 18.03
30 > 20
25;10 20.62 18.03 25 7.07
X 10
return
30 - ∞ < 20
30;25 32.02 20.62 20.62 14.14 15.81
10 > 5
5;5 10;20 10;30 20;15Y 25;10 20 15 dist = 10
nn = 10;20
Euclidean distance 10; 30;
10 - 10 < 5
X 5
30
20
25 dist unaffected
10 + 10 > 10
5; 10; 20; 25; 30 > 15
5 20 15 10 não altera dist
30 - 10 ≮ 15
Find the closest point to 10;30
56. Example NN search
10 <= 10
30 > 20
X 10
return
30 - ∞ < 20
10 > 5
Y 20 15 dist = 10
nn = 10;20
10 - 10 < 5
10; 30;
X 5
30
20
25 dist unaffected
10 + 10 > 10
5; 10; 20; 25; 30 > 15
5 20 15 10 não altera dist
dist unaffected
30 - 10 ≮ 15
Find the closest point to 10;30
57. Example NN search
X 10
Y 20 15
10; 30;
X 5 20
30 25
5; 10; 20; 25;
5 20 15 10
Find the closest point to 10;30
59. Choices for the
k-d tree
Max bucket size: 5 kdnodes
Cutting hyperplane selection:
select-rs* + insertion sort
for |sub-array| ≤ 16
BNDS_LEVEL: 3 XXX não uso no greedy
* Implementation found in Robert Sedgewick’s book
60. Greedy Algorithm 2 kdtree-greedy-nn(tree, start, n, C)
be T [1 . . n] a new array
T [1] = start
curr = start
cost = 0
for i = 2 to n
kdtree-delete(tree, curr )
next = kdtree-nearest(tree, curr , C)
T [i] = next
cost = cost + C(curr, next)
curr = next
cost = cost + C(curr , start)
kdtree-undelete-all(tree, n)
return (T, cost)
61. Results with Greedy
search
Naïve Improved* Diff ~
vm1084 290,806 286,437 1.52%
d2103 86,504 86,765 -0.29%
d15112 1,960,503 1,921,015 2.05%
d18512 799,220 779,783 2.49%
pla33810 77,332,499 81,131,055 -4.68%
pla85900 163,516,994 174,486,522 -6.29%
* Best result found starting from every point
62. Results with Greedy
search (Naïve / Improved - 1) * 100
Naïve Improved* Diff ~
vm1084 290,806 286,437 1.52%
d2103 86,504 86,765 -0.29%
d15112 1,960,503 1,921,015 2.05%
d18512 799,220 779,783 2.49%
pla33810 77,332,499 81,131,055 -4.68%
pla85900 163,516,994 174,486,522 -6.29%
* Best result found starting from every point
63. Results with Greedy
search (Naïve / Improved - 1) * 100
Naïve Improved* Diff ~
3 304
vm1084 290,806 286,437 1.52%
761 806
d2103 86,504 86,765 -0.29%
0 6523
d15112 1,960,503 1,921,015 2.05%
0 10818
d18512 799,220 779,783 2.49%
11394
0
pla33810 77,332,499 81,131,055 -4.68%
0 38007
pla85900 163,516,994 174,486,522 -6.29%
* Best result found starting from every point
65. Local search
2-Opt
1
2-exchange 1
9 2 9 2
8 3 8 3
7 4 7 4
6 5 6 5
1 5 4 3 2 6 7 8 9 1 2 3 4 5 6 7 8 9
a b c d a c b d
66. naive-2opt(T, n, cost, C)
repeat
gain = 0
2-Opt Algorithm
for i = 1 to n
for j = i + 2 to n
if ((j + 1) mod n) = = i
continue
a = T [i]; b = T [(i + 1) mod n]
c = T [j]; d = T [(j + 1) mod n]
new -gain = C(a, c) + C(b, d)−
(C(a, b) + C(c, d))
if new -gain < gain
gain = new -gain
best = [b, c]
if gain < 0
cost = cost + gain
invert(T, best[1], best[2])
until gain = = 0
67. Problem with
NAIVE-2OPT
Always considers every point
Impracticable to apply in
non-small instances
68. Problem with
NAIVE-2OPT
Time (s) Exchanges Cost Prox ~
vm1084 3.01 120 257,295 7.52%
d2103 8.30 86 82,125 2.08%
d15112 26700* 2,064 1,659,003 5.46%
Time only for 2-Opt execution
Starting points: Greedy’s results table
* Only one execution in another computer (other
instances took ~50% longer to run in this other one)
69. Improving the
2-Opt time
Given the points a and b, consider
only those points that are closer to
a than a is to b
k-d tree enables this search
Operation: search for near
neighbors in a fixed radius
dist(a,b)
70. Improving the
2-Opt time
Approximations
Upon discovering an exchange that
reduces the cost, it is performed
and the current search ends
If a 2-Opt swap does not reduce
the cost, try to perform a 2.1/ -Opt
4
move
71. Improving the
2-Opt time
Other considerations
The fixed-radius search is only
executed if the point b is not the
closest one to the a
After performing an exchange, the
loop goes back one step
72. 2. -Opt 1/4
1
9 2
8 3
7 4
6 5
1 1
c a c a
9 2 9 2
x
2-Opt
8 3 8 3
7 4 7 4
6 5 6 5
b d b d
73. 2. -Opt 1/4
1
c a 1
9 2
9 2
8 3
8 3
7 4
7 4
6 5
6 5
b d
1 2 6 7 8 9 5 4 3
1 2 9 6 7 8 5 4 3
a d
75. Exchanges in the approximated 2-Opt
300000 y = 20546x + 252.05
R² = 0.999
Approximated 2-Opt time 225000
150 150000
y = 1.3369x2 - 7.1294x + 10.326
75000
R² = 0.9936
113 0
pla85900 rl2-515400 rl2-944900
2-Opt swaps Tendency
75
“2-exchange” time
5.0E-04
y = 4.722E-6x2 - 1.905E-5x + 7.019E-5
R² = 0.9957
38 3.8E-04
2.5E-04
0 1.3E-04
pla85900 rl2-515400 rl2-944900
0E+00
pla85900 rl2-515400 rl2-944900
2-Opt (s) Tendency
Time/Exchange (s) Tendency
76. Exchanges in the approximated 2.1/4-Opt
300000
y = 25368x - 2354.1
R² = 0.9976
Approximated 2.1/4-Opt time 225000
150 150000
y = 1.7278x2 - 9.8544x + 14.61
75000
R² = 0.9848
113 0
pla85900 rl2-515400 rl2-944900
2.1/4-Opt swaps Tendency
75
“2.1/4-exchange” time
5.0E-04 y = 4.631E-6x2 - 1.768E-5x + 6.255E-5
R² = 0.9852
38 3.8E-04
2.5E-04
0 1.3E-04
pla85900 rl2-515400 rl2-944900
0E+00
pla85900 rl2-515400 rl2-944900
2.1/4-Opt (s) Tendency
Time/Exchange (s) Tendency
77. Exchanges in the local search
300000
Approximated local search time 225000
150 150000
75000
113 0
pla85900 rl2-515400 rl2-944900
2-Opt trocas 2.1/4-Opt trocas
75
Time/Exchange
5.0E-04
38 3.8E-04
2.5E-04
0 1.3E-04
pla85900 rl2-515400 rl2-944900
0E+00
pla85900 rl2-515400 rl2-944900
2-Opt (s) 2.1/4-Opt (s)
2-exchange (s) 2.1/4-exchange (s)
78. Evaluation
The “true” 2-Opt did beat the
approximations in 2 of 3 instances
executed
But took much longer (> 380.000x)
In the approximated 2-Opt: considering
other neighbors after finding an
improving exchange did not result in a
better final tour in most cases
79. Evaluation
Tour representation by simple arrays
consumed ~25% of execution time for
performing inversions and “slides” in
the pla85900 instance
Suggestions:
satellite list
two-level tree
80. Evaluation
Tour representation by simple arrays
consumed ~25% of execution time for
performing inversions and “slides” in
the pla85900 instance
Suggestions:
satellite list
two-level tree
2.1/ movement
4
81. GRASP
kdtree-grasp(n)
be B[1 . . n] a new array
best-cost = ∞
while stopping conditions not met
GT = kdtree-semigreedy-tour(random(n))
T, cost = kdtree-2opt(GT )
if cost < best-cost
best-cost = cost
B =T
return (B, best-cost)
82. GRASP
Strategy
Apply the improved algorithms
discussed earlier
Modify KDTREE-NEAREST in order to
collect the neighbors found during
the search for the nearest one
83. GRASP
Strategy
Apply the improved algorithms
discussed earlier
Modify KDTREE-NEAREST in order to
collect the neighbors found during
“KNN”
the search for the nearest one
84. GRASP
Params
Max number of executions: N2
Stops if cost remains the same for
5000 iterations
Max RCL size: draw [1, 3]
Selection: bias log
85. GRASP
RCL
Max heap
Easy to exchange the current
largest element by a new smaller
one (even though currently the RCL is quite small)
Ranking by the bias function is
performed backwards
86. GRASP
RCL - Observation
Max heap:
Insert <9 8 3 4 1 7 5 2 11>
11 9 8 5 3 7 4 2 1
Elements do not necessarily
decrease at right (contribute to the
randomness in GRASP?)
87. GRASP
Results
KG+2.1/4 GRASP* Prox.
T (s) Cost T (s) Execs. Cost ~
vm1084 0.00 254,355 >> 16,207 249,164 4,12%
d2103 0.00 82,402 >> 7,260 87,860 =
d15112 0.07 1,666,102 >> 6,451 1,668,978 =
d18512 0.09 680,798 >> 10,014 685,330 =
pla33810 0.24 70,744,397 >> 11,903 71,776,398 =
pla85900 0.70 150,777,638 >>> 7,445 152,935,500 =
Worse than Greedy-NN
* Results found by using the rand48 family of functions; seed fixed at 0
88. Evaluation
Managed to improved only the smallest
among these instances
Worse than pure Greedy on d2103
problem
For a tad bigger instances, GRASP
with K-d tree does not seem to be a
good choice*
* Considering all the choices implemented in this K-d tree
and in this GRASP
89. Evaluation
For smaller TSPLIB instances (not shown up
to this point), GRASP demonstrates good
results:
a280, att48, ch130: < 1% for optimal
berlin52, dantzig42, eil101,
eil51, fri26, kroA100, kroC100,
rat99, st70: optimal
91. Bibliography
K-d tree
Multidimensional binary search
Bentley, J. L. (1975).
trees used for associative searching. Commun.
ACM, 18(9):509–517.
Bentley, J. L. (1990). K-d trees for semidynamic
point sets. In SCG ’90: Proceedings of the sixth annual symposium
on Computational geometry, pages 187–197, New York, NY, USA. ACM.
Fast algorithms for geometric
Bentley, J. J. (1992).
traveling salesman problems. ORSA Journal on Computing,
4(4):387–411.
An
Friedman, J. H., Bentley, J. L., and Finkel, R. A. (1977).
algorithm for finding best matches in
logarithmic expected time. ACM Trans. Math. Softw.,
3(3):209–226.
92. Bibliography
Heuristic-based
Bresina, J. L. (1996).
stochastic sampling. In Proceedings of the
AAAI- 96, pages 271–278.
Mateus, G. R., Resnde, M. G. C., and Silva, R. M. A. (2009).
Grasp: Procedimentos de busca
gulosos, aleatórios, e adaptativos.
Technical report, AT&T Labs Research.
Sedgewick, R. (1990). Algorithms in C. Addison-
Wesley Professional.