Page 1
International Journal of Information, Communication and Computing Technology
Jagan Institute of Management Studies, New Delhi
_________________________________________________________________________________________________________ 1 Assistant Professor, Department of Computer Science & IT, Jagannath University, Jaipur
2 Professor, Department of Mathematics, Jagannath University, Jaipur
3 Assistant Professor, Department of Computer Science, St. Xavier’s College, Jaipur
Email: 1 [email protected] ,
2 [email protected] ,
3 [email protected]
Copyright ©IJICCT, Vol II, Issue II (July-Dec2014): ISSN 2347-7202 96
Self-Adaptive Spider Monkey Optimization Algorithm for Engineering Optimization Problems
1Sandeep Kumar,
2Vivek Kumar Sharma,
3Rajani Kumari
ABSTRACT
Algorithms inspired by intelligent behavior of simple
agents are very popular now a day among researchers. A
comparatively young algorithm motivated by extraordinary
behavior of Spider Monkeys is Spider Monkey Optimization
(SMO) algorithm. SMO algorithm is very successful
algorithm to get to the bottom of optimization problems. This
work presents a self-adaptive Spider Monkey optimization
(SaSMO) algorithm for optimization problems. The
proposed strategy is self-adaptive in nature and therefore no
manual parameter setting is required. The proposed
technique is named as Self-Adaptive Spider Monkey
optimization (SaSMO) algorithm. SaSMO gives better
results for considered problems. Results are compared with
basic SMO and its recent variant MPU-SMO.
KEYWORDS
Spider Monkey Optimization Algorithm, Swarm
intelligence, Engineering optimization problems, Nature
Inspired Algorithms.
1. INTRODUCTION
Intelligent food foraging behavior of Spider Monkeys
inspired J. C Bansal and his colleagues to develop a new
algorithm. J. C. Bansal et al. [15] proposed Spider Monkey
Optimization (SMO) algorithm in year 2013. It is a recent
popular algorithm motivated by intelligent behavior of
simple natural agents. This algorithm is stimulated by means
of the extra ordinary behavior of Spider Monkeys while
searching for food. This algorithm based on fission-fusion
social structure (FFSS). It falls into category of Nature
Inspired Algorithms (NIA) that is inspired by some natural
phenomenon or extraordinary behavior of intelligent insects.
NIAs include Evolutionary algorithms, Immune algorithms,
neural algorithm, Physical algorithms, Probabilistic
algorithms, stochastic algorithms and Swarm algorithms
based on their source of inspiration and motivation. In last
decade a number of new algorithms are developed by
researchers that are inspired by natural phenomenon. Some
recent development includes Water Wave Optimization
(WWO) algorithm [2]. WWO algorithm is motivated by the
shallow water wave theory. It mimics the eye catching
phenomenon of water waves, such as propagation,
refraction, and breaking. A. Brabazon et al. [3] proposed a
new population based strategy based on social roosting and
foraging behavior of one species of bird and named it Raven
Roosting Optimization algorithm. Based on the living
behaviors of microalgae, photosynthetic species an
algorithm was developed by SA Uymaz [4] named as
Artificial Algae Algorithm (AAA). S. Mirjalili et al. [6]
proposed a novel nature inspired strategy for continuous
optimization problems namely Grey Wolf Optimizer
(GWO). GWO algorithm mimics the leadership hierarchy
and hunting mechanism of gray wolves. A. Askarzadeh
incepted Bird Mating Optimizer (BMO) [7] for designing
most favorable searching techniques. This algorithm
emulates the behavior of bird species figuratively to breed
broods with superior genes. X. Li et al. [11] developed
Animal Migration Optimization (AMO) Algorithm that
mimics migration activities of animals.
SMO algorithm consists of population of budding solutions
like other population based algorithms. In this algorithm
budding solutions are represented by food sources of spider
monkeys. The superiority of a food source is decided by
calculating its fitness. The SMO algorithm is comparatively
an easy, rapid and population based stochastic search
strategy. While searching for optimal solution this algorithm
need to maintain balance between two basic activities named
the assortment process, which make sure the exploitation of
the preceding knowledge and the adaptation process, which
empowers exploring diverse fields of the search space.
However, it has been observed that SMO algorithm is very
good in exploration of local search reason and exploitation
of best feasible solutions in its immediacy [1] [5]. Therefore,
to solve a complex problem like parameter estimation for
frequency-modulated sound wave this paper uses a new
variant of SMO algorithm. The proposed algorithm is
adaptive in nature as it automatically modifies the radius of
search area during local leader phase and local leader
decision phase in order to update position along with fitness
based position update. Fitness of a solution decides its
quality. There are number of methods to calculate fitness of
a function but it must include value of function.
There are very few research papers on SMO algorithm in
literature as it is very young algorithm. Recently S. Kumar et
Page 2
Self-Adaptive Spider Monkey Optimization for Engineering Optimization Problems
Copyright ©IJICCT, Vol II, Issue II (July-Dec2014): ISSN 2347-7202 97
al. proposed a couple of variants of SMO algorithm. First is
Modified Position Update in Spider Monkey Optimization
Algorithm (MPU-SMO) [5]. MPU-SMO tries to enhance
exploration and exploitation activities together by modifying
local leader phase and global leader phase with the help of
golden section search (GSS) [22] technique stimulated by
memetic search in ABC [16]. Memetic strategy inspired by
GSS strategy recently used in various algorithms like
RMABC [8], IMeABC [9], EnABC [10], MSDE [12] and
IoABC [13]. Second is Fitness Based Position Update on
Spider Monkey Optimization Algorithm (FPSMO) [1].
Fitness of a solution play very important role in population
based stochastic algorithms. It denotes quality of solution,
that how much they are suitable for next iteration. FPSMO
update position based on the individual’s fitness. It assumes
that best fitted solution has good neighbors and modifies the
step size according to its fitness. It takes a large step for high
fitted solution and a small step for low fitted solutions.
FPSMO move away from bad solutions and keep searching
in proximity of good solutions.
Formation of the paper is as follows: Major steps of SMO
algorithm are explained in section 2. Section 3 describes
newly anticipated variant of SMO algorithm. In section 4,
some engineering optimization problems are discussed in
detail. The solution of selected problems and performance of
the proposed strategy is analyzed in section 5 and 6. At last,
in section 7, paper is concluded followed by references.
2. SPIDER MONKEY OPTIMIZATION
ALGORITHM
Extraordinary food foraging conduct of spider monkeys
motivated J. C. Bansal et al. [15] to develop a new
population based metaheuristics. They named it Spider
Monkey Optimization algorithm. The original SMO
algorithm given by J. C. Bansal et al. [15] consists of seven
phases.
Population initialization
Local Leader Phase (LLP)
Global Leader Phase (GLP)
Global Leader Learning (GLL) phase
Local Leader Learning (LLL) phase
Local Leader Decision (LLD) phase
Global Leader Decision (GLD) phase
Spider Monkey Optimization Algorithm is not considered as
a swarm intelligence algorithm as it elects leader or
coordinator to control the groups. It is stochastic in nature as
there are some random components in each step.
2.1 Phases of Spider Monkey Optimization (SMO)
Algorithm
SMO algorithm is a population based iterative approach. It
has seven key steps including initialization of population.
The complete depiction of every phase is summarized in
next few subsections.
Population initialization
At first, a population of N spider monkeys initialized. Initial
population denoted by a D-dimensional vector SMOi, where
i = 1, 2,..., N. Every spider monkey SMO represents a
feasible solution for the consideration problem. Every SMOi
is initialized using Eq. (1).
min max min( )
(0,1)
ij j j jSMO SMO SMO SMO
where
(1)
Here SMOmin j and SMOmax j indicate lower and upper bounds
of SMOi in jth
direction correspondingly.
Local Leader Phase (LLP)
The subsequent phase is Local Leader Phase. Based on the
experience of local leader and group members SMO
modernize its present location. It compares fitness new
location and current location and applies greedy selection.
The ith SMO that also belongs to kth
local group update its
location using Eq. (2).
[0,1] ( )
[ 1,1] ( )
newij ij kj ij
rj ij
SMO SMO rand LL SMO
rand SMO SMO
(2)
Where SMOij denote ith
SMO in jth
dimension, LLkj
correspond to the kth
local group leader location in jth
dimension. SMOrj is the rth
SMO which is arbitrarily selected
from kth
group such that r≠ i in jth
dimension.
Global Leader Phase (GLP)
The Global Leader phase (GLP) starts just after finishing the
LLP. Based on experience of Global Leader and members of
local group SMO modernize their position using Eq. (3).
[0,1] ( )
[ 1,1] ( )
newij ij j ij
rj ij
SMO SMO rand GL SMO
rand SMO SMO
(3)
Page 3
International Journal of Information, Communication and Computing Technology (IJICCT)
Copyright ©IJICCT, Vol II, Issue II (July-Dec2014): ISSN 2347-7202 98
Where GLj stands for the global leader’s position in jth
dimension and j ∈ {1, 2, ...,D} denotes a randomly selected
index.
The SMOi updates their locations with the help of
probabilities pi’s. Probability of a particular solution
calculated using its fitness. There are number of different
methods for computing fitness and probability, here pi
computed using Eq. (4).
max
0.9 0.1ii
fitnessp
fitness
(4)
Global Leader Learning (GLL) phase
Now global leader modify its location with the help of some
greedy approaches. Highly fitted solution in current swarm
chosen as global leader. It also perform a check that the
position of global leader is modernize or not and modify
Global Limit Count accordingly.
Local Leader Learning (LLL) phase
Now local leader modify its location with the help of some
greedy approaches. Highly fitted solution in current swarm
within a group chosen as local leader. It also perform a
check that the position of local leader is modernize or not
and modify Local Limit Count accordingly.
Local Leader Decision (LLD) phase
In this phase decision taken about position of Local Leader,
if it is not modernized up to a threshold a.k.a. Local Leader
Limit (LLlimit). In case of no change it randomly initializes
position of LL. Position of LL may be decided with the help
of Eq. (5).
[0,1] ( )
[0,1] ( )
newij ij j ij
ij kj
SMO SMO rand GL SM
rand SM LL
(5)
Global Leader Decision (GLD) phase
In this phase decision taken about position of Global Leader,
if it is not modernized up to a threshold a.k.a. Global Leader
Limit (GLlimit), and then she creates subgroups of small size.
Number of subgroups has upper bound named as maximum
number of groups (MG). During this phase, local leaders are
decided for newly created subgroups using LLL process.
The SMO algorithm is very simple and it is something like
artificial bee colony algorithm. It has very few control
parameters. The SMO algorithm has four control parameters
named Local leader limit, Global leader limit, maximum
number of group and perturbation rate. If we have swarm of
size N then maximum number of groups should be N/10.
Local leader limit should be D*N, with dimension-D. Global
leader limit should be in range [N/2, 2N] and perturbation
rate should be in range [0.1, 0.9]. These values of parameters
are not hard bound they may vary as per requirement of
problems but for best results above suggested values are
most suitable.
3. SELF-ADAPTIVE SPIDER MONKEY
OPTIMIZATION (SaSMO) ALGORITHM
In order to get rid of complex optimization problem this
paper presents a novel and efficient approach using Spider
Monkey Optimization algorithm. The newly proposed
strategy is self adaptive in nature as it modify the position of
local leader based on its current position. It uses cognitive
learning to update current position of agents. Its location
based on present location linearly decline from 100 percent
to 50 percent in every iteration. Based on the assumption
that the solution of local leader phase will be far from the
optimal solution in the 1st iteration and it will converge
intimately to the optimal solution in afterward iterations and
dynamically adjust position of local leader. The proposed
algorithm also modifies the process of position update in
global leader. It use probability of selection of each
individual to update modify the position.
It engender the new locality for the entire group members
using Eq. (9) during local leader phase and Apply the greedy
selection mechanism between existing position and newly
computed position.
[0,1] ( )
( ) ( )
newij ij kj ij
i rj ij
SM SM rand LL SM
p SM SM
(9)
Here Pi is probability of selection. The probability pi (pi
denote probability of ith
solution) for every group member.
Probability is calculated using fitness of individuals as per
Eq. (4). During global leader phase it engenders new
positions for the every member of group using Eq. (10).
[0,1] ( )
( ) ( )
newij ij j ij
i rj ij
SM SM rand GL SM
p SM SM
(10)
In order to modernize the position of local leader it use
algorithm 1.
Page 4
Self-Adaptive Spider Monkey Optimization for Engineering Optimization Problems
Copyright ©IJICCT, Vol II, Issue II (July-Dec2014): ISSN 2347-7202 99
The value of max and min are preset to 1 and 0.5, in that
order. The local leader’s location based on existing location
linearly dwindles from 100 percent to 50 percent in every
round of experiment. This value decided after a series of
experiments.
Position of global leader updated in same manner as basic
SMO algorithm. The proposed algorithm summarized in
algorithm 2. The proposed algorithm relies on the idea that
the solution of local leader phase will be distant from the
best feasible solution in the 1st iteration and it will converge
intimately to the most favorable solution in subsequent
iterations, algorithm 2 will with dynamism regulate the
location of local leader by allowing a spider monkey in the
1st iteration to stroll with a large step size in the search area.
The step size for the wandering of spider monkey will
decrease with increment in the number of the iteration. It is
based on the concept that initial solutions are not good.
These solutions get refined with increasing number of
iterations. Here it is assumed that better solutions lies in
proximity of best fitted solution. In order to take benefit of
this step size automatically reduces with iteration.
4. OPTIMIZATION PROBLEMS
Selection of best suitable solution from a large set of
available solutions with some constraints matters in field of
Engineering, Management and Science. Optimization is a
complex process of finding best feasible for particular
objective function with a set of constraints including
different types of objective functions like uni-model, multi-
model, differential, non differential and discontinuous
functions etc. The performance of SaSMO algorithm tested
on various standard benchmark functions ( Table 1) and
some real world problems. Description of these problems is
given in next subsections.
4.1 Pressure Vessel Design Problem
The problem of minimizing total cost of the substance,
forming and welding of a cylindrical vessel [19]. In case of
pressure vessel design in general four design variables are
measured: shell thickness (x1), spherical head thickness (x2),
radius of cylindrical shell (x3) and shell length (x4). Simple
mathematical illustration of this problem is as follow:
2 2 243 1 3 4 2 3 1 4 1 3( ) 0.6224 1.7781 3.1611 19.84f x x x x x x x x x x
Algorithm 1: Position update of local leader
If position of a Local leader is not updating after
LLLimit then apply following steps.
if U(0,1) > pr
min max min( )
(0,1)
newij j j jSM SM SM SM
Where
else
max max min
[0,1] ( )
( ) ( ( ))max_
newij ij j ij
ij kj
SM SM rand GL SM
iterSM LL
iter
Algorithm 2: Self-Adaptive Spider Monkey
Optimization (SaSMO) Algorithm
Initialize all parameters
Calculate fitness
Choose leaders (global and local both)
Repeat till the extermination criterion is not fulfilled.
Generate the fresh locations for the entire group
members
[0,1] ( )
( ) ( )
newij ij kj ij
i rj ij i
SM SM rand LL SM
p SM SM Here p is probability
Apply the greedy selection mechanism between existing
position and newly computed position.
Compute the probability pi (pi denote probability of ith
solution) for every group member.
max
0.9 0.1ii
fitnessp
fitness
Engender new positions for the every member of group.
[0,1] ( )
( ) ( )
newij ij j ij
i rj ij
SM SM rand GL SM
p SM SM
Modernize local and global leader’s position.
If position of a Local leader is not updating after
LLLimit then apply following steps.
if (U(0,1) > pr)
min max min[0,1] ( )newij j j jSM SM rand SM SM
else
max max min
[0,1] ( )
( ) ( ( ))max_
newij ij j ij
ij kj
SM SM rand GL SM
iterSM LL
iter
If position of Global Leader is not updating after
GLLimit then apply following steps.
if (Global_Limit_Count > GLLimit )
then
Set Global_Limit_Count = 0
if (Number of groups < MG )
then
Divide the population into groups.
else
Merge all the groups into single large group.
Bring up to date position of Local Leader.
Page 5
International Journal of Information, Communication and Computing Technology (IJICCT)
Copyright ©IJICCT, Vol II, Issue II (July-Dec2014): ISSN 2347-7202 100
Subject to
1 3 1 2 3 2
23 3 4 3
0.0193 , 0.00954 ,
4750*1728 ( )
3
g x x x g x x x
g x x x x
The search boundaries for the variables are
1.125 ≤ x1 ≤ 12.5, 0.625 ≤ x2 ≤ 12.5,
1.0*10-8
≤ x3 ≤ 240 and 1.0*10-8
≤ x4 ≤ 240.
The best ever identified global optimum solution is f(1.125,
0.625, 55.8592, 57.7315) = 7197.729 [19]. The tolerable
error for considered problem is 1.0E-05.
4.2 Lennard-Jones Problem
The function is to minimize the potential energy of a set of N
atoms. The location Xi of the atom i has three coordinates
and therefore the dimension of the search space in 3N.
Actually, the coordinates of a point x are concatenation of
the ones of the Xi. In short it can be written as x=(X1, X2,
........... XN) and function can be written as
1
44 2
1 1
1 1( ) ( )
N N
i j i i j i j
f x
X X X X
Here search space is considered as [-2, 2] and N=5, α=6 [6].
4.3 Parameter estimation for frequency-modulated sound
wave
Frequency-Modulated (FM) sound wave amalgamation has a
significant role in more than a few contemporary music
systems. The parameter optimization of an FM synthesizer is
an optimization problem with six dimension where the
vector to be optimized is X = {a1, w1, a2, w2, a3, w3} of the
sound wave given in equation (11). The problem is to
produce a sound (11) analogous to target (12). This problem
is a exceedingly intricate multimodal one having strong
epistasis, with minimum value f(X) = 0. The expressions for
the anticipated sound and the target sound waves are
specified as:
1 1 2 2 3 3( ) .sin( . . .sin( . . .sin( . . )))y t a t a t a t (11)
0 ( ) (1.0).sin((5.0. . (1.5).sin((4.8). .(2.0).sin((4.9). . )))
y t t tt
(12)
Respectively where θ = 2π/100 and the parameters are
defined in the range [-6.4, 6.35]. The fitness function is the
summation of square errors between the estimated wave (11)
and the target wave (12) as follows:
1002
45 0
0
( ) ( ( ) ( ))
i
f x y t y t
(13)
Acceptable error for this problem is 1.0E-05, i.e. an
algorithm is considered successful if it finds the error less
than acceptable error in a given number of generations.
Parameter Estimation for Frequency Modulated Sound
Waves Problem [15] was solved by number of researchers
with the help of various algorithms. Recently S Das et al.
[20] used Differential evolution using a neighborhood-based
mutation operator to tackle this problem. A Rajshekhar et al.
[19] make use of Levy mutated Artificial Bee Colony
algorithm for global optimization to get rid of this problem.
A novel approach namely Particle Swarm Optimization with
Dynamic Local Search for Frequency Modulation Parameter
Identification proposed by Y Zhang et al. [18] in 2012. Y.
Lai et al. [21] proposed a new approach to mechanize the
optimization of the parameters of a FM synthesizer with the
help of genetic algorithm. S Ghorai et al. [17] developed a
faster DE algorithm to automate process of parameter
calibration for Frequency Modulated Sound Waves. S.
Kumar et al. [12] gives a memetic approach in DE to find
parameters in Frequency Modulated Sound Waves. Recently
S. Kumar et al. [14] developed a new strategy using
opposition based learning method and applied it to solve
Parameter Estimation for FM Sound Waves Problem.
Literature has large number of techniques to estimate
parameters for Frequency Modulated Sound Waves.
4.4 Compression Spring Problem
The compression spring problem [6] minimizes the weight
of a compression spring that is subjected to constraints of
shear stress, surge frequency, minimum deflection and limits
on outside diameter and on design variables with three
design variables considered: The diameter of wire(x1), mean
coil diameter (x2) and count of active coils (x3). Simple
mathematical representation of this problem is as follow:
1 1 2
3 0.001
{1,2,3,....,70} , [0.6;3],
[0.207;0.5]
x granularity x
x granularity
And four constraints
max 21 2 max3
3
8: 0, : 0,
ff
c F xg S g l l
x
Page 6
Self-Adaptive Spider Monkey Optimization for Engineering Optimization Problems
Copyright ©IJICCT, Vol II, Issue II (July-Dec2014): ISSN 2347-7202 101
3 4
3 3max
2 3 2
max1 3 max
46 3
31 2
: 0, : 0
Where
1 0.75 0.615 , 1000, 189000,
1.05( 2) , 14, , 6,
300, 11.5 10 , 1.258
m pp pm w
f
pf p pm
p w
F ax Fg g
K
x xc F S
x x xFF
l x x lK K
xF K
x x
And the function to be minimized is
22 2 3 1
46
( 2)( )
4
x x xf X
The finest ever identified solution is (7, 1.386599591,
0.292), which gives the fitness value f =2.6254 and 1.0E-04
is tolerable error for compression spring problem.
Table 1: Test Problems
Test Problem Objective Function Search Range Optimum Value D Acceptable
Error
Sphere 2
11
( )n
ii
f x x
[-5.12, 5.12] f(0) = 0 30 1.0E-05
De-Jong’s 4
21
( ) ( )n
ii
f x i x
[-5.12, 5.12] f(0) = 0 30 1.0E-05
Griewank 2
31 1
1( ) ( ) cos 1
4000
DDi
ii i
xf x x
i
[-600, 600] f(0) = 0 30 1.0E-05
Rosenbrock 12 2 2
14
1
100( )( 1 )) (
i D
i i
i
ix x xf x
[-30, 30] f(0) = 0 30 1.0E-01
Rastrigin 2
5
1
( ) 10cos(2 ) 10
D
i i
i
f x x x
[-5.12, 5.12] f(0) = 0 30 1.0E-05
Alpine
61
( ) sin 0.1n
i i ii
f x x x x
[-10, 10] f(0) =0 30 1.0E-05
Michalewicz 220
71
( ) sin (sin( ) )D
ii
i
ixf x x
[0, π] fmin = -9.66015 10 1.0E-05
Cosine Mixture 2
81
1
( ) 0.1( cos(5 )) 0.1
DD
i ii
i
f x x x D
[-1, 1] f(0) = -D*0.1 30 1.0E-05
Exponential Problem
29
1( ) exp( 0.5 )
n
ii
f x x
[-1, 1] f(0) = 1 10 1.0E-05
Brown 3 2 21
1 2( ) 1 2 110 1
1( ) ( )i i
D x xii
if x x x
[-1, 4] f(0) = 0 30 1.0E-05
Schewel
111 1
( )DD
i ii i
f x x x
[-10, 10] f(0) = 0 30 1.0E-05
Salomon Problem 212
1( ) 1 cos(2 ) 0.1 , ,
D
ii
f x p p where p x
[-100, 100] f(0) = 0 30 1.0E-01
Axis Parallel hyper-ellipsoid
213
1( )
D
ii
f x ix
[-5.12, 5.12] f(0) = 0 30 1.0E-05
Page 7
International Journal of Information, Communication and Computing Technology (IJICCT)
Copyright ©IJICCT, Vol II, Issue II (July-Dec2014): ISSN 2347-7202 102
Sum of different
powers
1
141
( )iD
ii
f x x
[-1, 1] f(0) = 0 30 1.0E-05
Step Function 2
151
( ) ( 0.5 )D
ii
f x x
[-100, 100] f(-0.5≤x≤0.5) = 0 30 1.0E-05
Inverted Cosine
wave function
2 211 1
161
2 21 1
( 0.5( ) (exp( ,
8
. cos(4 0.5 )
Di i i i
i
i i i i
x x x xf x
Where I x x x x
[-5. 5] f(0) = -D+1 10 1.0E-05
Neumaier 3 Problem
(NF3) 2
17 11 2
( ) ( 1)D D
i i ii i
f x x x x
[-100, 100] f(0) = -210 10 1.0E-01
Rotated Hypere-
ellipsoid 218
11
( )
Di
jj
i
f x x
[-65.536,
65.536]
f(0) = 0 30 1.0E-05
Levy montalvo -1 12 2 2 2
19 1 1
1
( ) (10sin ( ) ( 1) (1 10sin ( )) ( 1) ),
11 ( 1)
4
D
i i D
i
i i
f x y y y yD
Where y x
[-10, 10] f(-1) = 0 30 1.0E-05
Levy montalvo -2 12 2 2
20 1 11
2 2
( ) 0.1(sin (3 ) ( 1) (1 sin (3 ))
( 1) (1 sin (2 ))
D
i ii
D D
f x x x x
x x
[-5, 5] f(1) = 0 30 1.0E-05
Ellipsoidal 2
211
( ) ( 1)D
ii
f x x
[-D, D] f(1, 2,..,D) = 0 30 1.0E-05
Beale function 2 2 222 1 2 1 2
3 21 2
( ) (1.5 (1 )) (2.25 (1 ))
(2.625 (1 ))
f x x x x x
x x
[-4.5, 4.5] f(3. 0.5) = 0 2 1.0E-05
Colville function 2 2 2 2 223 2 1 1 4 3
2 2 23 2 4 2 4
( ) 100( ) (1 ) 90( )
(1 ) 10.1[( 1) ( 1) ] 19.8( 1)( 1)
f x x x x x x
x x x x x
[-10, 10] f(1) = 0 4 1.0E-05
Braninss Function 2 224 2 1 1 1( ) ( ) (1 )cosf x a x bx cx d e f x e
1
2
[ 5,10],
[0,15]
x
x
f(-π, 12.275) =
0.3979 2 1.0E-05
Kowalik function
21121 2
25 213 4
( )( ) ( )i i
ii
i i
x b b xf x a
b b x x
[-5, 5]
f(0.1928, 0.1908,
0.1231, 0.1357)
= 3.07E-04
4 1.0E-05
2D Tripod function 26 2 1 1 2 1 2 2( ) ( )(1 ( )) ( 50 ( )(1 2 ( ))) ( 50(1 2 ( )))f x p x p x x p x p x x p x [-100, 100] f(0, -50)=0 2 1.0E-04
Shifted Rosenbrock
12 2 2
27 11
1 2 1 2
( ) (100( ) ( 1) ,
1, [ , ,... ], [ , ,.. ]
D
i i i biasi
D D
f x z z z f
z x o x x x x o o o o
[-100, 100] f(o)=fbias=390 10 1.0E-01
Shifted Sphere 2
28 1 2 1 21
( ) , ( ), [ , ,.. ], [ , ,... ]D
i bias D Di
f x z f z x o x x x x o o o o
[-100, 100] f(o)=fbias=-450 10 1.0E-05
Shifted Rastrigin
229
1
1, 2 1 2
( ) ( 10cos(2 ) 10) ,
( ), [ ,... ], [ , ,....... ]
D
i i biasi
D D
f x z z f
z x o x x x x o o o o
[-5, 5] f(o)=fbias=-330 10 1.0E-02
Shifted Schwefel
230
1 1
1, 2 1 2
( ) ( ) , ( ),
[ ,... ], [ , ,....... ]
D i
j biasi j
D D
f x z f z x o
x x x x o o o o
[-100, 100] f(o)=fbias=-450 10 1.0E-05
Page 8
Self-Adaptive Spider Monkey Optimization for Engineering Optimization Problems
Copyright ©IJICCT, Vol II, Issue II (July-Dec2014): ISSN 2347-7202 103
Shifted Griewank
2
311 1
1, 2 1 2
( ) cos( ) 1 ,4000
( ), [ ,... ], [ , ,....... ]
DDi i
biasi i
D D
z zf x f
i
z x o x x x x o o o o
[-600, 600] f(o)=fbias=-180 10 1.0E-05
Shifted Ackley 2
321 1
1 2 1 2
1 1( ) 20exp( 0.2 ) exp( cos(2 ))
20 , ( ), [ , ,... ], [ , ,... ]
D D
i ii i
bias D D
f x z zD D
e f z x o x x x x o o o o
[-32,32] f(o)=fbias=-140 10 1.0E-05
Goldstein-Price
2 2 233 1 2 1 1 2 1 2 2
2 2 21 2 1 1 2 1 2 2
( ) (1 ( 1) (19 14 3 14 6 3 ))
(30 (2 3 ) (18 32 12 48 36 27 ))
f x x x x x x x x x
x x x x x x x x
[-2, 2] f(0, -1)=3 2 1.0E-14
Six-hump camel
back 2 4 2 2 2
34 1 1 1 1 2 2 21
( ) (4 2.1 ) ( 4 4 )3
f x x x x x x x x [-5, 5] f(-0.0898, 0.7126) =
-1.0316 2 1.0E-05
Easom’s function 2 2
1 2( ( ) ( ) )35 1 2( ) cos cos
x xf x x x e
[-10, 10] f(π, π) = -1 2 1.0E-13
Dekkkers and Aarts 5 2 2 2 2 2 5 2 2 436 1 2 1 2 1 2( ) 10 ( ) 10 ( )f x x x x x x x [-20, 20]
f(0,15)=f(0, -15)= -24777
2 5.0E-01
Hosaki Problem 2 3 4 237 1 1 1 1 2 2
7 1( ) (1 8 7 ) exp( )
3 4f x x x x x x x
1
2
[0,5],
[0,6]
x
x
-2.3458 2 1.0E-06
McCormick 238 1 2 1 2 1 2
3 5( ) sin( ) ( ) 1
2 2f x x x x x x x
1
2
1.5 4,
3 3,
x
x
f(-0.547, -1.547) =-
1.9133 30 1.0E-04
Meyer and Roth Problem
521 3
391 1 2
( ) ( )1
ii
i i i
x x tf x y
x t x v
[-10, 10]
f(3.13, 15.16,0.78) = 0.4E-04
3 1.0E-03
Shubert 5 5
40 1 21 1
( ) cos(( 1) 1) cos(( 1) 1)t i
f x i i x i i x
[-10, 10] f(7.0835, 4.8580)= -
186.7309 2 1.0E-05
Sinusoidal 411 1
( ) [ sin( ) sin( ( ))], 2.5, 5, 30n n
i ii i
f x A x z B x z A B z
[-10, 10] f(90+z)=-(A+1) 10 1.00E-02
Moved axis parallel hyper-ellipsoid
242
1( ) 5
D
ii
f x ix
[-5.12, 5.12]
f(x) =0; x(i) 5i, i=1:D
30 1.0E-15
5. EXPERIMENTAL SETUP
This paper validated the performance of the planned Self-
Adaptive Spider Monkey Optimization algorithm with the
original technique in forty two benchmark problems and
four real worl problems Pressure Vessel Design problem,
Lennard-Jones problem, Parameter Estimation for FM
Sound Waves Problem and Compression Spring problem.
The performance of newly proposed algorithm is compared
with Basic SMO algorithm [15] and SMO’s recent version
MPU-SMO [5]. The performance compared based on
standard deviation (SD), mean error (ME), average function
evaluation (AFE) and success rate (SR).
Experiments are performed in C programming language with
following experimental setup for SaSMO.
The size of swarm N = 50 ( Number of Spider Monkeys
at the time of initialization)
MG = 5 ( Maximum group limiting maximum number of
spider monkeys in a group as MG = N/10)
Global Leader Limit (GLlimit)=50,
Local Leader Limit (LLlimit)=1500,
pr ∈ [0.1, 0.4], linearly growing over iterations,
Remaining all parameters is similar to basic SMO algorithm
[15].
Page 9
International Journal of Information, Communication and Computing Technology (IJICCT)
Copyright ©IJICCT, Vol II, Issue II (July-Dec2014): ISSN 2347-7202 104
Table 2: Evaluation of the outcomes of test problems
Test Problem SaSMO MPU-SMO SMO
SD ME AFE SR SD ME AFE SR SD ME AFE SR
Sphere 1.60E-06 8.25E-06 14597.25 100 6.57E-07 9.54E-06 44435.12 100 9.33E-07 8.89E-06 16219.17 100
De-Jong’s 3.13E-06 5.77E-06 11563.4 100 6.45E-07 9.47E-06 27691.99 100 1.22E-06 8.31E-06 12848.22 100
Griewank 9.80E-04 1.05E-04 28207.11 99 3.42E-03 1.12E-03 87401.67 89 1.45E-06 8.84E-06 31341.23 100
Rosenbrock 1.27E+01 6.43E+00 177433 11 5.15E+01 5.42E+01 201808.6 4 4.81E+01 5.41E+01 203397.1 1
Rastrigin 2.76E-06 6.90E-06 81293.64 100 1.99E-06 7.81E-06 91623.6 100 1.27E-06 8.60E-06 90326.27 100
Alpine 1.33E-05 1.37E-05 75277.59 70 9.63E-06 1.17E-05 173750.6 81 5.06E-07 9.54E-06 72530.65 100
Michalewicz 3.53E-06 4.34E-06 32553.34 100 3.14E-06 5.64E-06 48314.99 100 3.53E-06 5.55E-06 36170.38 100
Cosine Mixture 1.85E-06 8.07E-06 17603.42 100 5.54E-07 9.53E-06 34828.34 100 1.47E-02 1.49E-03 19559.36 99
Exponential
Problem 1.38E-06 8.65E-06 11182.94 100 3.12E-07 9.70E-06 31575.63 100 9.39E-07 8.91E-06 12425.49 100
Brown 3 1.84E-06 8.08E-06 14545.58 100 4.82E-07 9.50E-06 53797.81 100 9.23E-07 8.79E-06 16161.75 100
Schewel 7.39E-07 9.30E-06 26217.68 100 7.61E-07 9.60E-06 67482.73 100 4.78E-07 9.43E-06 29130.75 100
Salomon
Problem 1.49E-06 8.40E-06 16916.93 100 4.23E-02 9.38E-01 77315.53 89 4.15E-02 9.24E-01 18796.59 100
Axis Parallel
hyper-ellipsoid 4.96E-01 2.58E+00 160839.9 0 3.39E-07 9.62E-06 53227.79 100 6.81E-07 9.20E-06 18711 100
Sum of different
powers 2.71E-06 6.17E-06 5301.45 100 2.03E-06 7.54E-06 7658.64 100 1.96E-06 7.34E-06 5890.5 100
Step Function 0.00E+00 0.00E+00 10033.55 100 9.95E-02 1.00E-02 27487.95 99 0.00E+00 0.00E+00 11148.39 100
Inverted Cosine
wave function 2.25E-06 7.38E-06 53613.46 100 2.02E-06 8.13E-06 90650.07 100 1.66E-06 7.92E-06 59570.51 100
Neumaier 3
Problem (NF3) 7.58E-01 3.96E-01 86206.15 62 5.79E-02 9.84E-02 42451.28 98 5.44E-03 9.80E-02 79455.2 100
Rotated Hypere-
ellipsoid 1.64E-06 8.23E-06 21396.47 100 4.29E-07 9.57E-06 63116.97 100 8.91E-07 8.87E-06 23773.86 100
Levy montalvo -
1 1.67E-06 8.10E-06 13486.18 100 5.83E-07 9.44E-06 27431.67 100 8.69E-07 8.95E-06 14984.64 100
Levy montalvo -
2 1.72E-06 7.95E-06 13605.57 100 6.55E-07 9.43E-06 35504.89 100 9.58E-07 8.89E-06 15117.3 100
Ellipsoidal 1.69E-06 8.42E-06 17563.39 100 8.64E-07 9.53E-06 65693.17 100 7.78E-07 8.99E-06 19514.88 100
Beale function 2.89E-06 5.74E-06 2898.423 100 2.77E-06 5.28E-06 4414.41 100 3.03E-06 5.01E-06 3220.47 100
Colville
function 2.00E-03 8.17E-03 17195.96 100 2.19E-03 7.31E-03 19106.62 100 2.31E-03 7.34E-03 21878.6 100
Braninss
Function 6.88E-06 6.22E-06 18496.32 88 7.09E-06 6.08E-06 31362.01 86 6.34E-06 5.37E-06 20551.47 91
Kowalik
function 1.45E-04 1.16E-04 75072.94 95 1.58E-05 8.68E-05 56696.51 100 1.41E-05 8.57E-05 50081.04 100
2D Tripod
function 2.32E-05 6.23E-05 5202.549 100 2.42E-05 6.05E-05 5780.61 100 2.55E-05 6.46E-05 9623.38 100
Page 10
Self-Adaptive Spider Monkey Optimization for Engineering Optimization Problems
Copyright ©IJICCT, Vol II, Issue II (July-Dec2014): ISSN 2347-7202 105
Shifted
Rosenbrock 2.16E-01 1.41E-01 98742.48 83 7.05E-02 9.94E-02 98602.75 93 4.04E+00 1.17E+00 141195.5 70
Shifted Sphere 2.12E-06 7.34E-06 6980.094 100 1.21E-06 8.26E-06 14371.83 100 1.71E-06 7.71E-06 7755.66 100
Shifted
Rastrigin 1.64E+01 1.16E+02 186736.1 0 1.54E+01 1.16E+02 207583.9 0 1.74E+01 1.08E+02 207484.5 0
Shifted
Schwefel 5.39E+03 2.06E+04 186807.6 0 6.57E+03 2.21E+04 207600.7 0 5.56E+03 1.94E+04 207564 0
Shifted
Griewank 1.59E-03 2.77E-04 69724.47 96 3.55E-03 1.54E-03 103522.6 79 2.92E-06 5.53E-06 68582.74 100
Shifted Ackley 1.37E-06 8.51E-06 10824.76 100 1.01E-06 8.99E-06 24075.81 100 1.00E-06 8.83E-06 12027.51 100
Goldstein-Price 4.08E-15 4.95E-15 4885.353 100 4.55E-15 5.73E-15 8595.18 100 4.52E-15 4.67E-15 5428.17 100
Six-hump camel
back 1.49E-05 1.64E-05 94389.39 50 1.49E-05 1.66E-05 104877.1 50 1.42E-05 1.80E-05 117140.3 44
Easom’s
function 3.03E-14 4.45E-14 20198.66 100 3.01E-14 5.05E-14 34858.85 100 2.90E-14 4.80E-14 22442.95 100
Dekkkers and
Aarts 5.49E-03 4.91E-01 1407.78 100 5.20E-03 4.90E-01 2181.96 100 4.56E-03 4.89E-01 1564.2 100
Hosaki Problem 6.47E-06 6.23E-06 23193.59 88 6.35E-06 5.83E-06 25821.46 88 6.54E-06 6.34E-06 36012.41 83
McCormick 6.80E-06 8.84E-05 937.332 100 6.82E-06 8.77E-05 1202.85 100 6.38E-06 8.80E-05 1041.48 100
Meyer and Roth
Problem 2.74E-06 1.95E-03 2437.272 100 2.86E-06 1.95E-03 3827.12 100 2.93E-06 1.94E-03 2708.08 100
Shubert 5.63E-06 5.07E-06 4921.884 100 5.78E-06 5.15E-06 10377.09 100 5.36E-06 4.68E-06 5468.76 100
Sinusoidal 2.53E-03 8.29E-03 127494.4 93 4.51E-03 8.77E-03 147882.4 86 7.35E-03 1.14E-02 144957.7 72
Moved axis
parallel hyper-
ellipsoid
1.32E-16 8.38E-16 39369.73 100 1.27E-16 9.68E-16 154819.9 95 9.10E-17 8.90E-16 43744.14 100
Pressure Vessel
Design problem 1.58E-01 1.21E-01 14212.9 84 1.13E-04 4.58E-05 163569.9 61 5.78E-02 1.93E-02 202147 13
Lennard-Jones
problem 2.55E-03 2.33E-03 192194 46 1.31E-03 1.55E-03 181965.2 48 1.49E-03 1.79E-03 180215.5 44
Parameter
Estimation for
FM Sound
Waves Problem
5.89E+00 5.76E+00 105576.9 83 4.23E+00 2.20E+00 161490.1 64 3.68E+00 1.51E+00 117307.7 74
Compression
Spring problem 1.01E-02 6.26E-03 26004.47 91 3.11E-04 5.26E-04 28893.86 100 2.43E-03 1.47E-03 87129.91 83
6. RESULTS AND DISCUSSION
The results obtained with basic SMO, MPU-SMO and
proposed SaSMO are reported in table 2. Table 2 discuss
results in terms of Success rate, average number of function
evaluations, mean error and standard deviation.
It can be observed from table 2 and figure 1 that proposed
SaSMO algorithm achieve better success rate in comparison
to considered other algorithms. The SaSMO algorithm is
also able to achieve optima in less number of function
evaluations. Additionally, the speed of convergence for the
considered algorithms also computed with the help of AFE.
It is considered that the less number of AFE indicate that the
algorithm has higher rate of convergence. The SMO
algorithm is stochastic in nature. So as to diminish the
significance of this stochastic nature the measured AFEs for
all algorithms averaged over 100 runs for considered
problem. Acceleration Rate (AR) computed in order to
check rate of convergence. Acceleration Rate (AR) is
defined as follows based on the AFEs for the two algorithms
ALGO and SaSMO:
Page 11
International Journal of Information, Communication and Computing Technology (IJICCT)
Copyright ©IJICCT, Vol II, Issue II (July-Dec2014): ISSN 2347-7202 106
ALGO
SaSMO
AFEAR
AFE (12)
Here { , }ALGO SMO MPU SMO . The AR>1 indicates
that SaSMO has higher rate of convergence. Comparison of
rate of convergence for SaSMO-SMO and SaSMO-
MPUSMO reported in table 3. It can be observed from table
3 that SaSMO algorithm always has higher rate of
convergence.
Table 3: Acceleration Rate (AR) of SaSMO compare to the Basic SMO and MPU-SMO for Test Problems
Test Function MPU-SMO SMO Test Function MPU-SMO SMO
Sphere 3.04 1.11 Braninss Function 1.70 1.11
De-Jong’s 2.39 1.12 Kowalik function 1.26 1.11
Griewank 3.10 1.10 2D Tripod function 1.11 1.85
Rosenbrock 1.14 1.15 Shifted Rosenbrock 1.11 1.59
Rastrigin 1.13 1.11 Shifted Sphere 2.06 1.11
Alpine 2.66 1.13 Shifted Rastrigin 1.11 1.09
Michalewicz 1.48 1.12 Shifted Schwefel 1.12 1.11
Cosine Mixture 1.98 1.11 Shifted Griewank 1.68 1.11
Exponential Problem 2.82 1.09 Shifted Ackley 2.22 1.11
Brown 3 3.70 1.11 Goldstein-Price 1.76 1.11
Schewel 2.57 1.11 Six-hump camel back 1.11 1.24
Salomon Problem 4.57 1.09 Easom’s function 1.73 1.11
Axis Parallel hyper-ellipsoid 3.16 1.11 Dekkkers and Aarts 1.55 1.12
Sum of different powers 1.44 1.12 Hosaki Problem 1.11 1.55
Step Function 2.74 1.11 McCormick 1.28 1.11
Inverted Cosine wave function 1.69 1.11 Meyer and Roth Problem 1.57 1.11
Neumaier 3 Problem (NF3) 1.11 2.08 Shubert 2.11 1.12
Rotated Hypere-ellipsoid 2.95 1.11 Sinusoidal 1.16 1.14
Levy montalvo -1 2.03 1.11 Moved axis parallel hyper-ellipsoid 3.93 1.11
Levy montalvo -2 2.61 1.13 Pressure Vessel Design problem 1.11 1.37
Ellipsoidal 3.74 1.11 Lennard-Jones problem 1.12 1.12
Beale function 1.52 1.16 Parameter Estimation for FM Sound Waves Problem 1.53 1.11
Colville function 1.11 1.27 Compression Spring problem 1.11 3.35
Figure 1: Boxplot graph for Average Number of Function Evaluation using SMO, MPU-SMO and SaSMO
Page 12
Self-Adaptive Spider Monkey Optimization for Engineering Optimization Problems
Copyright ©IJICCT, Vol II, Issue II (July-Dec2014): ISSN 2347-7202 107
7. CONCLUSION
Self-Adaptive Spider Monkey Optimization algorithm is an
efficient variant of SMO algorithm as it takes very less AFE
in comparison to basic SMO and MPU-SMO algorithms.
Comparison of AFE shown in figure 1 in form of box-plot
graph. This paper proposed a novel approach for
optimization problems by modifying basic Spider Monkey
Optimization algorithm. This approach add probability based
position update in global leader phase and dynamically
decrease size of search radius for local leader with
advancement in iterations and named as Self-Adaptive
Spider Monkey Optimization algorithm. The proposed
SaSMO algorithm easily solved the considered problems
with great success rate and with less number of function
evaluations i.e. with higher rate of convergence.
Convergence rate of SaSMO is four times of convergence
rate of basic SMO. SaSMO algorithm is significantly better
than their current variants in terms of Reliability (due to
success rate), Efficiency (due to average number of function
evaluations) and Accuracy (due to mean objective function
value).
REFERENCES
[1] S Kumar et al. “Fitness Based Position Update in Spider
Monkey Optimization Algorithm”, In Proceedings of
the 2015 International Conference on Soft Computing
and Software Engineering [SCSE'15], University of
California at Berkeley (UC Berkeley), USA, March 5-6,
2015. Accepted
[2] Zheng, Yu-Jun. "Water wave optimization: A new
nature-inspired metaheuristic." Computers &
Operations Research 55 (2015): 1-11.
[3] Brabazon, Anthony, Wei Cui, and Michael O’Neill.
"The raven roosting optimisation algorithm." Soft
Computing (2015): 1-21.
[4] Uymaz, Sait Ali, Gulay Tezel, and Esra Yel. "Artificial
Algae Algorithm (AAA) For Nonlinear Global
Optimization." Applied Soft Computing (2015).
[5] S Kumar et al. “Modified Position Update in Spider
Monkey Optimization Algorithm”, Int. J. of Emerging
Technologies in Computational and Appl. Sci. 7(2):
198-204, 2014.
[6] Mirjalili, Seyedali, Seyed Mohammad Mirjalili, and
Andrew Lewis. "Grey wolf optimizer." Advances in
Engineering Software 69 (2014): 46-61.
[7] Askarzadeh, Alireza. "Bird mating optimizer: an
optimization algorithm inspired by bird mating
strategies." Communications in Nonlinear Science and
Numerical Simulation 19.4 (2014): 1213-1228.
[8] S Kumar, VK. Sharma and R. Kumari, “Randomized
memetic artificial bee colony algorithm,” Int. J. of
Emerging Trends and Technology in Computer Science
3(1): pp. 52-62, 2014
[9] S Kumar, VK Sharma, and R Kumari. "An Improved
Memetic Search in Artificial Bee Colony
Algorithm." Int. J. of Computer Sci. and Information
Technology, 5(2), (2014): 1237-1247.
[10] S Kumar, VK Sharma and R Kumari, “Enhanced Local
Search in Artificial Bee Colony Algorithm”, Int. J. of
Emerging Technologies in Computational and Applied
Sciences, 7(2): pp. 177-184, 2014.
[11] Li, Xiangtao, Jie Zhang, and Minghao Yin. "Animal
migration optimization: an optimization algorithm
inspired by animal migration behavior." Neural
Computing and Applications 24.7-8 (2014): 1867-1877.
[12] S Kumar, VK Sharma and R Kumari, “Memetic Search
in Differential Evolution Algorithm”, Int. J. of
Computer Applications, 90(6):40-47, 2014. DOI:
10.5120/15582-4406
[13] S Kumar, VK Sharma and Rajani Kumari, “Improved
Onlooker Bee Phase in Artificial Bee Colony
Algorithm”, Int. J. of Computer Applications, 90(6): pp.
31-39, 2014. DOI: 10.5120/15579-4304
[14] Kumar, Sandeep; Sharma, Vivek Kumar; Kumari,
Rajani; Sharma, Vishnu Prakash; Sharma, Harish,
"Opposition based levy flight search in differential
evolution algorithm," Signal Propagation and
Computer Technology (ICSPCT), 2014 International
Conference on , pp.361,367, 12-13 July 2014. doi:
10.1109/ICSPCT.2014.6884915.
[15] JC Bansal et al. “Spider Monkey Optimization
algorithm for numerical optimization”. Memetic
Computing (2013), pp. 1-17.
[16] JC Bansal, H Sharma, KV Arya and A Nagar, “Memetic
search in artificial bee colony algorithm.” Soft
Computing (2013): 1-18.
[17] Ghorai, S., & Pal, S. K. Automatic Parameter
Calibration for Frequency Modulated Sound Synthesis
with a faster Differential Evolution. WSEAS
TRANSACTIONS on SYSTEMS. Issue 12, Volume 12,
pp. 604-613. December 2013
[18] Y Zhang,, Chen, L. (2012), “Particle swarm
optimization with dynamic local search for frequency
modulation parameter identification”. Int. J. of
Advancements in Computing Technology, 4(3), 189-195.
[19] A Rajasekhar, Abraham, A., & Pant, M. (2011,
October). “Levy mutated Artificial Bee Colony
algorithm for global optimization”. In Systems, Man,
and Cybernetics (SMC), 2011 IEEE Int. Conf. (pp. 655-
662). IEEE.
[20] S Das et al. "Differential evolution using a
neighborhood-based mutation operator." Evolutionary
Computation, IEEE Transactions on 13.3 (2009): 526-
553.
[21] Lai, Y., Jeng, S. K., Liu, D. T., & Liu, Y. C. (2006,
March). Automated optimization of parameters for FM
sound synthesis with genetic algorithms. In Int.
Workshop on Computer Music and Audio Technology.
[22] J Kiefer (1953) Sequential minimax search for a
maximum. In: Proc. of American Mathematical Society,
vol. 4, pp 502–506.