This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Research ArticleTwo-Dimensional IIR Filter Design Using Simulated AnnealingBased Particle Swarm Optimization
Supriya Dhabal1 and Palaniandavar Venkateswaran2
1 Department of Electronics and Communication Engineering Netaji Subhash Engineering College Garia KolkataWest Bengal 700152 India
2Department of Electronics and Telecommunication Engineering Jadavpur University Kolkata West Bengal 700032 India
Correspondence should be addressed to Supriya Dhabal supriya dhabalyahoocoin
Received 14 May 2014 Revised 16 August 2014 Accepted 23 August 2014 Published 9 September 2014
Academic Editor Ling Wang
Copyright copy 2014 S Dhabal and P Venkateswaran This is an open access article distributed under the Creative CommonsAttribution License which permits unrestricted use distribution and reproduction in any medium provided the original work isproperly cited
We present a novel hybrid algorithm based on particle swarm optimization (PSO) and simulated annealing (SA) for the designof two-dimensional recursive digital filters The proposed method known as SA-PSO integrates the global search ability of PSOwith the local search ability of SA and offsets the weakness of each other The acceptance criterion of Metropolis is included in thebasic algorithm of PSO to increase the swarmrsquos diversity by accepting sometimes weaker solutions also The experimental resultsreveal that the performance of the optimal filter designed by the proposed SA-PSO method is improved Further the convergencebehavior as well as optimization accuracy of proposed method has been improved significantly and computational time is alsoreduced In addition the proposed SA-PSO method also produces the best optimal solution with lower mean and variance whichindicates that the algorithm can be used more efficiently in realizing two-dimensional digital filters
1 Introduction
Design of two-dimensional (2D) filters has been consideredextensively over the past two decades as it plays a very sig-nificant role in the domain of biomedical image processingsatellite imaging seismic data processing and so forth [1]It is well known that digital filters are typically classifiedinto two groups recursive or infinite impulse response (IIR)and nonrecursive or finite impulse response (FIR) filtersDesign of IIR filters has received much more attentionbecause IIR filters can provide a better performance thanFIR filters having the same number of filter coefficientsBut the main problems of IIR filters are that they have amultimodal error surface and they may also be unstable insome cases To overcome the problem of multimodal errorsurface a global optimization method can be adopted moreefficientlyThe stability problem can be tackled by limiting theproblem space as appropriate constraints from the beginningof optimization routine [1 2] Similar to 1D filter 2D IIRfilters can also meet the same desired specifications with less
number of coefficients than required for an equivalent 2DFIR filter The design methodology for 2D filters groupedin two ways McClellan transformation based design of2D filters from 1D prototype and another one based onappropriate optimization techniques [1ndash6] In optimizationbased methods the design problem can be formulated as aconstrained minimization problem and they are solved byvarious global optimization techniques Previously reportedwork on this problem has applied different optimizationtechniques like neural network (NN) [1] genetic algorithm(GA) [2] computer language GENETICA [3] Taguchi-basedimmune algorithm [4] Bees algorithm [5] and particleswarm optimization (PSO) [6] efficiently Most of thesealgorithms exhibit slow convergence to achieve a good near-optimum solution and are easily trapped into local optimaThese shortcomings can be avoided by introducing SA withPSO because the first one has strong local explorationcapabilities and PSO exhibits fast global searching abilities
Earlier reported works demonstrated that by combiningSA with PSO significant improvements are achieved for
Hindawi Publishing CorporationJournal of OptimizationVolume 2014 Article ID 239721 10 pageshttpdxdoiorg1011552014239721
2 Journal of Optimization
different engineering problems like partner selection invirtual enterprise [7] job shop scheduling problem [8] holontask allocation scheme [9] energy consumption reductionin embedded systems [10] and numerical verification ofseveral benchmark functions [11] Zhao et al [7 9] andJamili et al [8] combined SA with PSO for solving differentpractical problems where SA starts with the global bestsolution generated by PSO These hybrid algorithms [7ndash9] can be easily converted to basic PSO by ignoring theSA subroutine and at the same time it can be used asa conventional SA by assuming population size to be oneparticle It is demonstrated in [7ndash9] that the hybrid methodoutperforms conventional GA SA and PSO not only byproviding quality solution but also by exhibiting greaterconvergence speed Idoumghar et al [10] and Shieh et al [11]applied this hybrid algorithm for solving several benchmarkfunctions and also for reducing energy consumption inembedded system memories It is proved that [10] the hybridSA-PSO outperforms most of the recently introduced PSObased methods like QIPSO (quadratic interpolation basedPSO) GMPSO (Gaussianmutation based PSO) and so forthin terms of accuracy robustness and convergence speedMotivated by the improved features of hybrid SA-PSO herethe design problem of 2D IIR filter is implemented moreefficiently Simulation results using this approach ensurebetter quality solutions with less computational time Thenovelty of the proposed work is as follows (i) new hybridalgorithm for designing 2D recursive filters (ii) the effectiveapplication of Metropolis criterion to escape from localoptimum and (iii) adaptive simulated annealing (ASA) forbetter control of convergence speed and accuracy
The rest of the paper is organized as follows The designproblem of 2D IIR filters and the stability condition arepresented in Section 2 Section 3 introduces the preliminaryconcept of PSO The proposed design method using hybridSA-PSO is described in Section 4 The simulation resultsindicating the comparisons with different design examplesare shown in Section 5 Finally conclusions and future workare given in Section 6
2 Problem Formulation of 2D IIR Filter
Here we consider the design problem of 119873th order 2D IIRfilter with transfer function [6]
To compare the performance of proposed SA-PSO methodthe similar design specifications have been used for desiredamplitude response [1ndash6]
119872119889(1205961 1205962) =
1 if radic1205962
1+ 1205962
2le 008
05 if 008120587 le radic1205962
1+ 1205962
2le 012120587
0 otherwise
(8)
Consequently the desired amplitude response of 2D filtersatisfying (8) is depicted in Figure 1
020
2
0
02
04
06
08
1
minus2minus2
1205961
1205962
Abs(M
)
Figure 1 Desired amplitude response |119872119889(1205961 1205962)| of 2D filter
3 An Overview of ParticleSwarm Optimization
Particle swarm optimization (PSO) is a population basedevolutionary algorithm which can mimic biological mech-anisms like bird flocking or fish schooling [12] Duringthe evolutionary process the swarm particles are attractedtowards the location of the best fitness of the particle itself(local best) and the location of the best fitness achievedby the whole population (global best) The position andvelocity of 119894th particle can be represented as follows 119883
119894=
(1199091198941 1199091198942 119909
119894119863) and119881
119894= (V1198941 V1198942 V
119894119863) Particle velocities
119881119894on each dimension are limited to (119881min 119881max) which
controls the local and global exploration capability of aparticle The best position of each particle is assumed as119901best119894
= (1199011198941 1199011198942 119901
119894119863) and the best fittest particle in
the group is 119901best119892
= (1199011198921 1199011198922 119901
119892119863) Then the new
velocities and positions for next evaluations are calculated by[12ndash14]
119881119894119863
= 120596 lowast 119881119894119863
+ 1198881lowast rand
1() lowast (119901best
119894119863minus 119883119894119863)
+ 1198882lowast rand
2() lowast (119892best
119894119863minus 119883119894119863)
(9)
119883119894119863
= 119883119894119863
+ 119881119894119863 (10)
where 1198881and 1198882are cognitive and social acceleration control
parameters and rand1() and rand
2() are two randomnumbers
isin [0 1] One important variant of standard PSO is theconstriction factor based PSO (CPSO) which was proposedby Clerc and Kennedy [14] The CPSO can generate higher-quality solutions than the standard PSO with inertia weightand it guarantees the convergence of the search procedureIn CPSO the particle velocity is updated by following theequation
119881119894119863
= 120594 (120596 lowast 119881119894119863
+ 1198881lowast rand
1() lowast (119901best
119894119863minus 119883119894119863)
+1198882lowast rand
2() lowast (119901best
119892119863minus 119883119894119863))
(11)
4 Journal of Optimization
Table 1 Different inertia weight strategies
Name of inertia weight Formula of inertia weight Parameters
Linear decreasing inertia weight (PSO-TVIW) 120596 = (120596max minus 120596min) times(itermax minus iter)
itermax+ 120596min
120596max = 09
120596min = 04
Random inertia weight (PSO-RANDIW) 120596 = 05 +rand()
2rand() isin [0 1]
Chaotic inertia weight (PSO-CIW)119911 = 4 times 119911 times (1 minus 119911)
120596 = (120596max minus 120596min)times(itermax minus iter)
itermax+120596mintimes119911
120596max = 09
120596min = 04
119911 isin [0 1]
Table 2 Parameter settings for GA and the proposed SA-PSO
Algorithm Parameters Value
GA
Population size 100Selection Roulette wheel probability 13Crossover Two points rate 08Mutation Gaussian rate 001
SA-PSO
Population size 100Inertia weight (120596) 09 rarr 04
1198881 1198882
205Initial temperature 119879
0= 50
Minimum temperature 119879min = 10minus10
Cooling schedule 119862 = 001 119876 = 001
where 120594 is called constriction factor given by 120594 = 2|2 minus 120593 minus
radic1205932 minus 4120593| 120593 = 1198881+ 1198882gt 4
It is well known that in PSO process inertia weight (120596)
plays an important role for balancing between exploration-exploitation trade-off In [15 16] it has been demonstratedthat chaotic inertia weight based PSO (PSO-CIW) [17] is thebest strategy for better accuracy whereas PSO with randominertia weight (PSO-RANDIW) produces optimal solutionswith better efficiency Therefore these two best performedinertia weight strategies combined with time varying inertiaweight (PSO-TVIW) are considered here to compare theperformance of the proposed SA-PSO The summary ofdifferent inertia weight strategies reported earlier is given inTable 1 along with the required constraints
4 The Proposed SA-PSO Algorithm
The main benefit of PSO is that it is a problem independentstochastic search optimization method Due to its stochastic
nature it reveals insufficient global searching ability at theend of a run particularly in multimodal functions Henceto escape from local minima and increase the diversity ofparticle the SA is integrated with PSO Similar to PSO SAis also a heuristic based random search global optimizationmethod proposed by Kirkpatrick et al [18] During thesearch process SA accepts not only better candidate solutionsbut also weakening solutions in certain degree accordingto Metropolis criterion [19] which can be mathematicallyrepresented by
119886 (119909) =
1 if Δ119864 le 0
exp (minusΔ119864
119879) if Δ119864 gt 0
(12)
where Δ119864 denotes change in energy due to perturbation ofparameters and 119879 is the current temperature of system Tovalidate (12) a random number 120588 isin [0 1] is generated andchecked for whether 120588 le 119886(119909) or not A suitable coolingschedule based on adaptive simulated annealing (ASA) isintroducedhere to update temperature of the system [20]Theannealing schedule for 119896th iteration is written as follows
where 119879(0) is starting temperature 119876 represents quenchingfactor and119863 is the dimension of search space Finally in ourproposed method a well constructed stopping condition isincluded to minimize the execution time and computationalefforts The proposed algorithm terminates when either ofminimum values of objective function and final temperatureor maximum number of iterations is reached [21]
The design of 2D recursive filter starts by assuming apopulation of random solutions in multidimensional searchspace Here each particle consists of 15 positional coordinateswhich is represented by
119883 = [11988601 11988602
11988610
11988611
11988612
11988620
11988621
11988622
1198871
1198872
1198881
1198882
1198891
1198892
1198670] (14)
Therefore 2D filter coefficients are represented as a vector119883 selected in the interval of [minus3 3] Each entry of 119883 is
optimized based on the flowchart presented in Figure 2 andthe proposed design flow is given below
Journal of Optimization 5
Start
Parameter settings
Initialized particles
Calculate fitness for all particles in population and select initial best
Update velocity and position
Accept new particle for next generation
Reduce temperature using cooling schedule
Stop criterion reached
Design 2D filter using best particle
Randomly generate No
YesYes
No
No
No
Select one particle randomly from updated
population and calculate fitness
New particle better than old onea isin
a gt eminusΔET
[01]
Figure 2 Flowchart of the proposed algorithm
6 Journal of Optimization
Table 3 Best results of 2D filter coefficients obtained by different approaches for 120588 = 2
NN [1] GA [2] GENETICA [3] TBIA [4] QPSO DGM [6] ASA PSOTVIW
Step 1 Specify desired filter (119872119889) specifications as given in
(8) Assume initial temperature is (1198790) and minimum level of
temperature is (119879min) initialize 120588 = 2 4 or 8 and 1198731= 1198732=
50 and use (5) to calculate 119888119901119902
and 119904119901119902
for 119901 119902 = 0 1 2 beforeoptimization starts
Step 2 Initialize swarm size minimum value of objectivefunction (min E) random population of coefficients vector(119909119894) and maximum number of evaluations (119896max gt 0)
Step 3 Set the iteration number 119896 = 1 assign best solution119909119892
= 119909 and use (2) for computing the fitness of the bestsolution 119901best
Step 4 (calculate fitness) Compute fitness values for eachparticle in the swarm
Step 5 (update 119901best) Particle best position 119901best is updatedbased onMetropolis criterion (12) as follows if current fitnessof selected particle lt fitness of 119901best then current positionof the particle is accepted as new 119901best with probability 1otherwise current particle will be accepted according to 120588 le
119886 where 120588 isin [0 1] and 119886 = exp(minus(120601(119901current) minus 120601(119901best))119879)
Step 6 (update 119892best) Now assign best particles 119901best value to119892best and calculate the new velocity and position by (9) and(10) for all particles
Step 7 (reduce Temperature) Calculate new temperature 119879
specified in cooling schedule (13) If 119879 le 119879min then terminatesearching for best solution otherwise go to Step 8
Step 8 (update velocities and positions) Calculate velocities119881119894using (9) and positions for each particle by (10)
Step 9 Update 119896 = 119896 + 1 and check whether 119896 ge 119896max orminimum value of cost function is achieved by any particleIf yes then assign 119909 = 119892best otherwise go back to Step 4
Step 10 Design 2Dfilter based on coefficients of 119909 = 119892best andcompute the magnitude response
5 Simulation Results and Discussion
In order to validate the performance of proposed SA-PSO we implemented the algorithm using MATLAB 2009on Genuine Intel(R) Core 2 Duo CPU E7300 266GHz2GBRAM Here all the associated algorithms are executedfor 20 independent runs with 40000 functional evaluations(FE) Table 2 shows the choice of several required parametersfor GA and SA-PSO based methods In case of proposed SA-PSO based methods initial population size is chosen as 100It is also observed that with the further increase in populationsize the solution quality remains unchanged In SA-PSO-TVIW the inertiaweight is decreased from09 to 04 to obtainthe best possible solutions whereas in SA-PSO-RANDIWit is varied randomly in between 05 and 1 The startingtemperature for annealing is chosen to be very high so thatit can escape from local optima easily The parameters forcooling schedule are chosen to be very small to assure globalconvergence otherwise it can skip the true global solutions[20 21]
Table 3 shows an analysis of the experimental results ofthe proposed method with NN (2001) GA (2003) GENET-ICA (2006) TBIA (2008) QPSO DGM (2010) ASA andPSO-TVIW with respect to the best known solutions Thebest results among different methods are highlighted in bold
Journal of Optimization 7
020
2
0
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(a)
020
20
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(b)
020
20
02
04
06
08
1
minus2minus2
1205961
1205962
Abs(M
)
(c)
020
2
0
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(d)
020
20
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(e)
020
20
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(f)
Figure 3 Amplitude response |119872(1205961 1205962)| of 2D filter for 120588 = 2 (a) SA-PSO-TVIW (b) SA-PSO-RANDIW (c) NN (d) GA (e) GENETICA
and (f) TBIA
and it can be easily verified that the experimental results ofthe proposed method that is SA-PSO-TVIW and SA-PSO-RANDIW are better than all other reported methods [1ndash6]For example the percentage improvement in 120601
2provided
by the proposed SA-PSO-TVIW method in comparison to
Further in Table 4 for different values of 120588 the bestpossible solutions (best) and worst solutions (worst) arelisted for different methods which clearly reveal that the
8 Journal of Optimization
020
2
0
02
04
06
08
1
minus2 minus2
12059611205962
Abs(M
)
(a)
020
2
0
02
04
06
08
1
minus2 minus2
12059611205962
Abs(M
)
(b)
Figure 4 Amplitude response |119872(1205961 1205962)| of 2D filter for 120588 = 4 (a) SA-PSO-TVIW and (b) SA-PSO-RANDIW
Table 4 Comparative results of different algorithms in terms of bestand worst case solution
bestworst solutions obtained by the proposed SA-PSO basedalgorithm are best for all runs Furthermore Table 5 presentsthe statistics for average (mean) and variance (VAR) of theoptimal solutions for different values of 120588 From Table 5 itcan be observed that the proposed SA-PSOalgorithmexhibitsbetter performance with less computational time (ie lessthan half time as compared to GA or ASA) and producesoptimal solution with very lower variance Hence it can beimplemented more efficiently in real-time designing of 2Dfilters
Figures 3(a)ndash3(f) show the calculated amplitude responseof 2D filter for 120588 = 2 The best results obtained by SA-PSO-TVIW and SA-PSO-RANDIW are shown in Figures3(a) and 3(b) respectively For comparison purpose theresults obtained by NN GA GENETICA and TBIA are alsoincluded in Figure 3(c) to Figure 3(f) A closer look at thesefigures reveals that SA-PSO based methods that is SA-PSO-TVIW and SA-PSO RANDIW yield a better approximationto the desired response and stop-band ripple is much lesswith respect to other competitivemethods [1ndash6] Figures 4(a)
and 4(b) show the magnitude response obtained using SA-PSO based methods for 120588 = 4 whereas Figures 5(a) and 5(b)represents the magnitude response for 120588 = 8
Figures 6(a) and 6(b) show the best convergence profilesof SA-PSO based methods for 120588 = 2 and 4 respectivelyAmong the 20 independent executions of four different algo-rithms (the existing two algorithms GA and PSO-RANDIWand the proposed two algorithms SA-PSO-TVIW and SA-PSO-RANDIW) the best performed runs are plotted hereInitially for few thousands FEs PSO-RANDIW convergesfaster than GA SA-PSO-RANDIW and SA-PSO-TVIWAfter certain number of iterations the PSO-RANDIW andGA both exhibit premature convergence and settle to nearoptimal solutions After 30000 FEs SA-PSO based methodsfall below the curves of both GA and PSO because SA helpsparticles to jump out from local optima from the beginningof a search Due to this at the end of search procedure theproposed hybrid method provides best candidate solutionHence the proposed hybrid algorithm produces better opti-mal solutions during the search process of 2D recursive filters
6 Conclusions
In this paper a novel hybrid evolutionary algorithm based onPSO and SA is proposed for finding the global best solutionof second-order two-dimensional recursive digital filtersThe proposed hybrid method integrates the global searchcapability of PSO with SA to escape from local minima Theexperimental results are comparedwith earlier reported algo-rithms namely NN GA GENETICA and TBIA in additionto different variants of PSO The comparison results indicatethat SA-PSO based methods exhibit better performance inall experiments and provide best optimum solution duringsearch mechanism Also the proposed method producesthe best solution with lower mean and variance Hence theproposed approach could be an alternative to implementthe two-dimensional digital filters for real-time applications
Journal of Optimization 9
Table 5 Comparative results of different algorithms in terms of mean and VAR
Algorithms 120588 = 2 120588 = 4 120588 = 8 Time (S)Mean VAR Mean VAR Mean VAR
GA 46137 06160 04300 00688 00066 708119864 minus 06 8980ASA 42475 00288 02679 314119864 minus 04 00043 314119864 minus 06 6204PSO-TVIW 44858 02347 03044 00086 00039 160119864 minus 06 4684PSO-CIW 42127 02778 02847 00022 00035 142119864 minus 06 3902PSO-RANDIW 4066 03453 02549 00013 00037 550119864 minus 07 4054SA-PSO (TVIW) 38762 05053 03123 00029 00051 361119864 minus 06 3568SA-PSO (RANDIW) 29059 00094 01845 158E minus 04 00023 728E minus 07 3007
020
2
0
02
04
06
08
1
minus2 minus2
12059611205962
Abs(M
)
(a)
020
20
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(b)
Figure 5 Amplitude response |119872(1205961 1205962)| of 2D filter for 120588 = 8 (a) SA-PSO-TVIW and (b) SA-PSO-RANDIW
05 1 15 2 25 3 35 4Number of iterations
Fitn
ess i
n lo
g sc
ale
times104
101
GAPSO-RANDIW
SA-PSO-TVIWSA-PSO-RANDIW
(a)
05 1 15 2 25 3 35 4
Fitn
ess i
n lo
g sc
ale
times104
101
100
GAPSO-RANDIW
SA-PSO-TVIWSA-PSO-RANDIW
Number of iterations
(b)
Figure 6 Best fitness profile for GA PSO-RANDIW SA-PSO-TVIW and SA-PSO-RANDIW (a) 120588 = 2 and (b) 120588 = 4
10 Journal of Optimization
Emphasis on higher order two-dimensional recursive filterdesign with guaranteed stability and further reduction incomputational complexity is left as future work
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
References
[1] V M Mladenov and N E Mastorakis ldquoDesign of two-dimensional recursive filters by using neural networksrdquo IEEETransactions on Neural Networks vol 12 no 3 pp 585ndash5902001
[2] NMastorakis I F Gonos andM N S Swamy ldquoDesign of two-dimensional recursive filters using genetic algorithmsrdquo IEEETransactions on Circuits and Systems vol 50 no 5 pp 634ndash6392003
[3] I F Gonos L I Virirakis N EMastorakis andMN S SwamyldquoEvolutionary design of 2-dimensional recursive filters via thecomputer language GENETICArdquo IEEE Transactions on Circuitsand Systems II vol 53 no 4 pp 254ndash258 2006
[4] J-T Tsai W-H Ho and J-H Chou ldquoDesign of two-dimensional recursive filters by using Taguchi-based immunealgorithmrdquo IET Signal Processing vol 2 no 2 pp 110ndash117 2008
[5] D T Pham and E Koc ldquoDesign of a two-dimensional recur-sive filter using the bees algorithmrdquo International Journal ofAutomation and Computing vol 7 no 3 pp 399ndash402 2010
[6] J SunW Fang andWXu ldquoA quantum-behaved particle swarmoptimization with diversity-guided mutation for the designof two-dimensional IIR digital filtersrdquo IEEE Transactions onCircuits and Systems II Express Briefs vol 57 no 2 pp 141ndash1452010
[7] F Zhao Q Zhang D Yu X Chen and Y Yang ldquoA hybridalgorithm based on PSO and simulated annealing and itsapplications for partner selection in virtual enterpriserdquo inProceedings of the International Conference on Advances inIntelligent Computing pp 380ndash389 Springer 2005
[8] A Jamili M A Shafia and R Tavakkoli-Moghaddam ldquoAhybrid algorithm based on particle swarm optimization andsimulated annealing for a periodic job shop scheduling prob-lemrdquo International Journal of AdvancedManufacturing Technol-ogy vol 54 no 1ndash4 pp 309ndash322 2011
[9] F Zhao Y Hong D Yu Y Yang Q Zhang and H Yi ldquoA hybridalgorithm based on particle swarm optimization and simulatedannealing to holon task allocation for holonic manufacturingsystemrdquo International Journal of AdvancedManufacturing Tech-nology vol 32 no 9-10 pp 1021ndash1032 2007
[10] L IdoumgharMMelkemi R Schott andM I Aouad ldquoHybridPSO-SA type algorithms for multimodal function optimizationand reducing energy consumption in embedded systemsrdquoApplied Computational Intelligence and Soft Computing vol2011 Article ID 138078 12 pages 2011
[11] H-L Shieh C-C Kuo and C-M Chiang ldquoModified parti-cle swarm optimization algorithm with simulated annealingbehavior and its numerical verificationrdquo Applied Mathematicsand Computation vol 218 no 8 pp 4365ndash4383 2011
[12] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks vol 4 pp 1942ndash1948 1995
[13] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimizationrdquo in Proceedings of the IEEE International Congresson Evolutionary Computation vol 3 pp 101ndash106 1999
[14] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002
[15] A Nickabadi M M Ebadzadeh and R Safabakhsh ldquoA novelparticle swarm optimization algorithm with adaptive inertiaweightrdquoApplied Soft Computing Journal vol 11 no 4 pp 3658ndash3670 2011
[16] J C Bansal P K Singh M Saraswat A Verma S S Jadonand A Abraham ldquoInertia weight strategies in particle swarmoptimizationrdquo in Proceedings of the 3rd World Congress onNature and Biologically Inspired Computing (NaBIC 11) pp633ndash640 Salamanca Spain October 2011
[17] Y Feng G-F Teng A-XWang andY-M Yao ldquoChaotic inertiaweight in particle swarmoptimizationrdquo inProceedings of the 2ndInternational Conference on Innovative Computing Informationand Control (ICICIC 07) p 475 Kumamoto Japan September2007
[18] S Kirkpatrick J Gelatt and M P Vecchi ldquoOptimization bysimulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983
[19] N Metropolis A W Rosenbluth M N Rosenbluth A HTeller and E Teller ldquoEquation of state calculations by fastcomputing machinesrdquo The Journal of Chemical Physics vol 21no 6 pp 1087ndash1092 1953
[20] L Ingber ldquoAdaptive simulated annealing (ASA) lessonslearnedrdquo Journal of Control and Cybernetics vol 25 no 1 pp33ndash54 1996
[21] S Dhabal and P Venkateswaran ldquoAn efficient nonuniformcosinemodulated filter bank design using simulated annealingrdquoJournal of Signal and Information Processing vol 3 no 3 pp330ndash338 2012
different engineering problems like partner selection invirtual enterprise [7] job shop scheduling problem [8] holontask allocation scheme [9] energy consumption reductionin embedded systems [10] and numerical verification ofseveral benchmark functions [11] Zhao et al [7 9] andJamili et al [8] combined SA with PSO for solving differentpractical problems where SA starts with the global bestsolution generated by PSO These hybrid algorithms [7ndash9] can be easily converted to basic PSO by ignoring theSA subroutine and at the same time it can be used asa conventional SA by assuming population size to be oneparticle It is demonstrated in [7ndash9] that the hybrid methodoutperforms conventional GA SA and PSO not only byproviding quality solution but also by exhibiting greaterconvergence speed Idoumghar et al [10] and Shieh et al [11]applied this hybrid algorithm for solving several benchmarkfunctions and also for reducing energy consumption inembedded system memories It is proved that [10] the hybridSA-PSO outperforms most of the recently introduced PSObased methods like QIPSO (quadratic interpolation basedPSO) GMPSO (Gaussianmutation based PSO) and so forthin terms of accuracy robustness and convergence speedMotivated by the improved features of hybrid SA-PSO herethe design problem of 2D IIR filter is implemented moreefficiently Simulation results using this approach ensurebetter quality solutions with less computational time Thenovelty of the proposed work is as follows (i) new hybridalgorithm for designing 2D recursive filters (ii) the effectiveapplication of Metropolis criterion to escape from localoptimum and (iii) adaptive simulated annealing (ASA) forbetter control of convergence speed and accuracy
The rest of the paper is organized as follows The designproblem of 2D IIR filters and the stability condition arepresented in Section 2 Section 3 introduces the preliminaryconcept of PSO The proposed design method using hybridSA-PSO is described in Section 4 The simulation resultsindicating the comparisons with different design examplesare shown in Section 5 Finally conclusions and future workare given in Section 6
2 Problem Formulation of 2D IIR Filter
Here we consider the design problem of 119873th order 2D IIRfilter with transfer function [6]
To compare the performance of proposed SA-PSO methodthe similar design specifications have been used for desiredamplitude response [1ndash6]
119872119889(1205961 1205962) =
1 if radic1205962
1+ 1205962
2le 008
05 if 008120587 le radic1205962
1+ 1205962
2le 012120587
0 otherwise
(8)
Consequently the desired amplitude response of 2D filtersatisfying (8) is depicted in Figure 1
020
2
0
02
04
06
08
1
minus2minus2
1205961
1205962
Abs(M
)
Figure 1 Desired amplitude response |119872119889(1205961 1205962)| of 2D filter
3 An Overview of ParticleSwarm Optimization
Particle swarm optimization (PSO) is a population basedevolutionary algorithm which can mimic biological mech-anisms like bird flocking or fish schooling [12] Duringthe evolutionary process the swarm particles are attractedtowards the location of the best fitness of the particle itself(local best) and the location of the best fitness achievedby the whole population (global best) The position andvelocity of 119894th particle can be represented as follows 119883
119894=
(1199091198941 1199091198942 119909
119894119863) and119881
119894= (V1198941 V1198942 V
119894119863) Particle velocities
119881119894on each dimension are limited to (119881min 119881max) which
controls the local and global exploration capability of aparticle The best position of each particle is assumed as119901best119894
= (1199011198941 1199011198942 119901
119894119863) and the best fittest particle in
the group is 119901best119892
= (1199011198921 1199011198922 119901
119892119863) Then the new
velocities and positions for next evaluations are calculated by[12ndash14]
119881119894119863
= 120596 lowast 119881119894119863
+ 1198881lowast rand
1() lowast (119901best
119894119863minus 119883119894119863)
+ 1198882lowast rand
2() lowast (119892best
119894119863minus 119883119894119863)
(9)
119883119894119863
= 119883119894119863
+ 119881119894119863 (10)
where 1198881and 1198882are cognitive and social acceleration control
parameters and rand1() and rand
2() are two randomnumbers
isin [0 1] One important variant of standard PSO is theconstriction factor based PSO (CPSO) which was proposedby Clerc and Kennedy [14] The CPSO can generate higher-quality solutions than the standard PSO with inertia weightand it guarantees the convergence of the search procedureIn CPSO the particle velocity is updated by following theequation
119881119894119863
= 120594 (120596 lowast 119881119894119863
+ 1198881lowast rand
1() lowast (119901best
119894119863minus 119883119894119863)
+1198882lowast rand
2() lowast (119901best
119892119863minus 119883119894119863))
(11)
4 Journal of Optimization
Table 1 Different inertia weight strategies
Name of inertia weight Formula of inertia weight Parameters
Linear decreasing inertia weight (PSO-TVIW) 120596 = (120596max minus 120596min) times(itermax minus iter)
itermax+ 120596min
120596max = 09
120596min = 04
Random inertia weight (PSO-RANDIW) 120596 = 05 +rand()
2rand() isin [0 1]
Chaotic inertia weight (PSO-CIW)119911 = 4 times 119911 times (1 minus 119911)
120596 = (120596max minus 120596min)times(itermax minus iter)
itermax+120596mintimes119911
120596max = 09
120596min = 04
119911 isin [0 1]
Table 2 Parameter settings for GA and the proposed SA-PSO
Algorithm Parameters Value
GA
Population size 100Selection Roulette wheel probability 13Crossover Two points rate 08Mutation Gaussian rate 001
SA-PSO
Population size 100Inertia weight (120596) 09 rarr 04
1198881 1198882
205Initial temperature 119879
0= 50
Minimum temperature 119879min = 10minus10
Cooling schedule 119862 = 001 119876 = 001
where 120594 is called constriction factor given by 120594 = 2|2 minus 120593 minus
radic1205932 minus 4120593| 120593 = 1198881+ 1198882gt 4
It is well known that in PSO process inertia weight (120596)
plays an important role for balancing between exploration-exploitation trade-off In [15 16] it has been demonstratedthat chaotic inertia weight based PSO (PSO-CIW) [17] is thebest strategy for better accuracy whereas PSO with randominertia weight (PSO-RANDIW) produces optimal solutionswith better efficiency Therefore these two best performedinertia weight strategies combined with time varying inertiaweight (PSO-TVIW) are considered here to compare theperformance of the proposed SA-PSO The summary ofdifferent inertia weight strategies reported earlier is given inTable 1 along with the required constraints
4 The Proposed SA-PSO Algorithm
The main benefit of PSO is that it is a problem independentstochastic search optimization method Due to its stochastic
nature it reveals insufficient global searching ability at theend of a run particularly in multimodal functions Henceto escape from local minima and increase the diversity ofparticle the SA is integrated with PSO Similar to PSO SAis also a heuristic based random search global optimizationmethod proposed by Kirkpatrick et al [18] During thesearch process SA accepts not only better candidate solutionsbut also weakening solutions in certain degree accordingto Metropolis criterion [19] which can be mathematicallyrepresented by
119886 (119909) =
1 if Δ119864 le 0
exp (minusΔ119864
119879) if Δ119864 gt 0
(12)
where Δ119864 denotes change in energy due to perturbation ofparameters and 119879 is the current temperature of system Tovalidate (12) a random number 120588 isin [0 1] is generated andchecked for whether 120588 le 119886(119909) or not A suitable coolingschedule based on adaptive simulated annealing (ASA) isintroducedhere to update temperature of the system [20]Theannealing schedule for 119896th iteration is written as follows
where 119879(0) is starting temperature 119876 represents quenchingfactor and119863 is the dimension of search space Finally in ourproposed method a well constructed stopping condition isincluded to minimize the execution time and computationalefforts The proposed algorithm terminates when either ofminimum values of objective function and final temperatureor maximum number of iterations is reached [21]
The design of 2D recursive filter starts by assuming apopulation of random solutions in multidimensional searchspace Here each particle consists of 15 positional coordinateswhich is represented by
119883 = [11988601 11988602
11988610
11988611
11988612
11988620
11988621
11988622
1198871
1198872
1198881
1198882
1198891
1198892
1198670] (14)
Therefore 2D filter coefficients are represented as a vector119883 selected in the interval of [minus3 3] Each entry of 119883 is
optimized based on the flowchart presented in Figure 2 andthe proposed design flow is given below
Journal of Optimization 5
Start
Parameter settings
Initialized particles
Calculate fitness for all particles in population and select initial best
Update velocity and position
Accept new particle for next generation
Reduce temperature using cooling schedule
Stop criterion reached
Design 2D filter using best particle
Randomly generate No
YesYes
No
No
No
Select one particle randomly from updated
population and calculate fitness
New particle better than old onea isin
a gt eminusΔET
[01]
Figure 2 Flowchart of the proposed algorithm
6 Journal of Optimization
Table 3 Best results of 2D filter coefficients obtained by different approaches for 120588 = 2
NN [1] GA [2] GENETICA [3] TBIA [4] QPSO DGM [6] ASA PSOTVIW
Step 1 Specify desired filter (119872119889) specifications as given in
(8) Assume initial temperature is (1198790) and minimum level of
temperature is (119879min) initialize 120588 = 2 4 or 8 and 1198731= 1198732=
50 and use (5) to calculate 119888119901119902
and 119904119901119902
for 119901 119902 = 0 1 2 beforeoptimization starts
Step 2 Initialize swarm size minimum value of objectivefunction (min E) random population of coefficients vector(119909119894) and maximum number of evaluations (119896max gt 0)
Step 3 Set the iteration number 119896 = 1 assign best solution119909119892
= 119909 and use (2) for computing the fitness of the bestsolution 119901best
Step 4 (calculate fitness) Compute fitness values for eachparticle in the swarm
Step 5 (update 119901best) Particle best position 119901best is updatedbased onMetropolis criterion (12) as follows if current fitnessof selected particle lt fitness of 119901best then current positionof the particle is accepted as new 119901best with probability 1otherwise current particle will be accepted according to 120588 le
119886 where 120588 isin [0 1] and 119886 = exp(minus(120601(119901current) minus 120601(119901best))119879)
Step 6 (update 119892best) Now assign best particles 119901best value to119892best and calculate the new velocity and position by (9) and(10) for all particles
Step 7 (reduce Temperature) Calculate new temperature 119879
specified in cooling schedule (13) If 119879 le 119879min then terminatesearching for best solution otherwise go to Step 8
Step 8 (update velocities and positions) Calculate velocities119881119894using (9) and positions for each particle by (10)
Step 9 Update 119896 = 119896 + 1 and check whether 119896 ge 119896max orminimum value of cost function is achieved by any particleIf yes then assign 119909 = 119892best otherwise go back to Step 4
Step 10 Design 2Dfilter based on coefficients of 119909 = 119892best andcompute the magnitude response
5 Simulation Results and Discussion
In order to validate the performance of proposed SA-PSO we implemented the algorithm using MATLAB 2009on Genuine Intel(R) Core 2 Duo CPU E7300 266GHz2GBRAM Here all the associated algorithms are executedfor 20 independent runs with 40000 functional evaluations(FE) Table 2 shows the choice of several required parametersfor GA and SA-PSO based methods In case of proposed SA-PSO based methods initial population size is chosen as 100It is also observed that with the further increase in populationsize the solution quality remains unchanged In SA-PSO-TVIW the inertiaweight is decreased from09 to 04 to obtainthe best possible solutions whereas in SA-PSO-RANDIWit is varied randomly in between 05 and 1 The startingtemperature for annealing is chosen to be very high so thatit can escape from local optima easily The parameters forcooling schedule are chosen to be very small to assure globalconvergence otherwise it can skip the true global solutions[20 21]
Table 3 shows an analysis of the experimental results ofthe proposed method with NN (2001) GA (2003) GENET-ICA (2006) TBIA (2008) QPSO DGM (2010) ASA andPSO-TVIW with respect to the best known solutions Thebest results among different methods are highlighted in bold
Journal of Optimization 7
020
2
0
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(a)
020
20
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(b)
020
20
02
04
06
08
1
minus2minus2
1205961
1205962
Abs(M
)
(c)
020
2
0
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(d)
020
20
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(e)
020
20
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(f)
Figure 3 Amplitude response |119872(1205961 1205962)| of 2D filter for 120588 = 2 (a) SA-PSO-TVIW (b) SA-PSO-RANDIW (c) NN (d) GA (e) GENETICA
and (f) TBIA
and it can be easily verified that the experimental results ofthe proposed method that is SA-PSO-TVIW and SA-PSO-RANDIW are better than all other reported methods [1ndash6]For example the percentage improvement in 120601
2provided
by the proposed SA-PSO-TVIW method in comparison to
Further in Table 4 for different values of 120588 the bestpossible solutions (best) and worst solutions (worst) arelisted for different methods which clearly reveal that the
8 Journal of Optimization
020
2
0
02
04
06
08
1
minus2 minus2
12059611205962
Abs(M
)
(a)
020
2
0
02
04
06
08
1
minus2 minus2
12059611205962
Abs(M
)
(b)
Figure 4 Amplitude response |119872(1205961 1205962)| of 2D filter for 120588 = 4 (a) SA-PSO-TVIW and (b) SA-PSO-RANDIW
Table 4 Comparative results of different algorithms in terms of bestand worst case solution
bestworst solutions obtained by the proposed SA-PSO basedalgorithm are best for all runs Furthermore Table 5 presentsthe statistics for average (mean) and variance (VAR) of theoptimal solutions for different values of 120588 From Table 5 itcan be observed that the proposed SA-PSOalgorithmexhibitsbetter performance with less computational time (ie lessthan half time as compared to GA or ASA) and producesoptimal solution with very lower variance Hence it can beimplemented more efficiently in real-time designing of 2Dfilters
Figures 3(a)ndash3(f) show the calculated amplitude responseof 2D filter for 120588 = 2 The best results obtained by SA-PSO-TVIW and SA-PSO-RANDIW are shown in Figures3(a) and 3(b) respectively For comparison purpose theresults obtained by NN GA GENETICA and TBIA are alsoincluded in Figure 3(c) to Figure 3(f) A closer look at thesefigures reveals that SA-PSO based methods that is SA-PSO-TVIW and SA-PSO RANDIW yield a better approximationto the desired response and stop-band ripple is much lesswith respect to other competitivemethods [1ndash6] Figures 4(a)
and 4(b) show the magnitude response obtained using SA-PSO based methods for 120588 = 4 whereas Figures 5(a) and 5(b)represents the magnitude response for 120588 = 8
Figures 6(a) and 6(b) show the best convergence profilesof SA-PSO based methods for 120588 = 2 and 4 respectivelyAmong the 20 independent executions of four different algo-rithms (the existing two algorithms GA and PSO-RANDIWand the proposed two algorithms SA-PSO-TVIW and SA-PSO-RANDIW) the best performed runs are plotted hereInitially for few thousands FEs PSO-RANDIW convergesfaster than GA SA-PSO-RANDIW and SA-PSO-TVIWAfter certain number of iterations the PSO-RANDIW andGA both exhibit premature convergence and settle to nearoptimal solutions After 30000 FEs SA-PSO based methodsfall below the curves of both GA and PSO because SA helpsparticles to jump out from local optima from the beginningof a search Due to this at the end of search procedure theproposed hybrid method provides best candidate solutionHence the proposed hybrid algorithm produces better opti-mal solutions during the search process of 2D recursive filters
6 Conclusions
In this paper a novel hybrid evolutionary algorithm based onPSO and SA is proposed for finding the global best solutionof second-order two-dimensional recursive digital filtersThe proposed hybrid method integrates the global searchcapability of PSO with SA to escape from local minima Theexperimental results are comparedwith earlier reported algo-rithms namely NN GA GENETICA and TBIA in additionto different variants of PSO The comparison results indicatethat SA-PSO based methods exhibit better performance inall experiments and provide best optimum solution duringsearch mechanism Also the proposed method producesthe best solution with lower mean and variance Hence theproposed approach could be an alternative to implementthe two-dimensional digital filters for real-time applications
Journal of Optimization 9
Table 5 Comparative results of different algorithms in terms of mean and VAR
Algorithms 120588 = 2 120588 = 4 120588 = 8 Time (S)Mean VAR Mean VAR Mean VAR
GA 46137 06160 04300 00688 00066 708119864 minus 06 8980ASA 42475 00288 02679 314119864 minus 04 00043 314119864 minus 06 6204PSO-TVIW 44858 02347 03044 00086 00039 160119864 minus 06 4684PSO-CIW 42127 02778 02847 00022 00035 142119864 minus 06 3902PSO-RANDIW 4066 03453 02549 00013 00037 550119864 minus 07 4054SA-PSO (TVIW) 38762 05053 03123 00029 00051 361119864 minus 06 3568SA-PSO (RANDIW) 29059 00094 01845 158E minus 04 00023 728E minus 07 3007
020
2
0
02
04
06
08
1
minus2 minus2
12059611205962
Abs(M
)
(a)
020
20
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(b)
Figure 5 Amplitude response |119872(1205961 1205962)| of 2D filter for 120588 = 8 (a) SA-PSO-TVIW and (b) SA-PSO-RANDIW
05 1 15 2 25 3 35 4Number of iterations
Fitn
ess i
n lo
g sc
ale
times104
101
GAPSO-RANDIW
SA-PSO-TVIWSA-PSO-RANDIW
(a)
05 1 15 2 25 3 35 4
Fitn
ess i
n lo
g sc
ale
times104
101
100
GAPSO-RANDIW
SA-PSO-TVIWSA-PSO-RANDIW
Number of iterations
(b)
Figure 6 Best fitness profile for GA PSO-RANDIW SA-PSO-TVIW and SA-PSO-RANDIW (a) 120588 = 2 and (b) 120588 = 4
10 Journal of Optimization
Emphasis on higher order two-dimensional recursive filterdesign with guaranteed stability and further reduction incomputational complexity is left as future work
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
References
[1] V M Mladenov and N E Mastorakis ldquoDesign of two-dimensional recursive filters by using neural networksrdquo IEEETransactions on Neural Networks vol 12 no 3 pp 585ndash5902001
[2] NMastorakis I F Gonos andM N S Swamy ldquoDesign of two-dimensional recursive filters using genetic algorithmsrdquo IEEETransactions on Circuits and Systems vol 50 no 5 pp 634ndash6392003
[3] I F Gonos L I Virirakis N EMastorakis andMN S SwamyldquoEvolutionary design of 2-dimensional recursive filters via thecomputer language GENETICArdquo IEEE Transactions on Circuitsand Systems II vol 53 no 4 pp 254ndash258 2006
[4] J-T Tsai W-H Ho and J-H Chou ldquoDesign of two-dimensional recursive filters by using Taguchi-based immunealgorithmrdquo IET Signal Processing vol 2 no 2 pp 110ndash117 2008
[5] D T Pham and E Koc ldquoDesign of a two-dimensional recur-sive filter using the bees algorithmrdquo International Journal ofAutomation and Computing vol 7 no 3 pp 399ndash402 2010
[6] J SunW Fang andWXu ldquoA quantum-behaved particle swarmoptimization with diversity-guided mutation for the designof two-dimensional IIR digital filtersrdquo IEEE Transactions onCircuits and Systems II Express Briefs vol 57 no 2 pp 141ndash1452010
[7] F Zhao Q Zhang D Yu X Chen and Y Yang ldquoA hybridalgorithm based on PSO and simulated annealing and itsapplications for partner selection in virtual enterpriserdquo inProceedings of the International Conference on Advances inIntelligent Computing pp 380ndash389 Springer 2005
[8] A Jamili M A Shafia and R Tavakkoli-Moghaddam ldquoAhybrid algorithm based on particle swarm optimization andsimulated annealing for a periodic job shop scheduling prob-lemrdquo International Journal of AdvancedManufacturing Technol-ogy vol 54 no 1ndash4 pp 309ndash322 2011
[9] F Zhao Y Hong D Yu Y Yang Q Zhang and H Yi ldquoA hybridalgorithm based on particle swarm optimization and simulatedannealing to holon task allocation for holonic manufacturingsystemrdquo International Journal of AdvancedManufacturing Tech-nology vol 32 no 9-10 pp 1021ndash1032 2007
[10] L IdoumgharMMelkemi R Schott andM I Aouad ldquoHybridPSO-SA type algorithms for multimodal function optimizationand reducing energy consumption in embedded systemsrdquoApplied Computational Intelligence and Soft Computing vol2011 Article ID 138078 12 pages 2011
[11] H-L Shieh C-C Kuo and C-M Chiang ldquoModified parti-cle swarm optimization algorithm with simulated annealingbehavior and its numerical verificationrdquo Applied Mathematicsand Computation vol 218 no 8 pp 4365ndash4383 2011
[12] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks vol 4 pp 1942ndash1948 1995
[13] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimizationrdquo in Proceedings of the IEEE International Congresson Evolutionary Computation vol 3 pp 101ndash106 1999
[14] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002
[15] A Nickabadi M M Ebadzadeh and R Safabakhsh ldquoA novelparticle swarm optimization algorithm with adaptive inertiaweightrdquoApplied Soft Computing Journal vol 11 no 4 pp 3658ndash3670 2011
[16] J C Bansal P K Singh M Saraswat A Verma S S Jadonand A Abraham ldquoInertia weight strategies in particle swarmoptimizationrdquo in Proceedings of the 3rd World Congress onNature and Biologically Inspired Computing (NaBIC 11) pp633ndash640 Salamanca Spain October 2011
[17] Y Feng G-F Teng A-XWang andY-M Yao ldquoChaotic inertiaweight in particle swarmoptimizationrdquo inProceedings of the 2ndInternational Conference on Innovative Computing Informationand Control (ICICIC 07) p 475 Kumamoto Japan September2007
[18] S Kirkpatrick J Gelatt and M P Vecchi ldquoOptimization bysimulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983
[19] N Metropolis A W Rosenbluth M N Rosenbluth A HTeller and E Teller ldquoEquation of state calculations by fastcomputing machinesrdquo The Journal of Chemical Physics vol 21no 6 pp 1087ndash1092 1953
[20] L Ingber ldquoAdaptive simulated annealing (ASA) lessonslearnedrdquo Journal of Control and Cybernetics vol 25 no 1 pp33ndash54 1996
[21] S Dhabal and P Venkateswaran ldquoAn efficient nonuniformcosinemodulated filter bank design using simulated annealingrdquoJournal of Signal and Information Processing vol 3 no 3 pp330ndash338 2012
To compare the performance of proposed SA-PSO methodthe similar design specifications have been used for desiredamplitude response [1ndash6]
119872119889(1205961 1205962) =
1 if radic1205962
1+ 1205962
2le 008
05 if 008120587 le radic1205962
1+ 1205962
2le 012120587
0 otherwise
(8)
Consequently the desired amplitude response of 2D filtersatisfying (8) is depicted in Figure 1
020
2
0
02
04
06
08
1
minus2minus2
1205961
1205962
Abs(M
)
Figure 1 Desired amplitude response |119872119889(1205961 1205962)| of 2D filter
3 An Overview of ParticleSwarm Optimization
Particle swarm optimization (PSO) is a population basedevolutionary algorithm which can mimic biological mech-anisms like bird flocking or fish schooling [12] Duringthe evolutionary process the swarm particles are attractedtowards the location of the best fitness of the particle itself(local best) and the location of the best fitness achievedby the whole population (global best) The position andvelocity of 119894th particle can be represented as follows 119883
119894=
(1199091198941 1199091198942 119909
119894119863) and119881
119894= (V1198941 V1198942 V
119894119863) Particle velocities
119881119894on each dimension are limited to (119881min 119881max) which
controls the local and global exploration capability of aparticle The best position of each particle is assumed as119901best119894
= (1199011198941 1199011198942 119901
119894119863) and the best fittest particle in
the group is 119901best119892
= (1199011198921 1199011198922 119901
119892119863) Then the new
velocities and positions for next evaluations are calculated by[12ndash14]
119881119894119863
= 120596 lowast 119881119894119863
+ 1198881lowast rand
1() lowast (119901best
119894119863minus 119883119894119863)
+ 1198882lowast rand
2() lowast (119892best
119894119863minus 119883119894119863)
(9)
119883119894119863
= 119883119894119863
+ 119881119894119863 (10)
where 1198881and 1198882are cognitive and social acceleration control
parameters and rand1() and rand
2() are two randomnumbers
isin [0 1] One important variant of standard PSO is theconstriction factor based PSO (CPSO) which was proposedby Clerc and Kennedy [14] The CPSO can generate higher-quality solutions than the standard PSO with inertia weightand it guarantees the convergence of the search procedureIn CPSO the particle velocity is updated by following theequation
119881119894119863
= 120594 (120596 lowast 119881119894119863
+ 1198881lowast rand
1() lowast (119901best
119894119863minus 119883119894119863)
+1198882lowast rand
2() lowast (119901best
119892119863minus 119883119894119863))
(11)
4 Journal of Optimization
Table 1 Different inertia weight strategies
Name of inertia weight Formula of inertia weight Parameters
Linear decreasing inertia weight (PSO-TVIW) 120596 = (120596max minus 120596min) times(itermax minus iter)
itermax+ 120596min
120596max = 09
120596min = 04
Random inertia weight (PSO-RANDIW) 120596 = 05 +rand()
2rand() isin [0 1]
Chaotic inertia weight (PSO-CIW)119911 = 4 times 119911 times (1 minus 119911)
120596 = (120596max minus 120596min)times(itermax minus iter)
itermax+120596mintimes119911
120596max = 09
120596min = 04
119911 isin [0 1]
Table 2 Parameter settings for GA and the proposed SA-PSO
Algorithm Parameters Value
GA
Population size 100Selection Roulette wheel probability 13Crossover Two points rate 08Mutation Gaussian rate 001
SA-PSO
Population size 100Inertia weight (120596) 09 rarr 04
1198881 1198882
205Initial temperature 119879
0= 50
Minimum temperature 119879min = 10minus10
Cooling schedule 119862 = 001 119876 = 001
where 120594 is called constriction factor given by 120594 = 2|2 minus 120593 minus
radic1205932 minus 4120593| 120593 = 1198881+ 1198882gt 4
It is well known that in PSO process inertia weight (120596)
plays an important role for balancing between exploration-exploitation trade-off In [15 16] it has been demonstratedthat chaotic inertia weight based PSO (PSO-CIW) [17] is thebest strategy for better accuracy whereas PSO with randominertia weight (PSO-RANDIW) produces optimal solutionswith better efficiency Therefore these two best performedinertia weight strategies combined with time varying inertiaweight (PSO-TVIW) are considered here to compare theperformance of the proposed SA-PSO The summary ofdifferent inertia weight strategies reported earlier is given inTable 1 along with the required constraints
4 The Proposed SA-PSO Algorithm
The main benefit of PSO is that it is a problem independentstochastic search optimization method Due to its stochastic
nature it reveals insufficient global searching ability at theend of a run particularly in multimodal functions Henceto escape from local minima and increase the diversity ofparticle the SA is integrated with PSO Similar to PSO SAis also a heuristic based random search global optimizationmethod proposed by Kirkpatrick et al [18] During thesearch process SA accepts not only better candidate solutionsbut also weakening solutions in certain degree accordingto Metropolis criterion [19] which can be mathematicallyrepresented by
119886 (119909) =
1 if Δ119864 le 0
exp (minusΔ119864
119879) if Δ119864 gt 0
(12)
where Δ119864 denotes change in energy due to perturbation ofparameters and 119879 is the current temperature of system Tovalidate (12) a random number 120588 isin [0 1] is generated andchecked for whether 120588 le 119886(119909) or not A suitable coolingschedule based on adaptive simulated annealing (ASA) isintroducedhere to update temperature of the system [20]Theannealing schedule for 119896th iteration is written as follows
where 119879(0) is starting temperature 119876 represents quenchingfactor and119863 is the dimension of search space Finally in ourproposed method a well constructed stopping condition isincluded to minimize the execution time and computationalefforts The proposed algorithm terminates when either ofminimum values of objective function and final temperatureor maximum number of iterations is reached [21]
The design of 2D recursive filter starts by assuming apopulation of random solutions in multidimensional searchspace Here each particle consists of 15 positional coordinateswhich is represented by
119883 = [11988601 11988602
11988610
11988611
11988612
11988620
11988621
11988622
1198871
1198872
1198881
1198882
1198891
1198892
1198670] (14)
Therefore 2D filter coefficients are represented as a vector119883 selected in the interval of [minus3 3] Each entry of 119883 is
optimized based on the flowchart presented in Figure 2 andthe proposed design flow is given below
Journal of Optimization 5
Start
Parameter settings
Initialized particles
Calculate fitness for all particles in population and select initial best
Update velocity and position
Accept new particle for next generation
Reduce temperature using cooling schedule
Stop criterion reached
Design 2D filter using best particle
Randomly generate No
YesYes
No
No
No
Select one particle randomly from updated
population and calculate fitness
New particle better than old onea isin
a gt eminusΔET
[01]
Figure 2 Flowchart of the proposed algorithm
6 Journal of Optimization
Table 3 Best results of 2D filter coefficients obtained by different approaches for 120588 = 2
NN [1] GA [2] GENETICA [3] TBIA [4] QPSO DGM [6] ASA PSOTVIW
Step 1 Specify desired filter (119872119889) specifications as given in
(8) Assume initial temperature is (1198790) and minimum level of
temperature is (119879min) initialize 120588 = 2 4 or 8 and 1198731= 1198732=
50 and use (5) to calculate 119888119901119902
and 119904119901119902
for 119901 119902 = 0 1 2 beforeoptimization starts
Step 2 Initialize swarm size minimum value of objectivefunction (min E) random population of coefficients vector(119909119894) and maximum number of evaluations (119896max gt 0)
Step 3 Set the iteration number 119896 = 1 assign best solution119909119892
= 119909 and use (2) for computing the fitness of the bestsolution 119901best
Step 4 (calculate fitness) Compute fitness values for eachparticle in the swarm
Step 5 (update 119901best) Particle best position 119901best is updatedbased onMetropolis criterion (12) as follows if current fitnessof selected particle lt fitness of 119901best then current positionof the particle is accepted as new 119901best with probability 1otherwise current particle will be accepted according to 120588 le
119886 where 120588 isin [0 1] and 119886 = exp(minus(120601(119901current) minus 120601(119901best))119879)
Step 6 (update 119892best) Now assign best particles 119901best value to119892best and calculate the new velocity and position by (9) and(10) for all particles
Step 7 (reduce Temperature) Calculate new temperature 119879
specified in cooling schedule (13) If 119879 le 119879min then terminatesearching for best solution otherwise go to Step 8
Step 8 (update velocities and positions) Calculate velocities119881119894using (9) and positions for each particle by (10)
Step 9 Update 119896 = 119896 + 1 and check whether 119896 ge 119896max orminimum value of cost function is achieved by any particleIf yes then assign 119909 = 119892best otherwise go back to Step 4
Step 10 Design 2Dfilter based on coefficients of 119909 = 119892best andcompute the magnitude response
5 Simulation Results and Discussion
In order to validate the performance of proposed SA-PSO we implemented the algorithm using MATLAB 2009on Genuine Intel(R) Core 2 Duo CPU E7300 266GHz2GBRAM Here all the associated algorithms are executedfor 20 independent runs with 40000 functional evaluations(FE) Table 2 shows the choice of several required parametersfor GA and SA-PSO based methods In case of proposed SA-PSO based methods initial population size is chosen as 100It is also observed that with the further increase in populationsize the solution quality remains unchanged In SA-PSO-TVIW the inertiaweight is decreased from09 to 04 to obtainthe best possible solutions whereas in SA-PSO-RANDIWit is varied randomly in between 05 and 1 The startingtemperature for annealing is chosen to be very high so thatit can escape from local optima easily The parameters forcooling schedule are chosen to be very small to assure globalconvergence otherwise it can skip the true global solutions[20 21]
Table 3 shows an analysis of the experimental results ofthe proposed method with NN (2001) GA (2003) GENET-ICA (2006) TBIA (2008) QPSO DGM (2010) ASA andPSO-TVIW with respect to the best known solutions Thebest results among different methods are highlighted in bold
Journal of Optimization 7
020
2
0
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(a)
020
20
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(b)
020
20
02
04
06
08
1
minus2minus2
1205961
1205962
Abs(M
)
(c)
020
2
0
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(d)
020
20
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(e)
020
20
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(f)
Figure 3 Amplitude response |119872(1205961 1205962)| of 2D filter for 120588 = 2 (a) SA-PSO-TVIW (b) SA-PSO-RANDIW (c) NN (d) GA (e) GENETICA
and (f) TBIA
and it can be easily verified that the experimental results ofthe proposed method that is SA-PSO-TVIW and SA-PSO-RANDIW are better than all other reported methods [1ndash6]For example the percentage improvement in 120601
2provided
by the proposed SA-PSO-TVIW method in comparison to
Further in Table 4 for different values of 120588 the bestpossible solutions (best) and worst solutions (worst) arelisted for different methods which clearly reveal that the
8 Journal of Optimization
020
2
0
02
04
06
08
1
minus2 minus2
12059611205962
Abs(M
)
(a)
020
2
0
02
04
06
08
1
minus2 minus2
12059611205962
Abs(M
)
(b)
Figure 4 Amplitude response |119872(1205961 1205962)| of 2D filter for 120588 = 4 (a) SA-PSO-TVIW and (b) SA-PSO-RANDIW
Table 4 Comparative results of different algorithms in terms of bestand worst case solution
bestworst solutions obtained by the proposed SA-PSO basedalgorithm are best for all runs Furthermore Table 5 presentsthe statistics for average (mean) and variance (VAR) of theoptimal solutions for different values of 120588 From Table 5 itcan be observed that the proposed SA-PSOalgorithmexhibitsbetter performance with less computational time (ie lessthan half time as compared to GA or ASA) and producesoptimal solution with very lower variance Hence it can beimplemented more efficiently in real-time designing of 2Dfilters
Figures 3(a)ndash3(f) show the calculated amplitude responseof 2D filter for 120588 = 2 The best results obtained by SA-PSO-TVIW and SA-PSO-RANDIW are shown in Figures3(a) and 3(b) respectively For comparison purpose theresults obtained by NN GA GENETICA and TBIA are alsoincluded in Figure 3(c) to Figure 3(f) A closer look at thesefigures reveals that SA-PSO based methods that is SA-PSO-TVIW and SA-PSO RANDIW yield a better approximationto the desired response and stop-band ripple is much lesswith respect to other competitivemethods [1ndash6] Figures 4(a)
and 4(b) show the magnitude response obtained using SA-PSO based methods for 120588 = 4 whereas Figures 5(a) and 5(b)represents the magnitude response for 120588 = 8
Figures 6(a) and 6(b) show the best convergence profilesof SA-PSO based methods for 120588 = 2 and 4 respectivelyAmong the 20 independent executions of four different algo-rithms (the existing two algorithms GA and PSO-RANDIWand the proposed two algorithms SA-PSO-TVIW and SA-PSO-RANDIW) the best performed runs are plotted hereInitially for few thousands FEs PSO-RANDIW convergesfaster than GA SA-PSO-RANDIW and SA-PSO-TVIWAfter certain number of iterations the PSO-RANDIW andGA both exhibit premature convergence and settle to nearoptimal solutions After 30000 FEs SA-PSO based methodsfall below the curves of both GA and PSO because SA helpsparticles to jump out from local optima from the beginningof a search Due to this at the end of search procedure theproposed hybrid method provides best candidate solutionHence the proposed hybrid algorithm produces better opti-mal solutions during the search process of 2D recursive filters
6 Conclusions
In this paper a novel hybrid evolutionary algorithm based onPSO and SA is proposed for finding the global best solutionof second-order two-dimensional recursive digital filtersThe proposed hybrid method integrates the global searchcapability of PSO with SA to escape from local minima Theexperimental results are comparedwith earlier reported algo-rithms namely NN GA GENETICA and TBIA in additionto different variants of PSO The comparison results indicatethat SA-PSO based methods exhibit better performance inall experiments and provide best optimum solution duringsearch mechanism Also the proposed method producesthe best solution with lower mean and variance Hence theproposed approach could be an alternative to implementthe two-dimensional digital filters for real-time applications
Journal of Optimization 9
Table 5 Comparative results of different algorithms in terms of mean and VAR
Algorithms 120588 = 2 120588 = 4 120588 = 8 Time (S)Mean VAR Mean VAR Mean VAR
GA 46137 06160 04300 00688 00066 708119864 minus 06 8980ASA 42475 00288 02679 314119864 minus 04 00043 314119864 minus 06 6204PSO-TVIW 44858 02347 03044 00086 00039 160119864 minus 06 4684PSO-CIW 42127 02778 02847 00022 00035 142119864 minus 06 3902PSO-RANDIW 4066 03453 02549 00013 00037 550119864 minus 07 4054SA-PSO (TVIW) 38762 05053 03123 00029 00051 361119864 minus 06 3568SA-PSO (RANDIW) 29059 00094 01845 158E minus 04 00023 728E minus 07 3007
020
2
0
02
04
06
08
1
minus2 minus2
12059611205962
Abs(M
)
(a)
020
20
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(b)
Figure 5 Amplitude response |119872(1205961 1205962)| of 2D filter for 120588 = 8 (a) SA-PSO-TVIW and (b) SA-PSO-RANDIW
05 1 15 2 25 3 35 4Number of iterations
Fitn
ess i
n lo
g sc
ale
times104
101
GAPSO-RANDIW
SA-PSO-TVIWSA-PSO-RANDIW
(a)
05 1 15 2 25 3 35 4
Fitn
ess i
n lo
g sc
ale
times104
101
100
GAPSO-RANDIW
SA-PSO-TVIWSA-PSO-RANDIW
Number of iterations
(b)
Figure 6 Best fitness profile for GA PSO-RANDIW SA-PSO-TVIW and SA-PSO-RANDIW (a) 120588 = 2 and (b) 120588 = 4
10 Journal of Optimization
Emphasis on higher order two-dimensional recursive filterdesign with guaranteed stability and further reduction incomputational complexity is left as future work
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
References
[1] V M Mladenov and N E Mastorakis ldquoDesign of two-dimensional recursive filters by using neural networksrdquo IEEETransactions on Neural Networks vol 12 no 3 pp 585ndash5902001
[2] NMastorakis I F Gonos andM N S Swamy ldquoDesign of two-dimensional recursive filters using genetic algorithmsrdquo IEEETransactions on Circuits and Systems vol 50 no 5 pp 634ndash6392003
[3] I F Gonos L I Virirakis N EMastorakis andMN S SwamyldquoEvolutionary design of 2-dimensional recursive filters via thecomputer language GENETICArdquo IEEE Transactions on Circuitsand Systems II vol 53 no 4 pp 254ndash258 2006
[4] J-T Tsai W-H Ho and J-H Chou ldquoDesign of two-dimensional recursive filters by using Taguchi-based immunealgorithmrdquo IET Signal Processing vol 2 no 2 pp 110ndash117 2008
[5] D T Pham and E Koc ldquoDesign of a two-dimensional recur-sive filter using the bees algorithmrdquo International Journal ofAutomation and Computing vol 7 no 3 pp 399ndash402 2010
[6] J SunW Fang andWXu ldquoA quantum-behaved particle swarmoptimization with diversity-guided mutation for the designof two-dimensional IIR digital filtersrdquo IEEE Transactions onCircuits and Systems II Express Briefs vol 57 no 2 pp 141ndash1452010
[7] F Zhao Q Zhang D Yu X Chen and Y Yang ldquoA hybridalgorithm based on PSO and simulated annealing and itsapplications for partner selection in virtual enterpriserdquo inProceedings of the International Conference on Advances inIntelligent Computing pp 380ndash389 Springer 2005
[8] A Jamili M A Shafia and R Tavakkoli-Moghaddam ldquoAhybrid algorithm based on particle swarm optimization andsimulated annealing for a periodic job shop scheduling prob-lemrdquo International Journal of AdvancedManufacturing Technol-ogy vol 54 no 1ndash4 pp 309ndash322 2011
[9] F Zhao Y Hong D Yu Y Yang Q Zhang and H Yi ldquoA hybridalgorithm based on particle swarm optimization and simulatedannealing to holon task allocation for holonic manufacturingsystemrdquo International Journal of AdvancedManufacturing Tech-nology vol 32 no 9-10 pp 1021ndash1032 2007
[10] L IdoumgharMMelkemi R Schott andM I Aouad ldquoHybridPSO-SA type algorithms for multimodal function optimizationand reducing energy consumption in embedded systemsrdquoApplied Computational Intelligence and Soft Computing vol2011 Article ID 138078 12 pages 2011
[11] H-L Shieh C-C Kuo and C-M Chiang ldquoModified parti-cle swarm optimization algorithm with simulated annealingbehavior and its numerical verificationrdquo Applied Mathematicsand Computation vol 218 no 8 pp 4365ndash4383 2011
[12] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks vol 4 pp 1942ndash1948 1995
[13] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimizationrdquo in Proceedings of the IEEE International Congresson Evolutionary Computation vol 3 pp 101ndash106 1999
[14] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002
[15] A Nickabadi M M Ebadzadeh and R Safabakhsh ldquoA novelparticle swarm optimization algorithm with adaptive inertiaweightrdquoApplied Soft Computing Journal vol 11 no 4 pp 3658ndash3670 2011
[16] J C Bansal P K Singh M Saraswat A Verma S S Jadonand A Abraham ldquoInertia weight strategies in particle swarmoptimizationrdquo in Proceedings of the 3rd World Congress onNature and Biologically Inspired Computing (NaBIC 11) pp633ndash640 Salamanca Spain October 2011
[17] Y Feng G-F Teng A-XWang andY-M Yao ldquoChaotic inertiaweight in particle swarmoptimizationrdquo inProceedings of the 2ndInternational Conference on Innovative Computing Informationand Control (ICICIC 07) p 475 Kumamoto Japan September2007
[18] S Kirkpatrick J Gelatt and M P Vecchi ldquoOptimization bysimulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983
[19] N Metropolis A W Rosenbluth M N Rosenbluth A HTeller and E Teller ldquoEquation of state calculations by fastcomputing machinesrdquo The Journal of Chemical Physics vol 21no 6 pp 1087ndash1092 1953
[20] L Ingber ldquoAdaptive simulated annealing (ASA) lessonslearnedrdquo Journal of Control and Cybernetics vol 25 no 1 pp33ndash54 1996
[21] S Dhabal and P Venkateswaran ldquoAn efficient nonuniformcosinemodulated filter bank design using simulated annealingrdquoJournal of Signal and Information Processing vol 3 no 3 pp330ndash338 2012
Name of inertia weight Formula of inertia weight Parameters
Linear decreasing inertia weight (PSO-TVIW) 120596 = (120596max minus 120596min) times(itermax minus iter)
itermax+ 120596min
120596max = 09
120596min = 04
Random inertia weight (PSO-RANDIW) 120596 = 05 +rand()
2rand() isin [0 1]
Chaotic inertia weight (PSO-CIW)119911 = 4 times 119911 times (1 minus 119911)
120596 = (120596max minus 120596min)times(itermax minus iter)
itermax+120596mintimes119911
120596max = 09
120596min = 04
119911 isin [0 1]
Table 2 Parameter settings for GA and the proposed SA-PSO
Algorithm Parameters Value
GA
Population size 100Selection Roulette wheel probability 13Crossover Two points rate 08Mutation Gaussian rate 001
SA-PSO
Population size 100Inertia weight (120596) 09 rarr 04
1198881 1198882
205Initial temperature 119879
0= 50
Minimum temperature 119879min = 10minus10
Cooling schedule 119862 = 001 119876 = 001
where 120594 is called constriction factor given by 120594 = 2|2 minus 120593 minus
radic1205932 minus 4120593| 120593 = 1198881+ 1198882gt 4
It is well known that in PSO process inertia weight (120596)
plays an important role for balancing between exploration-exploitation trade-off In [15 16] it has been demonstratedthat chaotic inertia weight based PSO (PSO-CIW) [17] is thebest strategy for better accuracy whereas PSO with randominertia weight (PSO-RANDIW) produces optimal solutionswith better efficiency Therefore these two best performedinertia weight strategies combined with time varying inertiaweight (PSO-TVIW) are considered here to compare theperformance of the proposed SA-PSO The summary ofdifferent inertia weight strategies reported earlier is given inTable 1 along with the required constraints
4 The Proposed SA-PSO Algorithm
The main benefit of PSO is that it is a problem independentstochastic search optimization method Due to its stochastic
nature it reveals insufficient global searching ability at theend of a run particularly in multimodal functions Henceto escape from local minima and increase the diversity ofparticle the SA is integrated with PSO Similar to PSO SAis also a heuristic based random search global optimizationmethod proposed by Kirkpatrick et al [18] During thesearch process SA accepts not only better candidate solutionsbut also weakening solutions in certain degree accordingto Metropolis criterion [19] which can be mathematicallyrepresented by
119886 (119909) =
1 if Δ119864 le 0
exp (minusΔ119864
119879) if Δ119864 gt 0
(12)
where Δ119864 denotes change in energy due to perturbation ofparameters and 119879 is the current temperature of system Tovalidate (12) a random number 120588 isin [0 1] is generated andchecked for whether 120588 le 119886(119909) or not A suitable coolingschedule based on adaptive simulated annealing (ASA) isintroducedhere to update temperature of the system [20]Theannealing schedule for 119896th iteration is written as follows
where 119879(0) is starting temperature 119876 represents quenchingfactor and119863 is the dimension of search space Finally in ourproposed method a well constructed stopping condition isincluded to minimize the execution time and computationalefforts The proposed algorithm terminates when either ofminimum values of objective function and final temperatureor maximum number of iterations is reached [21]
The design of 2D recursive filter starts by assuming apopulation of random solutions in multidimensional searchspace Here each particle consists of 15 positional coordinateswhich is represented by
119883 = [11988601 11988602
11988610
11988611
11988612
11988620
11988621
11988622
1198871
1198872
1198881
1198882
1198891
1198892
1198670] (14)
Therefore 2D filter coefficients are represented as a vector119883 selected in the interval of [minus3 3] Each entry of 119883 is
optimized based on the flowchart presented in Figure 2 andthe proposed design flow is given below
Journal of Optimization 5
Start
Parameter settings
Initialized particles
Calculate fitness for all particles in population and select initial best
Update velocity and position
Accept new particle for next generation
Reduce temperature using cooling schedule
Stop criterion reached
Design 2D filter using best particle
Randomly generate No
YesYes
No
No
No
Select one particle randomly from updated
population and calculate fitness
New particle better than old onea isin
a gt eminusΔET
[01]
Figure 2 Flowchart of the proposed algorithm
6 Journal of Optimization
Table 3 Best results of 2D filter coefficients obtained by different approaches for 120588 = 2
NN [1] GA [2] GENETICA [3] TBIA [4] QPSO DGM [6] ASA PSOTVIW
Step 1 Specify desired filter (119872119889) specifications as given in
(8) Assume initial temperature is (1198790) and minimum level of
temperature is (119879min) initialize 120588 = 2 4 or 8 and 1198731= 1198732=
50 and use (5) to calculate 119888119901119902
and 119904119901119902
for 119901 119902 = 0 1 2 beforeoptimization starts
Step 2 Initialize swarm size minimum value of objectivefunction (min E) random population of coefficients vector(119909119894) and maximum number of evaluations (119896max gt 0)
Step 3 Set the iteration number 119896 = 1 assign best solution119909119892
= 119909 and use (2) for computing the fitness of the bestsolution 119901best
Step 4 (calculate fitness) Compute fitness values for eachparticle in the swarm
Step 5 (update 119901best) Particle best position 119901best is updatedbased onMetropolis criterion (12) as follows if current fitnessof selected particle lt fitness of 119901best then current positionof the particle is accepted as new 119901best with probability 1otherwise current particle will be accepted according to 120588 le
119886 where 120588 isin [0 1] and 119886 = exp(minus(120601(119901current) minus 120601(119901best))119879)
Step 6 (update 119892best) Now assign best particles 119901best value to119892best and calculate the new velocity and position by (9) and(10) for all particles
Step 7 (reduce Temperature) Calculate new temperature 119879
specified in cooling schedule (13) If 119879 le 119879min then terminatesearching for best solution otherwise go to Step 8
Step 8 (update velocities and positions) Calculate velocities119881119894using (9) and positions for each particle by (10)
Step 9 Update 119896 = 119896 + 1 and check whether 119896 ge 119896max orminimum value of cost function is achieved by any particleIf yes then assign 119909 = 119892best otherwise go back to Step 4
Step 10 Design 2Dfilter based on coefficients of 119909 = 119892best andcompute the magnitude response
5 Simulation Results and Discussion
In order to validate the performance of proposed SA-PSO we implemented the algorithm using MATLAB 2009on Genuine Intel(R) Core 2 Duo CPU E7300 266GHz2GBRAM Here all the associated algorithms are executedfor 20 independent runs with 40000 functional evaluations(FE) Table 2 shows the choice of several required parametersfor GA and SA-PSO based methods In case of proposed SA-PSO based methods initial population size is chosen as 100It is also observed that with the further increase in populationsize the solution quality remains unchanged In SA-PSO-TVIW the inertiaweight is decreased from09 to 04 to obtainthe best possible solutions whereas in SA-PSO-RANDIWit is varied randomly in between 05 and 1 The startingtemperature for annealing is chosen to be very high so thatit can escape from local optima easily The parameters forcooling schedule are chosen to be very small to assure globalconvergence otherwise it can skip the true global solutions[20 21]
Table 3 shows an analysis of the experimental results ofthe proposed method with NN (2001) GA (2003) GENET-ICA (2006) TBIA (2008) QPSO DGM (2010) ASA andPSO-TVIW with respect to the best known solutions Thebest results among different methods are highlighted in bold
Journal of Optimization 7
020
2
0
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(a)
020
20
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(b)
020
20
02
04
06
08
1
minus2minus2
1205961
1205962
Abs(M
)
(c)
020
2
0
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(d)
020
20
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(e)
020
20
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(f)
Figure 3 Amplitude response |119872(1205961 1205962)| of 2D filter for 120588 = 2 (a) SA-PSO-TVIW (b) SA-PSO-RANDIW (c) NN (d) GA (e) GENETICA
and (f) TBIA
and it can be easily verified that the experimental results ofthe proposed method that is SA-PSO-TVIW and SA-PSO-RANDIW are better than all other reported methods [1ndash6]For example the percentage improvement in 120601
2provided
by the proposed SA-PSO-TVIW method in comparison to
Further in Table 4 for different values of 120588 the bestpossible solutions (best) and worst solutions (worst) arelisted for different methods which clearly reveal that the
8 Journal of Optimization
020
2
0
02
04
06
08
1
minus2 minus2
12059611205962
Abs(M
)
(a)
020
2
0
02
04
06
08
1
minus2 minus2
12059611205962
Abs(M
)
(b)
Figure 4 Amplitude response |119872(1205961 1205962)| of 2D filter for 120588 = 4 (a) SA-PSO-TVIW and (b) SA-PSO-RANDIW
Table 4 Comparative results of different algorithms in terms of bestand worst case solution
bestworst solutions obtained by the proposed SA-PSO basedalgorithm are best for all runs Furthermore Table 5 presentsthe statistics for average (mean) and variance (VAR) of theoptimal solutions for different values of 120588 From Table 5 itcan be observed that the proposed SA-PSOalgorithmexhibitsbetter performance with less computational time (ie lessthan half time as compared to GA or ASA) and producesoptimal solution with very lower variance Hence it can beimplemented more efficiently in real-time designing of 2Dfilters
Figures 3(a)ndash3(f) show the calculated amplitude responseof 2D filter for 120588 = 2 The best results obtained by SA-PSO-TVIW and SA-PSO-RANDIW are shown in Figures3(a) and 3(b) respectively For comparison purpose theresults obtained by NN GA GENETICA and TBIA are alsoincluded in Figure 3(c) to Figure 3(f) A closer look at thesefigures reveals that SA-PSO based methods that is SA-PSO-TVIW and SA-PSO RANDIW yield a better approximationto the desired response and stop-band ripple is much lesswith respect to other competitivemethods [1ndash6] Figures 4(a)
and 4(b) show the magnitude response obtained using SA-PSO based methods for 120588 = 4 whereas Figures 5(a) and 5(b)represents the magnitude response for 120588 = 8
Figures 6(a) and 6(b) show the best convergence profilesof SA-PSO based methods for 120588 = 2 and 4 respectivelyAmong the 20 independent executions of four different algo-rithms (the existing two algorithms GA and PSO-RANDIWand the proposed two algorithms SA-PSO-TVIW and SA-PSO-RANDIW) the best performed runs are plotted hereInitially for few thousands FEs PSO-RANDIW convergesfaster than GA SA-PSO-RANDIW and SA-PSO-TVIWAfter certain number of iterations the PSO-RANDIW andGA both exhibit premature convergence and settle to nearoptimal solutions After 30000 FEs SA-PSO based methodsfall below the curves of both GA and PSO because SA helpsparticles to jump out from local optima from the beginningof a search Due to this at the end of search procedure theproposed hybrid method provides best candidate solutionHence the proposed hybrid algorithm produces better opti-mal solutions during the search process of 2D recursive filters
6 Conclusions
In this paper a novel hybrid evolutionary algorithm based onPSO and SA is proposed for finding the global best solutionof second-order two-dimensional recursive digital filtersThe proposed hybrid method integrates the global searchcapability of PSO with SA to escape from local minima Theexperimental results are comparedwith earlier reported algo-rithms namely NN GA GENETICA and TBIA in additionto different variants of PSO The comparison results indicatethat SA-PSO based methods exhibit better performance inall experiments and provide best optimum solution duringsearch mechanism Also the proposed method producesthe best solution with lower mean and variance Hence theproposed approach could be an alternative to implementthe two-dimensional digital filters for real-time applications
Journal of Optimization 9
Table 5 Comparative results of different algorithms in terms of mean and VAR
Algorithms 120588 = 2 120588 = 4 120588 = 8 Time (S)Mean VAR Mean VAR Mean VAR
GA 46137 06160 04300 00688 00066 708119864 minus 06 8980ASA 42475 00288 02679 314119864 minus 04 00043 314119864 minus 06 6204PSO-TVIW 44858 02347 03044 00086 00039 160119864 minus 06 4684PSO-CIW 42127 02778 02847 00022 00035 142119864 minus 06 3902PSO-RANDIW 4066 03453 02549 00013 00037 550119864 minus 07 4054SA-PSO (TVIW) 38762 05053 03123 00029 00051 361119864 minus 06 3568SA-PSO (RANDIW) 29059 00094 01845 158E minus 04 00023 728E minus 07 3007
020
2
0
02
04
06
08
1
minus2 minus2
12059611205962
Abs(M
)
(a)
020
20
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(b)
Figure 5 Amplitude response |119872(1205961 1205962)| of 2D filter for 120588 = 8 (a) SA-PSO-TVIW and (b) SA-PSO-RANDIW
05 1 15 2 25 3 35 4Number of iterations
Fitn
ess i
n lo
g sc
ale
times104
101
GAPSO-RANDIW
SA-PSO-TVIWSA-PSO-RANDIW
(a)
05 1 15 2 25 3 35 4
Fitn
ess i
n lo
g sc
ale
times104
101
100
GAPSO-RANDIW
SA-PSO-TVIWSA-PSO-RANDIW
Number of iterations
(b)
Figure 6 Best fitness profile for GA PSO-RANDIW SA-PSO-TVIW and SA-PSO-RANDIW (a) 120588 = 2 and (b) 120588 = 4
10 Journal of Optimization
Emphasis on higher order two-dimensional recursive filterdesign with guaranteed stability and further reduction incomputational complexity is left as future work
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
References
[1] V M Mladenov and N E Mastorakis ldquoDesign of two-dimensional recursive filters by using neural networksrdquo IEEETransactions on Neural Networks vol 12 no 3 pp 585ndash5902001
[2] NMastorakis I F Gonos andM N S Swamy ldquoDesign of two-dimensional recursive filters using genetic algorithmsrdquo IEEETransactions on Circuits and Systems vol 50 no 5 pp 634ndash6392003
[3] I F Gonos L I Virirakis N EMastorakis andMN S SwamyldquoEvolutionary design of 2-dimensional recursive filters via thecomputer language GENETICArdquo IEEE Transactions on Circuitsand Systems II vol 53 no 4 pp 254ndash258 2006
[4] J-T Tsai W-H Ho and J-H Chou ldquoDesign of two-dimensional recursive filters by using Taguchi-based immunealgorithmrdquo IET Signal Processing vol 2 no 2 pp 110ndash117 2008
[5] D T Pham and E Koc ldquoDesign of a two-dimensional recur-sive filter using the bees algorithmrdquo International Journal ofAutomation and Computing vol 7 no 3 pp 399ndash402 2010
[6] J SunW Fang andWXu ldquoA quantum-behaved particle swarmoptimization with diversity-guided mutation for the designof two-dimensional IIR digital filtersrdquo IEEE Transactions onCircuits and Systems II Express Briefs vol 57 no 2 pp 141ndash1452010
[7] F Zhao Q Zhang D Yu X Chen and Y Yang ldquoA hybridalgorithm based on PSO and simulated annealing and itsapplications for partner selection in virtual enterpriserdquo inProceedings of the International Conference on Advances inIntelligent Computing pp 380ndash389 Springer 2005
[8] A Jamili M A Shafia and R Tavakkoli-Moghaddam ldquoAhybrid algorithm based on particle swarm optimization andsimulated annealing for a periodic job shop scheduling prob-lemrdquo International Journal of AdvancedManufacturing Technol-ogy vol 54 no 1ndash4 pp 309ndash322 2011
[9] F Zhao Y Hong D Yu Y Yang Q Zhang and H Yi ldquoA hybridalgorithm based on particle swarm optimization and simulatedannealing to holon task allocation for holonic manufacturingsystemrdquo International Journal of AdvancedManufacturing Tech-nology vol 32 no 9-10 pp 1021ndash1032 2007
[10] L IdoumgharMMelkemi R Schott andM I Aouad ldquoHybridPSO-SA type algorithms for multimodal function optimizationand reducing energy consumption in embedded systemsrdquoApplied Computational Intelligence and Soft Computing vol2011 Article ID 138078 12 pages 2011
[11] H-L Shieh C-C Kuo and C-M Chiang ldquoModified parti-cle swarm optimization algorithm with simulated annealingbehavior and its numerical verificationrdquo Applied Mathematicsand Computation vol 218 no 8 pp 4365ndash4383 2011
[12] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks vol 4 pp 1942ndash1948 1995
[13] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimizationrdquo in Proceedings of the IEEE International Congresson Evolutionary Computation vol 3 pp 101ndash106 1999
[14] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002
[15] A Nickabadi M M Ebadzadeh and R Safabakhsh ldquoA novelparticle swarm optimization algorithm with adaptive inertiaweightrdquoApplied Soft Computing Journal vol 11 no 4 pp 3658ndash3670 2011
[16] J C Bansal P K Singh M Saraswat A Verma S S Jadonand A Abraham ldquoInertia weight strategies in particle swarmoptimizationrdquo in Proceedings of the 3rd World Congress onNature and Biologically Inspired Computing (NaBIC 11) pp633ndash640 Salamanca Spain October 2011
[17] Y Feng G-F Teng A-XWang andY-M Yao ldquoChaotic inertiaweight in particle swarmoptimizationrdquo inProceedings of the 2ndInternational Conference on Innovative Computing Informationand Control (ICICIC 07) p 475 Kumamoto Japan September2007
[18] S Kirkpatrick J Gelatt and M P Vecchi ldquoOptimization bysimulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983
[19] N Metropolis A W Rosenbluth M N Rosenbluth A HTeller and E Teller ldquoEquation of state calculations by fastcomputing machinesrdquo The Journal of Chemical Physics vol 21no 6 pp 1087ndash1092 1953
[20] L Ingber ldquoAdaptive simulated annealing (ASA) lessonslearnedrdquo Journal of Control and Cybernetics vol 25 no 1 pp33ndash54 1996
[21] S Dhabal and P Venkateswaran ldquoAn efficient nonuniformcosinemodulated filter bank design using simulated annealingrdquoJournal of Signal and Information Processing vol 3 no 3 pp330ndash338 2012
Step 1 Specify desired filter (119872119889) specifications as given in
(8) Assume initial temperature is (1198790) and minimum level of
temperature is (119879min) initialize 120588 = 2 4 or 8 and 1198731= 1198732=
50 and use (5) to calculate 119888119901119902
and 119904119901119902
for 119901 119902 = 0 1 2 beforeoptimization starts
Step 2 Initialize swarm size minimum value of objectivefunction (min E) random population of coefficients vector(119909119894) and maximum number of evaluations (119896max gt 0)
Step 3 Set the iteration number 119896 = 1 assign best solution119909119892
= 119909 and use (2) for computing the fitness of the bestsolution 119901best
Step 4 (calculate fitness) Compute fitness values for eachparticle in the swarm
Step 5 (update 119901best) Particle best position 119901best is updatedbased onMetropolis criterion (12) as follows if current fitnessof selected particle lt fitness of 119901best then current positionof the particle is accepted as new 119901best with probability 1otherwise current particle will be accepted according to 120588 le
119886 where 120588 isin [0 1] and 119886 = exp(minus(120601(119901current) minus 120601(119901best))119879)
Step 6 (update 119892best) Now assign best particles 119901best value to119892best and calculate the new velocity and position by (9) and(10) for all particles
Step 7 (reduce Temperature) Calculate new temperature 119879
specified in cooling schedule (13) If 119879 le 119879min then terminatesearching for best solution otherwise go to Step 8
Step 8 (update velocities and positions) Calculate velocities119881119894using (9) and positions for each particle by (10)
Step 9 Update 119896 = 119896 + 1 and check whether 119896 ge 119896max orminimum value of cost function is achieved by any particleIf yes then assign 119909 = 119892best otherwise go back to Step 4
Step 10 Design 2Dfilter based on coefficients of 119909 = 119892best andcompute the magnitude response
5 Simulation Results and Discussion
In order to validate the performance of proposed SA-PSO we implemented the algorithm using MATLAB 2009on Genuine Intel(R) Core 2 Duo CPU E7300 266GHz2GBRAM Here all the associated algorithms are executedfor 20 independent runs with 40000 functional evaluations(FE) Table 2 shows the choice of several required parametersfor GA and SA-PSO based methods In case of proposed SA-PSO based methods initial population size is chosen as 100It is also observed that with the further increase in populationsize the solution quality remains unchanged In SA-PSO-TVIW the inertiaweight is decreased from09 to 04 to obtainthe best possible solutions whereas in SA-PSO-RANDIWit is varied randomly in between 05 and 1 The startingtemperature for annealing is chosen to be very high so thatit can escape from local optima easily The parameters forcooling schedule are chosen to be very small to assure globalconvergence otherwise it can skip the true global solutions[20 21]
Table 3 shows an analysis of the experimental results ofthe proposed method with NN (2001) GA (2003) GENET-ICA (2006) TBIA (2008) QPSO DGM (2010) ASA andPSO-TVIW with respect to the best known solutions Thebest results among different methods are highlighted in bold
Journal of Optimization 7
020
2
0
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(a)
020
20
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(b)
020
20
02
04
06
08
1
minus2minus2
1205961
1205962
Abs(M
)
(c)
020
2
0
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(d)
020
20
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(e)
020
20
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(f)
Figure 3 Amplitude response |119872(1205961 1205962)| of 2D filter for 120588 = 2 (a) SA-PSO-TVIW (b) SA-PSO-RANDIW (c) NN (d) GA (e) GENETICA
and (f) TBIA
and it can be easily verified that the experimental results ofthe proposed method that is SA-PSO-TVIW and SA-PSO-RANDIW are better than all other reported methods [1ndash6]For example the percentage improvement in 120601
2provided
by the proposed SA-PSO-TVIW method in comparison to
Further in Table 4 for different values of 120588 the bestpossible solutions (best) and worst solutions (worst) arelisted for different methods which clearly reveal that the
8 Journal of Optimization
020
2
0
02
04
06
08
1
minus2 minus2
12059611205962
Abs(M
)
(a)
020
2
0
02
04
06
08
1
minus2 minus2
12059611205962
Abs(M
)
(b)
Figure 4 Amplitude response |119872(1205961 1205962)| of 2D filter for 120588 = 4 (a) SA-PSO-TVIW and (b) SA-PSO-RANDIW
Table 4 Comparative results of different algorithms in terms of bestand worst case solution
bestworst solutions obtained by the proposed SA-PSO basedalgorithm are best for all runs Furthermore Table 5 presentsthe statistics for average (mean) and variance (VAR) of theoptimal solutions for different values of 120588 From Table 5 itcan be observed that the proposed SA-PSOalgorithmexhibitsbetter performance with less computational time (ie lessthan half time as compared to GA or ASA) and producesoptimal solution with very lower variance Hence it can beimplemented more efficiently in real-time designing of 2Dfilters
Figures 3(a)ndash3(f) show the calculated amplitude responseof 2D filter for 120588 = 2 The best results obtained by SA-PSO-TVIW and SA-PSO-RANDIW are shown in Figures3(a) and 3(b) respectively For comparison purpose theresults obtained by NN GA GENETICA and TBIA are alsoincluded in Figure 3(c) to Figure 3(f) A closer look at thesefigures reveals that SA-PSO based methods that is SA-PSO-TVIW and SA-PSO RANDIW yield a better approximationto the desired response and stop-band ripple is much lesswith respect to other competitivemethods [1ndash6] Figures 4(a)
and 4(b) show the magnitude response obtained using SA-PSO based methods for 120588 = 4 whereas Figures 5(a) and 5(b)represents the magnitude response for 120588 = 8
Figures 6(a) and 6(b) show the best convergence profilesof SA-PSO based methods for 120588 = 2 and 4 respectivelyAmong the 20 independent executions of four different algo-rithms (the existing two algorithms GA and PSO-RANDIWand the proposed two algorithms SA-PSO-TVIW and SA-PSO-RANDIW) the best performed runs are plotted hereInitially for few thousands FEs PSO-RANDIW convergesfaster than GA SA-PSO-RANDIW and SA-PSO-TVIWAfter certain number of iterations the PSO-RANDIW andGA both exhibit premature convergence and settle to nearoptimal solutions After 30000 FEs SA-PSO based methodsfall below the curves of both GA and PSO because SA helpsparticles to jump out from local optima from the beginningof a search Due to this at the end of search procedure theproposed hybrid method provides best candidate solutionHence the proposed hybrid algorithm produces better opti-mal solutions during the search process of 2D recursive filters
6 Conclusions
In this paper a novel hybrid evolutionary algorithm based onPSO and SA is proposed for finding the global best solutionof second-order two-dimensional recursive digital filtersThe proposed hybrid method integrates the global searchcapability of PSO with SA to escape from local minima Theexperimental results are comparedwith earlier reported algo-rithms namely NN GA GENETICA and TBIA in additionto different variants of PSO The comparison results indicatethat SA-PSO based methods exhibit better performance inall experiments and provide best optimum solution duringsearch mechanism Also the proposed method producesthe best solution with lower mean and variance Hence theproposed approach could be an alternative to implementthe two-dimensional digital filters for real-time applications
Journal of Optimization 9
Table 5 Comparative results of different algorithms in terms of mean and VAR
Algorithms 120588 = 2 120588 = 4 120588 = 8 Time (S)Mean VAR Mean VAR Mean VAR
GA 46137 06160 04300 00688 00066 708119864 minus 06 8980ASA 42475 00288 02679 314119864 minus 04 00043 314119864 minus 06 6204PSO-TVIW 44858 02347 03044 00086 00039 160119864 minus 06 4684PSO-CIW 42127 02778 02847 00022 00035 142119864 minus 06 3902PSO-RANDIW 4066 03453 02549 00013 00037 550119864 minus 07 4054SA-PSO (TVIW) 38762 05053 03123 00029 00051 361119864 minus 06 3568SA-PSO (RANDIW) 29059 00094 01845 158E minus 04 00023 728E minus 07 3007
020
2
0
02
04
06
08
1
minus2 minus2
12059611205962
Abs(M
)
(a)
020
20
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(b)
Figure 5 Amplitude response |119872(1205961 1205962)| of 2D filter for 120588 = 8 (a) SA-PSO-TVIW and (b) SA-PSO-RANDIW
05 1 15 2 25 3 35 4Number of iterations
Fitn
ess i
n lo
g sc
ale
times104
101
GAPSO-RANDIW
SA-PSO-TVIWSA-PSO-RANDIW
(a)
05 1 15 2 25 3 35 4
Fitn
ess i
n lo
g sc
ale
times104
101
100
GAPSO-RANDIW
SA-PSO-TVIWSA-PSO-RANDIW
Number of iterations
(b)
Figure 6 Best fitness profile for GA PSO-RANDIW SA-PSO-TVIW and SA-PSO-RANDIW (a) 120588 = 2 and (b) 120588 = 4
10 Journal of Optimization
Emphasis on higher order two-dimensional recursive filterdesign with guaranteed stability and further reduction incomputational complexity is left as future work
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
References
[1] V M Mladenov and N E Mastorakis ldquoDesign of two-dimensional recursive filters by using neural networksrdquo IEEETransactions on Neural Networks vol 12 no 3 pp 585ndash5902001
[2] NMastorakis I F Gonos andM N S Swamy ldquoDesign of two-dimensional recursive filters using genetic algorithmsrdquo IEEETransactions on Circuits and Systems vol 50 no 5 pp 634ndash6392003
[3] I F Gonos L I Virirakis N EMastorakis andMN S SwamyldquoEvolutionary design of 2-dimensional recursive filters via thecomputer language GENETICArdquo IEEE Transactions on Circuitsand Systems II vol 53 no 4 pp 254ndash258 2006
[4] J-T Tsai W-H Ho and J-H Chou ldquoDesign of two-dimensional recursive filters by using Taguchi-based immunealgorithmrdquo IET Signal Processing vol 2 no 2 pp 110ndash117 2008
[5] D T Pham and E Koc ldquoDesign of a two-dimensional recur-sive filter using the bees algorithmrdquo International Journal ofAutomation and Computing vol 7 no 3 pp 399ndash402 2010
[6] J SunW Fang andWXu ldquoA quantum-behaved particle swarmoptimization with diversity-guided mutation for the designof two-dimensional IIR digital filtersrdquo IEEE Transactions onCircuits and Systems II Express Briefs vol 57 no 2 pp 141ndash1452010
[7] F Zhao Q Zhang D Yu X Chen and Y Yang ldquoA hybridalgorithm based on PSO and simulated annealing and itsapplications for partner selection in virtual enterpriserdquo inProceedings of the International Conference on Advances inIntelligent Computing pp 380ndash389 Springer 2005
[8] A Jamili M A Shafia and R Tavakkoli-Moghaddam ldquoAhybrid algorithm based on particle swarm optimization andsimulated annealing for a periodic job shop scheduling prob-lemrdquo International Journal of AdvancedManufacturing Technol-ogy vol 54 no 1ndash4 pp 309ndash322 2011
[9] F Zhao Y Hong D Yu Y Yang Q Zhang and H Yi ldquoA hybridalgorithm based on particle swarm optimization and simulatedannealing to holon task allocation for holonic manufacturingsystemrdquo International Journal of AdvancedManufacturing Tech-nology vol 32 no 9-10 pp 1021ndash1032 2007
[10] L IdoumgharMMelkemi R Schott andM I Aouad ldquoHybridPSO-SA type algorithms for multimodal function optimizationand reducing energy consumption in embedded systemsrdquoApplied Computational Intelligence and Soft Computing vol2011 Article ID 138078 12 pages 2011
[11] H-L Shieh C-C Kuo and C-M Chiang ldquoModified parti-cle swarm optimization algorithm with simulated annealingbehavior and its numerical verificationrdquo Applied Mathematicsand Computation vol 218 no 8 pp 4365ndash4383 2011
[12] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks vol 4 pp 1942ndash1948 1995
[13] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimizationrdquo in Proceedings of the IEEE International Congresson Evolutionary Computation vol 3 pp 101ndash106 1999
[14] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002
[15] A Nickabadi M M Ebadzadeh and R Safabakhsh ldquoA novelparticle swarm optimization algorithm with adaptive inertiaweightrdquoApplied Soft Computing Journal vol 11 no 4 pp 3658ndash3670 2011
[16] J C Bansal P K Singh M Saraswat A Verma S S Jadonand A Abraham ldquoInertia weight strategies in particle swarmoptimizationrdquo in Proceedings of the 3rd World Congress onNature and Biologically Inspired Computing (NaBIC 11) pp633ndash640 Salamanca Spain October 2011
[17] Y Feng G-F Teng A-XWang andY-M Yao ldquoChaotic inertiaweight in particle swarmoptimizationrdquo inProceedings of the 2ndInternational Conference on Innovative Computing Informationand Control (ICICIC 07) p 475 Kumamoto Japan September2007
[18] S Kirkpatrick J Gelatt and M P Vecchi ldquoOptimization bysimulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983
[19] N Metropolis A W Rosenbluth M N Rosenbluth A HTeller and E Teller ldquoEquation of state calculations by fastcomputing machinesrdquo The Journal of Chemical Physics vol 21no 6 pp 1087ndash1092 1953
[20] L Ingber ldquoAdaptive simulated annealing (ASA) lessonslearnedrdquo Journal of Control and Cybernetics vol 25 no 1 pp33ndash54 1996
[21] S Dhabal and P Venkateswaran ldquoAn efficient nonuniformcosinemodulated filter bank design using simulated annealingrdquoJournal of Signal and Information Processing vol 3 no 3 pp330ndash338 2012
Step 1 Specify desired filter (119872119889) specifications as given in
(8) Assume initial temperature is (1198790) and minimum level of
temperature is (119879min) initialize 120588 = 2 4 or 8 and 1198731= 1198732=
50 and use (5) to calculate 119888119901119902
and 119904119901119902
for 119901 119902 = 0 1 2 beforeoptimization starts
Step 2 Initialize swarm size minimum value of objectivefunction (min E) random population of coefficients vector(119909119894) and maximum number of evaluations (119896max gt 0)
Step 3 Set the iteration number 119896 = 1 assign best solution119909119892
= 119909 and use (2) for computing the fitness of the bestsolution 119901best
Step 4 (calculate fitness) Compute fitness values for eachparticle in the swarm
Step 5 (update 119901best) Particle best position 119901best is updatedbased onMetropolis criterion (12) as follows if current fitnessof selected particle lt fitness of 119901best then current positionof the particle is accepted as new 119901best with probability 1otherwise current particle will be accepted according to 120588 le
119886 where 120588 isin [0 1] and 119886 = exp(minus(120601(119901current) minus 120601(119901best))119879)
Step 6 (update 119892best) Now assign best particles 119901best value to119892best and calculate the new velocity and position by (9) and(10) for all particles
Step 7 (reduce Temperature) Calculate new temperature 119879
specified in cooling schedule (13) If 119879 le 119879min then terminatesearching for best solution otherwise go to Step 8
Step 8 (update velocities and positions) Calculate velocities119881119894using (9) and positions for each particle by (10)
Step 9 Update 119896 = 119896 + 1 and check whether 119896 ge 119896max orminimum value of cost function is achieved by any particleIf yes then assign 119909 = 119892best otherwise go back to Step 4
Step 10 Design 2Dfilter based on coefficients of 119909 = 119892best andcompute the magnitude response
5 Simulation Results and Discussion
In order to validate the performance of proposed SA-PSO we implemented the algorithm using MATLAB 2009on Genuine Intel(R) Core 2 Duo CPU E7300 266GHz2GBRAM Here all the associated algorithms are executedfor 20 independent runs with 40000 functional evaluations(FE) Table 2 shows the choice of several required parametersfor GA and SA-PSO based methods In case of proposed SA-PSO based methods initial population size is chosen as 100It is also observed that with the further increase in populationsize the solution quality remains unchanged In SA-PSO-TVIW the inertiaweight is decreased from09 to 04 to obtainthe best possible solutions whereas in SA-PSO-RANDIWit is varied randomly in between 05 and 1 The startingtemperature for annealing is chosen to be very high so thatit can escape from local optima easily The parameters forcooling schedule are chosen to be very small to assure globalconvergence otherwise it can skip the true global solutions[20 21]
Table 3 shows an analysis of the experimental results ofthe proposed method with NN (2001) GA (2003) GENET-ICA (2006) TBIA (2008) QPSO DGM (2010) ASA andPSO-TVIW with respect to the best known solutions Thebest results among different methods are highlighted in bold
Journal of Optimization 7
020
2
0
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(a)
020
20
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(b)
020
20
02
04
06
08
1
minus2minus2
1205961
1205962
Abs(M
)
(c)
020
2
0
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(d)
020
20
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(e)
020
20
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(f)
Figure 3 Amplitude response |119872(1205961 1205962)| of 2D filter for 120588 = 2 (a) SA-PSO-TVIW (b) SA-PSO-RANDIW (c) NN (d) GA (e) GENETICA
and (f) TBIA
and it can be easily verified that the experimental results ofthe proposed method that is SA-PSO-TVIW and SA-PSO-RANDIW are better than all other reported methods [1ndash6]For example the percentage improvement in 120601
2provided
by the proposed SA-PSO-TVIW method in comparison to
Further in Table 4 for different values of 120588 the bestpossible solutions (best) and worst solutions (worst) arelisted for different methods which clearly reveal that the
8 Journal of Optimization
020
2
0
02
04
06
08
1
minus2 minus2
12059611205962
Abs(M
)
(a)
020
2
0
02
04
06
08
1
minus2 minus2
12059611205962
Abs(M
)
(b)
Figure 4 Amplitude response |119872(1205961 1205962)| of 2D filter for 120588 = 4 (a) SA-PSO-TVIW and (b) SA-PSO-RANDIW
Table 4 Comparative results of different algorithms in terms of bestand worst case solution
bestworst solutions obtained by the proposed SA-PSO basedalgorithm are best for all runs Furthermore Table 5 presentsthe statistics for average (mean) and variance (VAR) of theoptimal solutions for different values of 120588 From Table 5 itcan be observed that the proposed SA-PSOalgorithmexhibitsbetter performance with less computational time (ie lessthan half time as compared to GA or ASA) and producesoptimal solution with very lower variance Hence it can beimplemented more efficiently in real-time designing of 2Dfilters
Figures 3(a)ndash3(f) show the calculated amplitude responseof 2D filter for 120588 = 2 The best results obtained by SA-PSO-TVIW and SA-PSO-RANDIW are shown in Figures3(a) and 3(b) respectively For comparison purpose theresults obtained by NN GA GENETICA and TBIA are alsoincluded in Figure 3(c) to Figure 3(f) A closer look at thesefigures reveals that SA-PSO based methods that is SA-PSO-TVIW and SA-PSO RANDIW yield a better approximationto the desired response and stop-band ripple is much lesswith respect to other competitivemethods [1ndash6] Figures 4(a)
and 4(b) show the magnitude response obtained using SA-PSO based methods for 120588 = 4 whereas Figures 5(a) and 5(b)represents the magnitude response for 120588 = 8
Figures 6(a) and 6(b) show the best convergence profilesof SA-PSO based methods for 120588 = 2 and 4 respectivelyAmong the 20 independent executions of four different algo-rithms (the existing two algorithms GA and PSO-RANDIWand the proposed two algorithms SA-PSO-TVIW and SA-PSO-RANDIW) the best performed runs are plotted hereInitially for few thousands FEs PSO-RANDIW convergesfaster than GA SA-PSO-RANDIW and SA-PSO-TVIWAfter certain number of iterations the PSO-RANDIW andGA both exhibit premature convergence and settle to nearoptimal solutions After 30000 FEs SA-PSO based methodsfall below the curves of both GA and PSO because SA helpsparticles to jump out from local optima from the beginningof a search Due to this at the end of search procedure theproposed hybrid method provides best candidate solutionHence the proposed hybrid algorithm produces better opti-mal solutions during the search process of 2D recursive filters
6 Conclusions
In this paper a novel hybrid evolutionary algorithm based onPSO and SA is proposed for finding the global best solutionof second-order two-dimensional recursive digital filtersThe proposed hybrid method integrates the global searchcapability of PSO with SA to escape from local minima Theexperimental results are comparedwith earlier reported algo-rithms namely NN GA GENETICA and TBIA in additionto different variants of PSO The comparison results indicatethat SA-PSO based methods exhibit better performance inall experiments and provide best optimum solution duringsearch mechanism Also the proposed method producesthe best solution with lower mean and variance Hence theproposed approach could be an alternative to implementthe two-dimensional digital filters for real-time applications
Journal of Optimization 9
Table 5 Comparative results of different algorithms in terms of mean and VAR
Algorithms 120588 = 2 120588 = 4 120588 = 8 Time (S)Mean VAR Mean VAR Mean VAR
GA 46137 06160 04300 00688 00066 708119864 minus 06 8980ASA 42475 00288 02679 314119864 minus 04 00043 314119864 minus 06 6204PSO-TVIW 44858 02347 03044 00086 00039 160119864 minus 06 4684PSO-CIW 42127 02778 02847 00022 00035 142119864 minus 06 3902PSO-RANDIW 4066 03453 02549 00013 00037 550119864 minus 07 4054SA-PSO (TVIW) 38762 05053 03123 00029 00051 361119864 minus 06 3568SA-PSO (RANDIW) 29059 00094 01845 158E minus 04 00023 728E minus 07 3007
020
2
0
02
04
06
08
1
minus2 minus2
12059611205962
Abs(M
)
(a)
020
20
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(b)
Figure 5 Amplitude response |119872(1205961 1205962)| of 2D filter for 120588 = 8 (a) SA-PSO-TVIW and (b) SA-PSO-RANDIW
05 1 15 2 25 3 35 4Number of iterations
Fitn
ess i
n lo
g sc
ale
times104
101
GAPSO-RANDIW
SA-PSO-TVIWSA-PSO-RANDIW
(a)
05 1 15 2 25 3 35 4
Fitn
ess i
n lo
g sc
ale
times104
101
100
GAPSO-RANDIW
SA-PSO-TVIWSA-PSO-RANDIW
Number of iterations
(b)
Figure 6 Best fitness profile for GA PSO-RANDIW SA-PSO-TVIW and SA-PSO-RANDIW (a) 120588 = 2 and (b) 120588 = 4
10 Journal of Optimization
Emphasis on higher order two-dimensional recursive filterdesign with guaranteed stability and further reduction incomputational complexity is left as future work
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
References
[1] V M Mladenov and N E Mastorakis ldquoDesign of two-dimensional recursive filters by using neural networksrdquo IEEETransactions on Neural Networks vol 12 no 3 pp 585ndash5902001
[2] NMastorakis I F Gonos andM N S Swamy ldquoDesign of two-dimensional recursive filters using genetic algorithmsrdquo IEEETransactions on Circuits and Systems vol 50 no 5 pp 634ndash6392003
[3] I F Gonos L I Virirakis N EMastorakis andMN S SwamyldquoEvolutionary design of 2-dimensional recursive filters via thecomputer language GENETICArdquo IEEE Transactions on Circuitsand Systems II vol 53 no 4 pp 254ndash258 2006
[4] J-T Tsai W-H Ho and J-H Chou ldquoDesign of two-dimensional recursive filters by using Taguchi-based immunealgorithmrdquo IET Signal Processing vol 2 no 2 pp 110ndash117 2008
[5] D T Pham and E Koc ldquoDesign of a two-dimensional recur-sive filter using the bees algorithmrdquo International Journal ofAutomation and Computing vol 7 no 3 pp 399ndash402 2010
[6] J SunW Fang andWXu ldquoA quantum-behaved particle swarmoptimization with diversity-guided mutation for the designof two-dimensional IIR digital filtersrdquo IEEE Transactions onCircuits and Systems II Express Briefs vol 57 no 2 pp 141ndash1452010
[7] F Zhao Q Zhang D Yu X Chen and Y Yang ldquoA hybridalgorithm based on PSO and simulated annealing and itsapplications for partner selection in virtual enterpriserdquo inProceedings of the International Conference on Advances inIntelligent Computing pp 380ndash389 Springer 2005
[8] A Jamili M A Shafia and R Tavakkoli-Moghaddam ldquoAhybrid algorithm based on particle swarm optimization andsimulated annealing for a periodic job shop scheduling prob-lemrdquo International Journal of AdvancedManufacturing Technol-ogy vol 54 no 1ndash4 pp 309ndash322 2011
[9] F Zhao Y Hong D Yu Y Yang Q Zhang and H Yi ldquoA hybridalgorithm based on particle swarm optimization and simulatedannealing to holon task allocation for holonic manufacturingsystemrdquo International Journal of AdvancedManufacturing Tech-nology vol 32 no 9-10 pp 1021ndash1032 2007
[10] L IdoumgharMMelkemi R Schott andM I Aouad ldquoHybridPSO-SA type algorithms for multimodal function optimizationand reducing energy consumption in embedded systemsrdquoApplied Computational Intelligence and Soft Computing vol2011 Article ID 138078 12 pages 2011
[11] H-L Shieh C-C Kuo and C-M Chiang ldquoModified parti-cle swarm optimization algorithm with simulated annealingbehavior and its numerical verificationrdquo Applied Mathematicsand Computation vol 218 no 8 pp 4365ndash4383 2011
[12] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks vol 4 pp 1942ndash1948 1995
[13] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimizationrdquo in Proceedings of the IEEE International Congresson Evolutionary Computation vol 3 pp 101ndash106 1999
[14] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002
[15] A Nickabadi M M Ebadzadeh and R Safabakhsh ldquoA novelparticle swarm optimization algorithm with adaptive inertiaweightrdquoApplied Soft Computing Journal vol 11 no 4 pp 3658ndash3670 2011
[16] J C Bansal P K Singh M Saraswat A Verma S S Jadonand A Abraham ldquoInertia weight strategies in particle swarmoptimizationrdquo in Proceedings of the 3rd World Congress onNature and Biologically Inspired Computing (NaBIC 11) pp633ndash640 Salamanca Spain October 2011
[17] Y Feng G-F Teng A-XWang andY-M Yao ldquoChaotic inertiaweight in particle swarmoptimizationrdquo inProceedings of the 2ndInternational Conference on Innovative Computing Informationand Control (ICICIC 07) p 475 Kumamoto Japan September2007
[18] S Kirkpatrick J Gelatt and M P Vecchi ldquoOptimization bysimulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983
[19] N Metropolis A W Rosenbluth M N Rosenbluth A HTeller and E Teller ldquoEquation of state calculations by fastcomputing machinesrdquo The Journal of Chemical Physics vol 21no 6 pp 1087ndash1092 1953
[20] L Ingber ldquoAdaptive simulated annealing (ASA) lessonslearnedrdquo Journal of Control and Cybernetics vol 25 no 1 pp33ndash54 1996
[21] S Dhabal and P Venkateswaran ldquoAn efficient nonuniformcosinemodulated filter bank design using simulated annealingrdquoJournal of Signal and Information Processing vol 3 no 3 pp330ndash338 2012
Figure 3 Amplitude response |119872(1205961 1205962)| of 2D filter for 120588 = 2 (a) SA-PSO-TVIW (b) SA-PSO-RANDIW (c) NN (d) GA (e) GENETICA
and (f) TBIA
and it can be easily verified that the experimental results ofthe proposed method that is SA-PSO-TVIW and SA-PSO-RANDIW are better than all other reported methods [1ndash6]For example the percentage improvement in 120601
2provided
by the proposed SA-PSO-TVIW method in comparison to
Further in Table 4 for different values of 120588 the bestpossible solutions (best) and worst solutions (worst) arelisted for different methods which clearly reveal that the
8 Journal of Optimization
020
2
0
02
04
06
08
1
minus2 minus2
12059611205962
Abs(M
)
(a)
020
2
0
02
04
06
08
1
minus2 minus2
12059611205962
Abs(M
)
(b)
Figure 4 Amplitude response |119872(1205961 1205962)| of 2D filter for 120588 = 4 (a) SA-PSO-TVIW and (b) SA-PSO-RANDIW
Table 4 Comparative results of different algorithms in terms of bestand worst case solution
bestworst solutions obtained by the proposed SA-PSO basedalgorithm are best for all runs Furthermore Table 5 presentsthe statistics for average (mean) and variance (VAR) of theoptimal solutions for different values of 120588 From Table 5 itcan be observed that the proposed SA-PSOalgorithmexhibitsbetter performance with less computational time (ie lessthan half time as compared to GA or ASA) and producesoptimal solution with very lower variance Hence it can beimplemented more efficiently in real-time designing of 2Dfilters
Figures 3(a)ndash3(f) show the calculated amplitude responseof 2D filter for 120588 = 2 The best results obtained by SA-PSO-TVIW and SA-PSO-RANDIW are shown in Figures3(a) and 3(b) respectively For comparison purpose theresults obtained by NN GA GENETICA and TBIA are alsoincluded in Figure 3(c) to Figure 3(f) A closer look at thesefigures reveals that SA-PSO based methods that is SA-PSO-TVIW and SA-PSO RANDIW yield a better approximationto the desired response and stop-band ripple is much lesswith respect to other competitivemethods [1ndash6] Figures 4(a)
and 4(b) show the magnitude response obtained using SA-PSO based methods for 120588 = 4 whereas Figures 5(a) and 5(b)represents the magnitude response for 120588 = 8
Figures 6(a) and 6(b) show the best convergence profilesof SA-PSO based methods for 120588 = 2 and 4 respectivelyAmong the 20 independent executions of four different algo-rithms (the existing two algorithms GA and PSO-RANDIWand the proposed two algorithms SA-PSO-TVIW and SA-PSO-RANDIW) the best performed runs are plotted hereInitially for few thousands FEs PSO-RANDIW convergesfaster than GA SA-PSO-RANDIW and SA-PSO-TVIWAfter certain number of iterations the PSO-RANDIW andGA both exhibit premature convergence and settle to nearoptimal solutions After 30000 FEs SA-PSO based methodsfall below the curves of both GA and PSO because SA helpsparticles to jump out from local optima from the beginningof a search Due to this at the end of search procedure theproposed hybrid method provides best candidate solutionHence the proposed hybrid algorithm produces better opti-mal solutions during the search process of 2D recursive filters
6 Conclusions
In this paper a novel hybrid evolutionary algorithm based onPSO and SA is proposed for finding the global best solutionof second-order two-dimensional recursive digital filtersThe proposed hybrid method integrates the global searchcapability of PSO with SA to escape from local minima Theexperimental results are comparedwith earlier reported algo-rithms namely NN GA GENETICA and TBIA in additionto different variants of PSO The comparison results indicatethat SA-PSO based methods exhibit better performance inall experiments and provide best optimum solution duringsearch mechanism Also the proposed method producesthe best solution with lower mean and variance Hence theproposed approach could be an alternative to implementthe two-dimensional digital filters for real-time applications
Journal of Optimization 9
Table 5 Comparative results of different algorithms in terms of mean and VAR
Algorithms 120588 = 2 120588 = 4 120588 = 8 Time (S)Mean VAR Mean VAR Mean VAR
GA 46137 06160 04300 00688 00066 708119864 minus 06 8980ASA 42475 00288 02679 314119864 minus 04 00043 314119864 minus 06 6204PSO-TVIW 44858 02347 03044 00086 00039 160119864 minus 06 4684PSO-CIW 42127 02778 02847 00022 00035 142119864 minus 06 3902PSO-RANDIW 4066 03453 02549 00013 00037 550119864 minus 07 4054SA-PSO (TVIW) 38762 05053 03123 00029 00051 361119864 minus 06 3568SA-PSO (RANDIW) 29059 00094 01845 158E minus 04 00023 728E minus 07 3007
020
2
0
02
04
06
08
1
minus2 minus2
12059611205962
Abs(M
)
(a)
020
20
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(b)
Figure 5 Amplitude response |119872(1205961 1205962)| of 2D filter for 120588 = 8 (a) SA-PSO-TVIW and (b) SA-PSO-RANDIW
05 1 15 2 25 3 35 4Number of iterations
Fitn
ess i
n lo
g sc
ale
times104
101
GAPSO-RANDIW
SA-PSO-TVIWSA-PSO-RANDIW
(a)
05 1 15 2 25 3 35 4
Fitn
ess i
n lo
g sc
ale
times104
101
100
GAPSO-RANDIW
SA-PSO-TVIWSA-PSO-RANDIW
Number of iterations
(b)
Figure 6 Best fitness profile for GA PSO-RANDIW SA-PSO-TVIW and SA-PSO-RANDIW (a) 120588 = 2 and (b) 120588 = 4
10 Journal of Optimization
Emphasis on higher order two-dimensional recursive filterdesign with guaranteed stability and further reduction incomputational complexity is left as future work
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
References
[1] V M Mladenov and N E Mastorakis ldquoDesign of two-dimensional recursive filters by using neural networksrdquo IEEETransactions on Neural Networks vol 12 no 3 pp 585ndash5902001
[2] NMastorakis I F Gonos andM N S Swamy ldquoDesign of two-dimensional recursive filters using genetic algorithmsrdquo IEEETransactions on Circuits and Systems vol 50 no 5 pp 634ndash6392003
[3] I F Gonos L I Virirakis N EMastorakis andMN S SwamyldquoEvolutionary design of 2-dimensional recursive filters via thecomputer language GENETICArdquo IEEE Transactions on Circuitsand Systems II vol 53 no 4 pp 254ndash258 2006
[4] J-T Tsai W-H Ho and J-H Chou ldquoDesign of two-dimensional recursive filters by using Taguchi-based immunealgorithmrdquo IET Signal Processing vol 2 no 2 pp 110ndash117 2008
[5] D T Pham and E Koc ldquoDesign of a two-dimensional recur-sive filter using the bees algorithmrdquo International Journal ofAutomation and Computing vol 7 no 3 pp 399ndash402 2010
[6] J SunW Fang andWXu ldquoA quantum-behaved particle swarmoptimization with diversity-guided mutation for the designof two-dimensional IIR digital filtersrdquo IEEE Transactions onCircuits and Systems II Express Briefs vol 57 no 2 pp 141ndash1452010
[7] F Zhao Q Zhang D Yu X Chen and Y Yang ldquoA hybridalgorithm based on PSO and simulated annealing and itsapplications for partner selection in virtual enterpriserdquo inProceedings of the International Conference on Advances inIntelligent Computing pp 380ndash389 Springer 2005
[8] A Jamili M A Shafia and R Tavakkoli-Moghaddam ldquoAhybrid algorithm based on particle swarm optimization andsimulated annealing for a periodic job shop scheduling prob-lemrdquo International Journal of AdvancedManufacturing Technol-ogy vol 54 no 1ndash4 pp 309ndash322 2011
[9] F Zhao Y Hong D Yu Y Yang Q Zhang and H Yi ldquoA hybridalgorithm based on particle swarm optimization and simulatedannealing to holon task allocation for holonic manufacturingsystemrdquo International Journal of AdvancedManufacturing Tech-nology vol 32 no 9-10 pp 1021ndash1032 2007
[10] L IdoumgharMMelkemi R Schott andM I Aouad ldquoHybridPSO-SA type algorithms for multimodal function optimizationand reducing energy consumption in embedded systemsrdquoApplied Computational Intelligence and Soft Computing vol2011 Article ID 138078 12 pages 2011
[11] H-L Shieh C-C Kuo and C-M Chiang ldquoModified parti-cle swarm optimization algorithm with simulated annealingbehavior and its numerical verificationrdquo Applied Mathematicsand Computation vol 218 no 8 pp 4365ndash4383 2011
[12] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks vol 4 pp 1942ndash1948 1995
[13] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimizationrdquo in Proceedings of the IEEE International Congresson Evolutionary Computation vol 3 pp 101ndash106 1999
[14] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002
[15] A Nickabadi M M Ebadzadeh and R Safabakhsh ldquoA novelparticle swarm optimization algorithm with adaptive inertiaweightrdquoApplied Soft Computing Journal vol 11 no 4 pp 3658ndash3670 2011
[16] J C Bansal P K Singh M Saraswat A Verma S S Jadonand A Abraham ldquoInertia weight strategies in particle swarmoptimizationrdquo in Proceedings of the 3rd World Congress onNature and Biologically Inspired Computing (NaBIC 11) pp633ndash640 Salamanca Spain October 2011
[17] Y Feng G-F Teng A-XWang andY-M Yao ldquoChaotic inertiaweight in particle swarmoptimizationrdquo inProceedings of the 2ndInternational Conference on Innovative Computing Informationand Control (ICICIC 07) p 475 Kumamoto Japan September2007
[18] S Kirkpatrick J Gelatt and M P Vecchi ldquoOptimization bysimulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983
[19] N Metropolis A W Rosenbluth M N Rosenbluth A HTeller and E Teller ldquoEquation of state calculations by fastcomputing machinesrdquo The Journal of Chemical Physics vol 21no 6 pp 1087ndash1092 1953
[20] L Ingber ldquoAdaptive simulated annealing (ASA) lessonslearnedrdquo Journal of Control and Cybernetics vol 25 no 1 pp33ndash54 1996
[21] S Dhabal and P Venkateswaran ldquoAn efficient nonuniformcosinemodulated filter bank design using simulated annealingrdquoJournal of Signal and Information Processing vol 3 no 3 pp330ndash338 2012
bestworst solutions obtained by the proposed SA-PSO basedalgorithm are best for all runs Furthermore Table 5 presentsthe statistics for average (mean) and variance (VAR) of theoptimal solutions for different values of 120588 From Table 5 itcan be observed that the proposed SA-PSOalgorithmexhibitsbetter performance with less computational time (ie lessthan half time as compared to GA or ASA) and producesoptimal solution with very lower variance Hence it can beimplemented more efficiently in real-time designing of 2Dfilters
Figures 3(a)ndash3(f) show the calculated amplitude responseof 2D filter for 120588 = 2 The best results obtained by SA-PSO-TVIW and SA-PSO-RANDIW are shown in Figures3(a) and 3(b) respectively For comparison purpose theresults obtained by NN GA GENETICA and TBIA are alsoincluded in Figure 3(c) to Figure 3(f) A closer look at thesefigures reveals that SA-PSO based methods that is SA-PSO-TVIW and SA-PSO RANDIW yield a better approximationto the desired response and stop-band ripple is much lesswith respect to other competitivemethods [1ndash6] Figures 4(a)
and 4(b) show the magnitude response obtained using SA-PSO based methods for 120588 = 4 whereas Figures 5(a) and 5(b)represents the magnitude response for 120588 = 8
Figures 6(a) and 6(b) show the best convergence profilesof SA-PSO based methods for 120588 = 2 and 4 respectivelyAmong the 20 independent executions of four different algo-rithms (the existing two algorithms GA and PSO-RANDIWand the proposed two algorithms SA-PSO-TVIW and SA-PSO-RANDIW) the best performed runs are plotted hereInitially for few thousands FEs PSO-RANDIW convergesfaster than GA SA-PSO-RANDIW and SA-PSO-TVIWAfter certain number of iterations the PSO-RANDIW andGA both exhibit premature convergence and settle to nearoptimal solutions After 30000 FEs SA-PSO based methodsfall below the curves of both GA and PSO because SA helpsparticles to jump out from local optima from the beginningof a search Due to this at the end of search procedure theproposed hybrid method provides best candidate solutionHence the proposed hybrid algorithm produces better opti-mal solutions during the search process of 2D recursive filters
6 Conclusions
In this paper a novel hybrid evolutionary algorithm based onPSO and SA is proposed for finding the global best solutionof second-order two-dimensional recursive digital filtersThe proposed hybrid method integrates the global searchcapability of PSO with SA to escape from local minima Theexperimental results are comparedwith earlier reported algo-rithms namely NN GA GENETICA and TBIA in additionto different variants of PSO The comparison results indicatethat SA-PSO based methods exhibit better performance inall experiments and provide best optimum solution duringsearch mechanism Also the proposed method producesthe best solution with lower mean and variance Hence theproposed approach could be an alternative to implementthe two-dimensional digital filters for real-time applications
Journal of Optimization 9
Table 5 Comparative results of different algorithms in terms of mean and VAR
Algorithms 120588 = 2 120588 = 4 120588 = 8 Time (S)Mean VAR Mean VAR Mean VAR
GA 46137 06160 04300 00688 00066 708119864 minus 06 8980ASA 42475 00288 02679 314119864 minus 04 00043 314119864 minus 06 6204PSO-TVIW 44858 02347 03044 00086 00039 160119864 minus 06 4684PSO-CIW 42127 02778 02847 00022 00035 142119864 minus 06 3902PSO-RANDIW 4066 03453 02549 00013 00037 550119864 minus 07 4054SA-PSO (TVIW) 38762 05053 03123 00029 00051 361119864 minus 06 3568SA-PSO (RANDIW) 29059 00094 01845 158E minus 04 00023 728E minus 07 3007
020
2
0
02
04
06
08
1
minus2 minus2
12059611205962
Abs(M
)
(a)
020
20
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(b)
Figure 5 Amplitude response |119872(1205961 1205962)| of 2D filter for 120588 = 8 (a) SA-PSO-TVIW and (b) SA-PSO-RANDIW
05 1 15 2 25 3 35 4Number of iterations
Fitn
ess i
n lo
g sc
ale
times104
101
GAPSO-RANDIW
SA-PSO-TVIWSA-PSO-RANDIW
(a)
05 1 15 2 25 3 35 4
Fitn
ess i
n lo
g sc
ale
times104
101
100
GAPSO-RANDIW
SA-PSO-TVIWSA-PSO-RANDIW
Number of iterations
(b)
Figure 6 Best fitness profile for GA PSO-RANDIW SA-PSO-TVIW and SA-PSO-RANDIW (a) 120588 = 2 and (b) 120588 = 4
10 Journal of Optimization
Emphasis on higher order two-dimensional recursive filterdesign with guaranteed stability and further reduction incomputational complexity is left as future work
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
References
[1] V M Mladenov and N E Mastorakis ldquoDesign of two-dimensional recursive filters by using neural networksrdquo IEEETransactions on Neural Networks vol 12 no 3 pp 585ndash5902001
[2] NMastorakis I F Gonos andM N S Swamy ldquoDesign of two-dimensional recursive filters using genetic algorithmsrdquo IEEETransactions on Circuits and Systems vol 50 no 5 pp 634ndash6392003
[3] I F Gonos L I Virirakis N EMastorakis andMN S SwamyldquoEvolutionary design of 2-dimensional recursive filters via thecomputer language GENETICArdquo IEEE Transactions on Circuitsand Systems II vol 53 no 4 pp 254ndash258 2006
[4] J-T Tsai W-H Ho and J-H Chou ldquoDesign of two-dimensional recursive filters by using Taguchi-based immunealgorithmrdquo IET Signal Processing vol 2 no 2 pp 110ndash117 2008
[5] D T Pham and E Koc ldquoDesign of a two-dimensional recur-sive filter using the bees algorithmrdquo International Journal ofAutomation and Computing vol 7 no 3 pp 399ndash402 2010
[6] J SunW Fang andWXu ldquoA quantum-behaved particle swarmoptimization with diversity-guided mutation for the designof two-dimensional IIR digital filtersrdquo IEEE Transactions onCircuits and Systems II Express Briefs vol 57 no 2 pp 141ndash1452010
[7] F Zhao Q Zhang D Yu X Chen and Y Yang ldquoA hybridalgorithm based on PSO and simulated annealing and itsapplications for partner selection in virtual enterpriserdquo inProceedings of the International Conference on Advances inIntelligent Computing pp 380ndash389 Springer 2005
[8] A Jamili M A Shafia and R Tavakkoli-Moghaddam ldquoAhybrid algorithm based on particle swarm optimization andsimulated annealing for a periodic job shop scheduling prob-lemrdquo International Journal of AdvancedManufacturing Technol-ogy vol 54 no 1ndash4 pp 309ndash322 2011
[9] F Zhao Y Hong D Yu Y Yang Q Zhang and H Yi ldquoA hybridalgorithm based on particle swarm optimization and simulatedannealing to holon task allocation for holonic manufacturingsystemrdquo International Journal of AdvancedManufacturing Tech-nology vol 32 no 9-10 pp 1021ndash1032 2007
[10] L IdoumgharMMelkemi R Schott andM I Aouad ldquoHybridPSO-SA type algorithms for multimodal function optimizationand reducing energy consumption in embedded systemsrdquoApplied Computational Intelligence and Soft Computing vol2011 Article ID 138078 12 pages 2011
[11] H-L Shieh C-C Kuo and C-M Chiang ldquoModified parti-cle swarm optimization algorithm with simulated annealingbehavior and its numerical verificationrdquo Applied Mathematicsand Computation vol 218 no 8 pp 4365ndash4383 2011
[12] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks vol 4 pp 1942ndash1948 1995
[13] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimizationrdquo in Proceedings of the IEEE International Congresson Evolutionary Computation vol 3 pp 101ndash106 1999
[14] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002
[15] A Nickabadi M M Ebadzadeh and R Safabakhsh ldquoA novelparticle swarm optimization algorithm with adaptive inertiaweightrdquoApplied Soft Computing Journal vol 11 no 4 pp 3658ndash3670 2011
[16] J C Bansal P K Singh M Saraswat A Verma S S Jadonand A Abraham ldquoInertia weight strategies in particle swarmoptimizationrdquo in Proceedings of the 3rd World Congress onNature and Biologically Inspired Computing (NaBIC 11) pp633ndash640 Salamanca Spain October 2011
[17] Y Feng G-F Teng A-XWang andY-M Yao ldquoChaotic inertiaweight in particle swarmoptimizationrdquo inProceedings of the 2ndInternational Conference on Innovative Computing Informationand Control (ICICIC 07) p 475 Kumamoto Japan September2007
[18] S Kirkpatrick J Gelatt and M P Vecchi ldquoOptimization bysimulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983
[19] N Metropolis A W Rosenbluth M N Rosenbluth A HTeller and E Teller ldquoEquation of state calculations by fastcomputing machinesrdquo The Journal of Chemical Physics vol 21no 6 pp 1087ndash1092 1953
[20] L Ingber ldquoAdaptive simulated annealing (ASA) lessonslearnedrdquo Journal of Control and Cybernetics vol 25 no 1 pp33ndash54 1996
[21] S Dhabal and P Venkateswaran ldquoAn efficient nonuniformcosinemodulated filter bank design using simulated annealingrdquoJournal of Signal and Information Processing vol 3 no 3 pp330ndash338 2012
Table 5 Comparative results of different algorithms in terms of mean and VAR
Algorithms 120588 = 2 120588 = 4 120588 = 8 Time (S)Mean VAR Mean VAR Mean VAR
GA 46137 06160 04300 00688 00066 708119864 minus 06 8980ASA 42475 00288 02679 314119864 minus 04 00043 314119864 minus 06 6204PSO-TVIW 44858 02347 03044 00086 00039 160119864 minus 06 4684PSO-CIW 42127 02778 02847 00022 00035 142119864 minus 06 3902PSO-RANDIW 4066 03453 02549 00013 00037 550119864 minus 07 4054SA-PSO (TVIW) 38762 05053 03123 00029 00051 361119864 minus 06 3568SA-PSO (RANDIW) 29059 00094 01845 158E minus 04 00023 728E minus 07 3007
020
2
0
02
04
06
08
1
minus2 minus2
12059611205962
Abs(M
)
(a)
020
20
02
04
06
08
1
minus2 minus2
1205961
1205962
Abs(M
)
(b)
Figure 5 Amplitude response |119872(1205961 1205962)| of 2D filter for 120588 = 8 (a) SA-PSO-TVIW and (b) SA-PSO-RANDIW
05 1 15 2 25 3 35 4Number of iterations
Fitn
ess i
n lo
g sc
ale
times104
101
GAPSO-RANDIW
SA-PSO-TVIWSA-PSO-RANDIW
(a)
05 1 15 2 25 3 35 4
Fitn
ess i
n lo
g sc
ale
times104
101
100
GAPSO-RANDIW
SA-PSO-TVIWSA-PSO-RANDIW
Number of iterations
(b)
Figure 6 Best fitness profile for GA PSO-RANDIW SA-PSO-TVIW and SA-PSO-RANDIW (a) 120588 = 2 and (b) 120588 = 4
10 Journal of Optimization
Emphasis on higher order two-dimensional recursive filterdesign with guaranteed stability and further reduction incomputational complexity is left as future work
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
References
[1] V M Mladenov and N E Mastorakis ldquoDesign of two-dimensional recursive filters by using neural networksrdquo IEEETransactions on Neural Networks vol 12 no 3 pp 585ndash5902001
[2] NMastorakis I F Gonos andM N S Swamy ldquoDesign of two-dimensional recursive filters using genetic algorithmsrdquo IEEETransactions on Circuits and Systems vol 50 no 5 pp 634ndash6392003
[3] I F Gonos L I Virirakis N EMastorakis andMN S SwamyldquoEvolutionary design of 2-dimensional recursive filters via thecomputer language GENETICArdquo IEEE Transactions on Circuitsand Systems II vol 53 no 4 pp 254ndash258 2006
[4] J-T Tsai W-H Ho and J-H Chou ldquoDesign of two-dimensional recursive filters by using Taguchi-based immunealgorithmrdquo IET Signal Processing vol 2 no 2 pp 110ndash117 2008
[5] D T Pham and E Koc ldquoDesign of a two-dimensional recur-sive filter using the bees algorithmrdquo International Journal ofAutomation and Computing vol 7 no 3 pp 399ndash402 2010
[6] J SunW Fang andWXu ldquoA quantum-behaved particle swarmoptimization with diversity-guided mutation for the designof two-dimensional IIR digital filtersrdquo IEEE Transactions onCircuits and Systems II Express Briefs vol 57 no 2 pp 141ndash1452010
[7] F Zhao Q Zhang D Yu X Chen and Y Yang ldquoA hybridalgorithm based on PSO and simulated annealing and itsapplications for partner selection in virtual enterpriserdquo inProceedings of the International Conference on Advances inIntelligent Computing pp 380ndash389 Springer 2005
[8] A Jamili M A Shafia and R Tavakkoli-Moghaddam ldquoAhybrid algorithm based on particle swarm optimization andsimulated annealing for a periodic job shop scheduling prob-lemrdquo International Journal of AdvancedManufacturing Technol-ogy vol 54 no 1ndash4 pp 309ndash322 2011
[9] F Zhao Y Hong D Yu Y Yang Q Zhang and H Yi ldquoA hybridalgorithm based on particle swarm optimization and simulatedannealing to holon task allocation for holonic manufacturingsystemrdquo International Journal of AdvancedManufacturing Tech-nology vol 32 no 9-10 pp 1021ndash1032 2007
[10] L IdoumgharMMelkemi R Schott andM I Aouad ldquoHybridPSO-SA type algorithms for multimodal function optimizationand reducing energy consumption in embedded systemsrdquoApplied Computational Intelligence and Soft Computing vol2011 Article ID 138078 12 pages 2011
[11] H-L Shieh C-C Kuo and C-M Chiang ldquoModified parti-cle swarm optimization algorithm with simulated annealingbehavior and its numerical verificationrdquo Applied Mathematicsand Computation vol 218 no 8 pp 4365ndash4383 2011
[12] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks vol 4 pp 1942ndash1948 1995
[13] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimizationrdquo in Proceedings of the IEEE International Congresson Evolutionary Computation vol 3 pp 101ndash106 1999
[14] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002
[15] A Nickabadi M M Ebadzadeh and R Safabakhsh ldquoA novelparticle swarm optimization algorithm with adaptive inertiaweightrdquoApplied Soft Computing Journal vol 11 no 4 pp 3658ndash3670 2011
[16] J C Bansal P K Singh M Saraswat A Verma S S Jadonand A Abraham ldquoInertia weight strategies in particle swarmoptimizationrdquo in Proceedings of the 3rd World Congress onNature and Biologically Inspired Computing (NaBIC 11) pp633ndash640 Salamanca Spain October 2011
[17] Y Feng G-F Teng A-XWang andY-M Yao ldquoChaotic inertiaweight in particle swarmoptimizationrdquo inProceedings of the 2ndInternational Conference on Innovative Computing Informationand Control (ICICIC 07) p 475 Kumamoto Japan September2007
[18] S Kirkpatrick J Gelatt and M P Vecchi ldquoOptimization bysimulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983
[19] N Metropolis A W Rosenbluth M N Rosenbluth A HTeller and E Teller ldquoEquation of state calculations by fastcomputing machinesrdquo The Journal of Chemical Physics vol 21no 6 pp 1087ndash1092 1953
[20] L Ingber ldquoAdaptive simulated annealing (ASA) lessonslearnedrdquo Journal of Control and Cybernetics vol 25 no 1 pp33ndash54 1996
[21] S Dhabal and P Venkateswaran ldquoAn efficient nonuniformcosinemodulated filter bank design using simulated annealingrdquoJournal of Signal and Information Processing vol 3 no 3 pp330ndash338 2012
Emphasis on higher order two-dimensional recursive filterdesign with guaranteed stability and further reduction incomputational complexity is left as future work
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
References
[1] V M Mladenov and N E Mastorakis ldquoDesign of two-dimensional recursive filters by using neural networksrdquo IEEETransactions on Neural Networks vol 12 no 3 pp 585ndash5902001
[2] NMastorakis I F Gonos andM N S Swamy ldquoDesign of two-dimensional recursive filters using genetic algorithmsrdquo IEEETransactions on Circuits and Systems vol 50 no 5 pp 634ndash6392003
[3] I F Gonos L I Virirakis N EMastorakis andMN S SwamyldquoEvolutionary design of 2-dimensional recursive filters via thecomputer language GENETICArdquo IEEE Transactions on Circuitsand Systems II vol 53 no 4 pp 254ndash258 2006
[4] J-T Tsai W-H Ho and J-H Chou ldquoDesign of two-dimensional recursive filters by using Taguchi-based immunealgorithmrdquo IET Signal Processing vol 2 no 2 pp 110ndash117 2008
[5] D T Pham and E Koc ldquoDesign of a two-dimensional recur-sive filter using the bees algorithmrdquo International Journal ofAutomation and Computing vol 7 no 3 pp 399ndash402 2010
[6] J SunW Fang andWXu ldquoA quantum-behaved particle swarmoptimization with diversity-guided mutation for the designof two-dimensional IIR digital filtersrdquo IEEE Transactions onCircuits and Systems II Express Briefs vol 57 no 2 pp 141ndash1452010
[7] F Zhao Q Zhang D Yu X Chen and Y Yang ldquoA hybridalgorithm based on PSO and simulated annealing and itsapplications for partner selection in virtual enterpriserdquo inProceedings of the International Conference on Advances inIntelligent Computing pp 380ndash389 Springer 2005
[8] A Jamili M A Shafia and R Tavakkoli-Moghaddam ldquoAhybrid algorithm based on particle swarm optimization andsimulated annealing for a periodic job shop scheduling prob-lemrdquo International Journal of AdvancedManufacturing Technol-ogy vol 54 no 1ndash4 pp 309ndash322 2011
[9] F Zhao Y Hong D Yu Y Yang Q Zhang and H Yi ldquoA hybridalgorithm based on particle swarm optimization and simulatedannealing to holon task allocation for holonic manufacturingsystemrdquo International Journal of AdvancedManufacturing Tech-nology vol 32 no 9-10 pp 1021ndash1032 2007
[10] L IdoumgharMMelkemi R Schott andM I Aouad ldquoHybridPSO-SA type algorithms for multimodal function optimizationand reducing energy consumption in embedded systemsrdquoApplied Computational Intelligence and Soft Computing vol2011 Article ID 138078 12 pages 2011
[11] H-L Shieh C-C Kuo and C-M Chiang ldquoModified parti-cle swarm optimization algorithm with simulated annealingbehavior and its numerical verificationrdquo Applied Mathematicsand Computation vol 218 no 8 pp 4365ndash4383 2011
[12] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks vol 4 pp 1942ndash1948 1995
[13] Y Shi and R C Eberhart ldquoEmpirical study of particle swarmoptimizationrdquo in Proceedings of the IEEE International Congresson Evolutionary Computation vol 3 pp 101ndash106 1999
[14] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002
[15] A Nickabadi M M Ebadzadeh and R Safabakhsh ldquoA novelparticle swarm optimization algorithm with adaptive inertiaweightrdquoApplied Soft Computing Journal vol 11 no 4 pp 3658ndash3670 2011
[16] J C Bansal P K Singh M Saraswat A Verma S S Jadonand A Abraham ldquoInertia weight strategies in particle swarmoptimizationrdquo in Proceedings of the 3rd World Congress onNature and Biologically Inspired Computing (NaBIC 11) pp633ndash640 Salamanca Spain October 2011
[17] Y Feng G-F Teng A-XWang andY-M Yao ldquoChaotic inertiaweight in particle swarmoptimizationrdquo inProceedings of the 2ndInternational Conference on Innovative Computing Informationand Control (ICICIC 07) p 475 Kumamoto Japan September2007
[18] S Kirkpatrick J Gelatt and M P Vecchi ldquoOptimization bysimulated annealingrdquo Science vol 220 no 4598 pp 671ndash6801983
[19] N Metropolis A W Rosenbluth M N Rosenbluth A HTeller and E Teller ldquoEquation of state calculations by fastcomputing machinesrdquo The Journal of Chemical Physics vol 21no 6 pp 1087ndash1092 1953
[20] L Ingber ldquoAdaptive simulated annealing (ASA) lessonslearnedrdquo Journal of Control and Cybernetics vol 25 no 1 pp33ndash54 1996
[21] S Dhabal and P Venkateswaran ldquoAn efficient nonuniformcosinemodulated filter bank design using simulated annealingrdquoJournal of Signal and Information Processing vol 3 no 3 pp330ndash338 2012