Research Article - (2016) Volume 5, Issue 3
This paper introduces a hybrid scheme for multi-objective optimization problems via utility of two established heuristic algorithms. The proposed hybrid scheme consists of two parts that include the Nelder-Mead simplex algorithm (SA) as well as the Non-dominated Sorting Genetic Algorithm II (NSGA II). In this respect, subsequent to NSGA II sorting for the optimum points, the SA searches the optimum set to find the local optimal points and thus localize a promising area that is likely to contain the global minimum. This is especially helpful since SA is an efficient algorithm that can accurately and quickly exploit the promising area for the optimum point. The proposed hybrid scheme is applied to multi- objective optimization of some bench mark functions and its performance is compared against those of the classical NSGA II as well as the Multi-Objective Particle Swarm Optimization (MOPSO). The numerical results show that the proposed hybrid scheme provides competitive results that outperform those of the existing algorithms.
Keywords: Heuristic hybrid algorithm; Multi-objective optimization problem; Simplex algorithm; Genetic algorithm
Often, in many engineering applications it is required to find the best approximate solution of multi-objective optimization problems quick and with good accuracy. Multi-objective optimization problems are very common and important in real world applications, and as such many researchers are still working to devise various novel heuristic and mathematical approaches for their solution. Mathematical techniques were more common in the earlier times for optimal solution, but as many of the conventional and especially deterministic pure mathematical optimization schemes suffer from stagnation in local optima, there is a demand for new innovations for heuristic methods [1]. The history of heuristic algorithms as a revolutionary idea for stochastic optimization techniques goes back to 1984 [1,2]. Further, it is notable that stochastic optimization techniques are readily applicable to the real problem because of gradient free mechanism and local optima trap avoidance. Stochastic and multi- objective optimization is in use in different fields of study such as mechanical engineering [3], civil engineering [4], chemistry [5,6] and other [7]. There are also some heuristic algorithms suggested to solve challenging problems in power systems such as cuckoo search algorithm [8], Pseudo-Gradient based Particle Swarm Optimization with Constriction Factor (PG-PSOCF) [9] and Hybrid Particle Swarm Optimization (HPSO) [10].
Within the past decade, a number of multi-objective heuristic algorithms has been suggested [11–14] that represent various trade-offs between the objectives including optimal solutions of a multi-objective problem [15]. In contrary to classical optimization methods converting the multi-objective optimization problem to a single-objective optimization problem by emphasizing one particular Pareto optimal solutions at a time, multi-objective heuristic algorithms give a set of optimal solutions called Pareto optimal solutions (PAS). A key point in this regard is to find accurate approximation of true Pareto optimal solutions with the highest diversity for multi-objective optimization algorithms [16]. PAS help decision makers to select a certain optimal points in a diverse range of design options based on their experience and design requirements. Among the most well-known stochastic optimization techniques are Non-dominated Sorting Genetic Algorithm (NSGA) [17], Strength-Pareto Evolutionary Algorithm (SPEA) [18,19], Pareto Archived Evolution Strategy (PAES) [20], Pareto frontier Differential Evolution (PDE) [21], Efficient Ant Colony Optimization [22], Hybridization of Fruit Fly Optimization Algorithm and Firefly Algorithm (FOA-FA) [23], Non-dominated Sorting Genetic Algorithm II (NSGA II) [24] and Multi-Objective Particle Swarm Optimization (MOPSO) [25]. MOPSO is one of the most popular multi-objective meta-heuristic algorithms that are conceptually similar to PSO. In more recent MOPSO studies [26–28] extra conditions such as diversity estimation are utilized to reach better exploration characteristics. In this sense, the concept of Pareto dominance for selecting leaders from a non-dominated external archive has been utilized by MOPSO where the leaders of swarms that guide the particles to the Pareto Frontier are selected from the top portion of the archive at each iteration. A dynamic fitness inheritance technique [29] is also applied to reduce the computation cost of the optimization process. Another well-known multi-objective meta-heuristics is NSGA II [24] that is a multiobjective version of the well- regarded Genetic Algorithm (GA) [30,31]. NSGA II was proposed to remove several deficiencies of its first version [17] that included high computational cost of nondominated sorting, lack of elitism and lack of sharing parameters. In the second version, the first population is generated randomly and each individual is grouped based on the non-dominated sorting method and new population is generated by genetic operators such as selection, crossover and mutation operators. The new population is next sorted again and the new elements of the final population are selected via higher non-dominated level. The process of selecting the non-dominated individuals is repeated till reaching a population of the same size as the initial population.
More recently over the past three years a number of multiobjective optimization algorithms have also been proposed. These new studied include Multi-objective Grey Wolf Optimizer [1], Multi-objective Cat Swarm Optimization [32], Multi-objective Differential Evolution [33], Multi-objective Gravitational Search Algorithm [34], Multi-objective Ant Colony Optimization [35], Multi-objective Artificial Bee Colony algorithm [36] and the Multiobjective Teaching-Learning-Based Optimization algorithm [37]. These studies show the ability of meta-heuristics in handling multiple objectives. Although all algorithms are able to approximate the true Pareto optimal front of a given problem, they are not able to solve all optimization problems accurately. Therefore, it is still very likely that new hybrid algorithms to be proposed that can solve new problems that not solved before and/or with a better accuracy as compared with the existing techniques.
In this respect a new hybrid heuristic optimization algorithm is proposed in the current work for multi-objective problems. The proposed hybrid algorithm hybridizes the Nelder-Mead simplex algorithm with the Non-dominated Sorting Genetic Algorithm II (NSGA II) to find the best global point. The performance of the new proposed algorithm is demonstrated via some complex benchmark functions.
The organization of this paper is as follows: section 2 deals with the mathematical representation of multi-objective optimization. Section 3 describes the simplex method. Section 4 is devoted to non-dominated sorting genetic algorithm II. The proposed hybrid algorithm is introduced in section 5. Section 6 provides the numerical simulation results. Finally, a conclusion is drawn in section 7.
The problem is to minimize the fitness function of a multi-objective problem. The model is described as follows:
where is the N-dimensional vector of decision variables that represents the possible solutions, M is the number of objectives, fm is the objective function, and (gj, hk) are the inequality and equality constraint functions, respectively.
To clarify the concept of Pareto optimality, consider two vectors {Fa, Fb}M. Fa dominates The dominance relations
for a two-objective problem are indicated by labelled circles in Figure 1. Hence, a vector of decision variables Xa is a non-dominated solution if and only if there is no other solution Xb such that
The non-dominated solution is also known as the Pareto optimal solution. The set of non-dominated solution of a multi-objective optimization problem is known as the Pareto Optimal set (P). The set of objective vectors with respect to (P) is known as the Pareto Front
The PF for a two-objective problem is illustrated by the black circles in Figure 1.
It is desired to find as many non-dominated solutions as possible according to the conflicting and different objective functions and constraints in the multi-objective optimization algorithm. The Pareto front corresponding to the non-dominated set should be as close to and well distributed over the true Pareto front as possible. However, it is possible to have different solutions that map to the same fitness value in objective space.
Simplex is a heuristic optimization algorithm. It is a direct search method that does not use the derivatives of the objective function. Simplex is a geometrical object produced by n+1 points (X0,..., Xn) in an n-dimensional space [38,39]. Thus for example, the simplex in twodimension space is a triangle. The basic idea of the simplex algorithm is to compare the value of the objective function at n+1 vertices of a simplex object and move it gradually toward the optimum point via an iterative process. The following equations can be used to generate the vertices of a regular simplex of size a (equilateral triangle in twodimensional space), within the n-dimensional space [39]:
where X0 is the initial base point, us is the unit vector along the coordinate axis s and p,q are defined as follows:
Through a sequence of elementary geometric transformations (reflection, contraction and expansion), the simplex, shown in Figure 2, moves toward the optimum point where after each transformation, the current worst vertex is replaced by a better one.
The reflected, expanded and contracted points are given by: Xr, Xe and Xc , respectively.
Here, is the centroid of all vertices
where h is the index of the worst point. The parameters α, β and γ are called reflection, expansion and contraction coefficients, respectively.
Reflection is the operation by which the worst vertex, called high, is reflected with respect to the centroid .. If the reflected point is better than all other points, the method expands the simplex in the reflection direction; otherwise, if it is at least better than the worst, the algorithm performs again the reflection with the new worst point [39]. The contraction is performed when the worst point is at least as good as the reflected point.
The NSGA II [24] algorithm is one of the most efficient and famous algorithms for multi-objective optimization. It uses fast non-dominated sorting to rank the population fronts and a parameter called crowding distance is calculated in the same front. Subsequently, the tournament selection is made between two individuals that are randomly selected from the parent population. The individual with lower front number is selected if the two individuals come from different fronts. The individual with higher crowding distance is selected if the two individuals are from the same front. Next the crossover and mutation operators are used to generate a new offspring population. And finally, the parent and offspring population are combined together where a fast nondominated sorting and crowding distance assignment procedure is used to rank the combined population and only the best N individuals are selected as the new parent population [40].
NSGA II is proposed on basis of non-dominated sorting genetic algorithm (NSGA) [17] and the main advantages of NSGA II compared with NSGA are as follows: (1) it uses a fast non-dominated sorting approach, (2) it has no sharing parameter and (3) it uses an elitist strategy [40]. NSGA II has good global search ability and well distributed non-dominated solution in the Pareto front. Nevertheless, the convergence capability of NSGA II is limited and its elite strategy may result in the loss of the non-dominated solutions in the Pareto front [40].
In This section, the proposed hybrid algorithm is introduced as a tool for multi-objective optimization problems. First the basic structure of the proposed algorithm is presented followed by detail description of the constructive modules.
General setting out of the algorithm
The block diagram of Simplex-NSGA II is shown in Figure 3. The parameters of Simplex-NSGA II are set during the initialization. Simplex-NSGA II utilizes two heuristic algorithms, simplex optimization method and NSGA II to find the optimal solution. In this approach, simplex optimization is utilized as exploration to make better diversity within the search space. In addition NSGA II is used as exploitation to arrange the member of Pareto optimal solutions is the main form of a multi-objective optimization algorithm. The proposed Simplex-NSGA II consists of two loops. The outer loop generates new population using the simplex method at each generation. The inner loop initially propagates the vertices of simplex using reflection, expansion and contraction operations to reshape and move the simplex toward likelihood regions of the search space and is terminated when the maximum number of iterations ( imax ) is reached. Table 1 represents the pseudo-code of the Simplex-NSGA II.
Simplex-NSGA II has nine parameters that must be set before the execution of the algorithm. The parameters are listed in Table 2 and are introduced in the following subsections.
Generation initial population
In the proposed method, the initial parent population ( P0 of size Npop ) is generated randomly with uniform distribution. The population is sorted based on non-domination and each solution is assigned a fitness value equal to its non-domination level. An initial offspring population ( Q0 ) with the same size as the parent population is created through recombination based on binary tournament selection and inducing variations via mutation operators.
Population update
From the first generation onward, a combined population ( ) is formed that is sorted according to fast nondomination procedure. The new parent population is formed by the inner loop. Therefore, during any iteration of the inner loop, the population will be updated using reflection, expansion and contraction operators for the next generation. In this respect, any individual that is generated by genetic operators such as selection, crossover and mutation is utilized as vertices of the simplex to generate the initial simplex in order to create the new population.
Stop condition
The inner loop stops when the maximum number of iterations (imax) is reached. The outer loop is stopped when the maximum number of generation is reached.
In this section, the performance of Simplex-NSGA II is evaluated using ten benchmarks that are proposed in CEC 2009 [41], listed in Table 3 and 4. The results are compared to well-known algorithms of Non-dominated Sorting Genetic Algorithm II (NSGA II) [24] and Multi-Objective Particle Swarm Optimization (MOPSO) [28]. For the performance metric [32], Inverted Generational Distance (IGD) [42], Spacing (SP) [43] and Maximum Spread (MS) [18] criteria are employed to measure convergence, quantity and coverage respectively. The mathematical formulation of IGD is as follows:
Where N is the number of true Pareto optimal solutions and di indicates the Euclidean distance between the i-th true Pareto optimal solution and the closet obtained Pareto optimal solutions in the reference set. It should be noted that IGD = 0 indicates that all members of the non-dominated solutions are in the true Pareto Front.
The mathematical formulation of the SP and MS are as follows:
where is the average of all di , N is the number of Pareto optimal solutions obtained, and
for all
Parameter | Description | Value |
---|---|---|
Npop | Population size | 50 |
Ngen | Maximum generation | 100 |
Pcrs | Probability of crossover | 0.80 |
Pmut | Probability of mutation | 0.01 |
a | Side of simplex | 22 |
α | Reflection coefficient | 0.9 |
γ | Expansion coefficient | 2 |
β | Contraction coefficient | 0.005 |
imax | Maximum iteration of simplex method | 60 |
Table 2: Parameters of simplex-NSGA II.
Name | Mathematic Formulation |
---|---|
UF1 | ![]() |
UF2 | ![]() |
UF3 | ![]() |
UF4 | ![]() |
UF5 | ![]() |
UF6 | ![]() |
UF7 | ![]() |
Table 3: Bi-objective benchmark functions.
where d is a function to calculate the Euclidean distance, bi is the maximum value in the ith objective, bi is the minimum in the ith objective, and M is the number of objectives.
In addition to utilizing the performance metrics, the best set of Pareto optimal solution obtained by Simplex-NSGA II on both the parameter space as well as the search space is shown in Figures 4-6. These figures show the performance of Simplex-NSGA II in comparison with true Pareto front. For a comparative evaluation, all the algorithms are run 30 times on the test problems and the statistics results of the 30 runs are provided in Tables 5-7. Note that 900,000 function evaluations are used for each algorithm and the number of parameters for each benchmark is 30.
Statistical results of the algorithm for IGD, SP and MS are provided in Tables 5-7, respectively. The IGD shows that the proposed hybrid algorithm (Simplex-NSGA II) is able to provide the best results on all statistical metrics for bi-objective test problems. IGD is the performance metric that shows the accuracy and convergence of an algorithm. Thus it can be stated that the proposed Simplex-NSGA II algorithm is able to provide superior convergence on the bi-objective benchmarks.
The resulting Pareto optimal solution of Simplex-NSGA II on each benchmark is also depicted in Figures 4 and 5. These figures demonstrate the convergence and coverage characteristics (Tables 6 and 7) of the Simplex-NSGA II are also better than the other algorithms. In addition, Simplex-NSGA II Pareto optimal fronts are closer to the true Pareto front and highly distributed along both objectives. The numerical results show that the performance of the Simplex-NSGA algorithm is more stable than MOPSO and NSGA II.
Name | Mathematic Formulation |
---|---|
UF8 | ![]() |
UF9 | ![]() |
UF10 | ![]() |
Table 4: Tri-objective benchmark functions.
Problems UF8, UF9 and UF10 are three-objective that make them more challenging against the two-objective problems. The results of the first three-objective problem show that Simplex-NSGA II has competitive and even better performance in solving these problems as compared to NSGA II.
The Pareto optimal fronts obtained by the proposed algorithm for UF8 are shown in Figure 5. The Pareto optimal solutions have converged towards the true Pareto optimal front despite their low diversity. Therefore, the proposed Simplex-NSGA II algorithm shows high convergence but low coverage for the UF8 in the best case.
The UF9 test problem has a separated Pareto front and its pertinent results indicate that the proposed algorithm has been effective on this challenging test function. The statistical results both quantitatively and qualitatively indicate that the coverage of the proposed Simplex-NSGA II is very good for all objectives. Finally, the results of the last UF10 test problem are similar to that of UF8, in which NSGA II provides better statistical results on the average, median and standard deviation, and the worst result is for IGD. However, the proposed Simplex-NSGA II, provides the best coverage among the 30 independent runs.
To get a better image of the Pareto optimal sets obtained via the proposed Simplex-NSGA II, the first three variables of the true Pareto optimal set and determined sets are illustrated in Figure 6 showing a very goof convergence and coverage characteristics.
IGD | UF1 | UF2 | UF3 | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Simplex-NSGAII | MOPSO | MOEA/D | Simplex-NSGAII | MOPSO | NSGA II | Simplex-NSGAII | MOPSO | NSGA II | |||||
Average | 0.04695 | 0.137005 | 0.18713 | 0.01359 | 0.06040 | 0.12224 | 0.07361 | 0.31399 | 0.28875 | ||||
Median | 0.03886 | 0.13174 | 0.18285 | 0.01225 | 0.04835 | 0.1231 | 0.07265 | 0.30802 | 0.28989 | ||||
STD. Dev. | 0.03319 | 0.04407 | 0.05073 | 0.00289 | 0.02762 | 0.01040 | 0.01108 | 0.04473 | 0.015931 | ||||
Worst | 0.09301 | 0.22786 | 0.24642 | 0.01845 | 0.13051 | 0.14389 | 0.09167 | 0.37773 | 0.31284 | ||||
Best | 0.00674 | 0.08990 | 0.12652 | 0.01060 | 0.03699 | 0.10456 | 0.05764 | 0.25648 | 0.26322 | ||||
UF4 | UF5 | UF6 | |||||||||||
Simplex-NSGAII | MOPSO | NSGA II | Simplex-NSGAII | MOPSO | NSGA II | Simplex-NSGAII | MOPSO | NSGA II | |||||
Average | 0.02434 | 0.13604 | 0.06813 | 0.45289 | 2.20238 | 1.29155 | 0.15713 | 0.64752 | 0.688412 | ||||
Median | 0.02415 | 0.13432 | 0.06846 | 0.45486 | 2.12574 | 1.33731 | 0.13523 | 0.55073 | 0.69811 | ||||
STD. Dev. | 0.00138 | 0.00739 | 0.00214 | 0.08449 | 0.55304 | 0.13479 | 0.09019 | 0.26612 | 0.05543 | ||||
Worst | 0.02760 | 0.15189 | 0.07037 | 0.53701 | 3.03836 | 1.46726 | 0.35279 | 1.24281 | 0.74031 | ||||
Best | 0.02295 | 0.12733 | 0.06466 | 0.25741 | 1.46479 | 1.12326 | 0.07954 | 0.37933 | 0.55255 | ||||
UF7 | UF8 | UF9 | UF10 | ||||||||||
Simplex-NSGAII | MOPSO | NSGA II | Simplex-NSGAII | NSGA II | Simplex-NSGAII | NSGA II | Simplex-NSGAII | NSGA II | |||||
Average | 0.02669 | 0.35395 | 0.45534 | 0.19992 | 0.53681 | 0.31183 | 0.48840 | 1.70149 | 1.63729 | ||||
Median | 0.02641 | 0.38730 | 0.43776 | 0.20740 | 0.53650 | 0.32593 | 0.41441 | 1.54063 | 1.59123 | ||||
STD. Dev. | 0.01986 | 0.20442 | 0.18993 | 0.06217 | 0.18267 | 0.08044 | 0.14459 | 0.55163 | 0.29849 | ||||
Worst | 0.06600 | 0.61512 | 0.67711 | 0.33433 | 0.79647 | 0.48522 | 0.72230 | 3.02835 | 2.16250 | ||||
Best | 0.00558 | 0.05402 | 0.02910 | 0.12553 | 0.24530 | 0.22217 | 0.33365 | 1.12806 | 1.22028 |
Table 5: Statistical results for IGD on UF1 to UF10.
SP | UF1 | UF2 | UF3 | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Simplex-NSGAII | MOPSO | NSGA II | Simplex-NSGAII | MOPSO | NSGA II | Simplex-NSGAII | MOPSO | NSGA II | |||||
Average | 0.03811 | 0.00898 | 0.00384 | 0.02770 | 0.00829 | 0.00876 | 0.01991 | 0.00699 | 0.28875 | ||||
Median | 0.03761 | 0.00855 | 0.00382 | 0.02654 | 0.00814 | 0.00859 | 0.02007 | 0.00677 | 0.28939 | ||||
STD. Dev. | 0.01516 | 0.00247 | 0.00151 | 0.01289 | 0.00168 | 0.00076 | 0.00276 | 0.00170 | 0.01582 | ||||
Worst | 0.06146 | 0.01464 | 0.00665 | 0.06199 | 0.01245 | 0.01042 | 0.02369 | 0.01007 | 0.31294 | ||||
Best | 0.01399 | 0.00670 | 0.00213 | 0.01640 | 0.00624 | 0.00797 | 0.01461 | 0.00476 | 0.26342 | ||||
UF4 | UF5 | UF6 | |||||||||||
Simplex-NSGAII | MOPSO | NSGA II | Simplex-NSGAII | MOPSO | NSGA II | Simplex-NSGAII | MOPSO | NSGA II | |||||
Average | 0.02543 | 0.00666 | 0.00730 | 0.12049 | 0.00479 | 0.00278 | 0.06150 | 0.02084 | 0.00640 | ||||
Median | 0.02386 | 0.00662 | 0.00728 | 0.12758 | 0.00487 | 0.00007 | 0.05482 | 0.01235 | 0.00000 | ||||
STD. Dev. | 0.00475 | 0.00091 | 0.00059 | 0.02505 | 0.00408 | 0.00553 | 0.04126 | 0.03258 | 0.01227 | ||||
Worst | 0.03384 | 0.00809 | 0.00836 | 0.15826 | 0.01206 | 0.01615 | 0.13353 | 0.11140 | 0.03010 | ||||
Best | 0.02055 | 0.00546 | 0.00610 | 0.07457 | 0.00006 | 0.00000 | 0.01222 | 0.00215 | 0.00000 | ||||
UF7 | UF8 | UF9 | UF10 | ||||||||||
Simplex-NSGAII | MOPSO | NSGA II | Simplex-NSGAII | NSGA II | Simplex-NSGAII | NSGA II | Simplex-NSGAII | NSGA II | |||||
Average | 0.02541 | 0.00670 | 0.00540 | 0.23364 | 0.02682 | 0.25104 | 0.02343 | 1.06788 | 0.01994 | ||||
Median | 0.02379 | 0.00655 | 0.00441 | 0.23699 | 0.02639 | 0.24302 | 0.02349 | 0.89524 | 0.02066 | ||||
STD. Dev. | 0.01286 | 0.00285 | 0.00301 | 0.03859 | 0.00827 | 0.03695 | 0.00405 | 0.41661 | 0.00348 | ||||
Worst | 0.05257 | 0.01240 | 0.01168 | 0.27492 | 0.04473 | 0.30891 | 0.03087 | 1.81766 | 0.02665 | ||||
Best | 0.00351 | 0.00325 | 0.00084 | 0.16389 | 0.01531 | 0.19916 | 0.01716 | 0.67219 | 0.01536 |
Table 6: Statistical results for SP on UF1 to UF10.
In summary, numerical results demonstrate that the proposed Simplex-NSGA II has better performance for bi-objective bench mark functions with regards to convergence and coverage. However, for some tri-objective bench mark functions, even though the proposed algorithm shows high convergence, it has a low coverage. Thus one could say the key advantages of the proposed Simplex-NSGA II algorithm compared to NSGA II and MOPSO are high convergence and coverage characteristics. In addition, the results of the proposed Simplex-NSGA II were in almost all cases better than those of MOPSO. Thus the results show that the proposed Simplex-NSGA II outperforms MOPSO as it utilizes simplex operators for generating new population obtained from the non-dominated solution. In addition, MOPSO updates gBest in each iteration. So all particles are attracted by the same (or a similar set of) gBest(s) in each iteration [44], while the individuals of Simplex-NSGA II, are updated for each generation thus assisting search agents to explore the search space more extensively.
MS | UF1 | UF2 | UF3 | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Simplex-NSGAII | MOPSO | NSGA II | Simplex-NSGAII | MOPSO | NSGA II | Simplex-NSGAII | MOPSO | NSGA II | |||||
Average | 0.96922 | 0.64538 | 0.51774 | 0.89222 | 0.91205 | 0.87201 | 0.99659 | 0.61030 | 0.23994 | ||||
Median | 0.91629 | 0.66322 | 0.59541 | 0.88328 | 0.91636 | 0.87437 | 1.00000 | 0.61612 | 0.22943 | ||||
STD. Dev. | 0.13919 | 0.19292 | 0.16609 | 0.05448 | 0.02560 | 0.00560 | 0.08165 | 0.10575 | 0.12129 | ||||
Worst | 0.79407 | 0.26592 | 0.31478 | 0.82385 | 0.86654 | 0.85986 | 0.87763 | 0.38172 | 0.08975 | ||||
Best | 0.99536 | 0.95226 | 0.74128 | 0.93883 | 0.95301 | 0.87794 | 1.00000 | 0.77145 | 0.47863 | ||||
UF4 | UF5 | UF6 | |||||||||||
Simplex-NSGAII | MOPSO | NSGA II | Simplex-NSGAII | MOPSO | NSGA II | Simplex-NSGAII | MOPSO | NSGA II | |||||
Average | 0.95759 | 0.81275 | 0.88320 | 0.83746 | 0.27926 | 0.29215 | 0.98393 | 0.27435 | 0.09677 | ||||
Median | 0.96323 | 0.81321 | 0.88131 | 0.83556 | 0.28654 | 0.29165 | 1.00000 | 0.22917 | 0.00005 | ||||
STD. Dev. | 0.01762 | 0.01367 | 0.01812 | 0.06986 | 0.09575 | 0.03470 | 0.21202 | 0.11285 | 0.20715 | ||||
Worst | 0.92252 | 0.79441 | 0.85324 | 0.69914 | 0.15574 | 0.23834 | 0.79702 | 0.15436 | 0.00000 | ||||
Best | 0.97665 | 0.83449 | 0.91394 | 0.95996 | 0.43827 | 0.34380 | 1.00000 | 0.52516 | 0.59484 | ||||
UF7 | UF8 | UF9 | UF10 | ||||||||||
Simplex-NSGAII | MOPSO | NSGA II | Simplex-NSGAII | NSGAII | Simplex-NSGAII | NSGA II | Simplex-NSGAII | NSGA II | |||||
Average | 0.98737 | 0.42928 | 0.56317 | 0.35198 | 0.50810 | 0.87641 | 0.19160 | 0.27643 | 0.13025 | ||||
Median | 1.00000 | 0.29520 | 0.63267 | 0.34883 | 0.50601 | 0.94795 | 0.16566 | 0.14774 | 0.10933 | ||||
STD. Dev. | 0.09838 | 0.27553 | 0.24209 | 0.44928 | 0.16136 | 0.39021 | 0.16351 | 0.38569 | 0.06243 | ||||
Worst | 0.80289 | 0.14458 | 0.14963 | 0.27879 | 0.22723 | 0.30638 | 0.06771 | 0.03183 | 0.06479 | ||||
Best | 1.00000 | 0.87714 | 0.99152 | 0.73412 | 0.71476 | 0.96641 | 0.64242 | 0.93133 | 0.25414 |
Table 7: Statistical results for MS on UF1 to UF10.
A new hybrid heuristic optimization algorithm is proposed for multi-objective optimization problems. The proposed hybrid algorithm formulates the optimization problems via a stochastic search method. The new hybrid algorithm is based on Nelder-Mead simplex and Nondominated Sorting Genetic Algorithm II (NSGA II) developed to find the global optimum. The proposed hybrid scheme is noted as Simplex-NSGA II whose performance is evaluated using a series of bi-objective and triobjective benchmarks test functions. The results are compared with those of other multi-objective optimization methods such as MOPSO and NSGA II. The numerical results demonstrate that the proposed hybrid algorithm has a greater performance in solving multi-objective optimization problems with a much better convergence and coverage capability.