- Genamics JournalSeek
- RefSeek
- Hamdard University
- EBSCO A-Z
- OCLC- WorldCat
- Publons
- Euro Pub
- Google Scholar

- Agri and Aquaculture
- Biochemistry
- Bioinformatics & Systems Biology
- Business & Management
- Chemistry
- Clinical Sciences
- Engineering
- Food & Nutrition
- General Science
- Genetics & Molecular Biology
- Immunology & Microbiology
- Medical Sciences
- Neuroscience & Psychology
- Nursing & Health Care
- Pharmaceutical Sciences

Research Article - (2018) Volume 7, Issue 3

*
*

**Keywords:**
Optimization; Algorithm; Intelligence; Distribution

In science and engineering fields, many complex optimization problems involve in constraint conditions [1-3]. That’s to say, the optimal solution of those practical problems are restricted to the problem’s constraint conditions. Valves in chemical process control need a maximum and a minimum displacement. Also, for safety or other operational reason, it is usual to impose some limits on allowable temperatures, levels and pressures [4]. When solving these optimization problems, it is difficult to deal with the constraints and find the optimal solution of the nonlinear constrained problem.

Mostly often, constraint handling optimization algorithm used in classical optimization methods can be classified into two types: one is generic methods that do not exploit the mathematical structure of the constraint, such as the penalty function method [5], lagrange multiple method [6], and some intelligence optimization search heuristic algorithms, e.g., enhanced grey wolf optimization algorithm [7], surrogate-assisted evolutionary optimization method [8], modified butterfly optimization algorithm [9] chaotic grey wolf optimization algorithm [10], and enhanced grey wolf optimisation algorithm [11] and the other is special methods that used to solve these problems with specific types of constraints, such as the cutting place method [12] the gradient projection method [13] the quasi-Newton method [14] and the steepest descent method [15].

As far as generic methods are concerned, since these algorithms are generic, some performances of them in some case can’t be fully satisfied. However, these special methods are applicable either to these optimization problems having convex search region only or to these optimization problems whose objective and constraint functions are differentiable. In fact, among the generic methods, the most popular approach in real optimization fields to deal with the constraint of an optimization problem is the penalty function method, which involves a number of penalty parameters and we must to set right in any algorithms in order to obtain the optimal solution, and this performance on penalty parameter has led many researches to devise the sophisticated penalty function method. These methods mainly can be divided three categories:

a) Multi-level penalty functions [16]

b) Dynamic penalty functions based on adaptive and coevolutionary penalty approaches [17] and;

c) Hybrid penalty functions combined with the advantages of evolutionary computation, such as [18,19].

Evolutionary algorithm is generally inspired by the modelling of the natural processes, especially human evolution. Genetic algorithm lies in the category of evolutionary algorithms. However, Imperialist Competitive Algorithm (ICA) uses socio political evolution of human as a source of inspiration for developing a strong optimization strategy proposed by Atashpaz-Gargari et al. [20]. ICA has been succeeded widely to solve many real-world optimization problems in recent years, e.g., Mahdi et al. [21] introduced an imperialist competitive algorithm for solving systems of nonlinear equations; in reference Mohammadi et al. [22] designed a multi-objective imperialist competitive algorithm to solve a capacitated hub covering location problem; and Shokrollahpour et al. [23] proposed a novel imperialist competitive algorithm for solving bi-criteria scheduling of the assembly flow-shop problem [23].

Moreover, how to find a balance between exploration and exploitation for an excellent generic algorithm is very important. Joshi et al. [11], the authors proposed an Enhanced Grey Wolf Optimization (EGWO) algorithm with a better hunting mechanism, which focuses on the proper balance between exploration and exploitation that leads to an optimal performance of the algorithm and hence promising candidate solutions are generated and Long et al. [24], the authors introduce a nonlinear control parameter strategy and a new position-updated equation in order to balance the exploration and exploitation of the algorithm.

In this paper, we proposed a new multiobjective optimization method based on ICA to solve nonlinear constrained optimization problem. Firstly, the nonlinear constrained optimization problem concerned is transformed into a bi-objective unconstrained optimization problem, so that no penalty function or other mechanism to deal with the constrained are introduced. Then, in order to improve the diversity of evolution country swarm and help the evolution country swarm to approach or land into the feasible region, three kinds of different methods of colonies moving toward their relevant imperialist are presented. Also, the new operator for exchanging position of the imperialist and colony is given as a recombination operator in genetic algorithm to achieve a better balance of the exploration and exploitation behaviors of the proposed algorithm. Moreover, a new local search method is also integrated in order to increase the convergence speed of the proposed algorithm. At last, the new method is tested on 13 well-known NP-hard nonlinear constrained optimization functions, and the experiment results suggest that the proposed algorithm is robust, efficient, and generic when solving nonlinear constrained optimization problem. Compared with some other state-of-the-art algorithms, the proposed algorithm has remarkably advantage in terms of the best, mean, and worst objective function value and the standard deviation, i.e, it is indicated that the proposed algorithm can effectively solve the nonlinear constrained optimization problem. The paper is organized as follows. In section 2, the related concepts of nonlinear constrained optimization problem are given. The main steps of the proposed imperialist competitive algorithm for solving the nonlinear constrained optimization problem are designed in section 3. The flowchart of the proposed algorithm is described in section 4. After simulation results are shown in section 5, the conclusion and acknowledgment are made in section 6 and 7, respectively.

Without loss of generality, the general Nonlinear Constrained Optimization Problem (NCOP) that we are interested in can be formulated as

Where is *n* dimension decision vector and is the inequation constraint for is equation constraint. For (in both cases, constraints could be linear or non-linear), and is feasible region.

**Definition 1**

For every point such as holds, then the point *x*^{*} is called the optimal solution, and is the optimal value for problem (1). Let

where *f(x) *is the objective function of problem (1) and *f*_{2}*(x)* is the optimization function defined by the constraint condition of problem (1), then, we can transform the nonlinear constrained optimization problem (1) into the bi objective optimization problem as follows: min

For the bi-objective optimization problem (4), to minimize the first objective function *f _{1}(x)* means to find a feasible point so as to become the optimal solution of problem (1), to minimize the second objective function

**Definition 2**

A two-dimension vector *u = u _{1}, u_{2}* is said to weakly dominate another two-dimension vector

**Definition 3 **

A point *x *Є* [L,U] *is said to be a weakly Pareto optimal solution for problem (4) if there does not exist another point *y* Є * [L,U] * such as *F(y)* weakly dominates *F(x)*. The set of all the weakly Pareto optimal solutions is called the weakly Pareto optimal set and the set of all the weakly Pareto optimal solution’s objective vectors is called the weakly Pareto front. Suppose that *E*w*(F,x)* is the weakly Pareto optimal solution set of problem (4). Then, the optimal solution of the problem (1) and the weakly Pareto optimal solution of the problem (4) have the following relation:

**Theorem 1**

A solution x* is the optimal solution of problem (1) if and Proof. Sufficiency is obvious. The necessity proof is given as follows: Since x* is the optimal solution of problem (1), then Furthermore, we have Furthermore, we have

If then there at least exists another solution and makes for *i= 1,2* hold i.e this is contradiction to the definition of function for so

The conclusion of the Theorem 1 demonstrates that the optimal solution of problem (1) can be obtained from the intersection of the feasible region of problem (1) and the weakly Pareto optimal solution set of problem (4), and the optimal solution makes the first objective function minimum.

In order to solve the Nonlinear Constrained Optimization Problem (NCOP) proposed in section 2, a new imperialist competitive algorithm is designed in sections 3 and 4. Firstly, we briefly introduce the idea of the Imperialist Competitive Algorithm (ICA), proposed by the authors Atashpaz-Gargari et al. [20]. ICA, similar to the evolutionary algorithm, particle swarm algorithm and so on, is a kind of swarm intelligence algorithm. ICA is inspired by imperialistic competition. All the countries are divided into two types: imperialist states and colonies. Imperialistic competition is the main evolution operator and hopefully causes the colonies to approach the global optimal solution. Based on the idea, we design the main operators for the proposed algorithm as follows.

**The creation of initial empires**

During the operation process of ICA, the initial evolution country swarm, should be generated firstly. Among the initial country swarm, some of the best countries are selected to form the initial imperialist, and the rest of the countries are divided among the initial imperialists as colonies. In this section, randomly generate initial countries in search space denotes them s country for and define the cost of each country as follows:

Where *f _{1}(x)* is the first objective function and

**Step 1: **Suppose the normalized power of each imperialist is defined by

Where *p _{j}* is the normalized power of the j-th imperialist, and is the normalized cost of the j-th imperialist for is the cost of the imperialist for

**Step 2:** Generate the initial number of the colonies belonging to each empire based on the following formula

Where *N.C. _{j}* is the number of initial colonies of the j-th empire, and is the total number of all initial colonies.

**Step 3: **Select *N.C. _{j}* colonies according to the roulette wheel selection and join them to the j-th imperialist. These colonies along with the imperialist together will form the j-th empire (denote empire j,j=1,2,…,N).

Atashpaz-Gargari et al. [20], the authors make each colony to move toward the imperialist by x-units in the direction which is the vector from colony to imperialist. x will be a random variable with uniform distribution, i.e.,

Where β > 1 and d is the distance between the colony and imperialist, and parameter β causes the colony to get closer to the imperialist from both sides. However, for constraint optimization problem, it needs the designed algorithm not only to make the infeasible solution approaching the feasible region and satisfying the constraint condition, but also make the objective function minimum. Based on these, we proposed a new method of colonies moving to their relevant imperialist as follows. Suppose that we make the colony to move the imperialist *i(i =1,2,…,N)* then

**Case 1**

If both imperialist *i *and colony *j* are feasible, i.e., imperialist* i* and colony *j Є D*, we generate a circle which the diameter is the straight-line segment d joining the imperialist *i* and colony *j*, the new position (denote as colony *k*) of the *j _{th}* colony moved to their relevant imperialist is shown in a gray colour in

Parameter *r* and A can cause the colony *j* to get closer to the imperialist from its neighbourhood rather than far away from the imperialist *i*.

**Case 2**

If both imperialist *i* and colony *j* are infeasible, i.e, imperialist *i* colony *j* Є [L,U]\D, we random select a colony s from feasible region , compute the barycenter of three countries colony *j*, imperialist *i *and colony *s*, and then, the barycenter (denote by colony *k* in **Figure 3**) can be regarded as the new position which colony j move to imperialist *i*. Using this method, we can make the colony not only to move to the imperialist but approach the feasible region.

**Case 3**

If there exists one feasible country between colony j and *i*, and suppose colony *j* is feasible country and imperialist is infeasible country and vice versa, i.e, colony j Є D , imperialist *i *Є [L,U]\D then, we generate a circle which the circle’s centre is colony j and the radius is the straight-line segment L joining the colony *j* and imperialist *i* the new position colony *k* of colony j moved to imperialist i is shown in a gray colour in **Figure 4**, where B is a random number with uniform distribution, i.e,

Where is a parameter that adjusts the deviation of direction which is the vector from colony *j *to imperialist is a random number with uniform distribution, and τ and are arbitrary. In most of our implementation, the value of have a good convergence to the global minimum and can make the feasible colony *j* not far away from the feasible region.

Based on the method of colonies moving toward their relevant imperialist in subsection the operator of exchanging position of the imperialist and the colony can be described as follows:

If both colony *j* and imperialist *i* are feasible, and suppose that the cost of colony *j* has lower cost than that the imperialist does, i.e., *f*1*(colony j 1*

*(1) f_{1}(imperialist i) then we use the colony j to replace the imperialist i and form the new imperialist, vise versa.*

*(2) If both colony j and imperialist i are unfeasible, then we always choose the one with the smaller cost as the new imperialist i i.e, if f_{2} (colony j > f_{2} (imperialist i) then keep imperialist i invariable; otherwise, if f_{2}(colony j < f_{2} imperialist i) then use the colony j to replace the imperialist i and form new imperialist.*

*(3) If there exists one feasible country between the colony j and imperialist i, we always use the feasible country as the new imperialist in order to make the evolution country swarm approaching the feasible region and fast converging to the minimum.*

In order to accelerate the convergence speed, we add a local search operator as follows to the proposed algorithm. Suppose imperialist , imperialist, imperialist *k* are *k* imperialists obtained by the proposed algorithm in current evolution countries swarm, where k is the number of imperialist and it will become more less with the imperialist competition.

Compute the approximate value of gradient for each imperialist

For is a n-dimension unit vector which the jth component ej = 1 and the rest of components

**Case 1: **If carry out one-dimensional search along the descent direction obtain and make the following formula.

Holds, where then we use to replace imperialist i and form the new imperialist.

As mentioned in reference [20], during the evolution of countries, all the empires try to possess the colonies of the other empires and control them. As a result, the power of the weaker empires gradually begins to decrease and the power of the more powerful increases. This process can be described in the follows:

• Compute the total power of the j-th empire depending on its own imperialist and colonies as follows:

Where is a positive coefficient, and imperialist_ is the imperialist of the jth empire, N.C.j is the number of colonies of the jth empire, cost (·) is the normalized cost function defined in formula (5). Use the following formula (16) to compute the possession probability

Where is the normalized total power and is the total power of the j-th empire, respectively. Randomly select some colonies from current evolution countries swarm, e.g., when select only one and denote in colony, let and also generate a N-dimension vector V with uniformly distributed elements, i.e.,

Then we divide the colony in to the -th empires, where index is subscript of the maximum component in vector *θ*.

With the imperialistic competition, powerless empires will collapse in the imperialist competitive, and the number of their colonies become less and less. When an empire loses all of its colonies, we consider that the empire has been collapse and the imperialist become one of the rest colonies.

The main difference between the proposed Multi-objective Imperialist Competitive Evolutionary Algorithm (denote as MICA) and the original imperialist competitive algorithms is the method of colony moving toward their relevant imperialist according to **Figures 2 and 4**. Additionally, in order to make the evolution country swarm to approach or come in the feasible region, three kinds of different methods of colonies moving toward their relevant imperialist are given. In addition, a new operator for exchanging position of the imperialist is also designed to achieve a better balance between the exploration and exploitation behaviours for MICA, and a new local search method is also embedded in order to increase the convergence speed of the proposed algorithm. The flowchart of the proposed algorithm is shown as follows: *[L,U]*

**Step 1:** Choose the proper parameter, initial country size randomly generate initial country swarm in search space *[L,U]* and denote them as the set *pop(0) *find the weakly Pareto optimal countries (i.e, weakly Pareto optimal solutions) in *pop(0)* according to the Definition 3 and denote them as an external set let *t=0.*

**Step 2:** Generate initial empires, i.e, select the most powerful countries N from pop(t) and divide the rest countries to each of them.

**Step 3: **Make each of colonies to move toward relative imperialist based on the method of colonies moving toward their relevant imperialist in subsection 3.2 and exchange the position of the imperialist and the colony according to subsection 3.3.

**Step 4: **Carry out the local search operator and imperialistic competition, and form the next evolution country swarm *pop(t+1).*

**Step 5:** Find the weakly Pareto optimal countries in the set and use them to replace those countries including *C(t) *to form the new external set

**Step 6:** If the maximum number of the cycles has been reached, the algorithm stops; output the optimal solution of problem (1). Otherwise, let go to Step 2.

To evaluate the efficiency of the proposed algorithm, thirteen nonlinear constrained optimization test problems g01∼g13 were tested by six optimizations evolutionary algorithms: OICA [20], SAEA [25], SMEA [26], RCEA [27], ISEA [28], and the proposed algorithm MICA. These benchmark functions are described in [27]. And they are summarized here for completeness, and the original sources of the functions are also cited. Test functions g02, g03, and g12 are maximization problems, they were transformed into minimization problems using the objective function min(-f (x )).

In order to estimate how difficult, it is to generate feasible countries through a purely random process, we use the ρ-metric [29] which can measure the ratio between the size of the feasible search space and that of the entire search space, i.e.,

Where is the number of countries randomly generated from search space and is the number of feasible countries found by each algorithm (out of these countries. In our algorithm, is theinitial country. Each algorithm was implemented by using MATLAB 7.0 on an Intel Pentium IV 2.8-GHz personal computer and was executed 30 independent runs for each test problem. In the simulation, the initial country the ratio of the mostpowerful countries is and the maximum number of cycles is 1500 [30-33].

Test problem | g01 | g02 | g03 | g04 | g05 | g06 | g07 | g08 | g09 | g10 | g11 | g12 | g13 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|

Average percentage | 100 | 95 | 59 | 89 | 100 | 62 | 78 | 92 | 85 | 78 | 86 | 98 | 57 |

**Table 1:** Average percentage of feasible countries in the final country swarm with 30 independent runs.

Summarizes the average percentage of the feasible countries in the final evolution country swarm in 30 independent runs for each test problem. Moreover, In order to illustrate the rate of the convergence for the proposed algorithm, we record the average distance from the best individual of the imperialist swarm to the boundaries of the feasible region at every 1500 generations in 30 runs. The results are presented in **Table 2**. Also, we list the known optimal solution and the best, mean, and the std. for the objective function value in **Table 3**, and the standard deviation (std.) after 30 independent runs by MICA and the Original Imperialist Competitive Algorithm (denote as OICA) is also given. These results provided by four compared algorithms SAEA, SMEA, RCEA and ISEA were taken from the original references. In **Table 2**, “I.N.” represents the iteration number, and in **Table 3**, “NA” presents no results in the reference.

I.N. | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|

Function | 1500 | 3000 | 4500 | 6000 | 7500 | 9000 | 10500 | 12000 | 13500 | 15000 |

g01 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |

g02 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |

g03 | 6.21E-03 | 4.42E-01 | 5.92E-13 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |

g04 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |

g05 | 3.28E-06 | 1.12E-04 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |

g06 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |

g07 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |

g08 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |

g09 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |

g10 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |

g11 | 2.86E-14 | 1.85E-05 | 2.72E-17 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |

g12 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |

g13 | 3.24E-12 | 5.32E-11 | 1.54E-07 | 2.34E-14 | 3.35E-06 | 0 | 0 | 0 | 0 | 0 |

**Table 2:** Average distance from the best individual of the imperialist swarm to the boundaries of the feasible region at every 1500 generations for the 30 runs.

Methods | ||||||||
---|---|---|---|---|---|---|---|---|

Function | Optimal | Status | OICA [19] | SAEA [9] | SMEA [16] | RCEA [17] | ISEA [26] | MICA |

g01 | -15 | best mean worst std. | -15.000 | -15.000 | -15.000 | -15.000 | -15.000 | -15.000 |

-15 | -15 | -15 | -15 | -14.494 | -15 | |||

-15 | -15 | -15 | -15 | -12.446 | -15 | |||

-15 | 0 | 0 | 0.0E+00 | 9.30E-01 | 1.30E-11 | |||

g02 | -0.80362 | best mean worst std. | -0.80342 | -0.80297 | -0.803601 | -0.803515 | -0.803376 | -0.803619 |

-0.79212 | -0.7901 | -0.785238 | -0.781975 | -0.798231 | -0.793421 | |||

-0.76213 | -0.76043 | -0.751322 | -0.726288 | -0.768291 | -0.783461 | |||

1.50E-03 | 1.20E-02 | 1.70E-02 | 2.00E-02 | 9.00E-03 | 2.50E-02 | |||

g03 | -1 | best mean worst std. | -1.0000 | -1 | -1.0000 | -1.0000 | -1.0000 | -1.0000 |

-1 | -1 | -1 | -1 | -1 | -1 | |||

-1 | -1 | -1 | -1 | -1 | -1 | |||

6.40E-03 | 7.5E-05 | 2.10E-04 | 1.90E-04 | 9.70E-05 | 2.30E-12 | |||

g04 | -30665.5 | best mean worst std. | -30665.405 | -30665.500 | -30665.539 | -30665.539 | - 30665.539 | - 30665.539 |

-30665.531 | -30665.2 | -30665.539 | -30665.539 | -30665.539 | -30665.539 | |||

-30665.523 | -30665.3 | -30665.539 | -30665.539 | -30665.539 | -30665.539 | |||

0 | 4.90E-01 | 0 | 2.00E-05 | 0 | 7.20E-10 | |||

g05 | 5126.498 | best mean worst std. | 5126.964 | 5126.989 | 5126.599 | 5126.497 | NA | 5126.4981 |

5432.08 | 5432.08 | 5174.492 | 5128.881 | NA | 5126.4981 | |||

5883.95 | 6089.43 | 5304.167 | 5142.472 | NA | 5126.4981 | |||

3.30E+05 | 3.90E+03 | 5.00E+01 | 3.50E+00 | NA | 1.51E-10 | |||

g06 | -6961.81 | best mean worst std. | -6961.800 | -6961.800 | -6961.814 | -6961.814 | -6961.814 | -6961.814 |

-6961.8 | -6961.8 | -6961.284 | -6875.94 | -6961.813 | -6961.814 | |||

-6961.8 | -6961.8 | -6952.482 | -6350.262 | -6961.812 | -6961.814 | |||

0 | 0 | 1.90E+00 | 1.60E+02 | 8.50E-05 | 1.21E-10 | |||

g07 | 24.30621 | best mean worst std. | 24.47 | 24.48 | 24.327 | 24.307 | 24.338 | 24.3062 |

25.38 | 26.58 | 24.475 | 24.374 | 24.527 | 24.3457 | |||

28.32 | 28.4 | 24.843 | 24.642 | 24.995 | 24.3812 | |||

1.20E+01 | 1.10E+00 | 1.30E-01 | 6.60E-02 | 1.70E-01 | 2.53E-09 | |||

g08 | 0.095825 | best mean worst std. | 0.095825 | 0.095825 | 0.095825 | 0.095825 | 0.095825 | 0.095825 |

0.095825 | 0.095825 | 0.095825 | 0.095825 | 0.095825 | 0.095825 | |||

0.095825 | 0.095825 | 0.095825 | 0.095825 | 0.095825 | 0.095825 | |||

0 | 0 | 0 | 2.60E-17 | 0 | 3.21E-14 | |||

g09 | 680.6301 | best mean worst std. | 680.64 | 680.64 | 680.632 | 680.630 | 680.630 | 680.630 |

680.7 | 680.72 | 680.643 | 680.656 | 680.631 | 680.63 | |||

680.83 | 680.87 | 680.719 | 680.763 | 680.634 | 680.63 | |||

5.30E+00 | 5.90E-02 | 1.60E-02 | 3.40E-02 | 8.10E-04 | 4.20E-09 | |||

g10 | 7049.331 | best mean worst std. | 7051.31 | 7061.34 | 7051.903 | 7054.316 | 7062.019 | 7049.330 |

7625.87 | 7627.89 | 7253.047 | 7559.192 | 7342.944 | 7049.33 | |||

8187.54 | 8288.79 | 7638.366 | 8835.655 | 7588.054 | 7049.33 | |||

3.40E+01 | 3.70E+02 | 7638.366 | 5.30E+02 | 1.40E+02 | 1.10E-09 | |||

g11 | 0.75 | best mean worst std. | 0.750 | 0.750 | 0.750 | 0.750 | 0.750 | 0.750 |

0.75 | 0.75 | 0.75 | 0.75 | 0.75 | 0.75 | |||

0.75 | 0.75 | 0.75 | 0.75 | 0.751 | 0.75 | |||

0 | 0 | 1.50E-04 | 8.00E-05 | 2.60E-04 | 5.31E-08 | |||

g12 | -1 | best mean worst std. | -1.0000 | -1.0000 | -1.0000 | -1.0000 | NA | -1.0000 |

-1 | -1 | -1 | -1 | NA | -1 | |||

-1 | -1 | -1 | -1 | NA | -1 | |||

0 | 0 | 0 | 0.00E+00 | NA | 6.80E-12 | |||

g13 | 0.05395 | best mean worst std. | 0.053997 | NA | 0.053986 | 0.053957 | 0.05517 | 0.053949 |

0.066531 | NA | 0.166385 | 0.067543 | 0.28184 | 0.025432 | |||

0.097569 | NA | 0.468294 | 0.216915 | 0.471 | 0.043957 | |||

1.60E-02 | NA | 1.80E-01 | 3.10E-02 | 1.80E-01 | 1.50E-01 |

**Table 3: **Comparison of the proposed algorithm MICA with respect to OICA [19], SAEA [9], SMES [16], RCEA [17], ISEA [26] on 13 benchmark functions. “NA” presents not results.

The global minimum is and the optimum value Moreover, they *g _{1}.g_{2},g_{3}g_{6} and g_{8}* are active.

where the global maximum is unknown, the best we found is which is better than any reported value up to the best of our knowledge, and the constraint *g _{1}* is active.

g03[32]

g05[33]

Where The feasible region of the search space consists of 93 disjointed spheres. A point is feasible if there exist such that the above inequality holds.

The optimum is located at x*=(5,5,5)within the feasible region and f(x*)=1.

Where −2.3 ≤ *x _{i}* ≤ 2.3 for

As can be seen from **Table 2**, for the test problems without equality constraints (g01, g02, g04, g06, g07, g08, g09, g10, and g12), the proposed algorithm MICA can enter the feasible region within 1500 generations; for test functions g03 and g11, and the proposed algorithm can enter the feasible region within 6000 generations. Although for functions g03 and g13, MICA can enter the feasible region within 6000 and 9000 generations, respectively; however, after 3000 generations, the best individual of the imperialist swarm has had very little distance to the boundaries of the feasible region.

It can be seen from **Table 3**, our algorithm MICA can find a better “best” result, compared with the other five algorithms OICA, SAEA, SMEA, RCEA and ISEA in four functions g02, g07, g10 and g13. In addition, algorithm MICA found a similar best solution in five problems g01, g03, g06, g08, g11, and g12 (ISEA didn’t give the results for g12). Slightly better best results were found by MICA in the remaining functions g04, g05, g06, and g09 (in fact, our algorithm obtained a similar best solution in g04 and g06 along with the compared three algorithms SMEA, RCEA and ISEA). Our approach found better mean and worst results in seven test functions g02, g05, g06, g07, g09, g10, and g10 except the compared algorithm ISEA for test g02 does [34]. It also provided similar mean and worst results in six functions g01, g03, g04, g08, g11, and g12. Also, the proposed algorithm obtained the slightly “worse” mean in test functions g01, g08, g12 and g13 for RCEA, and in g02 for the compared algorithm SMEA and ISEA.

The compared results in **Table 2** verifies that MICA has the capability in convergence rate, and the compared results in **Table 3** reflects the fact that our algorithm is capable of performing a robust and stable search. Furthermore, feasible solutions are consistently found for all test problems. The above observations validate that the proposed algorithm MICA has substantial potential in coping with various nonlinear constrained optimization problems.

This paper introduces a new imperialist competitive algorithm (MICA) for solving nonlinear constrained optimization problem. The proposed algorithm has three important characterizes:

1) Combining multi-objective optimization with local search models;

2) To achieve a better balance of the exploration and exploitation through the method of exchanging positions of the imperialist and colony;

3) Speeding up the convergence by taking advantage of a new local search method. Based on the comparison between the proposed algorithm and the five compared algorithms, it is concluded our algorithm NICA has shown its potential to handle various nonlinear constrained optimization problems, and its performance is much better than all other state-of-the-art evolutionary algorithms referred in this paper in terms of the selected performance metrics.

An important subject of ongoing work is of applying our approach to the solution of real-world optimization problems. Additionally, try to design different global and local search models since suitable search model can improve the capability of the algorithm remarkably. Last, we aim to explore the possibility of decreasing its computational cost after reaching the feasible region.

This work is partially supported by The Planning Fund for the Humanities and Social Sciences of the Ministry of Education (No. 18YJA790053). The author also gratefully acknowledges the helpful comments and suggestions of the reviewers, which have greatly improved the presentation.

- Homaifar A, Qi XC, Lai HS (1994) Constrained optimization via genetic algorithms. Simulation 62: 242-254.
- Coello CAC (2002) Theoretical and numerical constraint-handling techniques used with evolutionary algorithms: A survey of the state of the art. Comput. Methods App Mech Eng 191: 1245-1287.
- Hamida SB, Schoenauer M (2002) ASCHEA: new results using adaptive segregational constraint handling, in Proc. Congr Evol Comput, IEEE Press.
- Goodwin, Graham C, Seron, Maria M, de Dona, et al. (2005) Constrained control and estimation-an optimisation approach. Communications and Control Engineering 1-385.
- Yeniay O (2005) Penalty function methods for constrained optimization with genetic algorithms. Math Comput Appl 10: 45-56.
- Dimitri PB (1982) Constrained optimization and lagrange multiple methods, AcademicPress. 1-400.
- Joshi H, Arora S (2017) Enhanced grey wolf optimization algorithm for global optimization. Fundamenta Informaticae 153: 235-264.
- Miranda-Varela ME, Mezura-Montes E (2018) Constraint-handling techniques in surrogate-assisted evolutionary optimization: An empirical study. Appl Soft Comput 73: 215-229.
- Arora S, Singh S, Yetilmezsoy K (2018) A modified butterfly optimization algorithm for mechanical design optimization problems. J Braz Soc Mech Sci & Eng 40: 173-185.
- M. Kohli, S Arora (2018) Chaotic grey wolf optimization algorithm for constrained optimization problems. Journal of Computational Design and Engineering 5: 458-472.
- Joshi H, Arora S (2017) Enhanced grey wolf optimisation algorithm for constrained optimisation problems. International Journal of Swarm Intelligence 3: 126-151.
- Ion N (2009) A cutting plane method for solving convex optimization problems over the cone of nonnegative polynomials. WSEAS Transactions on Mathematics 8:269-279.
- Paul HC, Jorge JM (1987) Projected gradient methods for linearly constrained problems. Mathematical Programming 39: 93-16.
- Philipp H, Martin K (2013) Quasi-Newton methods: a new direction. J Mach Learn Res 14: 843-865.
- Yuan YX (2006) A new stepsize for the steepest descent method. J Comput Math 24: 149-156.
- Morten Hovd. (2011) Multi-level programming for designing penalty functions for MPC Controllers. Pro. of the 18th IFAC World Congress 44: 6098-6103.
- Huang FZ, Wang L, He Q (2007) An effective co-evolutionary differential evolution for constrained optimization. Appl MathComput186: 240-256.
- He Q, Wang L (2007) A hybrid particle swarm optimization with a feasibility-based rule for constrained optimization. Appl Math Comput 186:1407-1422.
- Lin YC, Wang FS, Hwang KS (1999) A hybrid method of evolutionary algorithms for mixed-integer nonlinear optimization problems. Proceedings of the 1999 Congress on Evolutionary Computation 633904.
- Atashpaz-Gargari E, Lucas C (2007) Imperialist competitive algorithm: An algorithm for optimization inspired by imperialistic competition. In: IEEE Congress on EvolutionaryComputation 4661-4667.
- Mahdi A, Ayaz I, Davoud A (2013) Imperialist competitive algorithm for solving systems of nonlinear equations. Comput Math Appl 65: 1894-1908.
- Mohammadi M, Tavakkoli-Moghaddam R, Rostamib H (2011) A multi-objective imperialist competitive algorithm for a capacitated hub covering location problem. International Journal of Industrial Engineering Computations 2: 671-688.
- Shokrollahpour E, Zandieh M, Dorri B (2011) A novel imperialist competitive algorithm for bi-criteria scheduling of the assembly flow shop problem. Int J Prod Res 49: 3087-3103.
- Long W, Jiao J, Liang X, Tang M (2018) An exploration-enhanced grey wolf optimizer to solve high-dimensional numerical optimization. Eng Appl Artif Intel 68: 63-80.
- Farmani R, Wright JA (2003) Self-adaptive fitness formulation for constrained Optimization. IEEE Trans Evol Comput 7: 445-455.
- Mezura-Montes E, Coello CAC (2005) A simple multimembered evolution strategy to solve constrained optimization problems. IEEE Trans Evol Comput 9: 1-17.
- Runarsson TP, Yao X (2000) Stochastic ranking for constrained evolutionary optimization. IEEE Trans. Evol Comput 4: 284-294.
- Aguirre AH, Rionda SB, Coello CAC, Lizaraga GL, Montes EM (2004) Handling constraints using multiobjective optimization concepts. Int J Numerical Methods in Eng 59: 1989-2017.
- Datta R, Regis R (2016) A surrogate-assisted evolution strategy for constrained multiobjective optimization. Expert Syst Appl 57: 270-284.
- Floundas, Christodoulos A, Pardalos A (1987) A collection of test problems for constrained global Optimization Algorithms. Lecture Notes in Comput Sci 1-158.
- Koziel S, Michalewicz Z (1999) Evolutionary algorithms, homomorphous mappings, and constrained parameter optimization. Evol Comput 7: 19-44.
- Michalewicz Z, Nazhiyath G, Michalewicz M (1996) A note on usefulness of geometrical crossover for numerical optimization problems. Proceedings of the 5th Annual Conference on Evolutionary Programming 305-312.
- Himmelblau D (1972) Applied nonlinear programming. Mcgraw-Hill, IEEE Press.
- Hock W, Schittkowski K (1981) Test examples for nonlinear programming codes.Lecture Notes in Econ Math Syst 1-127.