NSGA-2 multi-objective genetic algorithm. Anyone could give me a "simple explanation"?

Here's an explanation for NSGA-II

  1. First, it randomly initializes the population.
  2. Chromosomes are sorted and put into fronts based on Pareto Non dominated sets. Within a Pareto front, the chromosomes are ranked based on euclidean between solutions or I-dist (term used in NSGA-II) . Generally, solutions which are far away (not crowded) from other solutions are given a higher preference while selection. This is done in order to make a diverse solution n set and avoid a crowded solution set.
  3. The best N (population) chromosomes are picked from the current population and put into a mating pool
  4. In the mating pool, tournament selection, cross over and mating is done.
  5. The mating pool and current population is combined. The resulting set is sorted, and the best N chromosomes make it into the new population.
  6. Go to step 2, unless maximum number of generations have been reached.
  7. The solution set is the highest ranked Pareto non dominated set from the latest population.

I recommend to read the papers on these algorithms which explain the functionality quite well:

  • Deb, Pratab, Agarwal, Meyarivan. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Transactions on Evolutionary Computation 6(2), pp. 182-197, 2002.
  • Zitzler, Laumanns, Thiele. SPEA2: Improving the Strength Pareto Evolutionary Algorithm. Technical Report (TIK-103), Swiss Federal Institute of Technology (ETH), 2001.

I'm sure you're able to locate the PDF of these publications on the web.

About the difference between steady-state GA and generational GA: In generational replacement you create a whole new population of the same size as the old one using only the genes in the old population and then replace it as a whole. In steady-state replacement you create just one new individual which then replaces just one individual in the population. Steady-state GAs usually converge faster, but they're less likely to find the good local optima, because they do not explore the fitness landscape as much as when using generational replacement. It depends on the problem of course and sometimes you can choose how much of the old generation you want to replace which allows you to have some arbitrary scale between these two.

There are further multiobjective algorithms such as AbYSS and PAES.