Pymoo

Specialized Pymoo automation and integration for multi-objective optimization problems

Pymoo is a community skill for multi-objective optimization using the pymoo Python framework, covering problem definition, evolutionary algorithms, constraint handling, Pareto front analysis, and decision making for engineering optimization problems.

What Is This?

Overview

Pymoo provides tools for solving optimization problems with multiple competing objectives using evolutionary algorithms. It covers problem definition that specifies objective functions, constraints, and variable bounds for the optimization task, evolutionary algorithms including NSGA-II, NSGA-III, and MOEA/D for multi-objective search, constraint handling that incorporates equality and inequality constraints into the optimization process, Pareto front analysis that identifies the set of non-dominated solutions representing optimal trade-offs, and decision making that selects preferred solutions from the Pareto set based on criteria. The skill enables engineers to find optimal trade-offs in complex design problems.

Who Should Use This

This skill serves engineers solving design optimization problems with conflicting objectives, researchers benchmarking evolutionary algorithms on multi-objective test problems, and data scientists optimizing machine learning hyperparameters across multiple metrics.

Why Use It?

Problems It Solves

Single-objective optimization cannot capture trade-offs between competing design goals like cost versus performance. Weighted sum approaches to multi-objective problems require specifying weights before understanding the trade-off landscape. Grid search over parameter spaces scales exponentially with dimensions making exhaustive search impractical. Constrained optimization problems with nonlinear constraints are difficult for gradient-based methods.

Core Highlights

Problem definer specifies objectives, constraints, and variable bounds. Algorithm library provides NSGA-II, NSGA-III, and other evolutionary optimizers. Constraint handler incorporates equality and inequality conditions. Pareto analyzer identifies non-dominated solutions on the optimal front.

How to Use It?

Basic Usage

import numpy as np
from pymoo.core.problem\
  import Problem
from pymoo.algorithms\
  .moo.nsga2 import NSGA2
from pymoo.optimize import (
  minimize)

class MyProblem(Problem):
  def __init__(self):
    super().__init__(
      n_var=2,
      n_obj=2,
      n_constr=0,
      xl=np.array(
        [0.0, 0.0]),
      xu=np.array(
        [1.0, 1.0]))

  def _evaluate(
    self, x, out,
    *args, **kwargs
  ):
    f1 = x[:, 0] ** 2 + (
      x[:, 1] ** 2)
    f2 = (
      (x[:, 0] - 1) ** 2
      + (x[:, 1] - 1)
        ** 2)
    out['F'] = np.column_stack(
      [f1, f2])

algo = NSGA2(
  pop_size=100)
result = minimize(
  MyProblem(),
  algo,
  ('n_gen', 200),
  seed=42,
  verbose=True)

print(
  f'Solutions: '
  f'{len(result.F)}')

Real-World Examples

import numpy as np
from pymoo.core.problem\
  import Problem
from pymoo.algorithms\
  .moo.nsga2 import NSGA2
from pymoo.optimize import (
  minimize)

class DesignProblem(
  Problem
):
  def __init__(self):
    super().__init__(
      n_var=3,
      n_obj=2,
      n_constr=2,
      xl=np.zeros(3),
      xu=np.ones(3))

  def _evaluate(
    self, x, out,
    *args, **kwargs
  ):
    cost = (
      x[:, 0] + x[:, 1]
      + x[:, 2])
    perf = -(
      x[:, 0] * x[:, 1]
      + x[:, 2] ** 2)
    g1 = (
      x[:, 0] + x[:, 1]
      - 1.5)
    g2 = (
      0.2 - x[:, 2])
    out['F'] = (
      np.column_stack(
        [cost, perf]))
    out['G'] = (
      np.column_stack(
        [g1, g2]))

result = minimize(
  DesignProblem(),
  NSGA2(pop_size=200),
  ('n_gen', 300),
  seed=42)
print(
  f'Pareto size: '
  f'{len(result.F)}')

Advanced Tips

Use NSGA-III for problems with three or more objectives since it maintains better diversity on high-dimensional Pareto fronts than NSGA-II. Define custom termination criteria based on convergence metrics rather than fixed generation counts to avoid premature stopping or wasted computation. Normalize objective values before analysis when objectives have different scales to ensure balanced trade-off visualization.

When to Use It?

Use Cases

Optimize an engineering design for minimum cost and maximum performance with manufacturing constraints. Tune machine learning hyperparameters to balance accuracy and inference speed on the Pareto front. Solve a supply chain problem minimizing delivery time and cost simultaneously.

Related Topics

Pymoo, multi-objective optimization, evolutionary algorithms, NSGA-II, Pareto optimization, constraint handling, and design optimization.

Important Notes

Requirements

Pymoo Python package with NumPy for numerical operations. Problem definition with callable objective and constraint functions. Sufficient computational budget for evolutionary algorithm convergence.

Usage Recommendations

Do: start with NSGA-II for two-objective problems and switch to NSGA-III for higher dimensions. Visualize the Pareto front to understand trade-offs before selecting final solutions. Run multiple optimization runs with different seeds to assess solution consistency.

Don't: use very small population sizes since evolutionary algorithms need sufficient diversity to explore the objective space. Compare Pareto fronts from different runs without using hypervolume or similar quality indicators. Terminate optimization too early since evolutionary algorithms need generations to converge.

Limitations

Evolutionary algorithms require many function evaluations which is expensive for computationally costly objective functions. Solution quality depends on population size and generation count parameters that need tuning. Problems with many local optima may require specialized operators or larger populations to find the global Pareto front.