Data Science

Optimize multi-objective problems with desirable functions

In data science, it is not uncommon to encounter competitive goals. Whether it is designing a product, adjusting algorithms, or optimizing a portfolio, we usually need to balance several metrics to get the best results. Sometimes, maximizing one metric is at the expense of another, so it is difficult to have an overall optimized solution.

Despite some solutions to multi-objective optimization problems, I found desirability features to be both elegant and easy to explain to non-technical audiences. This makes them an interesting choice. The desirability feature will combine several metrics into standardized scores for overall optimization.

In this article, we will explore:

  • Mathematical foundations of desirable functions
  • How to implement these functions in Python
  • How to optimize multi-objective problem with desirable functions
  • Visualize to explain and interpret results

To build upon these concepts in a real example, we will apply desirability features to optimize bread baking: a toy problem with some interconnected parameters and competitive quality goals, which will allow us to explore several optimization options.

By the end of this article, you will have a powerful new tool in the Data Science Toolkit for solving multi-objective optimization problems on numerous domains, as well as the fully-featured code available on GitHub.

What are desirable functions?

Harrington (1965) first formalized the desired function, which was later extended by Derringer and Suich (1980). The idea is:

  • Convert each response to a performance score between 0 (absolutely unacceptable) and 1 (ideal value)
  • Combine all scores into one metric to maximize

Let’s explore the types of desirable features and then how to combine all scores.

Different types of desirable functions

There are three different desirable features that can handle many situations.

  • Smaller better: Used when minimizing response
def desirability_smaller_is_better(x: float, x_min: float, x_max: float) -> float:
    """Calculate desirability function value where smaller values are better.

    Args:
        x: Input parameter value
        x_min: Minimum acceptable value
        x_max: Maximum acceptable value

    Returns:
        Desirability score between 0 and 1
    """
    if x = x_max:
        return 0.0
    else:
        return (x_max - x) / (x_max - x_min)
  • Larger better: Used when maximizing response
def desirability_larger_is_better(x: float, x_min: float, x_max: float) -> float:
    """Calculate desirability function value where larger values are better.

    Args:
        x: Input parameter value
        x_min: Minimum acceptable value
        x_max: Maximum acceptable value

    Returns:
        Desirability score between 0 and 1
    """
    if x = x_max:
        return 1.0
    else:
        return (x - x_min) / (x_max - x_min)
  • Best goal: Used when a specific target value is best
def desirability_target_is_best(x: float, x_min: float, x_target: float, x_max: float) -> float:
    """Calculate two-sided desirability function value with target value.

    Args:
        x: Input parameter value
        x_min: Minimum acceptable value
        x_target: Target (optimal) value
        x_max: Maximum acceptable value

    Returns:
        Desirability score between 0 and 1
    """
    if x_min 

Each input parameter can be parameterized using one of these three desirability functions before combining them into a single desirability score.

Combined with expectations

Once individual metrics are translated into desirability scores, they need to be combined into a holistic expectation. The most common method is geometric mean:

dI is the individual’s desirable value and wI Weights reflect the relative importance of each measure.

The geometric mean has important properties: if any single desirability is 0 (IE is completely unacceptable), the total desirability is 0 regardless of other values. This enforces all requirements that must be met to a certain extent.

def overall_desirability(desirabilities, weights=None):
    """Compute overall desirability using geometric mean
    
    Parameters:
    -----------
    desirabilities : list
        Individual desirability scores
    weights : list
        Weights for each desirability
        
    Returns:
    --------
    float
        Overall desirability score
    """
    if weights is None:
        weights = [1] * len(desirabilities)
        
    # Convert to numpy arrays
    d = np.array(desirabilities)
    w = np.array(weights)
    
    # Calculate geometric mean
    return np.prod(d ** w) ** (1 / np.sum(w))

Weights are hyperparameters that can leverage the final result and provide room for customization.

A practical optimization example: Bread baking

To demonstrate the desirable features in action, let’s apply them to the toy problem: bread baking optimization problem.

Parameters and quality indicators

Let’s play with the following parameters:

  1. Fermentation time (30-180 minutes)
  2. Fermentation temperature (20–30°C)
  3. Hydration level (60–85%)
  4. When kneading (0–20 minutes)
  5. Baking temperature (180–250°C)

Let’s try to optimize these metrics:

  1. Texture quality: The texture of bread
  2. Flavor Overview: The taste of bread
  3. Practicality: The practicality of the whole process

Of course, each of these metrics depends on multiple parameters. Therefore, this is one of the most critical steps: map parameters to quality metrics.

For each quality metric, we need to define how the parameters affect it:

def compute_flavor_profile(params: List[float]) -> float:
    """Compute flavor profile score based on input parameters.

    Args:
        params: List of parameter values [fermentation_time, ferment_temp, hydration,
               kneading_time, baking_temp]

    Returns:
        Weighted flavor profile score between 0 and 1
    """
    # Flavor mainly affected by fermentation parameters
    fermentation_d = desirability_larger_is_better(params[0], 30, 180)
    ferment_temp_d = desirability_target_is_best(params[1], 20, 24, 28)
    hydration_d = desirability_target_is_best(params[2], 65, 75, 85)

    # Baking temperature has minimal effect on flavor
    weights = [0.5, 0.3, 0.2]
    return np.average([fermentation_d, ferment_temp_d, hydration_d],
                      weights=weights)

Here, for example, the taste is affected by the following:

  • Fermentation time, the minimum desirability is less than 30 minutes, and the maximum value exceeds 180 minutes.
  • Fermentation temperature, maximum desirability reaches peak at 24 degrees Celsius
  • Hydration, maximum desirability at 75% humidity peak

These calculated parameters are then averagely weighted to return flavor desirability. Similar calculations are performed for texture quality and practicality.

Objective function

Following the desirability function approach, we will use the overall desirability as our objective function. The goal is to maximize this total score, which means finding the parameters that best meet all three of our requirements at the same time:

def objective_function(params: List[float], weights: List[float]) -> float:
    """Compute overall desirability score based on individual quality metrics.

    Args:
        params: List of parameter values
        weights: Weights for texture, flavor and practicality scores

    Returns:
        Negative overall desirability score (for minimization)
    """
    # Compute individual desirability scores
    texture = compute_texture_quality(params)
    flavor = compute_flavor_profile(params)
    practicality = compute_practicality(params)

    # Ensure weights sum up to one
    weights = np.array(weights) / np.sum(weights)

    # Calculate overall desirability using geometric mean
    overall_d = overall_desirability([texture, flavor, practicality], weights)

    # Return negative value since we want to maximize desirability
    # but optimization functions typically minimize
    return -overall_d

After the individual requirements for texture, flavor and practicality were calculated; the overall requirements were simply calculated as weighted geometric mean. It eventually returns negative aggregate demand, thus minimizing it.

Optimize with Scipy

We finally used Scipy minimize function to find the best parameters. Since we return negative aggregate demand as objective function, minimizing it will maximize the overall desirability:

def optimize(weights: list[float]) -> list[float]:
    # Define parameter bounds
    bounds = {
        'fermentation_time': (1, 24),
        'fermentation_temp': (20, 30),
        'hydration_level': (60, 85),
        'kneading_time': (0, 20),
        'baking_temp': (180, 250)
    }

    # Initial guess (middle of bounds)
    x0 = [(b[0] + b[1]) / 2 for b in bounds.values()]

    # Run optimization
    result = minimize(
        objective_function,
        x0,
        args=(weights,),
        bounds=list(bounds.values()),
        method='SLSQP'
    )

    return result.x

In this function, after defining the bounds of each parameter, the initial guess is calculated as the middle of the boundary and is then given as input. minimize Scipy’s features. Finally returned.

Weights are also used as optimizer input and are also a good way to customize output. For example, with greater practical weight, optimized solutions will focus on practicality in flavor and texture.

Now, let’s visualize the results to several sets of weights.

Visualization of results

Let’s see how the optimizer handles different preference curves and demonstrates the flexibility of the desirable functionality given various input weights.

Let’s look at the results of weights that are beneficial to practicality:

The optimized parameters have weights that are conducive to practicality. Image of the author.

Since the weight mainly supports practicality, the overall desirability is 0.69 and the kneading time is 5 minutes, because high value will have a negative impact on practicality.

Now, if we optimize to a texture focus, our results are slightly different:

The optimized parameters have weights that are conducive to the texture. Image of the author.

In this case, the overall demand achieved is 0.85, which is significantly higher. The kneading time is 12 minutes this time, because higher value can have a positive impact on texture and receive so much punishment for practicality.

Conclusion: The practical application of desirable functions

Although we focus on the example of bread baking, the same approach can be applied to various fields such as product formulas in cosmetics or resource allocation in portfolio optimization.

The desirability feature provides a powerful mathematical framework for solving multi-objective optimization problems in many data science applications. By converting the original metrics into standardized desirability scores, we can effectively combine and optimize different goals.

Key advantages of this approach include:

  • Standardized scales make different metrics comparable and easy to merge into a single target
  • Flexible handling of different types of goals: minimizing, maximizing, targeting
  • Communication with preferences clearly through mathematical functions

The code presented here provides a starting point for your own experiments. Whether you are optimizing industrial processes, machine learning models, or product formulations, the desired desirability features offer a systematic approach to finding the best compromise between competing goals.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button