Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 97 tok/s
Gemini 2.5 Pro 58 tok/s Pro
GPT-5 Medium 38 tok/s
GPT-5 High 37 tok/s Pro
GPT-4o 101 tok/s
GPT OSS 120B 466 tok/s Pro
Kimi K2 243 tok/s Pro
2000 character limit reached

Objective Soup Recipes: Formal Methods & Optimization

Updated 14 August 2025
  • Objective soup recipes are defined over a high-dimensional configuration space that captures ingredient proportions, cooking times, and process parameters.
  • The approach integrates surrogate machine learning models and Bayesian optimization to handle noisy, subjective taste evaluations and costly physical trials.
  • Graphical formalism represents recipes as bipartite graphs, enabling systematic comparison, modular composition, and structured substitution for recipe innovation.

Objective soup recipes are formally defined and algorithmically suggested using frameworks that combine machine learning, Bayesian optimization, and graph-theoretic reasoning. These methodologies model both the combinatorial search space of ingredients and processes and the subjective/expensive evaluations typical of real-world cooking and expert judgment. Two principal lines of work provide complementary perspectives: the optimization-based construction of “most objective” recipes through simulation and learning (Garrido-Merchán et al., 2018), and the rigorous graphical formalization of recipes for equivalence, decomposition, and substitution tasks (Bikakis et al., 2023). Together, these approaches enable both principled generation and formal reasoning over soup recipes as structured objects.

1. Formalizing the Recipe Configuration Space

Objective soup recipes are parametrized over a configuration space Θ, which is the product of all variables that define a recipe’s instantiation. Typical parameters include ingredient quantities, preparation and cooking times, temperature, types of broth, seasoning levels, kitchen tools, and even categorical descriptors such as brand or location. The configuration for a specific recipe is thus denoted by θΘ\theta \in \Theta.

Example parametrization for a soup might be:

  • θ=[carrotgrams,boilingtime,brothtype,saltlevel]\theta = [\mathrm{carrot\:grams},\, \mathrm{boiling\:time},\, \mathrm{broth\:type},\, \mathrm{salt\:level}]
  • Parameter bounds, e.g., boiling time [10,40]\in [10, 40] minutes, carrot grams [50,300]\in [50, 300], seasoning {low,medium,high}\in \{\mathrm{low},\,\mathrm{medium},\,\mathrm{high}\}.

Formally, the recipe configuration space enables systematic search and optimization across a potentially high-dimensional domain, supporting both enumeration and sampling methods for exploration.

2. Modeling the Subjective Objective Function

Recipe quality is quantified as a scalar value λ[0,10]\lambda \in [0,10], typically ascribed by human expert ratings. This is modeled as

y(θ)=f(θ)+ϵ,ϵN(μ,σ)y(\theta) = f(\theta) + \epsilon\,,\quad \epsilon \sim \mathcal{N}(\mu, \sigma)

where f(θ)f(\theta) is the latent "true" quality function over recipe configurations, and ϵ\epsilon represents the noise inherent in subjective judgment. In practice, this means that repeated evaluations of the same configuration may yield different scores, and the mapping from configuration to quality is inaccessible to closed-form gradient-based optimization.

A plausible implication is that optimizing soup recipes requires frameworks that handle noise, non-analytic objectives, and costly evaluations.

3. Simulating and Learning from Recipe Data

Given the high cost of physical recipe evaluations, construction of an adequate dataset involves both real cooked trials and simulated data:

  • Real data DrealD_{\mathrm{real}}: Recipes cooked and rated by experts.
  • Simulated data DsimD_{\mathrm{sim}}: Synthetic ratings generated from expert-encoded probability distributions (e.g., Gamma or Gaussian), capturing expectations such as "higher oven temperature slightly improves broth clarity," etc.

An ML model mm is fit to the combined data D=DrealDsimD = D_{\mathrm{real}} \cup D_{\mathrm{sim}}. The recommended practice is to use support vector regression (SVR), with hyperparameters tuned via grid search and 10-fold cross-validation to minimize mean squared error (MSE). This surrogate model m(θ)m(\theta) serves to predict recipe quality for arbitrary configurations without the need for physical trials.

In cited experiments, optimal SVR hyperparameters were C=1C = 1, γ=0.01\gamma = 0.01, with prediction errors between 2.56 and 2.6 points (on a [0,10][0,10] scale), indicating reasonable performance given the subjective noise present in evaluations (Garrido-Merchán et al., 2018).

4. Bayesian Optimization for Recipe Suggestion

Bayesian Optimization (BO) is deployed for systematic exploration of the configuration space Θ\Theta to identify optimal soup recipes under subjective, noisy, and costly evaluations. The key components of this framework are:

  • Surrogate Model: A Gaussian Process (GP) provides mean μ\mu and covariance function k(,)k(\cdot, \cdot) over observations:

f(x)GP(μ,K)f(x) \sim GP(\mu, K)

Predictive distribution formulas:

μ=kT(K+σn2I)1y\mu = k_*^T (K + \sigma_n^2 I)^{-1} y

σ2=k(θt,θt)kT(K+σn2I)1k\sigma^2 = k(\theta_t, \theta_t) - k_*^T (K + \sigma_n^2 I)^{-1} k_*

where yy are observed ratings, kk_* is covariance vector for candidate θt\theta_t, KK is covariance matrix, and σn2\sigma_n^2 is evaluation noise.

  • Acquisition Function: Measures utility of evaluating a candidate. Criteria such as expected improvement and Predictive Entropy Search (PES) balance sampling between high-quality predicted regions (exploitation) and uncertain areas (exploration).

At each iteration, BO selects the configuration maximizing the acquisition function:

x=argmaxxα(x)x^* = \arg\max_x \alpha(x)

For soup recipes, physical evaluation is amortized by using the ML model m(θ)m(\theta), reducing experimental cost. Empirical results show BO (with Matérn kernel and PES averaging over 10 GP samples) rapidly identifies high-quality configurations, outperforming random search and baseline expert strategies. Repeated replications converge to similar "most voted" parameter values, forming robust prototype recipes.

5. Graphical Formalism for Recipe Representation and Reasoning

Recipes, including soup recipes, can be represented as labelled bipartite graphs (C,A,E)(C, A, E):

  • CC: nodes for comestibles (ingredients, intermediates, final products)
  • AA: nodes for actions (chop, boil, mix, simmer, etc.)
  • EE: edges linking comestibles to actions (inputs) and actions to comestibles (outputs)

Formal conditions guarantee well-formedness: connectedness, acyclicity, every action with incoming/outgoing edges, and comestibles produced at most once (Bikakis et al., 2023). A typing function annotates each node with types from a hierarchy (e.g., "boiling" vs "boil for 20 minutes"). This encoding enables systematic comparison, composition, and reasoning over recipes.

Definition (verbatim as provided): \begin{definition} A {\bf recipe graph} is a tuple (C,A,E)(C,A,E) where: (1) CC\emptyset \subset C \subseteq {\cal C} and AA\emptyset \subset A \subseteq {\cal A}; (2) E(C×A)(A×C)E \subseteq (C \times A) \cup (A \times C); (3) (CA,E)(C\cup A,E) is connected and acyclic; (4) Every aAa \in A has an incoming edge from some cCc \in C and an outgoing edge to some cCc' \in C; (5) For every comestible node cc, if it has two incoming edges (a,c)(a,c) and (a,c)(a',c), then a=aa=a'. \end{definition}

6. Structured Recipe Comparison, Composition, and Substitution

Rigorous definitions for comparing soup recipes include:

  • Isomorphism: Graph bijection that preserves structure.
  • Subrecipe/Equivalence: Subsetting nodes/arcs and consistent labels. Equivalence R1R2R_1 \equiv R_2 iff graphs are isomorphic and labels agree.

Composition merges atomic recipes (single action node subgraphs) into complete recipes, provided output/input nodes align in type and there is no node label conflict. Decomposition extracts atomic steps, enabling structural analysis and modular recipe engineering.

Substitution operates at:

  • Type Level: Relabeling nodes to alternate types via FTF \otimes T, with TT as a set of bindings. For instance, substituting "butter" for "olive oil" or modifying a broth component.
  • Structural Level: Replacing subgraphs subject to interface alignment and parallel connectivity, denoted R[R1/R2]R[R_1/R_2].

A cost function d(t1,t2)d(t_1, t_2) (e.g., word embedding distance, ontology) quantifies impact, guiding minimal-change substitutions. Primary/secondary distinction addresses necessity (ingredient absent) and restoration (adaptation of process).

7. Practical Application and Extension to Soup Recipes

The described frameworks are generic and readily extendible to soup recipes. Implementation follows:

  • Enumeration of all applicable parameters in Θ\Theta (ingredients, timings, process steps, seasoning, etc.).
  • Acquisition of a foundational dataset via expert ratings or controlled syntheses.
  • Encoding expert culinary understanding as probabilistic expectations to augment sparse data.
  • Learning a predictive model (SVR) for recipe quality, followed by iterative suggestion of candidates through BO.
  • Representation of resultant recipes as bipartite graphs, supporting formal comparison, modular composition, and substitution.

Examples provided for pasta and salad demonstrate generality; a plausible implication is that these methods enable structured innovation, adaptation to constraints (e.g., ingredient unavailability), and systematic optimization in the soup domain.

Summary

The synthesis of Bayesian optimization over learned surrogates (Garrido-Merchán et al., 2018) and formal graph representation (Bikakis et al., 2023) yields a comprehensive methodology for producing and reasoning about objective soup recipes. The approach enables principled recipe suggestion and adaptation under subjective evaluation and resource constraints, and supports rigorous comparison and modular recombination at multiple granularities. This provides a foundation for both computational and human-assisted culinary innovation, leveraging formalism to navigate the high-dimensional and noisy landscape of real-world cooking.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (2)