<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://optimization.cbe.cornell.edu/index.php?action=history&amp;feed=atom&amp;title=Bayesian_optimization</id>
	<title>Bayesian optimization - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://optimization.cbe.cornell.edu/index.php?action=history&amp;feed=atom&amp;title=Bayesian_optimization"/>
	<link rel="alternate" type="text/html" href="https://optimization.cbe.cornell.edu/index.php?title=Bayesian_optimization&amp;action=history"/>
	<updated>2026-05-02T10:51:01Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.43.0</generator>
	<entry>
		<id>https://optimization.cbe.cornell.edu/index.php?title=Bayesian_optimization&amp;diff=7680&amp;oldid=prev</id>
		<title>Fall2024 Team4 Wiki: Minor changes to punctuation around reference numbers and removing a couple of unwanted symbols</title>
		<link rel="alternate" type="text/html" href="https://optimization.cbe.cornell.edu/index.php?title=Bayesian_optimization&amp;diff=7680&amp;oldid=prev"/>
		<updated>2024-12-16T00:52:09Z</updated>

		<summary type="html">&lt;p&gt;Minor changes to punctuation around reference numbers and removing a couple of unwanted symbols&lt;/p&gt;
&lt;table style=&quot;background-color: #fff; color: #202122;&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;Revision as of 20:52, 15 December 2024&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l4&quot;&gt;Line 4:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 4:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== Introduction ==&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== Introduction ==&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Bayesian Optimization (BO) is a probabilistic model-based method for the optimization of objective functions that are computationally expensive and prevent thorough evaluation. It has found its place in optimization by handling objectives that are costly to compute and function as “black boxes” without an efficient mechanism to estimate their gradient&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;. &lt;/del&gt;[1]  &lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Bayesian Optimization (BO) is a probabilistic model-based method for the optimization of objective functions that are computationally expensive and prevent thorough evaluation. It has found its place in optimization by handling objectives that are costly to compute and function as “black boxes” without an efficient mechanism to estimate their gradient [1]&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;. &lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;The origins of BO trace back to the 1960s, with foundational work from H.J. Kushner and his introduction of a GP Model for noisy, one-dimensional optimization and leveraging Bayesian decision theory to derive optimization policies. V.R. Šaltenis expanded these ideas in the 1970s by introducing the Expected Improvement (EI) acquisition function and demonstrating its effectiveness through empirical studies on high-dimensional problems. An J. Mockus further developed the field by endorsing the one-step lookahead approach with the knowledge gradient, emphasizing its practical use in resource-constrained settings where it aligned with EI&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;. &lt;/del&gt;[1]&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;The origins of BO trace back to the 1960s, with foundational work from H.J. Kushner and his introduction of a GP Model for noisy, one-dimensional optimization and leveraging Bayesian decision theory to derive optimization policies. V.R. Šaltenis expanded these ideas in the 1970s by introducing the Expected Improvement (EI) acquisition function and demonstrating its effectiveness through empirical studies on high-dimensional problems. An J. Mockus further developed the field by endorsing the one-step lookahead approach with the knowledge gradient, emphasizing its practical use in resource-constrained settings where it aligned with EI [1]&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;.&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;BO has experienced a modern resurgence driven by advancements in machine learning, computational power, and the increasing demand to optimize complex systems across fields like artificial intelligence, robotics, and materials discovery, particularly excelling in scenarios with costly function evaluations, such as hyperparameter tuning, engineering design, and experimental sciences.  &lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;BO has experienced a modern resurgence driven by advancements in machine learning, computational power, and the increasing demand to optimize complex systems across fields like artificial intelligence, robotics, and materials discovery, particularly excelling in scenarios with costly function evaluations, such as hyperparameter tuning, engineering design, and experimental sciences.  &lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;The significance of Bayesian Optimization lies in its ability to solve problems where traditional optimization techniques fail due to limited resources or the black-box nature of the objective function. For example, tuning the hyperparameters of a neural network involves computationally intensive evaluations, often requiring substantial time and resources. BO provides a systematic and efficient framework to minimize the number of evaluations while still identifying near-optimal solutions&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;. &lt;/del&gt;[1]&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;The significance of Bayesian Optimization lies in its ability to solve problems where traditional optimization techniques fail due to limited resources or the black-box nature of the objective function. For example, tuning the hyperparameters of a neural network involves computationally intensive evaluations, often requiring substantial time and resources. BO provides a systematic and efficient framework to minimize the number of evaluations while still identifying near-optimal solutions [1]&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;.&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== Algorithm Discussion ==&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== Algorithm Discussion ==&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l62&quot;&gt;Line 62:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 62:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;At a new point &amp;#039;&amp;#039;x*&amp;#039;&amp;#039;, given the observations, a posterior probability distribution can be derived. By Bayes’ rule, it can be shown that [4]&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;At a new point &amp;#039;&amp;#039;x*&amp;#039;&amp;#039;, given the observations, a posterior probability distribution can be derived. By Bayes’ rule, it can be shown that [4]&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&amp;lt;math display=&quot;block&quot;&amp;gt;f(x^*) \mid f(x_1,\ldots,x_k) \sim N(\mu_k(x^*),\sigma_2^k), where   &amp;lt;/math&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&amp;lt;math display=&quot;block&quot;&amp;gt;f(x^*) \mid f(x_1,\ldots,x_k) \sim N(\mu_k(x^*),\sigma_2^k), where   &amp;lt;/math&amp;gt;&amp;lt;math display=&quot;block&quot;&amp;gt;\mu_k(x^*) = \Sigma_0(x^{*},\,x_1,\ldots,x_k) \Sigma_0(x_1,\ldots,x_k, x_1,\,\ldots,x_k)^{-1}(f(x_1,\ldots,x_k) - \mu_0(x_1,\ldots,x_k)) + \mu_0(x^{*})    &amp;lt;/math&amp;gt;&amp;lt;math display=&quot;block&quot;&amp;gt;\sigma_k^2(x^*) = \Sigma_0(x^{*},\,x^{*})-\Sigma_0(x^*, x_1,\,\ldots,x_k)\Sigma_0(x_1,\ldots,x_k,\,x_1,\ldots,x_k)^{-1} \Sigma_0(x_1,\,\ldots,x_k,x^*)   &amp;lt;/math&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt; &lt;/div&gt;&lt;/td&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-side-added&quot;&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&amp;lt;math display=&quot;block&quot;&amp;gt;\mu_k(x^*) = \Sigma_0(x^{*},\,x_1,\ldots,x_k) \Sigma_0(x_1,\ldots,x_k, x_1,\,\ldots,x_k)^{-1}(f(x_1,\ldots,x_k) - \mu_0(x_1,\ldots,x_k)) + \mu_0(x^{*})    &amp;lt;/math&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-side-added&quot;&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt; &lt;/div&gt;&lt;/td&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-side-added&quot;&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&amp;lt;math display=&quot;block&quot;&amp;gt;\sigma_k^2(x^*) = \Sigma_0(x^{*},\,x^{*})-\Sigma_0(x^*, x_1,\,\ldots,x_k)\Sigma_0(x_1,\ldots,x_k,\,x_1,\ldots,x_k)^{-1} \Sigma_0(x_1,\,\ldots,x_k,x^*)   &amp;lt;/math&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-side-added&quot;&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;For the mean and the covariance function, &amp;lt;math&amp;gt;\mu_0&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;For the mean and the covariance function, &amp;lt;math&amp;gt;\mu_0&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l137&quot;&gt;Line 137:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 133:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== Application ==&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== Application ==&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;BO has become vital to optimization of functions in a variety of fields. It is the most useful in scenarios where each iteration of the evaluation is costly in terms of time, resources, or computation. The most prominent application of BO comes in tuning hyperparameters of machine learning models. The problem lies in the high-dimensional and non-convex nature of hyperparameters such as learning rates and regularization coefficients, most commonly. Their values can drastically change the model, so it is important to optimize them appropriately. Snoek et al. (2012) demonstrated the effectiveness of BO in tuning neural network hyperparameters that resulted in state-of-the-art performance on benchmark datasets. Several hyperparameters were targeted including learning rate, momentum, weight decay, number of layers, number of units per layer, and other parameters specific to the architecture of the models. GPs were used to model the validation error as a function of the hyperparameters selected, and the EI criterion was used to select the next set of hyperparameters. BO found hyperparameter combinations that gave lower validation errors compared to traditional tuning techniques and had fewer function evaluations, reducing time complexity&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;. &lt;/del&gt;[8]&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;BO has become vital to optimization of functions in a variety of fields. It is the most useful in scenarios where each iteration of the evaluation is costly in terms of time, resources, or computation. The most prominent application of BO comes in tuning hyperparameters of machine learning models. The problem lies in the high-dimensional and non-convex nature of hyperparameters such as learning rates and regularization coefficients, most commonly. Their values can drastically change the model, so it is important to optimize them appropriately. Snoek et al. (2012) demonstrated the effectiveness of BO in tuning neural network hyperparameters that resulted in state-of-the-art performance on benchmark datasets. Several hyperparameters were targeted including learning rate, momentum, weight decay, number of layers, number of units per layer, and other parameters specific to the architecture of the models. GPs were used to model the validation error as a function of the hyperparameters selected, and the EI criterion was used to select the next set of hyperparameters. BO found hyperparameter combinations that gave lower validation errors compared to traditional tuning techniques and had fewer function evaluations, reducing time complexity [8]&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;.&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Another major area of application is the optimization of engineering design where BO aids in optimizing design parameters that would otherwise require costly simulations or physical experiments. This is applicable to all forms of engineering where design is crucial such as aerospace, automotive design, and structural engineering. According to Edgar et al. (2001), optimization plays a crucial role in enhancing the efficiency and safety of chemical processes by systematically adjusting design parameters. Edgar et al. (2001) present a case study on the optimization of distillation column design. The basic design parameters are the number of stages, reflux ratio, feed stage location, and column diameter. The objective function is the total annual cost, and while the complex and nonlinear relationship between the parameters and the function poses a challenge, BO is well-suited to handle such a problem. By fitting a GP to initial design points, and selecting an appropriate acquisition function, optimal hyperparameters can be determined through iteration&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;. &lt;/del&gt;[9]&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Another major area of application is the optimization of engineering design where BO aids in optimizing design parameters that would otherwise require costly simulations or physical experiments. This is applicable to all forms of engineering where design is crucial such as aerospace, automotive design, and structural engineering. According to Edgar et al. (2001), optimization plays a crucial role in enhancing the efficiency and safety of chemical processes by systematically adjusting design parameters. Edgar et al. (2001) present a case study on the optimization of distillation column design. The basic design parameters are the number of stages, reflux ratio, feed stage location, and column diameter. The objective function is the total annual cost, and while the complex and nonlinear relationship between the parameters and the function poses a challenge, BO is well-suited to handle such a problem. By fitting a GP to initial design points, and selecting an appropriate acquisition function, optimal hyperparameters can be determined through iteration [9]&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;.&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Similarly, in fields like chemistry and material science, experiments can be costly and time consuming. In this case, Bayesian Optimization can be utilized by selecting apt experimental conditions to conduct experiments under which are analogous to selecting hyperparameters in machine learning models. For instance, Gomez-Bombarelli et al (2018) proposed using BO to design novel organic molecules with specific electronic properties by altering and optimizing molecular structures. In this case, the molecular structure, the functional groups, and the bond angles and lengths were chosen as the parameters. Since electronic properties are nonlinearly dependent on molecular structures, and quantum mechanical calculations are computationally intensive, BO is appropriate. By defining a particular electronic property as the objective function, initializing a surrogate model, and selecting an acquisition function, the parameters can be found through iterative optimization&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;. &lt;/del&gt;[10]&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Similarly, in fields like chemistry and material science, experiments can be costly and time consuming. In this case, Bayesian Optimization can be utilized by selecting apt experimental conditions to conduct experiments under which are analogous to selecting hyperparameters in machine learning models. For instance, Gomez-Bombarelli et al (2018) proposed using BO to design novel organic molecules with specific electronic properties by altering and optimizing molecular structures. In this case, the molecular structure, the functional groups, and the bond angles and lengths were chosen as the parameters. Since electronic properties are nonlinearly dependent on molecular structures, and quantum mechanical calculations are computationally intensive, BO is appropriate. By defining a particular electronic property as the objective function, initializing a surrogate model, and selecting an acquisition function, the parameters can be found through iterative optimization [10]&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;.&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== Conclusion ==&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== Conclusion ==&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>Fall2024 Team4 Wiki</name></author>
	</entry>
	<entry>
		<id>https://optimization.cbe.cornell.edu/index.php?title=Bayesian_optimization&amp;diff=7660&amp;oldid=prev</id>
		<title>Fall2024 Team4 Wiki: Made some grammar changes and renumbered the references</title>
		<link rel="alternate" type="text/html" href="https://optimization.cbe.cornell.edu/index.php?title=Bayesian_optimization&amp;diff=7660&amp;oldid=prev"/>
		<updated>2024-12-15T23:12:51Z</updated>

		<summary type="html">&lt;p&gt;Made some grammar changes and renumbered the references&lt;/p&gt;
&lt;a href=&quot;https://optimization.cbe.cornell.edu/index.php?title=Bayesian_optimization&amp;amp;diff=7660&amp;amp;oldid=7542&quot;&gt;Show changes&lt;/a&gt;</summary>
		<author><name>Fall2024 Team4 Wiki</name></author>
	</entry>
	<entry>
		<id>https://optimization.cbe.cornell.edu/index.php?title=Bayesian_optimization&amp;diff=7542&amp;oldid=prev</id>
		<title>Fall2024 Team4 Wiki: Inserted text from the initial deadline into the Bayesian Optimization Wiki. This text includes improvements suggested by the TAs.</title>
		<link rel="alternate" type="text/html" href="https://optimization.cbe.cornell.edu/index.php?title=Bayesian_optimization&amp;diff=7542&amp;oldid=prev"/>
		<updated>2024-12-15T16:40:17Z</updated>

		<summary type="html">&lt;p&gt;Inserted text from the initial deadline into the Bayesian Optimization Wiki. This text includes improvements suggested by the TAs.&lt;/p&gt;
&lt;a href=&quot;https://optimization.cbe.cornell.edu/index.php?title=Bayesian_optimization&amp;amp;diff=7542&amp;amp;oldid=6834&quot;&gt;Show changes&lt;/a&gt;</summary>
		<author><name>Fall2024 Team4 Wiki</name></author>
	</entry>
	<entry>
		<id>https://optimization.cbe.cornell.edu/index.php?title=Bayesian_optimization&amp;diff=6834&amp;oldid=prev</id>
		<title>SYSEN5800TAs at 18:35, 10 December 2024</title>
		<link rel="alternate" type="text/html" href="https://optimization.cbe.cornell.edu/index.php?title=Bayesian_optimization&amp;diff=6834&amp;oldid=prev"/>
		<updated>2024-12-10T18:35:00Z</updated>

		<summary type="html">&lt;p&gt;&lt;/p&gt;
&lt;table style=&quot;background-color: #fff; color: #202122;&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;Revision as of 14:35, 10 December 2024&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l1&quot;&gt;Line 1:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 1:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Author: Rithvik Duggireddy, Brad Edgerton, Sochetra Sor, Jerry Feng, James Chen (ChemE 6800 Fall 2024)&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Author: Rithvik Duggireddy &lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;(rd532)&lt;/ins&gt;, Brad Edgerton &lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;(bme44)&lt;/ins&gt;, Sochetra Sor &lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;(ss3437)&lt;/ins&gt;, Jerry Feng &lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;(zf245)&lt;/ins&gt;, James Chen &lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;(jzc28) &lt;/ins&gt;(ChemE 6800 Fall 2024)&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;br&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Stewards: Nathan Preuss, Wei-Han Chen, Tianqi Xiao, Guoqing Hu&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Stewards: Nathan Preuss, Wei-Han Chen, Tianqi Xiao, Guoqing Hu&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>SYSEN5800TAs</name></author>
	</entry>
	<entry>
		<id>https://optimization.cbe.cornell.edu/index.php?title=Bayesian_optimization&amp;diff=6823&amp;oldid=prev</id>
		<title>SYSEN5800TAs at 18:24, 10 December 2024</title>
		<link rel="alternate" type="text/html" href="https://optimization.cbe.cornell.edu/index.php?title=Bayesian_optimization&amp;diff=6823&amp;oldid=prev"/>
		<updated>2024-12-10T18:24:01Z</updated>

		<summary type="html">&lt;p&gt;&lt;/p&gt;
&lt;table style=&quot;background-color: #fff; color: #202122;&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;Revision as of 14:24, 10 December 2024&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l1&quot;&gt;Line 1:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 1:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Fall 2024 &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;new page creation&lt;/del&gt;&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;Author: Rithvik Duggireddy, Brad Edgerton, Sochetra Sor, Jerry Feng, James Chen (ChemE 6800 &lt;/ins&gt;Fall 2024&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;)&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-side-deleted&quot;&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt; &lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-side-deleted&quot;&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;Stewards: Nathan Preuss, Wei-Han Chen, Tianqi Xiao, Guoqing Hu&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>SYSEN5800TAs</name></author>
	</entry>
	<entry>
		<id>https://optimization.cbe.cornell.edu/index.php?title=Bayesian_optimization&amp;diff=6737&amp;oldid=prev</id>
		<title>Youfengqi: Created page with &quot;Fall 2024 new page creation&quot;</title>
		<link rel="alternate" type="text/html" href="https://optimization.cbe.cornell.edu/index.php?title=Bayesian_optimization&amp;diff=6737&amp;oldid=prev"/>
		<updated>2024-12-04T20:46:34Z</updated>

		<summary type="html">&lt;p&gt;Created page with &amp;quot;Fall 2024 new page creation&amp;quot;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;Fall 2024 new page creation&lt;/div&gt;</summary>
		<author><name>Youfengqi</name></author>
	</entry>
</feed>