To see the other types of publications on this topic, follow the link: Bayesian Optimization.

Journal articles on the topic 'Bayesian Optimization'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Bayesian Optimization.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Nguyen, Thanh Dai, Sunil Gupta, Santu Rana, and Svetha Venkatesh. "Stable Bayesian optimization." International Journal of Data Science and Analytics 6, no. 4 (April 9, 2018): 327–39. http://dx.doi.org/10.1007/s41060-018-0119-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ahmed, Mohamed Osama, Sharan Vaswani, and Mark Schmidt. "Combining Bayesian optimization and Lipschitz optimization." Machine Learning 109, no. 1 (January 2020): 79–102. http://dx.doi.org/10.1007/s10994-019-05833-y.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Motoyama, Yuichi, Ryo Tamura, Kazuyoshi Yoshimi, Kei Terayama, Tsuyoshi Ueno, and Koji Tsuda. "Bayesian optimization package: PHYSBO." Computer Physics Communications 278 (September 2022): 108405. http://dx.doi.org/10.1016/j.cpc.2022.108405.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ewerhart, Christian. "Bayesian optimization and genericity." Operations Research Letters 21, no. 5 (January 1997): 243–48. http://dx.doi.org/10.1016/s0167-6377(97)00050-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Cochran, James J., Martin S. Levy, and Jeffrey D. Camm. "Bayesian coverage optimization models." Journal of Combinatorial Optimization 19, no. 2 (June 29, 2008): 158–73. http://dx.doi.org/10.1007/s10878-008-9172-y.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Shapiro, Alexander, Enlu Zhou, and Yifan Lin. "Bayesian Distributionally Robust Optimization." SIAM Journal on Optimization 33, no. 2 (June 26, 2023): 1279–304. http://dx.doi.org/10.1137/21m1465548.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Klepac, Goran. "Particle Swarm Optimization Algorithm as a Tool for Profile Optimization." International Journal of Natural Computing Research 5, no. 4 (October 2015): 1–23. http://dx.doi.org/10.4018/ijncr.2015100101.

Full text
Abstract:
Complex analytical environment is challenging environment for finding customer profiles. In situation where predictive model exists like Bayesian networks challenge became even bigger regarding combinatory explosion. Complex analytical environment can be caused by multiple modality of output variable, fact that each node of Bayesian network can potetnitaly be target variable for profiling, as well as from big data environment, which cause data complexity in way of data quantity. As an illustration of presented concept particle swarm optimization algorithm will be used as a tool, which will find profiles from developed predictive model of Bayesian network. This paper will show how partical swarm optimization algorithm can be powerfull tool for finding optimal customer profiles given target conditions as evidences within Bayesian networks.
APA, Harvard, Vancouver, ISO, and other styles
8

Hickish, Bob, David I. Fletcher, and Robert F. Harrison. "Investigating Bayesian Optimization for rail network optimization." International Journal of Rail Transportation 8, no. 4 (October 14, 2019): 307–23. http://dx.doi.org/10.1080/23248378.2019.1669500.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Dogan, Vedat, and Steven Prestwich. "Multi-Objective BiLevel Optimization by Bayesian Optimization." Algorithms 17, no. 4 (March 30, 2024): 146. http://dx.doi.org/10.3390/a17040146.

Full text
Abstract:
In a multi-objective optimization problem, a decision maker has more than one objective to optimize. In a bilevel optimization problem, there are the following two decision-makers in a hierarchy: a leader who makes the first decision and a follower who reacts, each aiming to optimize their own objective. Many real-world decision-making processes have various objectives to optimize at the same time while considering how the decision-makers affect each other. When both features are combined, we have a multi-objective bilevel optimization problem, which arises in manufacturing, logistics, environmental economics, defence applications and many other areas. Many exact and approximation-based techniques have been proposed, but because of the intrinsic nonconvexity and conflicting multiple objectives, their computational cost is high. We propose a hybrid algorithm based on batch Bayesian optimization to approximate the upper-level Pareto-optimal solution set. We also extend our approach to handle uncertainty in the leader’s objectives via a hypervolume improvement-based acquisition function. Experiments show that our algorithm is more efficient than other current methods while successfully approximating Pareto-fronts.
APA, Harvard, Vancouver, ISO, and other styles
10

Muzayanah, Rini, Dwika Ananda Agustina Pertiwi, Muazam Ali, and Much Aziz Muslim. "Comparison of gridsearchcv and bayesian hyperparameter optimization in random forest algorithm for diabetes prediction." Journal of Soft Computing Exploration 5, no. 1 (April 2, 2024): 86–91. http://dx.doi.org/10.52465/joscex.v5i1.308.

Full text
Abstract:
Diabetes Mellitus (DM) is a chronic disease whose complications have a significant impact on patients and the wider community. In its early stages, diabetes mellitus usually does not cause significant symptoms, but if it is detected too late and not handled properly, it can cause serious health problems. To overcome these problems, diabetes detection is one of the solutions used. In this research, diabetes detection was carried out using Random Forest with gridsearchcv and bayesian hyperparameter optimization. The research was carried out through the stages of study literature, model development using Kaggle Notebook, model testing, and results analysis. This study aims to compare GridSearchCV and Bayesian hyperparameter optimizations, then analyze the advantages and disadvantages of each optimization when applied to diabetes prediction using the Random Forest algorithm. From the research conducted, it was found that GridSearchCV and Bayesian hyperparameter optimization have their own advantages and disadvantages. The GridSearchCV hyperparameter excels in terms of accuracy of 0.74, although it takes longer for 338,416 seconds. On the other hand, Bayesian hyperparameter optimization has a lower accuracy rate than GridSearchCV optimization with a difference of 0.01, which is 0.73 and takes less time than GridSearchCV for 177,085 seconds.
APA, Harvard, Vancouver, ISO, and other styles
11

He, Xiang Dong, Jun Yan Huang, and Shu Tian Liu. "Bayesian Reliability-Based Optimization Design of Torsion Bar." Advanced Materials Research 538-541 (June 2012): 3085–88. http://dx.doi.org/10.4028/www.scientific.net/amr.538-541.3085.

Full text
Abstract:
Based on bayesian statistics theory and reliability-based optimization design, the research presents a new reliability-based optimization design approach that solves the form of finite test samples. In the article, the bayesian reliability-based optimization mathematical model is established and the bayesian reliability-based optimization approach of torsion bar is proposed. The method adopts a bayesian inference technique to estimate reliability, gives definition of bayesian reliability. The results illustrates the method presented is an efficient and practical reliability-based optimization approach of torsion bar.
APA, Harvard, Vancouver, ISO, and other styles
12

Fang, Yihao, Mu Niu, Pokman Cheung, and Lizhen Lin. "Extrinsic Bayesian Optimization on Manifolds." Algorithms 16, no. 2 (February 15, 2023): 117. http://dx.doi.org/10.3390/a16020117.

Full text
Abstract:
We propose an extrinsic Bayesian optimization (eBO) framework for general optimization problems on manifolds. Bayesian optimization algorithms build a surrogate of the objective function by employing Gaussian processes and utilizing the uncertainty in that surrogate by deriving an acquisition function. This acquisition function represents the probability of improvement based on the kernel of the Gaussian process, which guides the search in the optimization process. The critical challenge for designing Bayesian optimization algorithms on manifolds lies in the difficulty of constructing valid covariance kernels for Gaussian processes on general manifolds. Our approach is to employ extrinsic Gaussian processes by first embedding the manifold onto some higher dimensional Euclidean space via equivariant embeddings and then constructing a valid covariance kernel on the image manifold after the embedding. This leads to efficient and scalable algorithms for optimization over complex manifolds. Simulation study and real data analyses are carried out to demonstrate the utilities of our eBO framework by applying the eBO to various optimization problems over manifolds such as the sphere, the Grassmannian, and the manifold of positive definite matrices.
APA, Harvard, Vancouver, ISO, and other styles
13

Katakami, Shun, Hirotaka Sakamoto, and Masato Okada. "Bayesian Hyperparameter Estimation using Gaussian Process and Bayesian Optimization." Journal of the Physical Society of Japan 88, no. 7 (July 15, 2019): 074001. http://dx.doi.org/10.7566/jpsj.88.074001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Nguyen, Quoc Phong, Sebastian Tay, Bryan Kian Hsiang Low, and Patrick Jaillet. "Top-k Ranking Bayesian Optimization." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 10 (May 18, 2021): 9135–43. http://dx.doi.org/10.1609/aaai.v35i10.17103.

Full text
Abstract:
This paper presents a novel approach to top-k ranking Bayesian optimization (top-k ranking BO) which is a practical and significant generalization of preferential BO to handle top-k ranking and tie/indifference observations. We first design a surrogate model that is not only capable of catering to the above observations, but is also supported by a classic random utility model. Another equally important contribution is the introduction of the first information-theoretic acquisition function in BO with preferential observation called multinomial predictive entropy search (MPES) which is flexible in handling these observations and optimized for all inputs of a query jointly. MPES possesses superior performance compared with existing acquisition functions that select the inputs of a query one at a time greedily. We empirically evaluate the performance of MPES using several synthetic benchmark functions, CIFAR-10 dataset, and SUSHI preference dataset.
APA, Harvard, Vancouver, ISO, and other styles
15

Akella, Ravi Tej, Kamyar Azizzadenesheli, Mohammad Ghavamzadeh, Animashree Anandkumar, and Yisong Yue. "Deep Bayesian Quadrature Policy Optimization." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 8 (May 18, 2021): 6600–6608. http://dx.doi.org/10.1609/aaai.v35i8.16817.

Full text
Abstract:
We study the problem of obtaining accurate policy gradient estimates using a finite number of samples. Monte-Carlo methods have been the default choice for policy gradient estimation, despite suffering from high variance in the gradient estimates. On the other hand, more sample efficient alternatives like Bayesian quadrature methods have received little attention due to their high computational complexity. In this work, we propose deep Bayesian quadrature policy gradient (DBQPG), a computationally efficient high-dimensional generalization of Bayesian quadrature, for policy gradient estimation. We show that DBQPG can substitute Monte-Carlo estimation in policy gradient methods, and demonstrate its effectiveness on a set of continuous control benchmarks. In comparison to Monte-Carlo estimation, DBQPG provides (i) more accurate gradient estimates with a significantly lower variance, (ii) a consistent improvement in the sample complexity and average return for several deep policy gradient algorithms, and, (iii) the uncertainty in gradient estimation that can be incorporated to further improve the performance.
APA, Harvard, Vancouver, ISO, and other styles
16

Deshwal, Aryan, Syrine Belakaria, Janardhan Rao Doppa, and Dae Hyun Kim. "Bayesian Optimization over Permutation Spaces." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 6 (June 28, 2022): 6515–23. http://dx.doi.org/10.1609/aaai.v36i6.20604.

Full text
Abstract:
Optimizing expensive to evaluate black-box functions over an input space consisting of all permutations of d objects is an important problem with many real-world applications. For example, placement of functional blocks in hardware design to optimize performance via simulations. The overall goal is to minimize the number of function evaluations to find high-performing permutations. The key challenge in solving this problem using the Bayesian optimization (BO) framework is to trade-off the complexity of statistical model and tractability of acquisition function optimization. In this paper, we propose and evaluate two algorithms for BO over Permutation Spaces (BOPS). First, BOPS-T employs Gaussian process (GP) surrogate model with Kendall kernels and a Tractable acquisition function optimization approach to select the sequence of permutations for evaluation. Second, BOPS-H employs GP surrogate model with Mallow kernels and a Heuristic search approach to optimize the acquisition function. We theoretically analyze the performance of BOPS-T to show that their regret grows sub-linearly. Our experiments on multiple synthetic and real-world benchmarks show that both BOPS-T and BOPS-H perform better than the state-of-the-art BO algorithm for combinatorial spaces. To drive future research on this important problem, we make new resources and real-world benchmarks available to the community.
APA, Harvard, Vancouver, ISO, and other styles
17

Deshwal, Aryan, Cory M. Simon, and Janardhan Rao Doppa. "Bayesian optimization of nanoporous materials." Molecular Systems Design & Engineering 6, no. 12 (2021): 1066–86. http://dx.doi.org/10.1039/d1me00093d.

Full text
Abstract:
In Bayesian optimization, we efficiently search for an optimal material by iterating between (i) conducting an experiment on a material, (ii) updating our knowledge, and (iii) selecting the next material for an experiment.
APA, Harvard, Vancouver, ISO, and other styles
18

Elsas, J. H., N. A. G. Casaprima, P. H. S. Cardoso, and I. F. M. Menezes. "Bayesian optimization of riser configurations." Ocean Engineering 236 (September 2021): 109402. http://dx.doi.org/10.1016/j.oceaneng.2021.109402.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Toscano-Palmerin, Saul, and Peter I. Frazier. "Bayesian Optimization with Expensive Integrands." SIAM Journal on Optimization 32, no. 2 (April 18, 2022): 417–44. http://dx.doi.org/10.1137/19m1303125.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Sabbatella, Antonio, Francesco Archetti, Andrea Ponti, Ilaria Giordani, and Antonio Candelieri. "Bayesian Optimization for Instruction Generation." Applied Sciences 14, no. 24 (December 19, 2024): 11865. https://doi.org/10.3390/app142411865.

Full text
Abstract:
The performance of Large Language Models (LLMs) strongly depends on the selection of the best instructions for different downstream tasks, especially in the case of black-box LLMs. This study introduces BOInG (Bayesian Optimization for Instruction Generation), a method leveraging Bayesian Optimization (BO) to efficiently generate instructions while addressing the combinatorial nature of instruction search. Over the last decade, BO has emerged as a highly effective optimization method in various domains due to its flexibility and sample efficiency. At its core, BOInG employs Bayesian search in a low-dimensional continuous space, projecting solutions into a high-dimensional token embedding space to retrieve discrete tokens. These tokens act as seeds for the generation of human-readable, task-relevant instructions. Experimental results demonstrate that BOInG achieves comparable or superior performance to state-of-the-art methods, such as InstructZero and Instinct, with substantially lower resource requirements while also enabling the use of both white-box and black-box models. This approach offers both theoretical and practical benefits without requiring specialized hardware.
APA, Harvard, Vancouver, ISO, and other styles
21

Vitsas, Nick, Iordanis Evangelou, Georgios Papaioannou, and Anastasios Gkaravelis. "Opening Design using Bayesian Optimization." Virtual Reality & Intelligent Hardware 5, no. 6 (December 2023): 550–64. http://dx.doi.org/10.1016/j.vrih.2023.06.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Arbanas, Goran, Jinghua Feng, Zia J. Clifton, Andrew M. Holcomb, Marco T. Pigni, Dorothea Wiarda, Christopher W. Chapman, Vladimir Sobes, Li Emily Liu, and Yaron Danon. "Bayesian optimization of generalized data." EPJ Nuclear Sciences & Technologies 4 (2018): 30. http://dx.doi.org/10.1051/epjn/2018038.

Full text
Abstract:
Direct application of Bayes' theorem to generalized data yields a posterior probability distribution function (PDF) that is a product of a prior PDF of generalized data and a likelihood function, where generalized data consists of model parameters, measured data, and model defect data. The prior PDF of generalized data is defined by prior expectation values and a prior covariance matrix of generalized data that naturally includes covariance between any two components of generalized data. A set of constraints imposed on the posterior expectation values and covariances of generalized data via a given model is formally solved by the method of Lagrange multipliers. Posterior expectation values of the constraints and their covariance matrix are conventionally set to zero, leading to a likelihood function that is a Dirac delta function of the constraining equation. It is shown that setting constraints to values other than zero is analogous to introducing a model defect. Since posterior expectation values of any function of generalized data are integrals of that function over all generalized data weighted by the posterior PDF, all elements of generalized data may be viewed as nuisance parameters marginalized by this integration. One simple form of posterior PDF is obtained when the prior PDF and the likelihood function are normal PDFs. For linear models without a defect this PDF becomes equivalent to constrained least squares (CLS) method, that is, the χ2 minimization method.
APA, Harvard, Vancouver, ISO, and other styles
23

Venkatesh, Prasana K., Morrel H. Cohen, Robert W. Carr, and Anthony M. Dean. "Bayesian method for global optimization." Physical Review E 55, no. 5 (May 1, 1997): 6219–32. http://dx.doi.org/10.1103/physreve.55.6219.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Schultz, Laura, and Vadim Sokolov. "Bayesian Optimization for Transportation Simulators." Procedia Computer Science 130 (2018): 973–78. http://dx.doi.org/10.1016/j.procs.2018.04.098.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Shi, Zhuanghua, Russell M. Church, and Warren H. Meck. "Bayesian optimization of time perception." Trends in Cognitive Sciences 17, no. 11 (November 2013): 556–64. http://dx.doi.org/10.1016/j.tics.2013.09.009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Archetti, F., A. Gaivoronski, and F. Stella. "Stochastic optimization on Bayesian nets." European Journal of Operational Research 101, no. 2 (September 1997): 360–73. http://dx.doi.org/10.1016/s0377-2217(96)00403-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Galuzio, Paulo Paneque, Emerson Hochsteiner de Vasconcelos Segundo, Leandro dos Santos Coelho, and Viviana Cocco Mariani. "MOBOpt — multi-objective Bayesian optimization." SoftwareX 12 (July 2020): 100520. http://dx.doi.org/10.1016/j.softx.2020.100520.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Jiménez, José, and Josep Ginebra. "pyGPGO: Bayesian Optimization for Python." Journal of Open Source Software 2, no. 19 (November 2, 2017): 431. http://dx.doi.org/10.21105/joss.00431.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Betr�, Bruno. "Bayesian methods in global optimization." Journal of Global Optimization 1, no. 1 (1991): 1–14. http://dx.doi.org/10.1007/bf00120661.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Weissteiner, Jakob, Jakob Heiss, Julien Siems, and Sven Seuken. "Bayesian Optimization-Based Combinatorial Assignment." Proceedings of the AAAI Conference on Artificial Intelligence 37, no. 5 (June 26, 2023): 5858–66. http://dx.doi.org/10.1609/aaai.v37i5.25726.

Full text
Abstract:
We study the combinatorial assignment domain, which includes combinatorial auctions and course allocation. The main challenge in this domain is that the bundle space grows exponentially in the number of items. To address this, several papers have recently proposed machine learning-based preference elicitation algorithms that aim to elicit only the most important information from agents. However, the main shortcoming of this prior work is that it does not model a mechanism's uncertainty over values for not yet elicited bundles. In this paper, we address this shortcoming by presenting a Bayesian optimization-based combinatorial assignment (BOCA) mechanism. Our key technical contribution is to integrate a method for capturing model uncertainty into an iterative combinatorial auction mechanism. Concretely, we design a new method for estimating an upper uncertainty bound that can be used to define an acquisition function to determine the next query to the agents. This enables the mechanism to properly explore (and not just exploit) the bundle space during its preference elicitation phase. We run computational experiments in several spectrum auction domains to evaluate BOCA's performance. Our results show that BOCA achieves higher allocative efficiency than state-of-the-art approaches.
APA, Harvard, Vancouver, ISO, and other styles
31

Hoffer, J. G., S. Ranftl, and B. C. Geiger. "Robust Bayesian target value optimization." Computers & Industrial Engineering 180 (June 2023): 109279. http://dx.doi.org/10.1016/j.cie.2023.109279.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Guo, Jeff, Bojana Ranković, and Philippe Schwaller. "Bayesian Optimization for Chemical Reactions." CHIMIA 77, no. 1/2 (February 22, 2023): 31. http://dx.doi.org/10.2533/chimia.2023.31.

Full text
Abstract:
Reaction optimization is challenging and traditionally delegated to domain experts who iteratively propose increasingly optimal experiments. Problematically, the reaction landscape is complex and often requires hundreds of experiments to reach convergence, representing an enormous resource sink. Bayesian optimization (BO) is an optimization algorithm that recommends the next experiment based on previous observations and has recently gained considerable interest in the general chemistry community. The application of BO for chemical reactions has been demonstrated to increase efficiency in optimization campaigns and can recommend favorable reaction conditions amidst many possibilities. Moreover, its ability to jointly optimize desired objectives such as yield and stereoselectivity makes it an attractive alternative or at least complementary to domain expert-guided optimization. With the democratization of BO software, the barrier of entry to applying BO for chemical reactions has drastically lowered. The intersection between the paradigms will see advancements at an ever-rapid pace. In this review, we discuss how chemical reactions can be transformed into machine-readable formats which can be learned by machine learning (ML) models. We present a foundation for BO and how it has already been applied to optimize chemical reaction outcomes. The important message we convey is that realizing the full potential of ML-augmented reaction optimization will require close collaboration between experimentalists and computational scientists.
APA, Harvard, Vancouver, ISO, and other styles
33

Jenkins, William F., Peter Gerstoft, and Yongsung Park. "Bayesian optimization for geoacoustic inversion." Journal of the Acoustical Society of America 155, no. 3_Supplement (March 1, 2024): A213. http://dx.doi.org/10.1121/10.0027343.

Full text
Abstract:
Geoacoustic inversion of high-dimensional parameter spaces is a computationally intensive procedure, often necessitating thousands of forward model evaluations to accurately estimate the geoacoustic environment, such as Markov chain Monte Carlo sampling. This study introduces Bayesian optimization (BO), an efficient global optimization technique, to estimate geoacoustic parameters with significantly fewer evaluations, typically on the order of hundreds. BO involves an iterative search within the parameter space to locate the global optimum of an objective function; in this study, the Bartlett power is used. BO consists of fitting a Gaussian process surrogate model to existing evaluations of the objective function, followed by selecting a new data point for evaluation using a heuristic acquisition function. The effectiveness of BO is showcased through its application to both simulated and real-world data from a shallow-water environment for multidimensional parameter space encompassing source location, array tilt, and seabed properties.
APA, Harvard, Vancouver, ISO, and other styles
34

Oliver, Lindsay D., Jerrold Jeyachandra, Erin W. Dickie, Colin Hawco, Salim Mansour, Stephanie M. Hare, Robert W. Buchanan, et al. "Bayesian Optimization Of NeuroStimulation (BOONStim)." Brain Stimulation 18, no. 2 (March 2025): 112–15. https://doi.org/10.1016/j.brs.2025.01.020.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Maria D. Roopa and Nimitha John. "Bayesian Optimization Phase I Design of Experiment Models." Scientific Temper 15, no. 02 (June 24, 2024): 2380–84. http://dx.doi.org/10.58414/scientifictemper.2024.15.2.54.

Full text
Abstract:
This paper offers a concise overview of a novel model for clinical trials, focusing on measurable phase I outcomes from a Bayesian perspective. It outlines hypothetical Bayesian criteria standards and discusses utilization model techniques, including sampler Bayesian models. Bayesian methodologies are increasingly popular in clinical research for their ability to incorporate prior information and adapt trial designs based on accumulating data. Phase I trials are vital for assessing new treatment safety, making them ideal for Bayesian approaches. The model leverages Bayesian principles to guide trial decisions, like dose escalation and maximum tolerated dose determination. By merging prior knowledge with observed data, Bayesian methods provide a framework for informed decisions, especially in scenarios with small sample sizes or historical data. Additionally, the paper explores various Bayesian model techniques, including samplers for posterior inference, enhancing decision-making in clinical trials. Overall, it contributes to Bayesian methodologies by outlining a tailored model for phase I trials and offering practical implementation guidance to improve early-phase trial efficiency and reliability.
APA, Harvard, Vancouver, ISO, and other styles
36

Dwi Utami, Fathoni Dwiatmoko, and Nuari Anisa Sivi. "Analisis Pengaruh Bayesian Optimization Terhadap Kinerja SVM Dalam Prediksi Penyakit Diabetes." Infotek: Jurnal Informatika dan Teknologi 8, no. 1 (January 20, 2025): 140–50. https://doi.org/10.29408/jit.v8i1.28468.

Full text
Abstract:
Diabetes is a prevalent and serious chronic illness that impacts millions of individuals globally. Early detection of diabetes is essential to mitigate severe health complications. This study investigates the application of Support Vector Machine (SVM) enhanced by Bayesian Optimization for the early prediction of diabetes. While SVM is a robust machine learning algorithm, its performance heavily depends on the proper selection of parameters. Bayesian Optimization is an efficient approach to fine-tune SVM parameters, such as the regularization parameter (C) and the kernel parameter (gamma). The research utilizes a Kaggle dataset that includes various diabetes risk factors. The study compares the performance of SVM optimized using Bayesian Optimization against SVM without optimization. The findings reveal that SVM with Bayesian Optimization achieves an accuracy of 95%, surpassing the 94% accuracy of the unoptimized SVM. These results highlight that Bayesian Optimization enhances SVM's effectiveness in predicting diabetes early
APA, Harvard, Vancouver, ISO, and other styles
37

Awal, A., J. Hetzel, R. Gebel, V. Kamerdzhiev, and J. Pretz. "Optimization of the injection beam line at the Cooler Synchrotron COSY using Bayesian Optimization." Journal of Instrumentation 18, no. 04 (April 1, 2023): P04010. http://dx.doi.org/10.1088/1748-0221/18/04/p04010.

Full text
Abstract:
Abstract The complex non-linear processes in multi-dimensional parameter spaces, that are typical for an accelerator, are a natural application for machine learning algorithms. This paper reports on the use of Bayesian optimization for the optimization of the Injection Beam Line (IBL) of the Cooler Synchrotron storage ring COSY at the Forschungszentrum Jülich, Germany. Bayesian optimization is a machine learning method that optimizes a continuous objective function using limited observations. The IBL is composed of 15 quadrupoles and 28 steerers. The goal is to increase the beam intensity inside the storage ring. The results showed the effectiveness of the Bayesian optimization in achieving better/faster results compared to manual optimization.
APA, Harvard, Vancouver, ISO, and other styles
38

Kitahara, Masaru, Chao Dang, and Michael Beer. "Bayesian updating with two-step parallel Bayesian optimization and quadrature." Computer Methods in Applied Mechanics and Engineering 403 (January 2023): 115735. http://dx.doi.org/10.1016/j.cma.2022.115735.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Cai, Xu, and Jonathan Scarlett. "Kernelized Normalizing Constant Estimation: Bridging Bayesian Quadrature and Bayesian Optimization." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 10 (March 24, 2024): 11150–58. http://dx.doi.org/10.1609/aaai.v38i10.28992.

Full text
Abstract:
In this paper, we study the problem of estimating the normalizing constant through queries to the black-box function f, which is the integration of the exponential function of f scaled by a problem parameter lambda. We assume f belongs to a reproducing kernel Hilbert space (RKHS), and show that to estimate the normalizing constant within a small relative error, the level of difficulty depends on the value of lambda: When lambda approaches zero, the problem is similar to Bayesian quadrature (BQ), while when lambda approaches infinity, the problem is similar to Bayesian optimization (BO). More generally, the problem varies between BQ and BO. We find that this pattern holds true even when the function evaluations are noisy, bringing new aspects to this topic. Our findings are supported by both algorithm-independent lower bounds and algorithmic upper bounds, as well as simulation studies conducted on a variety of benchmark functions.
APA, Harvard, Vancouver, ISO, and other styles
40

Tian, Qiaoyu, Wen Xu, and Jin Xu. "Optimization of Bayesian algorithms for multi-threshold image segmentation." Journal of Computational Methods in Sciences and Engineering 24, no. 4-5 (August 14, 2024): 2863–77. http://dx.doi.org/10.3233/jcm-247522.

Full text
Abstract:
The Bayesian optimization algorithm uses Bayesian networks as the probability model of its solution space. Although the research on this algorithm has steadily developed, there are still some problems in its application process, such as excessive computational complexity. To solve various problems in Bayesian algorithm, reduce its computational complexity, and enable it to better achieve image segmentation. The study chooses to improve the Bayesian algorithm on the basis of immune algorithm, and solves the problem of computational complexity by reducing the number of Bayesian network construction times, thereby improving the individual fitness of the population. Through simulation experiments, it has been shown that the average number of times the improved Bayesian algorithm reaches the optimal value is 30, which is higher than the traditional algorithm’s 20 times. Its excellent optimization ability searches for the optimal threshold to complete image segmentation. The improved Bayesian optimization algorithm based on immune algorithm can effectively reduce computational complexity, shorten computational time, and improve convergence. And applying Bayesian algorithm to image segmentation has broadened the application field of the algorithm and found new exploration directions for image segmentation.
APA, Harvard, Vancouver, ISO, and other styles
41

Agasiev, Taleh, and Anatoly Karpenko. "Exploratory Landscape Validation for Bayesian Optimization Algorithms." Mathematics 12, no. 3 (January 28, 2024): 426. http://dx.doi.org/10.3390/math12030426.

Full text
Abstract:
Bayesian optimization algorithms are widely used for solving problems with a high computational complexity in terms of objective function evaluation. The efficiency of Bayesian optimization is strongly dependent on the quality of the surrogate models of an objective function, which are built and refined at each iteration. The quality of surrogate models, and hence the performance of an optimization algorithm, can be greatly improved by selecting the appropriate hyperparameter values of the approximation algorithm. The common approach to finding good hyperparameter values for each iteration of Bayesian optimization is to build surrogate models with different hyperparameter values and choose the best one based on some estimation of the approximation error, for example, a cross-validation score. Building multiple surrogate models for each iteration of Bayesian optimization is computationally demanding and significantly increases the time required to solve an optimization problem. This paper suggests a new approach, called exploratory landscape validation, to find good hyperparameter values with less computational effort. Exploratory landscape validation metrics can be used to predict the best hyperparameter values, which can improve both the quality of the solutions found by Bayesian optimization and the time needed to solve problems.
APA, Harvard, Vancouver, ISO, and other styles
42

Garces, Santiago Ramos, Ivan De Boi, João Pedro Ramos, Marc Dierckx, Lucia Popescu, and Stijn Derammelaere. "Efficient Tuning of an Isotope Separation Online System Through Safe Bayesian Optimization with Simulation-Informed Gaussian Process for the Constraints." Mathematics 12, no. 23 (November 25, 2024): 3696. http://dx.doi.org/10.3390/math12233696.

Full text
Abstract:
Optimizing process outcomes by tuning parameters through an automated system is common in industry. Ideally, this optimization is performed as efficiently as possible, using the minimum number of steps to achieve an optimal configuration. However, care must often be taken to ensure that, in pursuing the optimal solution, the process does not enter an “unsafe” state (for the process itself or its surroundings). Safe Bayesian optimization is a viable method in such contexts, as it guarantees constraint fulfillment during the optimization process, ensuring the system remains safe. This method assumes the constraints are real-valued and continuous functions. However, in some cases, the constraints are binary (true/false) or classification-based (safe/unsafe), limiting the direct application of safe Bayesian optimization. Therefore, a slight modification of safe Bayesian optimization allows for applying the method using a probabilistic classifier for learning classification constraints. However, violation of constraints may occur during the optimization process, as the theoretical guarantees of safe Bayesian optimization do not apply to discontinuous functions. This paper addresses this limitation by introducing an enhanced version of safe Bayesian optimization incorporating a simulation-informed Gaussian process (GP) for handling classification constraints. The simulation-informed GP transforms the classification constraint into a piece-wise function, enabling the application of safe Bayesian optimization. We applied this approach to optimize the parameters of a computational model for the isotope separator online (ISOL) at the MYRRHA facility (Multipurpose Hybrid Research Reactor for High-Tech Applications). The results revealed a significant reduction in constraint violations—approximately 80%—compared to safe Bayesian optimization methods that directly learn the classification constraints using Laplace approximation and expectation propagation. The sensitivity to the accuracy of the simulation model was analyzed to determine the extent to which it is advantageous to use the proposed method. These findings suggest that incorporating available information into the optimization process is valuable for reducing the number of unsafe outcomes in constrained optimization scenarios.
APA, Harvard, Vancouver, ISO, and other styles
43

Sun, Xingping, Chang Chen, Lu Wang, Hongwei Kang, Yong Shen, and Qingyi Chen. "Hybrid Optimization Algorithm for Bayesian Network Structure Learning." Information 10, no. 10 (September 24, 2019): 294. http://dx.doi.org/10.3390/info10100294.

Full text
Abstract:
Since the beginning of the 21st century, research on artificial intelligence has made great progress. Bayesian networks have gradually become one of the hotspots and important achievements in artificial intelligence research. Establishing an effective Bayesian network structure is the foundation and core of the learning and application of Bayesian networks. In Bayesian network structure learning, the traditional method of utilizing expert knowledge to construct the network structure is gradually replaced by the data learning structure method. However, as a result of the large amount of possible network structures, the search space is too large. The method of Bayesian network learning through training data usually has the problems of low precision or high complexity, which make the structure of learning differ greatly from that of reality, which has a great influence on the reasoning and practical application of Bayesian networks. In order to solve this problem, a hybrid optimization artificial bee colony algorithm is discretized and applied to structure learning. A hybrid optimization technique for the Bayesian network structure learning method is proposed. Experimental simulation results show that the proposed hybrid optimization structure learning algorithm has better structure and better convergence.
APA, Harvard, Vancouver, ISO, and other styles
44

Ji, Hualin, Liangliang Qi, Mingxin Lyu, Yanhua Lai, and Zhen Dong. "Improved Bayesian Optimization Framework for Inverse Thermal Conductivity Based on Transient Plane Source Method." Entropy 25, no. 4 (March 27, 2023): 575. http://dx.doi.org/10.3390/e25040575.

Full text
Abstract:
In order to reduce the errors caused by the idealization of the conventional analytical model in the transient planar source (TPS) method, a finite element model that more closely represents the actual heat transfer process was constructed. The average error of the established model was controlled at below 1%, which was a significantly better result than for the analytical model, which had an average error of about 5%. Based on probabilistic optimization and heuristic optimization algorithms, an optimization model of the inverse heat transfer problem with partial thermal conductivity differential equation constraints was constructed. A Bayesian optimization algorithm with an adaptive initial population (BOAAIP) was proposed by analyzing the influencing factors of the Bayesian optimization algorithm upon inversion. The improved Bayesian optimization algorithm is not affected by the range and individuals of the initial population, and thus has better adaptability and stability. To further verify its superiority, the Bayesian optimization algorithm was compared with the genetic algorithm. The results show that the inversion accuracy of the two algorithms is around 3% when the thermal conductivity of the material is below 100 Wm−1K−1, and the calculation speed of the improved Bayesian optimization algorithm is three to four times faster than that of the genetic algorithm.
APA, Harvard, Vancouver, ISO, and other styles
45

Chen, Ming, Xinhu Zhang, Kechun Shen, and Guang Pan. "Optimization of composite cylinder shell via a data-driven intelligent optimization algorithm." Journal of Physics: Conference Series 2181, no. 1 (January 1, 2022): 012019. http://dx.doi.org/10.1088/1742-6596/2181/1/012019.

Full text
Abstract:
Abstract While composite material provides huge flexibility for design, the design optimization of composite structure is time consuming with low efficiency. This work combines finite element analysis for composite cylinder shell with a data-driven intelligent optimization algorithm (Bayesian optimization algorithm) and is aimed at maximizing eigenvalue buckling load. Through minimizing number of iterations as a derivative-free global optimization algorithm, Bayesian optimization is versatile and can be further applied to design advanced composite structure with more complicated scenarios, such as complex geometries and load conditions.
APA, Harvard, Vancouver, ISO, and other styles
46

Kim, Jungtaek, Michael McCourt, Tackgeun You, Saehoon Kim, and Seungjin Choi. "Bayesian optimization with approximate set kernels." Machine Learning 110, no. 5 (March 22, 2021): 857–79. http://dx.doi.org/10.1007/s10994-021-05949-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Inman, Matthew J., John M. Earwood, Atef Z. Elsherbeni, and Charles E. Smith. "BAYESIAN OPTIMIZATION TECHNIQUES FOR ANTENNA DESIGN." Progress In Electromagnetics Research 49 (2004): 71–86. http://dx.doi.org/10.2528/pier04021302.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

NAKANO, Yuichiro, Yusuke FUJIMOTO, and Toshiharu SUGIE. "Controller Tuning Based on Bayesian Optimization." Transactions of the Society of Instrument and Control Engineers 55, no. 4 (2019): 269–74. http://dx.doi.org/10.9746/sicetr.55.269.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Abbas, M., A. Ilin, A. Solonen, J. Hakkarainen, E. Oja, and H. Järvinen. "Bayesian optimization for tuning chaotic systems." Nonlinear Processes in Geophysics Discussions 1, no. 2 (August 4, 2014): 1283–312. http://dx.doi.org/10.5194/npgd-1-1283-2014.

Full text
Abstract:
Abstract. In this work, we consider the Bayesian optimization (BO) approach for tuning parameters of complex chaotic systems. Such problems arise, for instance, in tuning the sub-grid scale parameterizations in weather and climate models. For such problems, the tuning procedure is generally based on a performance metric which measures how well the tuned model fits the data. This tuning is often a computationally expensive task. We show that BO, as a tool for finding the extrema of computationally expensive objective functions, is suitable for such tuning tasks. In the experiments, we consider tuning parameters of two systems: a simplified atmospheric model and a low-dimensional chaotic system. We show that BO is able to tune parameters of both the systems with a low number of objective function evaluations and without the need of any gradient information.
APA, Harvard, Vancouver, ISO, and other styles
50

Cui, Jiaxu, Bo Yang, and Xia Hu. "Deep Bayesian Optimization on Attributed Graphs." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 1377–84. http://dx.doi.org/10.1609/aaai.v33i01.33011377.

Full text
Abstract:
Attributed graphs, which contain rich contextual features beyond just network structure, are ubiquitous and have been observed to benefit various network analytics applications. Graph structure optimization, aiming to find the optimal graphs in terms of some specific measures, has become an effective computational tool in complex network analysis. However, traditional model-free methods suffer from the expensive computational cost of evaluating graphs; existing vectorial Bayesian optimization methods cannot be directly applied to attributed graphs and have the scalability issue due to the use of Gaussian processes (GPs). To bridge the gap, in this paper, we propose a novel scalable Deep Graph Bayesian Optimization (DGBO) method on attributed graphs. The proposed DGBO prevents the cubical complexity of the GPs by adopting a deep graph neural network to surrogate black-box functions, and can scale linearly with the number of observations. Intensive experiments are conducted on both artificial and real-world problems, including molecular discovery and urban road network design, and demonstrate the effectiveness of the DGBO compared with the state-of-the-art.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography