Replies: 1 comment 2 replies
-
See https://arxiv.org/pdf/1602.05149.pdf for additional background on and motivation for parallel Bayesian Optimization. Btw, if you're just interested in using Bayesian Optimization to solve your problem (and not in the details under the hood or research aspects), you may also want to try Ax (https://ax.dev/), which under the hood uses BoTorch but provides a simpler interface and convenience functions and hides a lot of the complexities that you don't really need as someone who just wants to use Bayesian Optimization. |
Beta Was this translation helpful? Give feedback.
-
Hi, botorch users and developers!
I'm a botorch beginner and wondering about the effect of "q".
Here is a sample code of mine.
gp = SingleTaskGP(train_X, train_Y)
mll = ExactMarginalLogLikelihood(gp.likelihood, gp)
fit_gpytorch_model(mll)
sampler = SobolQMCNormalSampler(sample_shape=512)
qEI = qExpectedImprovement(model=gp, best_f=train_Y.max())
candidate, acq_value = optimize_acqf(acq_function=qEI,
q=10,
num_restarts=20,
raw_samples=100,
sequential=True
)
As a result, I got 10 candidates and acquisition values, and their values were different from each other.
I know that because I set the argument of "sequential" to True in the optimize_acqf function, the function sequentially outputted a scalar value, resulting in 10 acquisition values. But, in practice, should I select a candidate with the largest acquisition value?
The argument of num_restarts avoids falling into local minima and does the argument of "q" have the same effect?
Any help is appreciated.
Shin
Beta Was this translation helpful? Give feedback.
All reactions