<?xml version="1.0" encoding="UTF-8"?><xml><records><record><source-app name="Biblio" version="7.x">Drupal-Biblio</source-app><ref-type>17</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">M. Schonlau</style></author><author><style face="normal" font="default" size="100%">Welch, William J.</style></author><author><style face="normal" font="default" size="100%">Jones, Donald R.</style></author></authors></contributors><titles><title><style face="normal" font="default" size="100%">Global versus Local Search in Constrained Optimization of Computer Models</style></title><secondary-title><style face="normal" font="default" size="100%">Lecture Notes-Monograph Series</style></secondary-title></titles><keywords><keyword><style  face="normal" font="default" size="100%">Bayesian global optimization</style></keyword><keyword><style  face="normal" font="default" size="100%">Computer code</style></keyword><keyword><style  face="normal" font="default" size="100%">sequential design</style></keyword><keyword><style  face="normal" font="default" size="100%">Stochastic process</style></keyword></keywords><dates><year><style  face="normal" font="default" size="100%">1998</style></year></dates><volume><style face="normal" font="default" size="100%">34</style></volume><pages><style face="normal" font="default" size="100%">11-25</style></pages><language><style face="normal" font="default" size="100%">eng</style></language><abstract><style face="normal" font="default" size="100%">&lt;p&gt;Engineering systems are now frequently optimized via computer models. The input-output relationships in these models are often highly nonlinear deterministic functions that are expensive to compute. Thus, when searching for the global optimum, it is desirable to minimize the number of function evaluations. Bayesian global optimization methods are well-suited to this task because they make use of all previous evaluations in selecting the next search point. A statistical model is fit to the sampled points which allows predictions to be made elsewhere, along with a measure of possible prediction error (uncertainty). The next point is chosen to maximize a criterion that balances searching where the predicted value of the function is good (local search) with searching where the uncertainty of prediction is large (global search). We extend this methodology in several ways. First, we introduce a parameter that controls the local-global balance. Secondly, we propose a method for dealing with nonlinear inequality constraints from additional response variables. Lastly, we adapt the sequential algorithm to proceed in stages rather than one point at a time. The extensions are illustrated using a shape optimization problem from the automotive industry.&lt;/p&gt;
</style></abstract></record></records></xml>