Abstract:
Computer models or codes are widespread in science and engineering. Often, the output y is deterministic, i.e., running the code twice with the same values for the inputs or explanatory variables, x, would give the same output. To construct a predictor, the deterministic function y(x) can be treated as a realization from a stochastic process,
Y(x) = beta3 f(x) + Z(x),
where beta3 f(x) is a polynomial regression function, and Z(·) is a random function with mean zero and correlation function R(Z(w), Z(x)) = exp(- theta ll w - x ll2 ) for two runs of the code at inputs wand x. Given n observations of the computer code, a best linear unbiased predictor (BLUP) follows. As theta --> 0, we show that the asymptotic coefficients in the BLUP are weighted combination of Lagrange interpolation polynomials. Even if there are no explicit regression terms f(x) in the model, asymptotically the estimation procedure can implicitly include a polynomial trend in the inputs. We consider integrated mean squared error (IMSE) of prediction when there are no regression terms in the model. The asymptotic IMSE integrated over the design region is expressed as a quadratic form. This leads to a criterion for numerically optimizing the design.
Keywords:
Best linear unbiased prediction, Computer code, Integrated mean squared error, Interpolation, Optimal design, Stochastic process.
