Hong, X. ORCID: https://orcid.org/0000-0002-6832-2298, Gao, J., Jiang, X. and Harris, C. J.
(2014)
Fast identification algorithms for Gaussian process model.
Neurocomputing, 133.
pp. 25-31.
ISSN 0925-2312
doi: 10.1016/j.neucom.2013.11.035
Abstract/Summary
A class identification algorithms is introduced for Gaussian process(GP)models.The fundamental approach is to propose a new kernel function which leads to a covariance matrix with low rank,a property that is consequently exploited for computational efficiency for both model parameter estimation and model predictions.The objective of either maximizing the marginal likelihood or the Kullback–Leibler (K–L) divergence between the estimated output probability density function(pdf)and the true pdf has been used as respective cost functions.For each cost function,an efficient coordinate descent algorithm is proposed to estimate the kernel parameters using a one dimensional derivative free search, and noise variance using a fast gradient descent algorithm. Numerical examples are included to demonstrate the effectiveness of the new identification approaches.
Altmetric Badge
Item Type | Article |
URI | https://reading-clone.eprints-hosting.org/id/eprint/36487 |
Item Type | Article |
Refereed | Yes |
Divisions | Science > School of Mathematical, Physical and Computational Sciences > Department of Computer Science |
Uncontrolled Keywords | Gaussian process Optimization Kullback–Leibler divergence |
Publisher | Elsevier |
Download/View statistics | View download statistics for this item |
University Staff: Request a correction | Centaur Editors: Update this record