To see the other types of publications on this topic, follow the link: Gaussian process.

Journal articles on the topic 'Gaussian process'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Gaussian process.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Fearn, Tom. "Gaussian Process Regression." NIR news 24, no. 6 (2013): 23–24. http://dx.doi.org/10.1255/nirn.1392.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Daemi, Atefeh, Hariprasad Kodamana, and Biao Huang. "Gaussian process modelling with Gaussian mixture likelihood." Journal of Process Control 81 (September 2019): 209–20. http://dx.doi.org/10.1016/j.jprocont.2019.06.007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

O’Callaghan, Simon T., and Fabio T. Ramos. "Gaussian process occupancy maps." International Journal of Robotics Research 31, no. 1 (2012): 42–62. http://dx.doi.org/10.1177/0278364911421039.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Luthi, Marcel, Thomas Gerig, Christoph Jud, and Thomas Vetter. "Gaussian Process Morphable Models." IEEE Transactions on Pattern Analysis and Machine Intelligence 40, no. 8 (2018): 1860–73. http://dx.doi.org/10.1109/tpami.2017.2739743.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Mackay, D. J. C., and M. N. Gibbs. "Variational Gaussian process classifiers." IEEE Transactions on Neural Networks 11, no. 6 (2000): 1458–64. http://dx.doi.org/10.1109/72.883477.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Deisenroth, Marc Peter, Carl Edward Rasmussen, and Jan Peters. "Gaussian process dynamic programming." Neurocomputing 72, no. 7-9 (2009): 1508–24. http://dx.doi.org/10.1016/j.neucom.2008.12.019.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Chatzis, S. P., and Y. Demiris. "Echo State Gaussian Process." IEEE Transactions on Neural Networks 22, no. 9 (2011): 1435–45. http://dx.doi.org/10.1109/tnn.2011.2162109.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Jin, Zhehao, Andong Liu, Wen-an Zhang, Li Yu, and Chenguang Yang. "Gaussian process movement primitive." Automatica 155 (September 2023): 111120. http://dx.doi.org/10.1016/j.automatica.2023.111120.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Ou, Xiaoling, Julian Morris, and Elaine Martin. "Gaussian Process Regression for Batch Process Modelling." IFAC Proceedings Volumes 37, no. 9 (2004): 817–22. http://dx.doi.org/10.1016/s1474-6670(17)31910-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Subramanian, Sandya, Riccardo Barbieri, and Emery N. Brown. "Point process temporal structure characterizes electrodermal activity." Proceedings of the National Academy of Sciences 117, no. 42 (2020): 26422–28. http://dx.doi.org/10.1073/pnas.2004403117.

Full text
Abstract:
Electrodermal activity (EDA) is a direct readout of the body’s sympathetic nervous system measured as sweat-induced changes in the skin’s electrical conductance. There is growing interest in using EDA to track physiological conditions such as stress levels, sleep quality, and emotional states. Standardized EDA data analysis methods are readily available. However, none considers an established physiological feature of EDA. The sympathetically mediated pulsatile changes in skin sweat measured as EDA resemble an integrate-and-fire process. An integrate-and-fire process modeled as a Gaussian rando
APA, Harvard, Vancouver, ISO, and other styles
11

Küper, Armin, and Steffen Waldherr. "Numerical Gaussian process Kalman filtering." IFAC-PapersOnLine 53, no. 2 (2020): 11416–21. http://dx.doi.org/10.1016/j.ifacol.2020.12.577.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Xia Zhanguo, Wan Ling, Cai Shiyu, and Xia Shixiong. "Research Progress of Gaussian Process." International Journal of Digital Content Technology and its Applications 6, no. 14 (2012): 369–78. http://dx.doi.org/10.4156/jdcta.vol6.issue14.45.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Zhong, Guoqiang, Wu-Jun Li, Dit-Yan Yeung, Xinwen Hou, and Cheng-Lin Liu. "Gaussian Process Latent Random Field." Proceedings of the AAAI Conference on Artificial Intelligence 24, no. 1 (2010): 679–84. http://dx.doi.org/10.1609/aaai.v24i1.7697.

Full text
Abstract:
In this paper, we propose a novel supervised extension of GPLVM, called Gaussian process latent random field (GPLRF), by enforcing the latent variables to be a Gaussian Markov random field with respect to a graph constructed from the supervisory information.
APA, Harvard, Vancouver, ISO, and other styles
14

Song, Andrew, Bahareh Tolooshams, and Demba Ba. "Gaussian Process Convolutional Dictionary Learning." IEEE Signal Processing Letters 29 (2022): 95–99. http://dx.doi.org/10.1109/lsp.2021.3127471.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Gogolashvili, Davit, Bogdan Kozyrskiy, and Maurizio Filippone. "Locally Smoothed Gaussian Process Regression." Procedia Computer Science 207 (2022): 2717–26. http://dx.doi.org/10.1016/j.procs.2022.09.330.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Bastos, Leonardo S., and Anthony O’Hagan. "Diagnostics for Gaussian Process Emulators." Technometrics 51, no. 4 (2009): 425–38. http://dx.doi.org/10.1198/tech.2009.08019.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Murata, Noboru, and Yusuke Kuroda. "A Gaussian Process Robust Regression." Progress of Theoretical Physics Supplement 157 (2005): 280–83. http://dx.doi.org/10.1143/ptps.157.280.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Platanios, Emmanouil A., and Sotirios P. Chatzis. "Gaussian Process-Mixture Conditional Heteroscedasticity." IEEE Transactions on Pattern Analysis and Machine Intelligence 36, no. 5 (2014): 888–900. http://dx.doi.org/10.1109/tpami.2013.183.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Gu, Mengyang, Xiaojing Wang, and James O. Berger. "Robust Gaussian stochastic process emulation." Annals of Statistics 46, no. 6A (2018): 3038–66. http://dx.doi.org/10.1214/17-aos1648.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Pruher, Jakub, and Ondrej Straka. "Gaussian Process Quadrature Moment Transform." IEEE Transactions on Automatic Control 63, no. 9 (2018): 2844–54. http://dx.doi.org/10.1109/tac.2017.2774444.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Yiu, Simon, and Kai Yang. "Gaussian Process Assisted Fingerprinting Localization." IEEE Internet of Things Journal 3, no. 5 (2016): 683–90. http://dx.doi.org/10.1109/jiot.2015.2481932.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Pensoneault, Andrew, Xiu Yang, and Xueyu Zhu. "Nonnegativity-enforced Gaussian process regression." Theoretical and Applied Mechanics Letters 10, no. 3 (2020): 182–87. http://dx.doi.org/10.1016/j.taml.2020.01.036.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Blitvić, Natasha. "The (q,t)-Gaussian process." Journal of Functional Analysis 263, no. 10 (2012): 3270–305. http://dx.doi.org/10.1016/j.jfa.2012.08.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Chen, Tao, and Jianghong Ren. "Bagging for Gaussian process regression." Neurocomputing 72, no. 7-9 (2009): 1605–10. http://dx.doi.org/10.1016/j.neucom.2008.09.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Dalbey, Keith R., and Laura Swiler. "GAUSSIAN PROCESS ADAPTIVE IMPORTANCE SAMPLING." International Journal for Uncertainty Quantification 4, no. 2 (2014): 133–49. http://dx.doi.org/10.1615/int.j.uncertaintyquantification.2013006330.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Guenther, John, and Herbert K. H. Lee. "An Improved Treed Gaussian Process." Applied Mathematics 11, no. 07 (2020): 613–38. http://dx.doi.org/10.4236/am.2020.117042.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Gao, Tingran, Shahar Z. Kovalsky, and Ingrid Daubechies. "Gaussian Process Landmarking on Manifolds." SIAM Journal on Mathematics of Data Science 1, no. 1 (2019): 208–36. http://dx.doi.org/10.1137/18m1184035.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Ažman, Kristjan, and Juš Kocijan. "Fixed-structure Gaussian process model." International Journal of Systems Science 40, no. 12 (2009): 1253–62. http://dx.doi.org/10.1080/00207720903038028.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Gregorčič, Gregor, and Gordon Lightbody. "Gaussian process internal model control." International Journal of Systems Science 43, no. 11 (2012): 2079–94. http://dx.doi.org/10.1080/00207721.2011.564326.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Kumar, Arun, and Palaniappan Vellaisamy. "Fractional Normal Inverse Gaussian Process." Methodology and Computing in Applied Probability 14, no. 2 (2010): 263–83. http://dx.doi.org/10.1007/s11009-010-9201-z.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Mair, Sebastian, and Ulf Brefeld. "Distributed robust Gaussian Process regression." Knowledge and Information Systems 55, no. 2 (2017): 415–35. http://dx.doi.org/10.1007/s10115-017-1084-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Zhang, Wei, Brian Barr, and John Paisley. "Gaussian Process Neural Additive Models." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 15 (2024): 16865–72. http://dx.doi.org/10.1609/aaai.v38i15.29628.

Full text
Abstract:
Deep neural networks have revolutionized many fields, but their black-box nature also occasionally prevents their wider adoption in fields such as healthcare and finance where interpretable and explainable models are required. The recent development of Neural Additive Models (NAMs) poses a major step in the direction of interpretable deep learning for tabular datasets. In this paper, we propose a new subclass of NAMs that utilize a single-layer neural network construction of the Gaussian process via random Fourier features, which we call Gaussian Process Neural Additive Models (GP-NAM). GP-NAM
APA, Harvard, Vancouver, ISO, and other styles
33

Yang, Zewen, Xiaobing Dai, and Sandra Hirche. "Asynchronous Distributed Gaussian Process Regression." Proceedings of the AAAI Conference on Artificial Intelligence 39, no. 21 (2025): 22065–73. https://doi.org/10.1609/aaai.v39i21.34359.

Full text
Abstract:
In this paper, we address a practical distributed Bayesian learning problem with asynchronous measurements and predictions due to diverse computational conditions. To this end, asynchronous distributed Gaussian process (AsyncDGP) regression is proposed, which is the first effective online distributed Gaussian processes (GPs) approach to improve the prediction accuracy in real-time learning tasks. By leveraging the devised evaluation criterion and established prediction error bounds, AsyncDGP enables the distinction of contributions of each model for prediction ensembling using aggregation stra
APA, Harvard, Vancouver, ISO, and other styles
34

Wang, Bo, and Jian Qing Shi. "Generalized Gaussian Process Regression Model for Non-Gaussian Functional Data." Journal of the American Statistical Association 109, no. 507 (2014): 1123–33. http://dx.doi.org/10.1080/01621459.2014.889021.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Visser, Emile, Corné E. van Daalen, and J. C. Schoeman. "Lossy compression of observations for Gaussian process regression." MATEC Web of Conferences 370 (2022): 07006. http://dx.doi.org/10.1051/matecconf/202237007006.

Full text
Abstract:
This paper proposes a novel approach of Gaussian process observation set compression based on a squared difference measure. It is used to discard observations to speed up Gaussian process prediction while retaining the information encoded in the full set of observations. Furthermore, this paper compares the regression performance of a compressed Gaussian process to its uncompressed version and to a randomly downsampled Gaussian process for a standard two-dimensional test function. The empirical results of this paper show that this is an effective algorithm for Gaussian process compression, spe
APA, Harvard, Vancouver, ISO, and other styles
36

Savitsky, Terrance, and Marina Vannucci. "Spiked Dirichlet Process Priors for Gaussian Process Models." Journal of Probability and Statistics 2010 (2010): 1–14. http://dx.doi.org/10.1155/2010/201489.

Full text
Abstract:
We expand a framework for Bayesian variable selection for Gaussian process (GP) models by employing spiked Dirichlet process (DP) prior constructions over set partitions containing covariates. Our approach results in a nonparametric treatment of the distribution of the covariance parameters of the GP covariance matrix that in turn induces a clustering of the covariates. We evaluate two prior constructions: the first one employs a mixture of a point-mass and a continuous distribution as the centering distribution for the DP prior, therefore, clustering all covariates. The second one employs a m
APA, Harvard, Vancouver, ISO, and other styles
37

Finamore, Weiler, Marcelo Pinho, Manish Sharma, and Moises Ribeiro. "Modeling Noise as a Bernoulli-Gaussian Process." Journal of Communication and Information Systems 38 (2023): 175–86. http://dx.doi.org/10.14209/jcis.2023.20.

Full text
Abstract:
Transmission medium is always perturbed by noise with a random nature which can be characterized by taking a sequence of noise samples and, after analyzing the sequence, attributing a probabilistic model to represent the randomness of the noise. If thermal noise (receiver generated) is the only noise impairing the transmission (our only focus is digital transmission) the memoryless stationary discrete-time Gaussian process is the best model to probabilistically represent the noise. The mathematical representation of the transmission medium in such a situation yields the well known Gaussian Cha
APA, Harvard, Vancouver, ISO, and other styles
38

Bozkurt, Ferhat, Mete Yağanoğlu, and Faruk Baturalp Günay. "Effective Gaussian Blurring Process on Graphics Processing Unit with CUDA." International Journal of Machine Learning and Computing 5, no. 1 (2015): 57–61. http://dx.doi.org/10.7763/ijmlc.2015.v5.483.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Rius Carretero, David, and Salvador Torra Porras. "APLICACIONES ACTUARIALES MEDIANTE GAUSSIAN PROCESS REGRESSION: VIDA Y NO VIDA." Anales del Instituto de Actuarios Españoles, no. 28 (December 2022): 67–100. http://dx.doi.org/10.26360/2022_3.

Full text
Abstract:
Resumen En este trabajo se ha realizado una breve introducción sobre la metodología Regresión de Proceso Gaussiano (GPR) y dos aplicaciones en el ámbito Actuarial. Por un lado, se ha realizado un ejercicio de interpolación sobre las tablas de mortalidad PASEM Unisex 2020, concluyendo que el GPR es una excelente herramienta de interpolación, y que nos permite una tarificación más ajustada en el ramo de Vida. Por otro lado, se ha integrado el GPR como medida de predicción de provisiones en los ramos de No-Vida, obteniendo unos resultados prometedores. Por último, se concluye que un GPR puede ser
APA, Harvard, Vancouver, ISO, and other styles
40

Opper, Manfred, and Cédric Archambeau. "The Variational Gaussian Approximation Revisited." Neural Computation 21, no. 3 (2009): 786–92. http://dx.doi.org/10.1162/neco.2008.08-07-592.

Full text
Abstract:
The variational approximation of posterior distributions by multivariate gaussians has been much less popular in the machine learning community compared to the corresponding approximation by factorizing distributions. This is for a good reason: the gaussian approximation is in general plagued by an [Formula: see text] number of variational parameters to be optimized, N being the number of random variables. In this letter, we discuss the relationship between the Laplace and the variational approximation, and we show that for models with gaussian priors and factorizing likelihoods, the number of
APA, Harvard, Vancouver, ISO, and other styles
41

Yunxin Zhao, Xinhua Zhuang, and Sheu-Jen Ting. "Gaussian mixture density modeling of non-Gaussian source for autoregressive process." IEEE Transactions on Signal Processing 43, no. 4 (1995): 894–903. http://dx.doi.org/10.1109/78.376842.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Polyak, Iakov, Gareth W. Richings, Scott Habershon, and Peter J. Knowles. "Direct quantum dynamics using variational Gaussian wavepackets and Gaussian process regression." Journal of Chemical Physics 150, no. 4 (2019): 041101. http://dx.doi.org/10.1063/1.5086358.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Konstant, D. G., and V. I. Piterbarg. "Extreme values of the cyclostationary Gaussian random process." Journal of Applied Probability 30, no. 1 (1993): 82–97. http://dx.doi.org/10.2307/3214623.

Full text
Abstract:
In this paper the class of cyclostationary Gaussian random processes is studied. Basic asymptotics are given for the class of Gaussian processes that are centered and differentiable in mean square. Then, under certain conditions on the non-degeneration of the centered cyclostationary Gaussian process with integrable covariance functions, the Gnedenko-type limit formula is established for and all x > 0.
APA, Harvard, Vancouver, ISO, and other styles
44

Konstant, D. G., and V. I. Piterbarg. "Extreme values of the cyclostationary Gaussian random process." Journal of Applied Probability 30, no. 01 (1993): 82–97. http://dx.doi.org/10.1017/s0021900200044016.

Full text
Abstract:
In this paper the class of cyclostationary Gaussian random processes is studied. Basic asymptotics are given for the class of Gaussian processes that are centered and differentiable in mean square. Then, under certain conditions on the non-degeneration of the centered cyclostationary Gaussian process with integrable covariance functions, the Gnedenko-type limit formula is established for and all x > 0.
APA, Harvard, Vancouver, ISO, and other styles
45

Lian, Yingzhao, and Colin N. Jones. "On Gaussian Process Based Koopman Operators." IFAC-PapersOnLine 53, no. 2 (2020): 449–55. http://dx.doi.org/10.1016/j.ifacol.2020.12.217.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Hadjakos, Aristotelis. "Gaussian Process Synthesis of Artificial Sounds." Applied Sciences 10, no. 5 (2020): 1781. http://dx.doi.org/10.3390/app10051781.

Full text
Abstract:
In this paper, we propose Gaussian Process (GP) sound synthesis. A GP is used to sample random continuous functions, which are then used for wavetable or waveshaping synthesis. The shape of the sampled functions is controlled with the kernel function of the GP. Sampling multiple times from the same GP generates perceptually similar but non-identical sounds. Since there are many ways to choose the kernel function and its parameters, an interface aids the user in sound selection. The interface is based on a two-dimensional visualization of the sounds grouped by their similarity as judged by a t-
APA, Harvard, Vancouver, ISO, and other styles
47

Cabaña, Enrique M. "A gaussian process with parabolic covariances." Journal of Applied Probability 28, no. 4 (1991): 898–902. http://dx.doi.org/10.2307/3214693.

Full text
Abstract:
The centred, periodic, stationary Gaussian process X(z), ≧ z ≧ 1 with covariances , appears when one studies the solutions of the vibrating string equation forced by noise, corresponding to the case of a finite string with the extremes tied together. The close relationship between this process and a Brownian bridge permits us to compute the distribution of the maximum excursion of the string at particular times.
APA, Harvard, Vancouver, ISO, and other styles
48

IBA, Yukito, and Shotaro AKAHO. "Gaussian Process Regression with Measurement Error." IEICE Transactions on Information and Systems E93-D, no. 10 (2010): 2680–89. http://dx.doi.org/10.1587/transinf.e93.d.2680.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Kandasamy, Kirthevasan, Gautam Dasarathy, Junier Oliva, Jeff Schneider, and Barnabás Póczos. "Multi-fidelity Gaussian Process Bandit Optimisation." Journal of Artificial Intelligence Research 66 (September 15, 2019): 151–96. http://dx.doi.org/10.1613/jair.1.11288.

Full text
Abstract:
In many scientific and engineering applications, we are tasked with the maximisation of an expensive to evaluate black box function f. Traditional settings for this problem assume just the availability of this single function. However, in many cases, cheap approximations to f may be obtainable. For example, the expensive real world behaviour of a robot can be approximated by a cheap computer simulation. We can use these approximations to eliminate low function value regions cheaply and use the expensive evaluations of f in a small but promising region and speedily identify the optimum. We form
APA, Harvard, Vancouver, ISO, and other styles
50

Liang, Junjie, Yanting Wu, Dongkuan Xu, and Vasant G. Honavar. "Longitudinal Deep Kernel Gaussian Process Regression." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 10 (2021): 8556–64. http://dx.doi.org/10.1609/aaai.v35i10.17038.

Full text
Abstract:
Gaussian processes offer an attractive framework for predictive modeling from longitudinal data, \ie irregularly sampled, sparse observations from a set of individuals over time. However, such methods have two key shortcomings: (i) They rely on ad hoc heuristics or expensive trial and error to choose the effective kernels, and (ii) They fail to handle multilevel correlation structure in the data. We introduce Longitudinal deep kernel Gaussian process regression (L-DKGPR) to overcome these limitations by fully automating the discovery of complex multilevel correlation structure from longitudina
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!