Academic literature on the topic 'Kernel regression'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Kernel regression.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Kernel regression"

1

Yulianto, Fendy, Wayan Firdaus Mahmudy, and Arief Andy Soebroto. "Comparison of Regression, Support Vector Regression (SVR), and SVR-Particle Swarm Optimization (PSO) for Rainfall Forecasting." Journal of Information Technology and Computer Science 5, no. 3 (2020): 235. http://dx.doi.org/10.25126/jitecs.20205374.

Full text
Abstract:
Rainfall is one of the factors that influence climate change in an area and is very difficult to predict, while rainfall information is very important for the community. Forecasting can be done using existing historical data with the help of mathematical computing in modeling. The Support Vector Regression (SVR) method is one method that can be used to predict non-linear rainfall data using a regression function. In calculations using the regression function, choosing the right SVR parameters is needed to produce forecasting with high accuracy. Particle Swarm Optimization (PSO) method is one m
APA, Harvard, Vancouver, ISO, and other styles
2

Farooq, Tahir, Aziz Guergachi, and Sridhar Krishnan. "Knowledge-Based Green's Kernel for Support Vector Regression." Mathematical Problems in Engineering 2010 (2010): 1–16. http://dx.doi.org/10.1155/2010/378652.

Full text
Abstract:
This paper presents a novel prior knowledge-based Green's kernel for support vector regression (SVR). After reviewing the correspondence between support vector kernels used in support vector machines (SVMs) and regularization operators used in regularization networks and the use of Green's function of their corresponding regularization operators to construct support vector kernels, a mathematical framework is presented to obtain the domain knowledge about magnitude of the Fourier transform of the function to be predicted and design a prior knowledge-based Green's kernel that exhibits optimal r
APA, Harvard, Vancouver, ISO, and other styles
3

Zhou, Xiaojian, Qianqian Geng, and Ting Jiang. "Boosting RBFNN performance in regression tasks with quantum kernel methods." Journal of Statistical Mechanics: Theory and Experiment 2025, no. 6 (2025): 063101. https://doi.org/10.1088/1742-5468/add0a3.

Full text
Abstract:
Abstract Quantum and classical machine learning are fundamentally connected through kernel methods, with kernels serving as inner products of feature vectors in high-dimensional spaces, forming their foundation. Among commonly used kernels, the Gaussian kernel plays a prominent role in radial basis function neural network (RBFNN) for regression tasks. Nonetheless, the localized response property of the Gaussian kernel, which emphasizes relationships between nearby data points, limits its capacity to model interactions among more distant data points. As a result, it may potentially overlook the
APA, Harvard, Vancouver, ISO, and other styles
4

Ma, Xiaoyan, Yanbin Zhang, Hui Cao, Shiliang Zhang, and Yan Zhou. "Nonlinear Regression with High-Dimensional Space Mapping for Blood Component Spectral Quantitative Analysis." Journal of Spectroscopy 2018 (2018): 1–8. http://dx.doi.org/10.1155/2018/2689750.

Full text
Abstract:
Accurate and fast determination of blood component concentration is very essential for the efficient diagnosis of patients. This paper proposes a nonlinear regression method with high-dimensional space mapping for blood component spectral quantitative analysis. Kernels are introduced to map the input data into high-dimensional space for nonlinear regression. As the most famous kernel, Gaussian kernel is usually adopted by researchers. More kernels need to be studied for each kernel describes its own high-dimensional feature space mapping which affects regression performance. In this paper, eig
APA, Harvard, Vancouver, ISO, and other styles
5

Zhang, Chao, and Shaogao Lv. "An Efficient Kernel Learning Algorithm for Semisupervised Regression Problems." Mathematical Problems in Engineering 2015 (2015): 1–9. http://dx.doi.org/10.1155/2015/451947.

Full text
Abstract:
Kernel selection is a central issue in kernel methods of machine learning. In this paper, we investigate the regularized learning schemes based on kernel design methods. Our ideal kernel is derived from a simple iterative procedure using large scale unlabeled data in a semisupervised framework. Compared with most of existing approaches, our algorithm avoids multioptimization in the process of learning kernels and its computation is as efficient as the standard single kernel-based algorithms. Moreover, large amounts of information associated with input space can be exploited, and thus generaliz
APA, Harvard, Vancouver, ISO, and other styles
6

Caraka, Rezzy Eko, Hasbi Yasin, and Adi Waridi Basyiruddin. "Peramalan Crude Palm Oil (CPO) Menggunakan Support Vector Regression Kernel Radial Basis." Jurnal Matematika 7, no. 1 (2017): 43. http://dx.doi.org/10.24843/jmat.2017.v07.i01.p81.

Full text
Abstract:
Recently, instead of selecting a kernel has been proposed which uses SVR, where the weight of each kernel is optimized during training. Along this line of research, many pioneering kernel learning algorithms have been proposed. The use of kernels provides a powerful and principled approach to modeling nonlinear patterns through linear patterns in a feature space. Another bene?t is that the design of kernels and linear methods can be decoupled, which greatly facilitates the modularity of machine learning methods. We perform experiments on real data sets crude palm oil prices for application and
APA, Harvard, Vancouver, ISO, and other styles
7

Mackenzie, M., and A. K. Tieu. "Asymmetric Kernel Regression." IEEE Transactions on Neural Networks 15, no. 2 (2004): 276–82. http://dx.doi.org/10.1109/tnn.2004.824414.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Chen, Xin, Xuejun Ma, and Wang Zhou. "Kernel density regression." Journal of Statistical Planning and Inference 205 (March 2020): 318–29. http://dx.doi.org/10.1016/j.jspi.2019.09.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Fagundes, Roberta A. A., Renata M. C. R. de Souza, and Francisco José A. Cysneiros. "Interval kernel regression." Neurocomputing 128 (March 2014): 371–88. http://dx.doi.org/10.1016/j.neucom.2013.08.029.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Lee, Myung Hee, and Yufeng Liu. "Kernel continuum regression." Computational Statistics & Data Analysis 68 (December 2013): 190–201. http://dx.doi.org/10.1016/j.csda.2013.06.016.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Kernel regression"

1

Dharmasena, Tibbotuwa Deniye Kankanamge Lasitha Sandamali, and Sandamali dharmasena@rmit edu au. "Sequential Procedures for Nonparametric Kernel Regression." RMIT University. Mathematical and Geospatial Sciences, 2008. http://adt.lib.rmit.edu.au/adt/public/adt-VIT20090119.134815.

Full text
Abstract:
In a nonparametric setting, the functional form of the relationship between the response variable and the associated predictor variables is unspecified; however it is assumed to be a smooth function. The main aim of nonparametric regression is to highlight an important structure in data without any assumptions about the shape of an underlying regression function. In regression, the random and fixed design models should be distinguished. Among the variety of nonparametric regression estimators currently in use, kernel type estimators are most popular. Kernel type estimators provide a flexible c
APA, Harvard, Vancouver, ISO, and other styles
2

Brault, Romain. "Large-scale operator-valued kernel regression." Thesis, Université Paris-Saclay (ComUE), 2017. http://www.theses.fr/2017SACLE024/document.

Full text
Abstract:
De nombreuses problématiques d'apprentissage artificiel peuvent être modélisées grâce à des fonctions à valeur vectorielles. Les noyaux à valeurs opérateurs et leur espace de Hilbert à noyaux reproduisant à valeurs vectorielles associés donnent un cadre théorique et pratique pour apprendre de telles fonctions, étendant la littérature existante des noyaux scalaires. Cependant, lorsque les données sont nombreuses, ces méthodes sont peu utilisables, ne passant pas à l'échelle, car elle nécessite une quantité de mémoire évoluant quadratiquement et un temps de calcul évoluant cubiquement vis à vis
APA, Harvard, Vancouver, ISO, and other styles
3

Zheng, Qi. "Local adaptive smoothing in kernel regression estimation." Connect to this title online, 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Hibraj, Feliks <1995&gt. "Efficient tensor kernel methods for sparse regression." Master's Degree Thesis, Università Ca' Foscari Venezia, 2020. http://hdl.handle.net/10579/16921.

Full text
Abstract:
Recently, classical kernel methods have been extended by the introduction of suitable tensor kernels so to promote sparsity in the solution of the underlying regression problem. Indeed, they solve an lp-norm regularization problem, with p=m/(m-1) and m even integer, which happens to be close to a lasso problem. However, a major drawback of the method is that storing tensors requires a considerable amount of memory, ultimately limiting its applicability. In this work we address this problem by proposing two advances. First, we directly reduce the memory requirement, by introducing a new and mo
APA, Harvard, Vancouver, ISO, and other styles
5

Ren, Haobo. "Functional inverse regression and reproducing kernel Hilbert space." Diss., Texas A&M University, 2005. http://hdl.handle.net/1969.1/4203.

Full text
Abstract:
The basic philosophy of Functional Data Analysis (FDA) is to think of the observed data functions as elements of a possibly infinite-dimensional function space. Most of the current research topics on FDA focus on advancing theoretical tools and extending existing multivariate techniques to accommodate the infinite-dimensional nature of data. This dissertation reports contributions on both fronts, where a unifying inverse regression theory for both the multivariate setting (Li 1991) and functional data from a Reproducing Kernel Hilbert Space (RKHS) prospective is developed. We proposed a functi
APA, Harvard, Vancouver, ISO, and other styles
6

Farooq, Muhammad Verfasser], and Ingo [Akademischer Betreuer] [Steinwart. "Kernel-based expectile regression / Muhammad Farooq ; Betreuer: Ingo Steinwart." Stuttgart : Universitätsbibliothek der Universität Stuttgart, 2017. http://d-nb.info/1148426337/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Farooq, Muhammad [Verfasser], and Ingo [Akademischer Betreuer] Steinwart. "Kernel-based expectile regression / Muhammad Farooq ; Betreuer: Ingo Steinwart." Stuttgart : Universitätsbibliothek der Universität Stuttgart, 2017. http://d-nb.info/1148426337/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Natarajan, Balasubramaniam. "Asymptotic properties of Non-parametric Regression with Beta Kernels." Diss., Kansas State University, 2017. http://hdl.handle.net/2097/38554.

Full text
Abstract:
Doctor of Philosophy<br>Department of Statistics<br>Weixing Song<br>Kernel based non-parametric regression is a popular statistical tool to identify the relationship between response and predictor variables when standard parametric regression models are not appropriate. The efficacy of kernel based methods depend both on the kernel choice and the smoothing parameter. With insufficient smoothing, the resulting regression estimate is too rough and with excessive smoothing, important features of the underlying relationship is lost. While the choice of the kernel has been shown to have less of an
APA, Harvard, Vancouver, ISO, and other styles
9

DiPaolo, Conner. "Randomized Algorithms for Preconditioner Selection with Applications to Kernel Regression." Scholarship @ Claremont, 2019. https://scholarship.claremont.edu/hmc_theses/230.

Full text
Abstract:
The task of choosing a preconditioner M to use when solving a linear system Ax=b with iterative methods is often tedious and most methods remain ad-hoc. This thesis presents a randomized algorithm to make this chore less painful through use of randomized algorithms for estimating traces. In particular, we show that the preconditioner stability || I - M-1A ||F, known to forecast preconditioner quality, can be computed in the time it takes to run a constant number of iterations of conjugate gradients through use of sketching methods. This is in spite of folklore which suggests the quantity is im
APA, Harvard, Vancouver, ISO, and other styles
10

Chan, Nigel Hiu Ngai. "Uniform convergence on cointegrating regression." Thesis, The University of Sydney, 2013. http://hdl.handle.net/2123/9807.

Full text
Abstract:
Nonlinear cointegration model has been a popular tool for applied econometric modelling. There are numerous real life time series examples that demonstrate nonlinear response to another nonstationary time series in the field of macro-economics. The nonparametric estimation methods for the nonlinear linked function have been studied extensively in the literature. Most of the existing studies concentrate on establishing point-wise convergence, and there is little research on uniform convergence. The uniform convergence of nonparametric estimator of the nonlinear function is an important theoreti
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Kernel regression"

1

Cleophas, Ton J., and Aeilko H. Zwinderman. Kernel Ridge Regression in Clinical Research. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-10717-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Brockmann, Michael. Local bandwidth selection in nonparametric kernel regression. Verlag Shaker, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Racine, Jean. A nonparametric variable kernel method for local adaptive smoothing of regression functions and associated response coefficients. Dept. of Economics, York University, 1991.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ferraty, Frédéric, and Philippe Vieu. Kernel Regression Estimation for Functional Data. Edited by Frédéric Ferraty and Yves Romain. Oxford University Press, 2018. http://dx.doi.org/10.1093/oxfordhb/9780199568444.013.4.

Full text
Abstract:
This article provides an overview of recent nonparametric and semiparametric advances in kernel regression estimation for functional data. In particular, it considers the various statistical techniques based on kernel smoothing ideas that have recently been developed for functional regression estimation problems. The article first examines nonparametric functional regression modelling before discussing three popular functional regression estimates constructed by means of kernel ideas, namely: the Nadaraya-Watson convolution kernel estimate, the kNN functional estimate, and the local linear fun
APA, Harvard, Vancouver, ISO, and other styles
5

Cleophas, Ton J., and Aeilko H. Zwinderman. Kernel Ridge Regression in Clinical Research. Springer International Publishing AG, 2022.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kernel Ridge Regression in Clinical Research. Springer International Publishing AG, 2023.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Wolberg, John R. Expert Trading Systems: Modeling Financial Markets with Kernel Regression. Wiley & Sons, Incorporated, John, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Expert Trading Systems: Modeling Financial Markets with Kernel Regression. Wiley, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Kernel regression"

1

Wand, M. P., and M. C. Jones. "Kernel regression." In Kernel Smoothing. Springer US, 1995. http://dx.doi.org/10.1007/978-1-4899-4493-1_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Sarda, Pascal, and Philippe Vieu. "Kernel Regression." In Smoothing and Regression. John Wiley & Sons, Inc., 2012. http://dx.doi.org/10.1002/9781118150658.ch3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ahlawat, Samit. "Kernel Regression." In Statistical Quantitative Methods in Finance. Apress, 2025. https://doi.org/10.1007/979-8-8688-0962-0_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Hirukawa, Masayuki. "Regression Estimation." In Asymmetric Kernel Smoothing. Springer Singapore, 2018. http://dx.doi.org/10.1007/978-981-10-5466-2_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Montesinos López, Osval Antonio, Abelardo Montesinos López, and Jose Crossa. "Reproducing Kernel Hilbert Spaces Regression and Classification Methods." In Multivariate Statistical Machine Learning Methods for Genomic Prediction. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-89010-0_8.

Full text
Abstract:
AbstractThe fundamentals for Reproducing Kernel Hilbert Spaces (RKHS) regression methods are described in this chapter. We first point out the virtues of RKHS regression methods and why these methods are gaining a lot of acceptance in statistical machine learning. Key elements for the construction of RKHS regression methods are provided, the kernel trick is explained in some detail, and the main kernel functions for building kernels are provided. This chapter explains some loss functions under a fixed model framework with examples of Gaussian, binary, and categorical response variables. We ill
APA, Harvard, Vancouver, ISO, and other styles
6

Vovk, Vladimir. "Kernel Ridge Regression." In Empirical Inference. Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-41136-6_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Yamamoto, Masaharu, and Koichiro Yamauchi. "Swap Kernel Regression." In Artificial Neural Networks and Machine Learning – ICANN 2019: Theoretical Neural Computation. Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-30487-4_18.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Lückehe, Daniel. "Unüberwachte Kernel- Regression." In Hybride Optimierung für Dimensionsreduktion. Springer Fachmedien Wiesbaden, 2015. http://dx.doi.org/10.1007/978-3-658-10738-3_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Cleophas, Ton J., and Aeilko H. Zwinderman. "Traditional Kernel Regression." In Kernel Ridge Regression in Clinical Research. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-10717-7_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Cleophas, Ton J., and Aeilko H. Zwinderman. "Kernel Regression Versus Quantile Regression." In Quantile Regression in Clinical Research. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-82840-0_25.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Kernel regression"

1

Marich, Elizaveta, Andrea Galeazzi, Steven Sachio, Foteini Michalopoulou, and Maria M. Papathanasiou. "Design Space Exploration via Gaussian Process Regression and Alpha Shape Visualization." In The 35th European Symposium on Computer Aided Process Engineering. PSE Press, 2025. https://doi.org/10.69997/sct.192990.

Full text
Abstract:
This study introduces a novel methodology that combines Gaussian process regression (GPR) with alpha shape design space reconstruction to visualize multi-dimensional design spaces. The proposed GPR surrogate approach incorporates a kernel optimization step, employing a greedy tree search strategy to identify the optimal combinatorial kernel from a selection of base kernels. This approach efficiently evaluates design spaces around specific points of interest, enabling alpha shape reconstruction. The methodology's adaptability is demonstrated through its application to both lower-dimensional (2D
APA, Harvard, Vancouver, ISO, and other styles
2

Gu, Jiuxiang, Yingyu Liang, Zhizhou Sha, Zhenmei Shi, and Zhao Song. "Differential Privacy Mechanisms in Neural Tangent Kernel Regression." In 2025 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV). IEEE, 2025. https://doi.org/10.1109/wacv61041.2025.00234.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Gonen, Mehmet, and Ethem Alpaydin. "Localized Multiple Kernel Regression." In 2010 20th International Conference on Pattern Recognition (ICPR). IEEE, 2010. http://dx.doi.org/10.1109/icpr.2010.352.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Wang, Meng, Xian-sheng Hua, Yan Song, Li-rong Dai, and Hong-jiang Zhang. "Semi-Supervised Kernel Regression." In Sixth International Conference on Data Mining (ICDM'06). IEEE, 2006. http://dx.doi.org/10.1109/icdm.2006.143.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Liu, Fanghui, Xiaolin Huang, and Jie Yang. "Indefinite Kernel Logistic Regression." In MM '17: ACM Multimedia Conference. ACM, 2017. http://dx.doi.org/10.1145/3123266.3123295.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Zheng, Yan, and Jeff M. Phillips. "Coresets for Kernel Regression." In KDD '17: The 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. ACM, 2017. http://dx.doi.org/10.1145/3097983.3098000.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Sahoo, Doyen, Steven C. H. Hoi, and Bin Li. "Online multiple kernel regression." In KDD '14: The 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. ACM, 2014. http://dx.doi.org/10.1145/2623330.2623712.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

He, Jinrong, Lixin Ding, Lei Jiang, and Ling Ma. "Kernel ridge regression classification." In 2014 International Joint Conference on Neural Networks (IJCNN). IEEE, 2014. http://dx.doi.org/10.1109/ijcnn.2014.6889396.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Shuicheng Yan, Xi Zhou, Ming Liu, Mark Hasegawa-Johnson, and Thomas S. Huang. "Regression from patch-kernel." In 2008 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2008. http://dx.doi.org/10.1109/cvpr.2008.4587405.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Burnaev, Evgeny, and Ivan Nazarov. "Conformalized Kernel Ridge Regression." In 2016 15th IEEE International Conference on Machine Learning and Applications (ICMLA). IEEE, 2016. http://dx.doi.org/10.1109/icmla.2016.0017.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Kernel regression"

1

Chong, Alberto E., and José Galdo. Training Quality and Earnings: The Effects of Competition on the Provision of Public-Sponsored Training Programs. Inter-American Development Bank, 2006. http://dx.doi.org/10.18235/0011257.

Full text
Abstract:
This paper evaluates the effectiveness of market-based approaches in the provision of public-sponsored training programs. In particular, we study the link between training quality and labor earnings using a Peruvian program that targets disadvantaged youths. Multiple proxies for training quality are identified from bidding processes in which public and private training institutions that operate for profit compete for limited public funding. Using difference-in-differences kernel matching and standard regression-based approaches, we find that beneficiaries attending high-quality training course
APA, Harvard, Vancouver, ISO, and other styles
2

Zhao, L. C. Exponential Bounds of Mean Error for the Kernal Estimates of Regression Functions. Defense Technical Information Center, 1985. http://dx.doi.org/10.21236/ada167345.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!