To see the other types of publications on this topic, follow the link: Kernel regression.

Journal articles on the topic 'Kernel regression'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Kernel regression.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Yulianto, Fendy, Wayan Firdaus Mahmudy, and Arief Andy Soebroto. "Comparison of Regression, Support Vector Regression (SVR), and SVR-Particle Swarm Optimization (PSO) for Rainfall Forecasting." Journal of Information Technology and Computer Science 5, no. 3 (2020): 235. http://dx.doi.org/10.25126/jitecs.20205374.

Full text
Abstract:
Rainfall is one of the factors that influence climate change in an area and is very difficult to predict, while rainfall information is very important for the community. Forecasting can be done using existing historical data with the help of mathematical computing in modeling. The Support Vector Regression (SVR) method is one method that can be used to predict non-linear rainfall data using a regression function. In calculations using the regression function, choosing the right SVR parameters is needed to produce forecasting with high accuracy. Particle Swarm Optimization (PSO) method is one m
APA, Harvard, Vancouver, ISO, and other styles
2

Farooq, Tahir, Aziz Guergachi, and Sridhar Krishnan. "Knowledge-Based Green's Kernel for Support Vector Regression." Mathematical Problems in Engineering 2010 (2010): 1–16. http://dx.doi.org/10.1155/2010/378652.

Full text
Abstract:
This paper presents a novel prior knowledge-based Green's kernel for support vector regression (SVR). After reviewing the correspondence between support vector kernels used in support vector machines (SVMs) and regularization operators used in regularization networks and the use of Green's function of their corresponding regularization operators to construct support vector kernels, a mathematical framework is presented to obtain the domain knowledge about magnitude of the Fourier transform of the function to be predicted and design a prior knowledge-based Green's kernel that exhibits optimal r
APA, Harvard, Vancouver, ISO, and other styles
3

Zhou, Xiaojian, Qianqian Geng, and Ting Jiang. "Boosting RBFNN performance in regression tasks with quantum kernel methods." Journal of Statistical Mechanics: Theory and Experiment 2025, no. 6 (2025): 063101. https://doi.org/10.1088/1742-5468/add0a3.

Full text
Abstract:
Abstract Quantum and classical machine learning are fundamentally connected through kernel methods, with kernels serving as inner products of feature vectors in high-dimensional spaces, forming their foundation. Among commonly used kernels, the Gaussian kernel plays a prominent role in radial basis function neural network (RBFNN) for regression tasks. Nonetheless, the localized response property of the Gaussian kernel, which emphasizes relationships between nearby data points, limits its capacity to model interactions among more distant data points. As a result, it may potentially overlook the
APA, Harvard, Vancouver, ISO, and other styles
4

Ma, Xiaoyan, Yanbin Zhang, Hui Cao, Shiliang Zhang, and Yan Zhou. "Nonlinear Regression with High-Dimensional Space Mapping for Blood Component Spectral Quantitative Analysis." Journal of Spectroscopy 2018 (2018): 1–8. http://dx.doi.org/10.1155/2018/2689750.

Full text
Abstract:
Accurate and fast determination of blood component concentration is very essential for the efficient diagnosis of patients. This paper proposes a nonlinear regression method with high-dimensional space mapping for blood component spectral quantitative analysis. Kernels are introduced to map the input data into high-dimensional space for nonlinear regression. As the most famous kernel, Gaussian kernel is usually adopted by researchers. More kernels need to be studied for each kernel describes its own high-dimensional feature space mapping which affects regression performance. In this paper, eig
APA, Harvard, Vancouver, ISO, and other styles
5

Zhang, Chao, and Shaogao Lv. "An Efficient Kernel Learning Algorithm for Semisupervised Regression Problems." Mathematical Problems in Engineering 2015 (2015): 1–9. http://dx.doi.org/10.1155/2015/451947.

Full text
Abstract:
Kernel selection is a central issue in kernel methods of machine learning. In this paper, we investigate the regularized learning schemes based on kernel design methods. Our ideal kernel is derived from a simple iterative procedure using large scale unlabeled data in a semisupervised framework. Compared with most of existing approaches, our algorithm avoids multioptimization in the process of learning kernels and its computation is as efficient as the standard single kernel-based algorithms. Moreover, large amounts of information associated with input space can be exploited, and thus generaliz
APA, Harvard, Vancouver, ISO, and other styles
6

Caraka, Rezzy Eko, Hasbi Yasin, and Adi Waridi Basyiruddin. "Peramalan Crude Palm Oil (CPO) Menggunakan Support Vector Regression Kernel Radial Basis." Jurnal Matematika 7, no. 1 (2017): 43. http://dx.doi.org/10.24843/jmat.2017.v07.i01.p81.

Full text
Abstract:
Recently, instead of selecting a kernel has been proposed which uses SVR, where the weight of each kernel is optimized during training. Along this line of research, many pioneering kernel learning algorithms have been proposed. The use of kernels provides a powerful and principled approach to modeling nonlinear patterns through linear patterns in a feature space. Another bene?t is that the design of kernels and linear methods can be decoupled, which greatly facilitates the modularity of machine learning methods. We perform experiments on real data sets crude palm oil prices for application and
APA, Harvard, Vancouver, ISO, and other styles
7

Mackenzie, M., and A. K. Tieu. "Asymmetric Kernel Regression." IEEE Transactions on Neural Networks 15, no. 2 (2004): 276–82. http://dx.doi.org/10.1109/tnn.2004.824414.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Chen, Xin, Xuejun Ma, and Wang Zhou. "Kernel density regression." Journal of Statistical Planning and Inference 205 (March 2020): 318–29. http://dx.doi.org/10.1016/j.jspi.2019.09.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Fagundes, Roberta A. A., Renata M. C. R. de Souza, and Francisco José A. Cysneiros. "Interval kernel regression." Neurocomputing 128 (March 2014): 371–88. http://dx.doi.org/10.1016/j.neucom.2013.08.029.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Lee, Myung Hee, and Yufeng Liu. "Kernel continuum regression." Computational Statistics & Data Analysis 68 (December 2013): 190–201. http://dx.doi.org/10.1016/j.csda.2013.06.016.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Yang, Shuyuan, Min Wang, and Licheng Jiao. "Ridgelet kernel regression." Neurocomputing 70, no. 16-18 (2007): 3046–55. http://dx.doi.org/10.1016/j.neucom.2006.05.015.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Braun, W. John, and Li-Shan Huang. "Kernel spline regression." Canadian Journal of Statistics 33, no. 2 (2005): 259–78. http://dx.doi.org/10.1002/cjs.5550330207.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Olanrewaju, Rasaki Olawale, Toyin Omoyeni Oguntola, Lukman Abiodun Nafiu, and Sodiq Adejare Olanrewaju. "On the robustness of the Olanrewaju-Olanrewaju regression kernel-based to nonparametric kernels for Support Vector Regressor (SVR)." Engineering and Applied Science Letters 7, no. 2 (2024): 42–54. https://doi.org/10.30538/psrp-easl2024.0101.

Full text
Abstract:
In this article we studied and juxtaposed nonparametric Least Square and the Olanrewaju-Olanrewaju regression-type \({L_{(O - O){\lambda _{\gamma (\left| \theta \right|)}}}}\) kernels for supervised Support Vector Regressor (SVR) machine learning of hyperplane regression in a bivariate setting. The nonparametric kernels used to expound the SVR were Bisquare, Gaussian, Triweight, Uniform, Epanechnikov, and Triangular. Lagrangian multiplier estimation technique was adopted in estimating the involved SVR hyperplane regression coefficients as well as other embedded coefficients in each of the stat
APA, Harvard, Vancouver, ISO, and other styles
14

Liang, Junjie, Yanting Wu, Dongkuan Xu, and Vasant G. Honavar. "Longitudinal Deep Kernel Gaussian Process Regression." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 10 (2021): 8556–64. http://dx.doi.org/10.1609/aaai.v35i10.17038.

Full text
Abstract:
Gaussian processes offer an attractive framework for predictive modeling from longitudinal data, \ie irregularly sampled, sparse observations from a set of individuals over time. However, such methods have two key shortcomings: (i) They rely on ad hoc heuristics or expensive trial and error to choose the effective kernels, and (ii) They fail to handle multilevel correlation structure in the data. We introduce Longitudinal deep kernel Gaussian process regression (L-DKGPR) to overcome these limitations by fully automating the discovery of complex multilevel correlation structure from longitudina
APA, Harvard, Vancouver, ISO, and other styles
15

Verma, Neetu, Sujoy Das, and Namita Srivastava. "Multiple kernel support vector regression for pricing nifty option." International Journal of Applied Mathematical Research 4, no. 4 (2015): 488. http://dx.doi.org/10.14419/ijamr.v4i4.5023.

Full text
Abstract:
<p>The goal of present experiments is to investigate the use of multiple kernel learning as a tool for pricing options in the context of Indian stock market for Nifty index options. In this paper, fair price of an option is predicted by Multiple Kernel Support Vector Regression (MKLSVR) using linear combinations of kernels and Single Kernel Support Vector Regression (SKSVR). Prices of option highly depend on different money market conditions like deep-in-the-money, in-the-money, at-the-money, out-of-money and deep-out-of-money condition. The experimental study attempts to identify the fo
APA, Harvard, Vancouver, ISO, and other styles
16

Indra Septiawati, Eka Suryani, Elvia Budianita, Fitri Insani, and Lola Oktavia. "Prediksi Jumlah Perceraian Menggunakan Metode Support Vector Regression (SVR)." Journal of Computer System and Informatics (JoSYC) 5, no. 1 (2023): 208–17. http://dx.doi.org/10.47065/josyc.v5i1.4613.

Full text
Abstract:
The increasing number of divorces poses an increasingly significant social challenge in Indonesia, including in the city of Pekanbaru. The impact of these divorces on the adolescent population can have negative effects on their emotional and psychological well-being, as well as their ability to interact socially and engage in the learning process. This study utilizes monthly divorce data from 2015 to April 2023 to conduct time series analysis and applies the Support Vector Regression (SVR) method to predict the number of divorces in the city of Pekanbaru. Three types of SVR kernels, namely lin
APA, Harvard, Vancouver, ISO, and other styles
17

Sakinah, Nur, Nurfitra, Nurmasyita Ihlasia, and Lilies Handayani. "Modeling of Poverty Level in Central Sulawesi Using Nonparametric Kernel Regression Analysis Approach." Parameter: Journal of Statistics 2, no. 3 (2022): 11–16. http://dx.doi.org/10.22487/27765660.2022.v2.i3.15743.

Full text
Abstract:
Poverty is defined as a person's inability to meet their basic needs. The level of poverty that exists can be used to assess the good or bad of a country's economy. The kernel regression method is used in this study to model the poverty rate in Central Sulawesi in 2020. According to the findings of this study, comparing poverty rate predictions for the Gaussian Kernel function and the Epanechnikov Kernel function with optimal bandwidth can be said to use different kernel functions with optimal bandwidth for each - each of these kernel functions will produce the same curve estimate. So, in kern
APA, Harvard, Vancouver, ISO, and other styles
18

Liu, Bing, Shixiong Xia, and Yong Zhou. "Multiple Kernel Spectral Regression for Dimensionality Reduction." Journal of Applied Mathematics 2013 (2013): 1–8. http://dx.doi.org/10.1155/2013/427462.

Full text
Abstract:
Traditional manifold learning algorithms, such as locally linear embedding, Isomap, and Laplacian eigenmap, only provide the embedding results of the training samples. To solve the out-of-sample extension problem, spectral regression (SR) solves the problem of learning an embedding function by establishing a regression framework, which can avoid eigen-decomposition of dense matrices. Motivated by the effectiveness of SR, we incorporate multiple kernel learning (MKL) into SR for dimensionality reduction. The proposed approach (termed MKL-SR) seeks an embedding function in the Reproducing Kernel
APA, Harvard, Vancouver, ISO, and other styles
19

Aseel Sameer Mohammed. "Suggested Method for Prediction Using Gaussian Process Regression Kernel Regression." Journal of Information Systems Engineering and Management 10, no. 35s (2025): 721–34. https://doi.org/10.52783/jisem.v10i35s.6280.

Full text
Abstract:
Accurately predicting children's weight is challenging due to measurement inconsistencies. To address this, a hybrid kernel function, combining the squared exponential kernel and the Gaussian kernel with a mixture parameter, is proposed for developing a fuzzy Gaussian process regression model. The integration of fuzzy set theory and a triangular membership function helps handle weight measurement inaccuracies by determining the degrees of membership for each element in the weight vector. The model is estimated using the spider monkey optimization (SMO) algorithm and implemented in MATLAB Ver.
APA, Harvard, Vancouver, ISO, and other styles
20

Zhao, Lv, Yi Dan Su, Hua Qin, and Pian Pian Ma. "Study of Multiple-Kernel Relevance Vector Machine Based on Kernel Alignment." Applied Mechanics and Materials 239-240 (December 2012): 1308–12. http://dx.doi.org/10.4028/www.scientific.net/amm.239-240.1308.

Full text
Abstract:
The relevance vector machine (RVM) was a Bayesian framework for learning sparse regression models and classifiers, it used single kernel function to map training data from low dimension sample space to high dimension feature space. The prediction accuracy and generalization of traditional single-kernel RVM (sRVM) were not ideal both in classification and regression, so we constructed homogeneous and heterogeneous multiple kernels function (MKF) by kernel function combination in which we testified the validity of basic kernel function (BKF) and its parameters we employed by kernel alignment (KA
APA, Harvard, Vancouver, ISO, and other styles
21

Alida, Mufni, and Metty Mustikasari. "Rupiah Exchange Prediction of US Dollar Using Linear, Polynomial, and Radial Basis Function Kernel in Support Vector Regression." Jurnal Online Informatika 5, no. 1 (2020): 53–60. http://dx.doi.org/10.15575/join.v5i1.537.

Full text
Abstract:
As a developing country, Indonesia is affected by fluctuations in foreign exchange rates, especially the US Dollar. Determination of foreign exchange rates must be profitable so a country can run its economy well. The prediction of the exchange rate is done to find out the large exchange rates that occur in the future and the government can take the right policy. Prediction is done by one of the Machine Learning methods, namely the Support Vector Regression (SVR) algorithm. The prediction model is made using three kernels in SVR. Each kernel has the best model and, the accuracy and error value
APA, Harvard, Vancouver, ISO, and other styles
22

Cai, Jia. "Coefficient-Based Regression with Non-Identical Unbounded Sampling." Abstract and Applied Analysis 2013 (2013): 1–8. http://dx.doi.org/10.1155/2013/134727.

Full text
Abstract:
We investigate a coefficient-based least squares regression problem with indefinite kernels from non-identical unbounded sampling processes. Here non-identical unbounded sampling means the samples are drawn independently but not identically from unbounded sampling processes. The kernel is not necessarily symmetric or positive semi-definite. This leads to additional difficulty in the error analysis. By introducing a suitable reproducing kernel Hilbert space (RKHS) and a suitable intermediate integral operator, elaborate analysis is presented by means of a novel technique for the sample error. T
APA, Harvard, Vancouver, ISO, and other styles
23

Ramadhan, Riza F., and Robert Kurniawan. "PEMODELAN DATA KEMATIAN BAYI DENGAN GEOGRAPHICALLY WEIGHTED NEGATIVE BINOMIAL REGRESSION." MEDIA STATISTIKA 9, no. 2 (2017): 95. http://dx.doi.org/10.14710/medstat.9.2.95-106.

Full text
Abstract:
Overdispersion phenomenon and the influence of location or spatial aspect on data are handled using Binomial Geographically Weighted Regression (GWNBR). GWNBR is the best solution to form a regression analysis that is specific to each observation’s location. The analysis resulted in parameter value which different from one observation to another between location. The Weighting Matrix Selection is done before doing The GWNBR modeling. Different weighting will resulted in different model. Thus this study aims to investigate the best fit model using infant mortality data that is produced by some
APA, Harvard, Vancouver, ISO, and other styles
24

Teng, Tong, Jie Chen, Yehong Zhang, and Bryan Kian Hsiang Low. "Scalable Variational Bayesian Kernel Selection for Sparse Gaussian Process Regression." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (2020): 5997–6004. http://dx.doi.org/10.1609/aaai.v34i04.6061.

Full text
Abstract:
This paper presents a variational Bayesian kernel selection (VBKS) algorithm for sparse Gaussian process regression (SGPR) models. In contrast to existing GP kernel selection algorithms that aim to select only one kernel with the highest model evidence, our VBKS algorithm considers the kernel as a random variable and learns its belief from data such that the uncertainty of the kernel can be interpreted and exploited to avoid overconfident GP predictions. To achieve this, we represent the probabilistic kernel as an additional variational variable in a variational inference (VI) framework for SG
APA, Harvard, Vancouver, ISO, and other styles
25

Keddy, Dickson, Elias Nii, and Bright Bediako-Kyeremeh. "Coupled Kernel Ensemble Regression." International Journal of Computer Applications 181, no. 34 (2018): 1–8. http://dx.doi.org/10.5120/ijca2018918278.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Koo, Ja-Yong, Kwi Wook Park, Byung Won Kim, Kwang-Rae Kim, and Changyi Park. "Structured kernel quantile regression." Journal of Statistical Computation and Simulation 83, no. 1 (2013): 179–90. http://dx.doi.org/10.1080/00949655.2011.631923.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Han, Weiwei. "Local Reweighted Kernel Regression." International Journal of Engineering and Manufacturing 1, no. 1 (2011): 20–26. http://dx.doi.org/10.5815/ijem.2011.01.04.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Zheng, Qi, Colin Gallagher, and K. B. Kulasekera. "Adaptively weighted kernel regression." Journal of Nonparametric Statistics 25, no. 4 (2013): 855–72. http://dx.doi.org/10.1080/10485252.2013.813511.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Jianke Zhu, S. Hoi, and M. R. T. Lyu. "Robust Regularized Kernel Regression." IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 38, no. 6 (2008): 1639–44. http://dx.doi.org/10.1109/tsmcb.2008.927279.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Gao, Chao, and Xiao-jun Wu. "Kernel Support Tensor Regression." Procedia Engineering 29 (2012): 3986–90. http://dx.doi.org/10.1016/j.proeng.2012.01.606.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Eichner, Gerrit, and Winfried Stute. "Kernel adjusted nonparametric regression." Journal of Statistical Planning and Inference 142, no. 9 (2012): 2537–44. http://dx.doi.org/10.1016/j.jspi.2012.03.011.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Di Marzio, Marco, and Charles C. Taylor. "On boosting kernel regression." Journal of Statistical Planning and Inference 138, no. 8 (2008): 2483–98. http://dx.doi.org/10.1016/j.jspi.2007.10.005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Cawley, Gavin C., Nicola L. C. Talbot, Robert J. Foxall, Stephen R. Dorling, and Danilo P. Mandic. "Heteroscedastic kernel ridge regression." Neurocomputing 57 (March 2004): 105–24. http://dx.doi.org/10.1016/j.neucom.2004.01.005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Bang, Sungwan, Soo-Heang Eo, Myoungshic Jhun, and Hyung Jun Cho. "Composite kernel quantile regression." Communications in Statistics - Simulation and Computation 46, no. 3 (2015): 2228–40. http://dx.doi.org/10.1080/03610918.2015.1039133.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Kramer, Oliver, and Fabian Gieseke. "Evolutionary kernel density regression." Expert Systems with Applications 39, no. 10 (2012): 9246–54. http://dx.doi.org/10.1016/j.eswa.2012.02.080.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Kumar, M., N. K. Tiwari, and S. Ranjan. "Kernel function based regression approaches for estimating the oxygen transfer performance of plunging hollow jet aerator." Journal of Achievements in Materials and Manufacturing Engineering 2, no. 95 (2019): 74–84. http://dx.doi.org/10.5604/01.3001.0013.7917.

Full text
Abstract:
Purpose: To evaluate the capability of various kernels employed with support vector regression (SVR) and Gaussian process regression (GPR) techniques in estimating the volumetric oxygen transfer coefficient of plunging hollow jets. Design/methodology/approach: In this study, a data set of 81 observations is acquired from laboratory experiments of hollow jets plunging on the surface of water in the tank. The jet variables: jet velocity, jet thickness, jet length, and water depth are varied accordingly and the values of volumetric oxygen transfer coefficient is computed. An empirical relationshi
APA, Harvard, Vancouver, ISO, and other styles
37

John, Lija, and Vani V. Prakash. "Survey Kernel Optimized Regression Model for Product Recommendation." International Journal of Trend in Scientific Research and Development Volume-2, Issue-3 (2018): 1877–83. http://dx.doi.org/10.31142/ijtsrd11306.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Yu, Yang, Chenlong Fan, Qibin Li, et al. "Research on Mass Prediction of Maize Kernel Based on Machine Vision and Machine Learning Algorithm." Processes 13, no. 2 (2025): 346. https://doi.org/10.3390/pr13020346.

Full text
Abstract:
The yield assessment process during maize harvesting is a necessary means to ensure farmers’ economic benefits and stable agricultural production. Predicting the mass of maize kernels is an important condition for yield detection. This study proposes a maize kernel mass prediction model based on machine vision and machine learning algorithms to determine whether the kernels are broken. By extracting the geometric features of maize kernels, a phenotypic feature dataset of maize kernels was constructed. Subsequently, popular machine learning algorithms were used to establish regression models fo
APA, Harvard, Vancouver, ISO, and other styles
39

Wu, Qiang, Feng Liang, and Sayan Mukherjee. "Kernel Sliced Inverse Regression: Regularization and Consistency." Abstract and Applied Analysis 2013 (2013): 1–11. http://dx.doi.org/10.1155/2013/540725.

Full text
Abstract:
Kernel sliced inverse regression (KSIR) is a natural framework for nonlinear dimension reduction using the mapping induced by kernels. However, there are numeric, algorithmic, and conceptual subtleties in making the method robust and consistent. We apply two types of regularization in this framework to address computational stability and generalization performance. We also provide an interpretation of the algorithm and prove consistency. The utility of this approach is illustrated on simulated and real data.
APA, Harvard, Vancouver, ISO, and other styles
40

Fadillah, Nur, Priliany Audina Dariah, Anisa Anggraeni, Nur Cahyani, and Lilies Handayani. "Comparison of Gaussian and Epancehnikov Kernels." Tadulako Social Science and Humaniora Journal 3, no. 1 (2022): 13–22. http://dx.doi.org/10.22487/sochum.v3i1.15745.

Full text
Abstract:
Kernel regression is a nonparametric analysis with smoothing method. Smoothing has become synonymous with nonparametric methods used to estimate functions. The purpose of smoothing is to remove variability from data that has no effect so that the characteristics of the data will appear clear. Kernel regression has a flexible form and the mathematical calculations are easy to adjust. In kernel regression, an estimator is known which is usually used to estimate the regression function, namely the Nadaraya-Watson estimator. This study aims to show how to estimate data using nonparametric regressi
APA, Harvard, Vancouver, ISO, and other styles
41

Salzo, Saverio, and Johan A. K. Suykens. "Generalized support vector regression: Duality and tensor-kernel representation." Analysis and Applications 18, no. 01 (2019): 149–83. http://dx.doi.org/10.1142/s0219530519410069.

Full text
Abstract:
In this paper, we study the variational problem associated to support vector regression in Banach function spaces. Using the Fenchel–Rockafellar duality theory, we give an explicit formulation of the dual problem as well as of the related optimality conditions. Moreover, we provide a new computational framework for solving the problem which relies on a tensor-kernel representation. This analysis overcomes the typical difficulties connected to learning in Banach spaces. We finally present a large class of tensor-kernels to which our theory fully applies: power series tensor kernels. This type o
APA, Harvard, Vancouver, ISO, and other styles
42

Sadek, Amjed Mohammed, and Lekaa Ali Mohammed. "Evaluation of the Performance of Kernel Non-parametric Regression and Ordinary Least Squares Regression." JOIV : International Journal on Informatics Visualization 8, no. 3 (2024): 1352. http://dx.doi.org/10.62527/joiv.8.3.2430.

Full text
Abstract:
Researchers need to understand the differences between parametric and nonparametric regression models and how they work with available information about the relationship between response and explanatory variables and the distribution of random errors. This paper proposes a new nonparametric regression function for the kernel and employs it with the Nadaraya-Watson kernel estimator method and the Gaussian kernel function. The proposed kernel function (AMS) is then compared to the Gaussian kernel and the traditional parametric method, the ordinary least squares method (OLS). The objective of thi
APA, Harvard, Vancouver, ISO, and other styles
43

Widiarni, Adyah, and Mustakim Mustakim. "Penerapan Algoritma Support Vector Regression dalam Memprediksi Produksi dan Produktivitas Kelapa Sawit." JURNAL MEDIA INFORMATIKA BUDIDARMA 7, no. 2 (2023): 864. http://dx.doi.org/10.30865/mib.v7i2.6089.

Full text
Abstract:
Palm oil is a plantation crop that provides the highest economic value in Indonesia. Riau is currently the highest palm oil producing province in Indonesia with a state-run palm oil company, PTPN V. However, palm oil production is not always stable every month, whichexperiences ups and downs in the amount of production and productivity due to several factors including irregular rainfall, climate, soil fertility and most importantly fruit bunches that are not ready to harvest. So the data mining processing process is carried out by predicting the amount of production and productivity of oil pal
APA, Harvard, Vancouver, ISO, and other styles
44

Puspitasari, Chasandra, Nur Rokhman, and Wahyono. "PREDICTION OF OZONE (O3) VALUES USING SUPPORT VECTOR REGRESSION METHOD." Jurnal Informatika Polinema 7, no. 4 (2021): 81–88. http://dx.doi.org/10.33795/jip.v7i4.777.

Full text
Abstract:
A large number of motor vehicles that cause congestion is a major factor in the poor air quality in big cities. Ozone (O3) is one of the main indicators in measuring the level of air pollution in the city of Surabaya to find out how air quality. Prediction of Ozone (O3) value is important as a support for the community and government in efforts to improve the air quality. This study aims to predict the value of Ozone (O3) in the form of time series data using the Support Vector Regression (SVR) method with the Linear, Polynomial, RBF, and ANOVA kernels. The data used in this study are 549 prim
APA, Harvard, Vancouver, ISO, and other styles
45

DACHAPAK, Chooleewan, Shunshoku KANAE, Zi-Jiang YANG, and Kiyoshi WADA. "Kernel Principal Component Regression in Reproducing Kernel Hilbert Space." Proceedings of the ISCIE International Symposium on Stochastic Systems Theory and its Applications 2003 (May 5, 2003): 213–18. http://dx.doi.org/10.5687/sss.2003.213.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Nakarmi, Janet, Hailin Sang, and Lin Ge. "Variable bandwidth kernel regression estimation." ESAIM: Probability and Statistics 25 (2021): 55–86. http://dx.doi.org/10.1051/ps/2021003.

Full text
Abstract:
In this paper we propose a variable bandwidth kernel regression estimator for i.i.d. observations in ℝ2 to improve the classical Nadaraya-Watson estimator. The bias is improved to the order of O(hn4) under the condition that the fifth order derivative of the density function and the sixth order derivative of the regression function are bounded and continuous. We also establish the central limit theorems for the proposed ideal and true variable kernel regression estimators. The simulation study confirms our results and demonstrates the advantage of the variable bandwidth kernel method over the
APA, Harvard, Vancouver, ISO, and other styles
47

Fitriah, Ma’idatul, Inggih Permana, Febi Nur Salisah, Medyantiwi Rahmawita Munzir, and Megawati Megawati. "Peramalan Jumlah Kedatangan Wisatawan Menggunakan Support Vector Regression Berbasis Sliding Window." JURNAL MEDIA INFORMATIKA BUDIDARMA 8, no. 3 (2024): 1366. http://dx.doi.org/10.30865/mib.v8i3.7408.

Full text
Abstract:
As a developing city, Pekanbaru has the potential for attractive tourist attractions for tourists. The arrival of tourists has had a big positive impact on the economy of Pekanbaru City. The number of tourist arrivals can experience ups and downs every month, for this reason it is necessary to forecast the number of tourists in the future. This research aims to apply the Orange Data Mining application in predicting the number of tourist arrivals by comparing the kernels in the Support Vector Regression (SVR) method and applying Sliding Window size 3 to window size 13 to transform into time ser
APA, Harvard, Vancouver, ISO, and other styles
48

Moon, Young-Il, Sung-Jin Cho, and Si-Young Chun. "Nonparametic Kernel Regression model for Rating curve." Journal of Korea Water Resources Association 36, no. 6 (2003): 1025–33. http://dx.doi.org/10.3741/jkwra.2003.36.6.1025.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Park, B. U., Y. K. Lee, and S. Ha. "$L_2$ boosting in kernel regression." Bernoulli 15, no. 3 (2009): 599–613. http://dx.doi.org/10.3150/08-bej160.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Halconruy, H., and N. Marie. "Kernel Selection in Nonparametric Regression." Mathematical Methods of Statistics 29, no. 1 (2020): 32–56. http://dx.doi.org/10.3103/s1066530720010044.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!