To see the other types of publications on this topic, follow the link: Support vector regression.

Journal articles on the topic 'Support vector regression'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Support vector regression.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Sabzekar, Mostafa, and Seyed Mohammad Hossein Hasheminejad. "Robust regression using support vector regressions." Chaos, Solitons & Fractals 144 (March 2021): 110738. http://dx.doi.org/10.1016/j.chaos.2021.110738.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Jun, Sung-Hae. "An Outlier Data Analysis using Support Vector Regression." Journal of Korean Institute of Intelligent Systems 18, no. 6 (December 25, 2008): 876–80. http://dx.doi.org/10.5391/jkiis.2008.18.6.876.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Jun, Sung-Hae, Jung-Eun Park, and Kyung-Whan Oh. "A Sparse Data Preprocessing Using Support Vector Regression." Journal of Korean Institute of Intelligent Systems 14, no. 6 (October 1, 2004): 789–92. http://dx.doi.org/10.5391/jkiis.2004.14.6.789.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Lee, Hyoung-Ro, and Hyun-Jung Shin. "Electricity Demand Forecasting based on Support Vector Regression." IE interfaces 24, no. 4 (December 1, 2011): 351–61. http://dx.doi.org/10.7232/ieif.2011.24.4.351.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Kenesei, Tamás, and János Abonyi. "Interpretable support vector regression." Artificial Intelligence Research 1, no. 2 (October 9, 2012): 11. http://dx.doi.org/10.5430/air.v1n2p11.

Full text
Abstract:
This paper deals with transforming Support vector regression (SVR) models into fuzzy systems (FIS). It is highlighted that trained support vector based models can be used for the construction of fuzzy rule-based regression models. However, the transformed support vector model does not automatically result in an interpretable fuzzy model. Training of a support vector model results a complex rule base, where the number of rules are approximately 40-60% of the number of the training data, therefore reduction of the support vector model initialized fuzzy model is an essential task. For this purpose, a three-step reduction algorithm is used based on the combination of previously published model reduction techniques, namely the reduced set method to decrease number of kernel functions, then after the reduced support vector model is transformed into fuzzy rule base similarity measure based merging and orthogonal least-squares methods are utilized. The proposed approach is applied for nonlinear system identification, the identification of a Hammerstein system is used to demonstrate accuracy of the technique with fulfilling the criteria of interpretability.
APA, Harvard, Vancouver, ISO, and other styles
6

Lv, Yuan, and Zhong Gan. "Robustε-Support Vector Regression." Mathematical Problems in Engineering 2014 (2014): 1–5. http://dx.doi.org/10.1155/2014/373571.

Full text
Abstract:
Spheroid disturbance of input data brings great challenges to support vector regression; thus it is essential to study the robust regression model. This paper is dedicated to establish a robust regression model which makes the regression function robust against disturbance of data and system parameter. Firstly, two theorems have been given to show that the robust linearε-support vector regression problem could be settled by solving the dual problems. Secondly, it has been focused on the development of robust support vector regression algorithm which is extended from linear domain to nonlinear domain. Finally, the numerical experiments result demonstrates the effectiveness of the models and algorithms proposed in this paper.
APA, Harvard, Vancouver, ISO, and other styles
7

Lingras, P., and C. J. Butz. "Rough support vector regression." European Journal of Operational Research 206, no. 2 (October 2010): 445–55. http://dx.doi.org/10.1016/j.ejor.2009.10.023.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Chu, Wei, and S. Sathiya Keerthi. "Support Vector Ordinal Regression." Neural Computation 19, no. 3 (March 2007): 792–815. http://dx.doi.org/10.1162/neco.2007.19.3.792.

Full text
Abstract:
In this letter, we propose two new support vector approaches for ordinal regression, which optimize multiple thresholds to define parallel discriminant hyperplanes for the ordinal scales. Both approaches guarantee that the thresholds are properly ordered at the optimal solution. The size of these optimization problems is linear in the number of training samples. The sequential minimal optimization algorithm is adapted for the resulting optimization problems; it is extremely easy to implement and scales efficiently as a quadratic function of the number of examples. The results of numerical experiments on some benchmark and real-world data sets, including applications of ordinal regression to information retrieval, verify the usefulness of these approaches.
APA, Harvard, Vancouver, ISO, and other styles
9

Harrington, Peter de B. "Automated support vector regression." Journal of Chemometrics 31, no. 4 (December 28, 2016): e2867. http://dx.doi.org/10.1002/cem.2867.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Panagopoulos, Orestis P., Petros Xanthopoulos, Talayeh Razzaghi, and Onur Şeref. "Relaxed support vector regression." Annals of Operations Research 276, no. 1-2 (April 11, 2018): 191–210. http://dx.doi.org/10.1007/s10479-018-2847-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Muthukrishnan, R., and S. Kalaivani. "Robust Weighted Support Vector Regression Approach for Predictive Modeling." Indian Journal Of Science And Technology 16, no. 30 (August 14, 2023): 2287–96. http://dx.doi.org/10.17485/ijst/v16i30.1180.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Lee, Dongju, and Sujin Choi. "Generalized Support Vector Quantile Regression." Journal of Society of Korea Industrial and Systems Engineering 43, no. 4 (December 30, 2020): 107–15. http://dx.doi.org/10.11627/jkise.2020.43.4.107.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

SHI, Yingzhong, Shitong WANG, Yizhang JIANG, and Peiling LIU. "Transfer learning support vector regression." Journal of Computer Applications 33, no. 11 (November 26, 2013): 3084–89. http://dx.doi.org/10.3724/sp.j.1087.2013.03084.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Seok, Kyungha. "Semisupervised support vector quantile regression." Journal of the Korean Data and Information Science Society 26, no. 2 (March 31, 2015): 517–24. http://dx.doi.org/10.7465/jkdi.2015.26.2.517.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Musicant, D. R., and A. Feinberg. "Active Set Support Vector Regression." IEEE Transactions on Neural Networks 15, no. 2 (March 2004): 268–75. http://dx.doi.org/10.1109/tnn.2004.824259.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Shim, Jooyong, Kyungha Seok, and Changha Hwang. "Monotone support vector quantile regression." Communications in Statistics - Theory and Methods 46, no. 10 (May 31, 2016): 5180–93. http://dx.doi.org/10.1080/03610926.2015.1096395.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Hong, Dug Hun, and Changha Hwang. "Support vector fuzzy regression machines." Fuzzy Sets and Systems 138, no. 2 (September 2003): 271–81. http://dx.doi.org/10.1016/s0165-0114(02)00514-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Parastalooi, Nafiseh, Ali Amiri, and Parisa Aliheidari. "Modified twin support vector regression." Neurocomputing 211 (October 2016): 84–97. http://dx.doi.org/10.1016/j.neucom.2016.01.105.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Singh, Mittul, Jivitej Chadha, Puneet Ahuja, Jayadeva, and Suresh Chandra. "Reduced twin support vector regression." Neurocomputing 74, no. 9 (April 2011): 1474–77. http://dx.doi.org/10.1016/j.neucom.2010.11.003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Zhao, Yongping, and Jianguo Sun. "Rough ν-support vector regression." Expert Systems with Applications 36, no. 6 (August 2009): 9793–98. http://dx.doi.org/10.1016/j.eswa.2009.02.007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Zhao, Yong-Ping, and Jian-Guo Sun. "Robust truncated support vector regression." Expert Systems with Applications 37, no. 7 (July 2010): 5126–33. http://dx.doi.org/10.1016/j.eswa.2009.12.082.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Balasundaram, S., and Kapil. "On Lagrangian support vector regression." Expert Systems with Applications 37, no. 12 (December 2010): 8784–92. http://dx.doi.org/10.1016/j.eswa.2010.06.028.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Joki, Kaisa, Adil M. Bagirov, Napsu Karmitsa, Marko M. Mäkelä, and Sona Taheri. "Clusterwise support vector linear regression." European Journal of Operational Research 287, no. 1 (November 2020): 19–35. http://dx.doi.org/10.1016/j.ejor.2020.04.032.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Chen, Xiaobo, Jian Yang, Jun Liang, and Qiaolin Ye. "Smooth twin support vector regression." Neural Computing and Applications 21, no. 3 (October 10, 2010): 505–13. http://dx.doi.org/10.1007/s00521-010-0454-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Wang, Huadong, Yong Shi, Lingfeng Niu, and Yingjie Tian. "Nonparallel Support Vector Ordinal Regression." IEEE Transactions on Cybernetics 47, no. 10 (October 2017): 3306–17. http://dx.doi.org/10.1109/tcyb.2017.2682852.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Han, Xixuan, and Line Clemmensen. "On Weighted Support Vector Regression." Quality and Reliability Engineering International 30, no. 6 (June 9, 2014): 891–903. http://dx.doi.org/10.1002/qre.1654.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Carrasco, Miguel, Julio López, and Sebastián Maldonado. "Epsilon-nonparallel support vector regression." Applied Intelligence 49, no. 12 (May 28, 2019): 4223–36. http://dx.doi.org/10.1007/s10489-019-01498-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Maia, Mateus, Jonatha Sousa Pimentel, Raydonal Ospina, and Anderson Ara. "Wavelet Support Vector Censored Regression." Analytics 2, no. 2 (May 4, 2023): 410–25. http://dx.doi.org/10.3390/analytics2020023.

Full text
Abstract:
Learning methods in survival analysis have the ability to handle censored observations. The Cox model is a predictive prevalent statistical technique for survival analysis, but its use rests on the strong assumption of hazard proportionality, which can be challenging to verify, particularly when working with non-linearity and high-dimensional data. Therefore, it may be necessary to consider a more flexible and generalizable approach, such as support vector machines. This paper aims to propose a new method, namely wavelet support vector censored regression, and compare the Cox model with traditional support vector regression and traditional support vector regression for censored data models, survival models based on support vector machines. In addition, to evaluate the effectiveness of different kernel functions in the support vector censored regression approach to survival data, we conducted a series of simulations with varying number of observations and ratios of censored data. Based on the simulation results, we found that the wavelet support vector censored regression outperformed the other methods in terms of the C-index. The evaluation was performed on simulations, survival benchmarking datasets and in a biomedical real application.
APA, Harvard, Vancouver, ISO, and other styles
29

Lin-Kai Luo, Lin-Kai Luo, Chao-Jie Xu Lin-Kai Luo, Ling-Jun Ye Chao-Jie Xu, and Hong Peng Ling-Jun Ye. "Some Support Vector Regression Machines with Given Empirical Risks Partly." 電腦學刊 33, no. 5 (October 2022): 061–72. http://dx.doi.org/10.53106/199115992022103305006.

Full text
Abstract:
<p>There are often some prior requirements about empirical risk in regression problems. To meet these requirements, this paper firstly proposes two novel support vector regression machine models in which part of empirical risks are given. One is a support vector regression machine in which partial empirical risks are given (PSVR), and the other is a model in which unilateral partial empirical risks are given (UPSVR). For the samples with given empirical risk levels, PSVR meets the requirements by some inequality constraints about empirical risk levels, while for the other samples without empirical risk requirement, PSVR uses the same strategy as the tradition support vector regression (SVR) to meet the requirement of empirical risk. UPSVR is similar to PSVR, except that the inequality constrains of empirical risks are unilateral. Secondly, the dual problems and the solving methods of PSVR and UPSVR are given. Finally, the effectiveness and superiority of PSVR and UPSVR are verified by the experiments on four artificial datasets. Both PSVR and UPSVR achieve better regression performance than the traditional models respectively. At the same time, PSVR is less sensitive to the trade-off coefficient C between empirical risk and confident risk compared with SVR. Thus, PSVR can select parameter C faster and more conveniently. PSVR and UPSVR are the extensions of the traditional models. When the set of samples with given empirical risks is empty, they degenerate into the traditional models. PSVR and UPSVR are suitable for the scene with prior requirements of empirical risk. </p> <p>&nbsp;</p>
APA, Harvard, Vancouver, ISO, and other styles
30

Hur, Jin, and BeomJun Park. "The Development of the Short-Term Wind Power Forecasting System using Support Vector Regression." Journal of the Korean Institute of Illuminating and Electrical Installation Engineers 31, no. 9 (September 30, 2017): 104–10. http://dx.doi.org/10.5207/jieie.2017.31.9.104.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Qin, Li-Tang, Shu-Shen Liu, Hai-Ling Liu, and Yong-Hong Zhang. "Support vector regression and least squares support vector regression for hormetic dose–response curves fitting." Chemosphere 78, no. 3 (January 2010): 327–34. http://dx.doi.org/10.1016/j.chemosphere.2009.10.029.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Wang, Jian Guo, Liang Wu Cheng, Wen Xing Zhang, and Bo Qin. "A Modified Incremental Support Vector Machine for Regression." Applied Mechanics and Materials 135-136 (October 2011): 63–69. http://dx.doi.org/10.4028/www.scientific.net/amm.135-136.63.

Full text
Abstract:
support vector machine (SVM) has been shown to exhibit superior predictive power compared to traditional approaches in many studies, such as mechanical equipment monitoring and diagnosis. However, SVM training is very costly in terms of time and memory consumption due to the enormous amounts of training data and the quadratic programming problem. In order to improve SVM training speed and accuracy, we propose a modified incremental support vector machine (MISVM) for regression problems in this paper. The main concepts are that using the distance from the margin vectors which violate the Karush-Kuhn-Tucker (KKT) condition to the final decision hyperplane to evaluate the importance of each margin vectors, and the margin vectors whose distance is below the specified value are preserved, the others are eliminated. Then the original SVs and the remaining margin vectors are used to train a new SVM. The proposed MISVM can not only eliminate the unimportant samples such as noise samples, but also preserved the important samples. The effectiveness of the proposed MISVMs is demonstrated with two UCI data sets. These experiments also show that the proposed MISVM is competitive with previously published methods.
APA, Harvard, Vancouver, ISO, and other styles
33

Peng, Xinjun, and Dong Xu. "Projection support vector regression algorithms for data regression." Knowledge-Based Systems 112 (November 2016): 54–66. http://dx.doi.org/10.1016/j.knosys.2016.08.030.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Melki, Gabriella, Alberto Cano, Vojislav Kecman, and Sebastián Ventura. "Multi-target support vector regression via correlation regressor chains." Information Sciences 415-416 (November 2017): 53–69. http://dx.doi.org/10.1016/j.ins.2017.06.017.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Widiyanti, Emilia, and Sukmawati Nur Endah. "Pengenalan Emosi dalam Musik Berdasarkan Musical Features Menggunakan Support Vector Regression." JURNAL MASYARAKAT INFORMATIKA 11, no. 2 (November 17, 2020): 1–14. http://dx.doi.org/10.14710/jmasif.11.2.34875.

Full text
Abstract:
Musik dibuat untuk menyampaikan emosi dan seringkali dimanfaatkan dalam berbagai kegiatan sehari-hari. Music Emotion Recognition atau pengenalan emosi dalam musik menjadi salah satu bidang penelitian yang ikut berkembang seiring dengan perkembangan jenis dan pemanfaatan musik. Penelitian ini menyajikan hasil pengenalan emosi pada musik dengan musical features menggunakan Support Vector Regression dengan jenis pelatihan ɛ-Support Vector Regression dan ʋ-Support Vector Regression serta kombinasi fitur terbaik yang menghasilkan model terbaik. Data yang digunakan sejumlah 165 data musik yang berbentuk musik soundtrack instrumental. Dari penelitan ini dihasilkan dua model terbaik menggunakan pelatihan ʋ-SVR. Model yang dihasilkan yaitu model pengenalan angle dengan masukan fitur terbaik adalah fitur Pitch dan Energy, dan model pengenalan distance dengan masukan fitur terbaik Zero Crossing Rate dan Beat. Model dihasilkan dengan nilai parameter pelatihan model untuk cost=27, gamma=2-7 dan nu=2-2 pada model angle dan cost=27, gamma=2-8 dan nu=2-2 pada model distance. Pengenalan dengan kedua model tersebut menghasilkan akurasi sebesar 37,75%.
APA, Harvard, Vancouver, ISO, and other styles
36

Lee, Kang-In, Sang-Hoon Jung, Hong-Kyun Ryu, Young-Joong Yoon, Sang-Wook Nam, and Young-Seek Chung. "Study on Beamforming of Conformal Array Antenna Using Support Vector Regression." Journal of Korean Institute of Electromagnetic Engineering and Science 29, no. 11 (November 2018): 868–77. http://dx.doi.org/10.5515/kjkiees.2018.29.11.868.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

GUO, Hu-Sheng, and Wen-Jian WANG. "Dynamical Granular Support Vector Regression Machine." Journal of Software 24, no. 11 (January 3, 2014): 2535–47. http://dx.doi.org/10.3724/sp.j.1001.2013.04472.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Richasdy, Donni, and Saiful Akbar. "Path Smoothing With Support Vector Regression." JOURNAL OF INFORMATICS AND TELECOMMUNICATION ENGINEERING 4, no. 1 (July 20, 2020): 142–50. http://dx.doi.org/10.31289/jite.v4i1.3856.

Full text
Abstract:
One of moving object problems is the incomplete data that acquired by Geo-tracking technology. This phenomenon can be found in aircraft ground-based tracking with data loss come near to 5 minutes. It needs path smoothing process to complete the data. One solution of path smoothing is using physics of motion, while this research performs path smoothing process using machine learning algorithm that is Support Vector Regression (SVR). This study will optimize the SVR configuration parameters such as kernel, common, gamma, epsilon and degree. Support Vector Regression will predict value of the data lost from aircraft tracking data. We use combination of mean absolute error (MAE) and mean absolute percentage error (MAPE) to get more accuracy. MAE will explain the average value of error that occurs, while MAPE will explain the error percentage to the data. In the experiment, the best error value MAE 0.52 and MAPE 2.07, which means error data ± 0.52, this is equal to 2.07% of the overall data value.Keywords: Moving Object, Path Smoothing, Support Vector Regression, MAE
APA, Harvard, Vancouver, ISO, and other styles
39

Baser, Furkan, and Aysen Apaydin. "Hybrid fuzzy support vector regression analysis." Journal of Intelligent & Fuzzy Systems 28, no. 5 (June 23, 2015): 2037–45. http://dx.doi.org/10.3233/ifs-141482.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Dewi, K. E., and N. I. Widiastuti. "Support Vector Regression for GPA Prediction." IOP Conference Series: Materials Science and Engineering 879 (August 7, 2020): 012112. http://dx.doi.org/10.1088/1757-899x/879/1/012112.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Heng, Chen, Chen Dirong, and Huang Wei. "Support vector regression for functional data." SCIENTIA SINICA Mathematica 48, no. 3 (January 26, 2018): 409. http://dx.doi.org/10.1360/n012016-00143.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Sun, Shaochao, and Dao Huang. "Flatheaded Support Vector Machine for Regression." Advanced Science Letters 19, no. 8 (August 1, 2013): 2293–99. http://dx.doi.org/10.1166/asl.2013.4907.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Tanveer, M. "Linear programming twin support vector regression." Filomat 31, no. 7 (2017): 2123–42. http://dx.doi.org/10.2298/fil1707123t.

Full text
Abstract:
In this paper, a new linear programming formulation of a 1-norm twin support vector regression is proposed whose solution is obtained by solving a pair of dual exterior penalty problems as unconstrained minimization problems using Newton method. The idea of our formulation is to reformulate TSVR as a strongly convex problem by incorporated regularization technique and then derive a new 1-norm linear programming formulation for TSVR to improve robustness and sparsity. Our approach has the advantage that a pair of matrix equation of order equals to the number of input examples is solved at each iteration of the algorithm. The algorithm converges from any starting point and can be easily implemented in MATLAB without using any optimization packages. The efficiency of the proposed method is demonstrated by experimental results on a number of interesting synthetic and real-world datasets.
APA, Harvard, Vancouver, ISO, and other styles
44

Mangasarian, O. L., and D. R. Musicant. "Robust linear and support vector regression." IEEE Transactions on Pattern Analysis and Machine Intelligence 22, no. 9 (2000): 950–55. http://dx.doi.org/10.1109/34.877518.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Ma, Junshui, James Theiler, and Simon Perkins. "Accurate On-line Support Vector Regression." Neural Computation 15, no. 11 (November 1, 2003): 2683–703. http://dx.doi.org/10.1162/089976603322385117.

Full text
Abstract:
Batch implementations of support vector regression (SVR) are inefficient when used in an on-line setting because they must be retrained from scratch every time the training set is modified. Following an incremental support vector classification algorithm introduced by Cauwenberghs and Poggio (2001), we have developed an accurate on-line support vector regression (AOSVR) that efficiently updates a trained SVR function whenever a sample is added to or removed from the training set. The updated SVR function is identical to that produced by a batch algorithm. Applications of AOSVR in both on-line and cross-validation scenarios are presented.Inbothscenarios, numerical experiments indicate that AOSVR is faster than batch SVR algorithms with both cold and warm start.
APA, Harvard, Vancouver, ISO, and other styles
46

Xu, Yitian, and Laisheng Wang. "A weighted twin support vector regression." Knowledge-Based Systems 33 (September 2012): 92–101. http://dx.doi.org/10.1016/j.knosys.2012.03.013.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Zhao, Yong-Ping, Jing Zhao, and Min Zhao. "Twin least squares support vector regression." Neurocomputing 118 (October 2013): 225–36. http://dx.doi.org/10.1016/j.neucom.2013.03.005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

yongqi, Chen. "Least Squares Support Vector Fuzzy Regression." Energy Procedia 17 (2012): 711–16. http://dx.doi.org/10.1016/j.egypro.2012.02.160.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Balasundaram, S., and Subhash Chandra Prasad. "On pairing Huber support vector regression." Applied Soft Computing 97 (December 2020): 106708. http://dx.doi.org/10.1016/j.asoc.2020.106708.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Huang, Xiaolin, Lei Shi, Kristiaan Pelckmans, and Johan A. K. Suykens. "Asymmetric ν-tube support vector regression." Computational Statistics & Data Analysis 77 (September 2014): 371–82. http://dx.doi.org/10.1016/j.csda.2014.03.016.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography