To see the other types of publications on this topic, follow the link: Support Vector Machine Regression.

Journal articles on the topic 'Support Vector Machine Regression'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Support Vector Machine Regression.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

GUO, Hu-Sheng, and Wen-Jian WANG. "Dynamical Granular Support Vector Regression Machine." Journal of Software 24, no. 11 (January 3, 2014): 2535–47. http://dx.doi.org/10.3724/sp.j.1001.2013.04472.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Sun, Shaochao, and Dao Huang. "Flatheaded Support Vector Machine for Regression." Advanced Science Letters 19, no. 8 (August 1, 2013): 2293–99. http://dx.doi.org/10.1166/asl.2013.4907.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Wang, Jian Guo, Liang Wu Cheng, Wen Xing Zhang, and Bo Qin. "A Modified Incremental Support Vector Machine for Regression." Applied Mechanics and Materials 135-136 (October 2011): 63–69. http://dx.doi.org/10.4028/www.scientific.net/amm.135-136.63.

Full text
Abstract:
support vector machine (SVM) has been shown to exhibit superior predictive power compared to traditional approaches in many studies, such as mechanical equipment monitoring and diagnosis. However, SVM training is very costly in terms of time and memory consumption due to the enormous amounts of training data and the quadratic programming problem. In order to improve SVM training speed and accuracy, we propose a modified incremental support vector machine (MISVM) for regression problems in this paper. The main concepts are that using the distance from the margin vectors which violate the Karush-Kuhn-Tucker (KKT) condition to the final decision hyperplane to evaluate the importance of each margin vectors, and the margin vectors whose distance is below the specified value are preserved, the others are eliminated. Then the original SVs and the remaining margin vectors are used to train a new SVM. The proposed MISVM can not only eliminate the unimportant samples such as noise samples, but also preserved the important samples. The effectiveness of the proposed MISVMs is demonstrated with two UCI data sets. These experiments also show that the proposed MISVM is competitive with previously published methods.
APA, Harvard, Vancouver, ISO, and other styles
4

ZHENG, SHENG, YUQIU SUN, JINWEN TIAN, and JAIN LIU. "MAPPED LEAST SQUARES SUPPORT VECTOR MACHINE REGRESSION." International Journal of Pattern Recognition and Artificial Intelligence 19, no. 03 (May 2005): 459–75. http://dx.doi.org/10.1142/s0218001405004058.

Full text
Abstract:
This paper describes a novel version of regression SVM (Support Vector Machines) that is based on the least-squares error. We show that the solution of this optimization problem can be obtained easily once the inverse of a certain matrix is computed. This matrix, however, depends only on the input vectors, but not on the labels. Thus, if many learning problems with the same set of input vectors but different sets of labels have to be solved, it makes sense to compute the inverse of the matrix just once and then use it for computing all subsequent models. The computational complexity to train an regression SVM can be reduced to O (N2), just a matrix multiplication operation, and thus probably faster than known SVM training algorithms that have O (N2) work with loops. We describe applications from image processing, where the input points are usually of the form {(x0 + dx, y0 + dy) : |dx| < m, |dy| < n} and all such set of points can be translated to the same set {(dx, dy) : |dx| < m, |dy| < n} by subtracting (x0, y0) from all the vectors. The experimental results demonstrate that the proposed approach is faster than those processing each learning problem separately.
APA, Harvard, Vancouver, ISO, and other styles
5

Khemchandani, Reshma, Keshav Goyal, and Suresh Chandra. "TWSVR: Regression via Twin Support Vector Machine." Neural Networks 74 (February 2016): 14–21. http://dx.doi.org/10.1016/j.neunet.2015.10.007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Arjmandzadeh, Ameneh, Sohrab Effati, and Mohammad Zamirian. "Interval Support Vector Machine In Regression Analysis." Journal of Mathematics and Computer Science 02, no. 03 (April 15, 2011): 565–71. http://dx.doi.org/10.22436/jmcs.02.03.19.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

熊, 令纯. "Five Understandings on Support Vector Machine Regression." Hans Journal of Data Mining 09, no. 02 (2019): 52–59. http://dx.doi.org/10.12677/hjdm.2019.92007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Rastogi (nee Khemchandani), Reshma, Pritam Anand, and Suresh Chandra. "-norm Twin Support Vector Machine-based Regression." Optimization 66, no. 11 (August 21, 2017): 1895–911. http://dx.doi.org/10.1080/02331934.2017.1364739.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Seok, Kyungha, Changha Hwang, and Daehyeon Cho. "PREDICTION INTERVALS FOR SUPPORT VECTOR MACHINE REGRESSION." Communications in Statistics - Theory and Methods 31, no. 10 (January 12, 2002): 1887–98. http://dx.doi.org/10.1081/sta-120014918.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Xu, Qifa, Jinxiu Zhang, Cuixia Jiang, Xue Huang, and Yaoyao He. "Weighted quantile regression via support vector machine." Expert Systems with Applications 42, no. 13 (August 2015): 5441–51. http://dx.doi.org/10.1016/j.eswa.2015.03.003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Chen, Xiaobo, Jian Yang, and Jun Liang. "A flexible support vector machine for regression." Neural Computing and Applications 21, no. 8 (May 25, 2011): 2005–13. http://dx.doi.org/10.1007/s00521-011-0623-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Guenther, Nick, and Matthias Schonlau. "Support Vector Machines." Stata Journal: Promoting communications on statistics and Stata 16, no. 4 (December 2016): 917–37. http://dx.doi.org/10.1177/1536867x1601600407.

Full text
Abstract:
Support vector machines are statistical- and machine-learning techniques with the primary goal of prediction. They can be applied to continuous, binary, and categorical outcomes analogous to Gaussian, logistic, and multinomial regression. We introduce a new command for this purpose, svmachines. This package is a thin wrapper for the widely deployed libsvm (Chang and Lin, 2011, ACM Transactions on Intelligent Systems and Technology 2(3): Article 27). We illustrate svmachines with two examples.
APA, Harvard, Vancouver, ISO, and other styles
13

Besalatpour, A., M. Hajabbasi, S. Ayoubi, A. Gharipour, and A. Jazi. "Prediction of soil physical properties by optimized support vector machines." International Agrophysics 26, no. 2 (April 1, 2012): 109–15. http://dx.doi.org/10.2478/v10247-012-0017-7.

Full text
Abstract:
Prediction of soil physical properties by optimized support vector machinesThe potential use of optimized support vector machines with simulated annealing algorithm in developing prediction functions for estimating soil aggregate stability and soil shear strength was evaluated. The predictive capabilities of support vector machines in comparison with traditional regression prediction functions were also studied. In results, the support vector machines achieved greater accuracy in predicting both soil shear strength and soil aggregate stability properties comparing to traditional multiple-linear regression. The coefficient of correlation (R) between the measured and predicted soil shear strength values using the support vector machine model was 0.98 while it was 0.52 using the multiple-linear regression model. Furthermore, a lower mean square error value of 0.06 obtained using the support vector machine model in prediction of soil shear strength as compared to the multiple-linear regression model. The ERROR% value for soil aggregate stability prediction using the multiple-linear regression model was 14.59% while a lower ERROR% value of 4.29% was observed for the support vector machine model. The mean square error values for soil aggregate stability prediction using the multiple-linear regression and support vector machine models were 0.001 and 0.012, respectively. It appears that utilization of optimized support vector machine approach with simulated annealing algorithm in developing soil property prediction functions could be a suitable alternative to commonly used regression methods.
APA, Harvard, Vancouver, ISO, and other styles
14

ZHANG, GAI-YING, GAO GUO, and JIANG-SHE ZHANG. "SVR+RVR: A ROBUST SPARSE KERNEL METHOD FOR REGRESSION." International Journal on Artificial Intelligence Tools 19, no. 05 (October 2010): 627–45. http://dx.doi.org/10.1142/s0218213010000340.

Full text
Abstract:
Support vector machine (SVM) and relevance vector machine (RVM) are two state of the art kernel learning methods. But both methods have some disadvantages: although SVM is very robust against outliers, it makes unnecessarily liberal use of basis functions since the number of support vectors required typically grows linearly with the size of the training set; on the other hand the solution of RVM is astonishingly sparse, but its performance deteriorates significantly when the observations are contaminated by outliers. In this paper, we present a combination of SVM and RVM for regression problems, in which the two methods are concatenated: firstly, we train a support vector regression (SVR) machine on the full training set; then a relevance vector regression (RVR) machine is trained only on a subset consisting of support vectors, but whose target values are replaced by the predictions of SVR. Using this combination, we overcome the drawbacks of SVR and RVR. Experiments demonstrate SVR+RVR is both very sparse and robust.
APA, Harvard, Vancouver, ISO, and other styles
15

Seok, Kyungha. "Semi-supervised regression based on support vector machine." Journal of the Korean Data and Information Science Society 25, no. 2 (March 31, 2014): 447–54. http://dx.doi.org/10.7465/jkdi.2014.25.2.447.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Zhang, Hang, and Yangmei Lei. "BSP-Based Support Vector Regression Machine Parallel Framework." International Journal of Networked and Distributed Computing 1, no. 3 (2013): 134. http://dx.doi.org/10.2991/ijndc.2013.1.3.2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Wauters, Mathieu, and Mario Vanhoucke. "Support Vector Machine Regression for project control forecasting." Automation in Construction 47 (November 2014): 92–106. http://dx.doi.org/10.1016/j.autcon.2014.07.014.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Ban, Bo, Junjie Yang, Pengguang Chen, Jianbin Xiong, and Qinruo Wang. "Ship Track Regression Based on Support Vector Machine." IEEE Access 5 (2017): 18836–46. http://dx.doi.org/10.1109/access.2017.2749260.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Li, Guoqi, Changyun Wen, Guang-Bin Huang, and Yan Chen. "Error tolerance based support vector machine for regression." Neurocomputing 74, no. 5 (February 2011): 771–82. http://dx.doi.org/10.1016/j.neucom.2010.10.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Dindarloo, Saeid R. "Support vector machine regression analysis of LHD failures." International Journal of Mining, Reclamation and Environment 30, no. 1 (November 3, 2014): 64–69. http://dx.doi.org/10.1080/17480930.2014.973637.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Xue, Zhenxia, Roxin Zhang, Chuandong Qin, and Xiaoqing Zeng. "A rough ν-twin support vector regression machine." Applied Intelligence 48, no. 11 (May 23, 2018): 4023–46. http://dx.doi.org/10.1007/s10489-018-1185-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Safari, Amir. "An e–E-insensitive support vector regression machine." Computational Statistics 29, no. 6 (May 11, 2014): 1447–68. http://dx.doi.org/10.1007/s00180-014-0500-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Shao, Yuan-Hai, Chun-Hua Zhang, Zhi-Min Yang, Ling Jing, and Nai-Yang Deng. "An ε-twin support vector machine for regression." Neural Computing and Applications 23, no. 1 (April 8, 2012): 175–85. http://dx.doi.org/10.1007/s00521-012-0924-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Lu, Yumao, and Vwani Roychowdhury. "Parallel randomized sampling for support vector machine (SVM) and support vector regression (SVR)." Knowledge and Information Systems 14, no. 2 (June 30, 2007): 233–47. http://dx.doi.org/10.1007/s10115-007-0082-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Wang, De Cheng, Er Hao Liu, and Hui Lin. "One Approach for Direct Torque Control Switching Voltage Vector Selection." Applied Mechanics and Materials 246-247 (December 2012): 867–71. http://dx.doi.org/10.4028/www.scientific.net/amm.246-247.867.

Full text
Abstract:
Direct torque control selected switching voltage vector according to torque hysteresis comparator output, flux hysteresis comparator output, and sector. One switching voltage vector selection approach was proposed. It used support vector regression machine to carry out direct torque control switching voltage vector selection. The selection of eight switching voltage vectors was an eight classification problem. This classification problem was changed into regression problem by support vector regression machine. The nonlinear function used for switching voltage vector selection was gained by support vector regression machine training. Asynchronous motor direct torque control simulation result shows feasibility and effectivity of proposed method.
APA, Harvard, Vancouver, ISO, and other styles
26

Liu, Jie Fang, Pu Mei Gao, and Bao Lin Ma. "CWT-Support Vector Regression Model and Its Application." Advanced Materials Research 113-116 (June 2010): 207–10. http://dx.doi.org/10.4028/www.scientific.net/amr.113-116.207.

Full text
Abstract:
Near-infrared spectroscopy (NIR) analytical technique is simple, fast and low cost, making neither pollution nor damage to the samples, and can determine many components simultaneously. Continuous wavelet transform (CWT), as an application direction of the wavelet analysis, is keener to the signal slight change. Support vector machine (SVM) is based on the principle of structural risk minimization, which makes SVM has better generalization ability than other traditional learning machines that are based on the learning principle of empirical risk minimization. In this paper, we use CWT- SVM model to predict meat’s component. Compared with Partial Least Squares (PLS) and SVR, we get more satisfactory result.
APA, Harvard, Vancouver, ISO, and other styles
27

Wang, Kuaini, Jingjing Zhang, Yanyan Chen, and Ping Zhong. "Least Absolute Deviation Support Vector Regression." Mathematical Problems in Engineering 2014 (2014): 1–8. http://dx.doi.org/10.1155/2014/169575.

Full text
Abstract:
Least squares support vector machine (LS-SVM) is a powerful tool for pattern classification and regression estimation. However, LS-SVM is sensitive to large noises and outliers since it employs the squared loss function. To solve the problem, in this paper, we propose an absolute deviation loss function to reduce the effects of outliers and derive a robust regression model termed as least absolute deviation support vector regression (LAD-SVR). The proposed loss function is not differentiable. We approximate it by constructing a smooth function and develop a Newton algorithm to solve the robust model. Numerical experiments on both artificial datasets and benchmark datasets demonstrate the robustness and effectiveness of the proposed method.
APA, Harvard, Vancouver, ISO, and other styles
28

Hong, Dug Hun, and Changha Hwang. "Support vector fuzzy regression machines." Fuzzy Sets and Systems 138, no. 2 (September 2003): 271–81. http://dx.doi.org/10.1016/s0165-0114(02)00514-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Díaz-Vico, David, Jesús Prada, Adil Omari, and José Dorronsoro. "Deep support vector neural networks." Integrated Computer-Aided Engineering 27, no. 4 (September 11, 2020): 389–402. http://dx.doi.org/10.3233/ica-200635.

Full text
Abstract:
Kernel based Support Vector Machines, SVM, one of the most popular machine learning models, usually achieve top performances in two-class classification and regression problems. However, their training cost is at least quadratic on sample size, making them thus unsuitable for large sample problems. However, Deep Neural Networks (DNNs), with a cost linear on sample size, are able to solve big data problems relatively easily. In this work we propose to combine the advanced representations that DNNs can achieve in their last hidden layers with the hinge and ϵ insensitive losses that are used in two-class SVM classification and regression. We can thus have much better scalability while achieving performances comparable to those of SVMs. Moreover, we will also show that the resulting Deep SVM models are competitive with standard DNNs in two-class classification problems but have an edge in regression ones.
APA, Harvard, Vancouver, ISO, and other styles
30

Lang, Rongling, Fei Zhao, and Yongtang Shi. "A Support Vector Machine for Regression in Complex Field." Informatica 28, no. 4 (January 1, 2017): 651–64. http://dx.doi.org/10.15388/informatica.2017.150.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Hwang, Changha, Sang-Il Choi, and Jooyong Shim. "Deep multiple kernel least squares support vector regression machine." Journal of the Korean Data And Information Science Sociaty 29, no. 4 (July 31, 2018): 895–902. http://dx.doi.org/10.7465/jkdi.2018.29.4.895.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Liu, Kun, and Bing-Yu Sun. "Least Squares Support Vector Machine Regression with Equality Constraints." Physics Procedia 24 (2012): 2227–30. http://dx.doi.org/10.1016/j.phpro.2012.02.327.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Peng, Xinjun. "TSVR: An efficient Twin Support Vector Machine for regression." Neural Networks 23, no. 3 (April 2010): 365–72. http://dx.doi.org/10.1016/j.neunet.2009.07.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Bo, Li, Li Xinjun, and Zhao Zhiyan. "Novel algorithm for constructing support vector machine regression ensemble." Journal of Systems Engineering and Electronics 17, no. 3 (September 2006): 541–45. http://dx.doi.org/10.1016/s1004-4132(06)60093-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Gao, J. B., S. R. Gunn, and C. J. Harris. "Mean field method for the support vector machine regression." Neurocomputing 50 (January 2003): 391–405. http://dx.doi.org/10.1016/s0925-2312(02)00573-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Peng, Xinjun, Dong Xu, and Jindong Shen. "A twin projection support vector machine for data regression." Neurocomputing 138 (August 2014): 131–41. http://dx.doi.org/10.1016/j.neucom.2014.02.028.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Peng, Xinjun, and De Chen. "PTSVRs: Regression models via projection twin support vector machine." Information Sciences 435 (April 2018): 1–14. http://dx.doi.org/10.1016/j.ins.2018.01.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Shim, Joo-Yong, Jong-Sig Bae, and Chang-Ha Hwang. "Multiclass Classification via Least Squares Support Vector Machine Regression." Communications for Statistical Applications and Methods 15, no. 3 (May 30, 2008): 441–50. http://dx.doi.org/10.5351/ckss.2008.15.3.441.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Dug Hun Hong and Changha Hwang. "Interval regression analysis using quadratic loss support vector machine." IEEE Transactions on Fuzzy Systems 13, no. 2 (April 2005): 229–37. http://dx.doi.org/10.1109/tfuzz.2004.840133.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Richasdy, Donni, and Saiful Akbar. "Path Smoothing With Support Vector Regression." JOURNAL OF INFORMATICS AND TELECOMMUNICATION ENGINEERING 4, no. 1 (July 20, 2020): 142–50. http://dx.doi.org/10.31289/jite.v4i1.3856.

Full text
Abstract:
One of moving object problems is the incomplete data that acquired by Geo-tracking technology. This phenomenon can be found in aircraft ground-based tracking with data loss come near to 5 minutes. It needs path smoothing process to complete the data. One solution of path smoothing is using physics of motion, while this research performs path smoothing process using machine learning algorithm that is Support Vector Regression (SVR). This study will optimize the SVR configuration parameters such as kernel, common, gamma, epsilon and degree. Support Vector Regression will predict value of the data lost from aircraft tracking data. We use combination of mean absolute error (MAE) and mean absolute percentage error (MAPE) to get more accuracy. MAE will explain the average value of error that occurs, while MAPE will explain the error percentage to the data. In the experiment, the best error value MAE 0.52 and MAPE 2.07, which means error data ± 0.52, this is equal to 2.07% of the overall data value.Keywords: Moving Object, Path Smoothing, Support Vector Regression, MAE
APA, Harvard, Vancouver, ISO, and other styles
41

Gâlmeanu, Honorius, Lucian Mircea Sasu, and Razvan Andonie. "Incremental and Decremental SVM for Regression." International Journal of Computers Communications & Control 11, no. 6 (October 17, 2016): 755. http://dx.doi.org/10.15837/ijccc.2016.6.2744.

Full text
Abstract:
Training a support vector machine (SVM) for regression (function approximation) in an incremental/decremental way consists essentially in migrating the input vectors in and out of the support vector set with specific modification of the associated thresholds. We introduce with full details such a method, which allows for defining the exact increments or decrements associated with the thresholds before vector migrations take place. Two delicate issues are especially addressed: the variation of the regularization parameter (for tuning the model performance) and the extreme situations where the support vector set becomes empty. We experimentally compare our method with several regression methods: the multilayer perceptron, two standard SVM implementations, and two models based on adaptive resonance theory.
APA, Harvard, Vancouver, ISO, and other styles
42

Qu, Qiang, Ming Qi Chang, Lei Xu, Yue Wang, and Shao Hua Lu. "Support Vector Machine-Based Aqueduct Safety Assessment." Advanced Materials Research 368-373 (October 2011): 531–36. http://dx.doi.org/10.4028/www.scientific.net/amr.368-373.531.

Full text
Abstract:
According to water power, structure and foundation conditions of aqueduct, it has established aqueduct safety assessment indicator system and standards. Based on statistical learning theory, support vector machine shifts the learning problems into a convex quadratic programming problem with structural risk minimization criterion, which could get the global optimal solution, and be applicable to solving the small sample, nonlinearity classification and regression problems. In order to evaluate the safety condition of aqueduct, it has established the aqueduct safety assessment model which is based on support vector machine. It has divided safety standards into normal, basically normal, abnormal and dangerous. According to the aqueduct safety assessment standards and respective evaluation level, the sample set is generated randomly, which is used to build a pair of classifier with many support vectors. The results show that the method is feasible, and it has a good application prospect in irrigation district canal building safety assessment.
APA, Harvard, Vancouver, ISO, and other styles
43

Shi, Xu Chao, Qi Xia Liu, and Xiu Juan Lv. "Application of SVM in Predicting the Strength of Cement Stabilized Soil." Applied Mechanics and Materials 160 (March 2012): 313–17. http://dx.doi.org/10.4028/www.scientific.net/amm.160.313.

Full text
Abstract:
Support Vector Machine is a powerful machine learning technique based on statistical learning theory. This paper investigates the potential of support vector machines based regression approach to model the strength of cement stabilized soil from test dates. Support Vector Machine model is proposed to predict compressive strength of cement stabilized soil. And the effects of selecting kernel function on Support Vector Machine modeling are also analyzed. The results show that the Support Vector Machine is more precise in measuring the strength of cement than traditional methods. The Support Vector Machine method has advantages in its simple structure,excellent capability in studying and good application prospects, also it provide us with a novel method of measuring the strength of cement stabilized soil.
APA, Harvard, Vancouver, ISO, and other styles
44

SONMEZ, Rifat, and Burak SÖZGEN. "A support vector machine method for bid/no bid decision making." JOURNAL OF CIVIL ENGINEERING AND MANAGEMENT 23, no. 5 (May 24, 2017): 641–49. http://dx.doi.org/10.3846/13923730.2017.1281836.

Full text
Abstract:
The bid/no bid decision is an important and complex process, and is impacted by numerous variables that are related to the contractor, project, client, competitors, tender and market conditions. Despite the complexity of bid decision making process, in the construction industry the majority of bid/no bid decisions is made informally based on experience, judgment, and perception. In this paper, a procedure based on support vector machines and backward elimination regression is presented for improving the existing bid decision making methods. The method takes advan­tage of the strong generalization properties of support vector machines and attempts to further enhance generalization performance by eliminating insignificant input variables. The method is implemented for bid/no bid decision making of offshore oil and gas platform fabrication projects to achieve a parsimonious support vector machine classifier. The performance of the support vector machine classifier is compared with the performances of the worth evaluation model, linear regression, and neural network classifiers. The results show that the support vector machine classifier outperforms existing methods significantly, and the proposed procedure provides a powerful tool for bid/no bid decision making. The results also reveal that elimination of the insignificant input variables improves generalization performance of the sup­port vector machines.
APA, Harvard, Vancouver, ISO, and other styles
45

Yang, Xianfei, Xiang Yu, and Hui Lu. "Dual possibilistic regression models of support vector machines and application in power load forecasting." International Journal of Distributed Sensor Networks 16, no. 5 (May 2020): 155014772092163. http://dx.doi.org/10.1177/1550147720921636.

Full text
Abstract:
Power load forecasting is an important guarantee of safe, stable, and economic operation of power systems. It is appropriate to use interval data to represent fuzzy information in power load forecasting. The dual possibilistic regression models approximate the observed interval data from the outside and inside directions, respectively, which can estimate the inherent uncertainty existing in the given fuzzy phenomenon well. In this article, efficient dual possibilistic regression models of support vector machines based on solving a group of quadratic programming problems are proposed. And each quadratic programming problem containing fewer optimization variables makes the training speed of the proposed approach fast. Compared with other interval regression approaches based on support vector machines, such as quadratic loss support vector machine approach and two smaller quadratic programming problem support vector machine approach, the proposed approach is more efficient on several artificial datasets and power load dataset.
APA, Harvard, Vancouver, ISO, and other styles
46

Giustolisi, Orazio. "Using a multi-objective genetic algorithm for SVM construction." Journal of Hydroinformatics 8, no. 2 (March 1, 2006): 125–39. http://dx.doi.org/10.2166/hydro.2006.016b.

Full text
Abstract:
Support Vector Machines are kernel machines useful for classification and regression problems. In this paper, they are used for non-linear regression of environmental data. From a structural point of view, Support Vector Machines are particular Artificial Neural Networks and their training paradigm has some positive implications. In fact, the original training approach is useful to overcome the curse of dimensionality and too strict assumptions on statistics of the errors in data. Support Vector Machines and Radial Basis Function Regularised Networks are presented within a common structural framework for non-linear regression in order to emphasise the training strategy for support vector machines and to better explain the multi-objective approach in support vector machines' construction. A support vector machine's performance depends on the kernel parameter, input selection and ε-tube optimal dimension. These will be used as decision variables for the evolutionary strategy based on a Genetic Algorithm, which exhibits the number of support vectors, for the capacity of machine, and the fitness to a validation subset, for the model accuracy in mapping the underlying physical phenomena, as objective functions. The strategy is tested on a case study dealing with groundwater modelling, based on time series (past measured rainfalls and levels) for level predictions at variable time horizons.
APA, Harvard, Vancouver, ISO, and other styles
47

Fearn, Tom. "Support Vector Machines IV: Penalised Regression." NIR news 16, no. 2 (March 2005): 11–12. http://dx.doi.org/10.1255/nirn.811.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Grinblat, Guillermo L., Lucas C. Uzal, Pablo F. Verdes, and Pablo M. Granitto. "Nonstationary regression with support vector machines." Neural Computing and Applications 26, no. 3 (October 7, 2014): 641–49. http://dx.doi.org/10.1007/s00521-014-1742-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Tong, Hongzhi, Di-Rong Chen, and Lizhong Peng. "Analysis of Support Vector Machines Regression." Foundations of Computational Mathematics 9, no. 2 (April 4, 2008): 243–57. http://dx.doi.org/10.1007/s10208-008-9026-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Chen, Zhan-bo. "Research on Application of Regression Least Squares Support Vector Machine on Performance Prediction of Hydraulic Excavator." Journal of Control Science and Engineering 2014 (2014): 1–4. http://dx.doi.org/10.1155/2014/686130.

Full text
Abstract:
In order to improve the performance prediction accuracy of hydraulic excavator, the regression least squares support vector machine is applied. First, the mathematical model of the regression least squares support vector machine is studied, and then the algorithm of the regression least squares support vector machine is designed. Finally, the performance prediction simulation of hydraulic excavator based on regression least squares support vector machine is carried out, and simulation results show that this method can predict the performance changing rules of hydraulic excavator correctly.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography