Academic literature on the topic 'Support vector regression'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Support vector regression.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Support vector regression"

1

Sabzekar, Mostafa, and Seyed Mohammad Hossein Hasheminejad. "Robust regression using support vector regressions." Chaos, Solitons & Fractals 144 (March 2021): 110738. http://dx.doi.org/10.1016/j.chaos.2021.110738.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Jun, Sung-Hae. "An Outlier Data Analysis using Support Vector Regression." Journal of Korean Institute of Intelligent Systems 18, no. 6 (December 25, 2008): 876–80. http://dx.doi.org/10.5391/jkiis.2008.18.6.876.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Jun, Sung-Hae, Jung-Eun Park, and Kyung-Whan Oh. "A Sparse Data Preprocessing Using Support Vector Regression." Journal of Korean Institute of Intelligent Systems 14, no. 6 (October 1, 2004): 789–92. http://dx.doi.org/10.5391/jkiis.2004.14.6.789.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Lee, Hyoung-Ro, and Hyun-Jung Shin. "Electricity Demand Forecasting based on Support Vector Regression." IE interfaces 24, no. 4 (December 1, 2011): 351–61. http://dx.doi.org/10.7232/ieif.2011.24.4.351.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Kenesei, Tamás, and János Abonyi. "Interpretable support vector regression." Artificial Intelligence Research 1, no. 2 (October 9, 2012): 11. http://dx.doi.org/10.5430/air.v1n2p11.

Full text
Abstract:
This paper deals with transforming Support vector regression (SVR) models into fuzzy systems (FIS). It is highlighted that trained support vector based models can be used for the construction of fuzzy rule-based regression models. However, the transformed support vector model does not automatically result in an interpretable fuzzy model. Training of a support vector model results a complex rule base, where the number of rules are approximately 40-60% of the number of the training data, therefore reduction of the support vector model initialized fuzzy model is an essential task. For this purpose, a three-step reduction algorithm is used based on the combination of previously published model reduction techniques, namely the reduced set method to decrease number of kernel functions, then after the reduced support vector model is transformed into fuzzy rule base similarity measure based merging and orthogonal least-squares methods are utilized. The proposed approach is applied for nonlinear system identification, the identification of a Hammerstein system is used to demonstrate accuracy of the technique with fulfilling the criteria of interpretability.
APA, Harvard, Vancouver, ISO, and other styles
6

Lv, Yuan, and Zhong Gan. "Robustε-Support Vector Regression." Mathematical Problems in Engineering 2014 (2014): 1–5. http://dx.doi.org/10.1155/2014/373571.

Full text
Abstract:
Spheroid disturbance of input data brings great challenges to support vector regression; thus it is essential to study the robust regression model. This paper is dedicated to establish a robust regression model which makes the regression function robust against disturbance of data and system parameter. Firstly, two theorems have been given to show that the robust linearε-support vector regression problem could be settled by solving the dual problems. Secondly, it has been focused on the development of robust support vector regression algorithm which is extended from linear domain to nonlinear domain. Finally, the numerical experiments result demonstrates the effectiveness of the models and algorithms proposed in this paper.
APA, Harvard, Vancouver, ISO, and other styles
7

Lingras, P., and C. J. Butz. "Rough support vector regression." European Journal of Operational Research 206, no. 2 (October 2010): 445–55. http://dx.doi.org/10.1016/j.ejor.2009.10.023.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Chu, Wei, and S. Sathiya Keerthi. "Support Vector Ordinal Regression." Neural Computation 19, no. 3 (March 2007): 792–815. http://dx.doi.org/10.1162/neco.2007.19.3.792.

Full text
Abstract:
In this letter, we propose two new support vector approaches for ordinal regression, which optimize multiple thresholds to define parallel discriminant hyperplanes for the ordinal scales. Both approaches guarantee that the thresholds are properly ordered at the optimal solution. The size of these optimization problems is linear in the number of training samples. The sequential minimal optimization algorithm is adapted for the resulting optimization problems; it is extremely easy to implement and scales efficiently as a quadratic function of the number of examples. The results of numerical experiments on some benchmark and real-world data sets, including applications of ordinal regression to information retrieval, verify the usefulness of these approaches.
APA, Harvard, Vancouver, ISO, and other styles
9

Harrington, Peter de B. "Automated support vector regression." Journal of Chemometrics 31, no. 4 (December 28, 2016): e2867. http://dx.doi.org/10.1002/cem.2867.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Panagopoulos, Orestis P., Petros Xanthopoulos, Talayeh Razzaghi, and Onur Şeref. "Relaxed support vector regression." Annals of Operations Research 276, no. 1-2 (April 11, 2018): 191–210. http://dx.doi.org/10.1007/s10479-018-2847-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Support vector regression"

1

Shah, Rohan Shiloh. "Support vector machines for classification and regression." Thesis, McGill University, 2007. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=100247.

Full text
Abstract:
In the last decade Support Vector Machines (SVMs) have emerged as an important learning technique for solving classification and regression problems in various fields, most notably in computational biology, finance and text categorization. This is due in part to built-in mechanisms to ensure good generalization which leads to accurate prediction, the use of kernel functions to model non-linear distributions, the ability to train relatively quickly on large data sets using novel mathematical optimization techniques and most significantly the possibility of theoretical analysis using computational learning theory. In this thesis, we discuss the theoretical basis and computational approaches to Support Vector Machines.
APA, Harvard, Vancouver, ISO, and other styles
2

Lee, Keun Joo. "Geometric Tolerancing of Cylindricity Utilizing Support Vector Regression." Scholarly Repository, 2009. http://scholarlyrepository.miami.edu/oa_theses/233.

Full text
Abstract:
In the age where quick turn around time and high speed manufacturing methods are becoming more important, quality assurance is a consistent bottleneck in production. With the development of cheap and fast computer hardware, it has become viable to use machine vision for the collection of data points from a machined part. The generation of these large sample points have necessitated a need for a comprehensive algorithm that will be able to provide accurate results while being computationally efficient. Current established methods are least-squares (LSQ) and non-linear programming (NLP). The LSQ method is often deemed too inaccurate and is prone to providing bad results, while the NLP method is computationally taxing. A novel method of using support vector regression (SVR) to solve the NP-hard problem of cylindricity of machined parts is proposed. This method was evaluated against LSQ and NLP in both accuracy and CPU processing time. An open-source, user-modifiable programming package was developed to test the model. Analysis of test results show the novel SVR algorithm to be a viable alternative in exploring different methods of cylindricity in real-world manufacturing.
APA, Harvard, Vancouver, ISO, and other styles
3

Nayeri, Negin. "Option strategies using hybrid Support Vector Regression - ARIMA." Thesis, KTH, Matematisk statistik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-275719.

Full text
Abstract:
In this thesis, the use of machine learning in option strategies is evaluated with focus on the S&P 500 Index. The first part of the thesis focuses on testing the performance power of the Support Vector Regression (SVR) method for the historical realized volatility with a window of 20 days. The prediction window will also be 1-month forward (approximately 20 trading days). The second part of the thesis focuses on creating an ARIMA model that forecasts the error that is based on the difference between the predicted respective true values. This is done in order to create the hybrid SVR-ARIMA model. The new model now consists of a realized volatility value derived from the SVR model as well as the error obtained from the ARIMA model. Lastly, the two methods, that is single SVR and hybrid SVR-ARIMA are compared and the model that exhibits the best result is used within two option strategies. The results showcase the promising forecasting power of the SVR method which for this dataset had an accuracy leveland 67 % for the realized volatility. The ARIMA model also exhibits successful forecasting ability for the next lag. However, for this dataset, the Hybrid SVR-ARIMA model outperforms the single SVR model. It is debatable whether the success of these methods may be due to the fact the dataset only covers the years between 2010-2018 and the highly volatile environments of the financial crisis 2008 is omitted. Nonetheless, the use of the hybrid SVR-ARIMA model used within the two option strategies gives an average payoff 0.37 % and 1.68 %. It should however be noted that the affiliated costs of trading options is not included in the payoff and neither is the cost of premium due in buying options as the costs vary depending on the origin of the purchase. This thesis has been performed in collaboration with Crescit Asset Management in Stockholm, Sweden.
I denna uppsats utvärderas användningen av maskininlärning i optionsstrategier med fokus på S&P 500 Index. Den första delen av uppsatsen fokuserar på att testa prognos kraften av Support Vector Regression (SVR) metoden för den realiserade volatiliteten med ett fönster på 20 dagar. Prognos kommer att ske för 1 månad framåt (20 trading dagar). Den andra delen av uppsatsen fokuserar på att skapa en ARIMA-modell som prognostiserar nästa värdet i tidsserien som baseras på skillnaden mellan de erhållna prognoserna samt sanna värdena. Detta görs för att skapa en hybrid SVR-ARIMA-modell. Den nya modellen består nu av ett realiserat volatilitetsvärde härrörande från SVR samt den error som erhållits från ARIMA- modellen. Avslutningsvis kommer de två metoderna, det vill säga SVR och hybrid SVR-ARIMA, jämföras och den modell med bäst resultat användas inom två options strategier. Resultaten visar den lovande prognotiseringsförmågan för SVR-metoden som för denna dataset hade en noggrannhetsnivå på 67 % för realiserad volatiliteten. ARIMA- modellen visar också en framgångsrik prognosförmåga för nästa punkt i tidsserien. Dock överträffar Hybrid SVR-ARIMA-modellen SVR-modellen för detta dataset. Det kan diskuteras ifall framgången med dessa metoder kan bero på att denna dataset täcker åren mellan 2010-2018 och det mycket volatila tiden under finanskrisen 2008 är uteslutet. Detta kan ifrågasätta modellernas prognotiseringsförmåga på högre volatilitetsmarknader. Dock ger användningen av hybrid-SVR-ARIMA-modellen som används inom de två option strategierna en genomsnittlig avkastning på 0,37 % och 1,68 %. Det bör dock noteras att de tillkommande kostnaderna för att handla optioner samt premiekostnaden som skall betalas i samband med köp av optioner inte ingår i avkastningen då dessa kostnader varierar beroende på plats av köp. Denna uppsats har gjorts i samarbete med Crescit Asset Management i Stockholm.
APA, Harvard, Vancouver, ISO, and other styles
4

Ericson, Johan. "Lastprediktering : Med Neuralt Nätverk och Support Vector Regression." Thesis, Karlstads universitet, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-73371.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Wu, Zhili. "Regularization methods for support vector machines." HKBU Institutional Repository, 2008. http://repository.hkbu.edu.hk/etd_ra/912.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Yaohao, Peng. "Support Vector Regression aplicado à previsão de taxas de câmbio." reponame:Repositório Institucional da UnB, 2016. http://repositorio.unb.br/handle/10482/23270.

Full text
Abstract:
Dissertação (mestrado)—Universidade de Brasília, Faculdade de Economia, Administração, Contabilidade e Gestão Pública, Programa de Pós-Graduação em Administração, 2016.
Submitted by Fernanda Percia França (fernandafranca@bce.unb.br) on 2017-03-29T16:54:14Z No. of bitstreams: 1 2016_PengYaohao.pdf: 1180450 bytes, checksum: ef8f0884103e32b6a4dac98a1c9dd880 (MD5)
Approved for entry into archive by Raquel Viana(raquelviana@bce.unb.br) on 2017-04-13T20:57:38Z (GMT) No. of bitstreams: 1 2016_PengYaohao.pdf: 1180450 bytes, checksum: ef8f0884103e32b6a4dac98a1c9dd880 (MD5)
Made available in DSpace on 2017-04-13T20:57:38Z (GMT). No. of bitstreams: 1 2016_PengYaohao.pdf: 1180450 bytes, checksum: ef8f0884103e32b6a4dac98a1c9dd880 (MD5)
O presente estudo realizou a previsão da taxa spot de 15 pares de câmbio mediante a aplicação de um algoritmo de aprendizado de máquinas – Support Vector Regression – com base em um modelo fundamentalista composto por 13 variáveis explicativas. Para a estimação das previsões, foram consideradas 9 funções Kernel extraídas da literatura científica, totalizando assim 135 modelos verificados. As previsões foram comparadas com o benchmark Random Walke avaliadas em relação à taxa de acerto direcional do câmbio e às métricas de erro RMSE (raiz quadrada do erro quadrático médio) e MAE (erro absoluto médio). A significância estatística do incremento de poder explicativo dos modelos SVR em relação ao Random Walk foi verificada mediante a aplicação do Reality Check Test de White (2000). Os resultados mostram que os modelos SVR obtiveram desempenho preditivo satisfatório em relação ao benchmark, com vários dos modelos propostos apresentando forte significância estatística de superioridade preditiva.Por outro lado, observou-se que várias funções Kernel comumente utilizadas na literatura científica não lograram êxito em superar o Random Walk, apontando para uma possível lacuna no estado da arte de aprendizado de máquinas aplicada à previsão de taxas de câmbio. Por fim, discutiu-se acerca das implicações dos resultados obtidos para o desenvolvimento futuro da agenda de pesquisa correlata.
This paper aims to forecast the spot exchange rate of 15 currency pairs by applying a machinelearning algorithm – Support Vector Regression – based on a fundamentalist model composedof 13 explanatory variables. The predictions’ estimation were obtained by applying 9different Kernel functions extracted from the scientific literature, resulting in a total of 135 modelsverified. The predictions were compared to the Random Walk benchmark and evaluated for directionalaccuracy rate of exchange pradictions and error performance indices RMSE (root meansquare error) and MAE (mean absolute error). The statistical significance of the explanatorypower gain via SVR models with respect to the Random Walk was checked by applying White(2000)’s Reality Check Test. The results show that SVR models achieved satisfactory predictiveperformance relative to the benchmark, with several of the proposed models showing strong statisticalsignificance of predictive superiority. Furthermore, the results showed that mainstreamKernel functions commonly used in the scientific literature failed to outperform the RandomWalk,indicating a possible gap in the state of art of machine learning methods applications to exchangerates forecasting. Finally, the paper presents a discussion about the implications of the obtainedresults for the future development of related research agendas.
APA, Harvard, Vancouver, ISO, and other styles
7

Beltrami, Monica. "Precificação de opções sobre ações por modelos de Support Vector Regression." reponame:Repositório Institucional da UFPR, 2012. http://hdl.handle.net/1884/27334.

Full text
Abstract:
Resumo: Derivativos são títulos financeiros cujo valor de mercado deriva do preço de outro ativo. Dentre as modalidades de derivativos existentes, destacam-se as opções. No mercado de opções, negociam-se contratos que concedem ao seu titular o direito futuro de comprar ou vender um ativo objeto por um preço determinado no presente, chamado de preço de exercício. As opções são comercializadas no mercado mediante o pagamento de um prêmio, correspondente ao valor monetário do contrato. O valor desse prêmio sofre influência de diversos fatores e oscila de acordo com a expectativa do mercado. A determinação de preços teóricos de opções mediante modelos matemáticos permite ao investidor verificar se os preços estabelecidos pelo mercado estão superestimados ou subestimados. Essas informações influenciam diretamente nas operações realizadas pelo investidor. Desta forma, é preciso que o modelo empregado apresente alto grau de confiabilidade e seja condizente com a realidade do mercado ao qual ele se destina. Neste sentido, essa dissertação tem como objetivo estabelecer um modelo de precificação de opções baseado na técnica de Support Vector Regression (SVR), que capte a realidade do mercado brasileiro. O SVR baseia-se no aprendizado supervisionado estatístico e determina uma função de precificação a partir do reconhecimento de padrões e tendências do mercado. Para realizar a pesquisa, foram utilizados dados referentes às opções de compra americanas sobre ações da Petrobras PN negociadas na Bolsa de Valores de São Paulo, no período de novembro de 2008 a maio de 2009. Com a finalidade de validar o modelo proposto, compararam-se os resultados encontrados pelo SVR com os valores calculados pelo modelo de Black & Scholes, o qual se caracteriza por ser um dos modelos mais utilizados na área de finanças. A partir das comparações realizadas, concluiu-se que o desempenho do modelo de SVR foi superior ao de B&S na precificação de opções classificadas dentro do dinheiro, no dinheiro e fora do dinheiro. Verificou-se também que o modelo de SVR é capaz de captar os movimentos de preços do mercado.
APA, Harvard, Vancouver, ISO, and other styles
8

Wise, John Nathaniel. "Optimization of a low speed wind turbine using support vector regression." Thesis, Stellenbosch : University of Stellenbosch, 2009. http://hdl.handle.net/10019.1/2737.

Full text
Abstract:
Thesis (MScEng (Mechanical and Mechatronic Engineering))--University of Stellenbosch, 2009.
NUMERICAL design optimization provides a powerful tool that assists designers in improving their products. Design optimization automatically modifies important design parameters to obtain the best product that satisfies all the design requirements. This thesis explores the use of Support Vector Regression (SVR) and demonstrates its usefulness in the numerical optimization of a low-speed wind turbine for the power coe cient, Cp. The optimization design problem is the three-dimensional optimization of a wind turbine blade by making use of four two-dimensional radial stations. The candidate airfoils at these stations are selected from the 4-digit NACA range. A metamodel of the lift and drag coe cients of the NACA 4-digit series is created with SVR by using training points evaluated with XFOIL software. These SVR approximations are used in conjunction with the Blade Element Momentum theory to calculate and optimize the Cp value for the entire blade. The high accuracy attained with the SVR metamodels makes it a viable alternative to using XFOIL directly, as it has the advantages of being faster and easier to couple with the optimizer. The technique developed allows the optimization procedure the freedom to select profiles, angles of attack and chord length from the 4-digit NACA series to find an optimal Cp value. As a result of every radial blade station consisting of a NACA 4-digit series, the same lift and drag metamodels are used for each station. This technique also makes it simple to evaluate the entire blade as one set of design variables. The thesis contains a detailed description of the design and optimization problem, the implementation of the SVR algorithm, the creation of the lift and drag metamodels with SVR and an alternative methodology, the BEM theory and a summary of the results.
APA, Harvard, Vancouver, ISO, and other styles
9

Hasanov, Ilgar <1996&gt. "A Comparison between Support Vector Machines and Logistic Regression for Classification." Master's Degree Thesis, Università Ca' Foscari Venezia, 2022. http://hdl.handle.net/10579/20753.

Full text
Abstract:
This dissertation is about classification methods for binary data developed in Computer Science and Statistics. The research focuses on two main algorithms called support vector machines and logistic regression. The thesis consists of three chapters. The first chapter provides a general discussion of classification algorithms used in Statistical and Machine Learning with special emphasis on logistic regression and support vector machines. The second chapter includes some simulation studies to compare the classification methods. The third chapter concludes the thesis with an application to a real dataset.
APA, Harvard, Vancouver, ISO, and other styles
10

Shen, Judong. "Fusing support vector machines and soft computing for pattern recognition and regression /." Search for this dissertation online, 2005. http://wwwlib.umi.com/cr/ksu/main.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Support vector regression"

1

Drezet, P. Directly optimized support vector machines for classification and regression. Sheffield: University of Sheffield, Dept. of Automatic Control and Systems Engineering, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

O. Görgülü and A. Akilli. Egg production curve fitting using least square support vector machines and nonlinear regression analysis. Verlag Eugen Ulmer, 2018. http://dx.doi.org/10.1399/eps.2018.235.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Perez, C. STATISTICS and DATA ANALYSIS with MATLAB. SUPPORT VECTOR MACHINE, LOGISTIC REGRESSION, DISCRIMINANT ANALYSIS, and DECISION TREES. Independently Published, 2019.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Mejía, Julián Andrés Buriticá. Modelo Black-Litterman con Support Vector Regression: Una Alternativa para Los Fondos de Pensiones Obligatorios Colombianos. Independently Published, 2021.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

López, César Pérez. DATA MINING and MACHINE LEARNING. PREDICTIVE TECHNIQUES : REGRESSION, GENERALIZED LINEAR MODELS, SUPPORT VECTOR MACHINE and NEURAL NETWORKS: Examples with MATLAB. Lulu Press, Inc., 2021.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

DATA MINING and MACHINE LEARNING. CLASSIFICATION PREDICTIVE TECHNIQUES : SUPPORT VECTOR MACHINE, LOGISTIC REGRESSION, DISCRIMINANT ANALYSIS and DECISION TREES: Examples with MATLAB. Lulu Press, Inc., 2021.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Support vector regression"

1

Awad, Mariette, and Rahul Khanna. "Support Vector Regression." In Efficient Learning Machines, 67–80. Berkeley, CA: Apress, 2015. http://dx.doi.org/10.1007/978-1-4302-5990-9_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Montesinos López, Osval Antonio, Abelardo Montesinos López, and Jose Crossa. "Support Vector Machines and Support Vector Regression." In Multivariate Statistical Machine Learning Methods for Genomic Prediction, 337–78. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-89010-0_9.

Full text
Abstract:
AbstractIn this chapter, the support vector machines (svm) methods are studied. We first point out the origin and popularity of these methods and then we define the hyperplane concept which is the key for building these methods. We derive methods related to svm: the maximum margin classifier and the support vector classifier. We describe the derivation of the svm along with some kernel functions that are fundamental for building the different kernels methods that are allowed in svm. We explain how the svm for binary response variables can be expanded for categorical response variables and give examples of svm for binary and categorical response variables with plant breeding data for genomic selection. Finally, general issues for adopting the svm methodology for continuous response variables are provided, and some examples of svm for continuous response variables for genomic prediction are described.
APA, Harvard, Vancouver, ISO, and other styles
3

Berk, Richard A. "Support Vector Machines." In Statistical Learning from a Regression Perspective, 291–310. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-44048-4_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Berk, Richard A. "Support Vector Machines." In Statistical Learning from a Regression Perspective, 339–59. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-40189-4_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Orchel, Marcin. "Balanced Support Vector Regression." In Artificial Intelligence and Soft Computing, 727–38. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-19369-4_64.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Zhu, Wentao, Jun Miao, and Laiyun Qing. "Extreme Support Vector Regression." In Adaptation, Learning, and Optimization, 25–34. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-04741-6_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Bellocchio, Francesco, N. Alberto Borghese, Stefano Ferrari, and Vincenzo Piuri. "Hierarchical Support Vector Regression." In 3D Surface Reconstruction, 111–42. New York, NY: Springer New York, 2012. http://dx.doi.org/10.1007/978-1-4614-5632-2_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Jiang, Haochuan, Kaizhu Huang, and Rui Zhang. "Field Support Vector Regression." In Neural Information Processing, 699–708. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-70087-8_72.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Schleif, Frank-Michael. "Indefinite Support Vector Regression." In Artificial Neural Networks and Machine Learning – ICANN 2017, 313–21. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68612-7_36.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Jayadeva, Reshma Khemchandani, and Suresh Chandra. "TWSVR: Twin Support Vector Machine Based Regression." In Twin Support Vector Machines, 63–101. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-46186-1_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Support vector regression"

1

Jap, Dirmanto, Marc Stöttinger, and Shivam Bhasin. "Support vector regression." In ISCA '15: The 42nd Annual International Symposium on Computer Architecture. New York, NY, USA: ACM, 2015. http://dx.doi.org/10.1145/2768566.2768568.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Brudnak, M. "Vector-Valued Support Vector Regression." In The 2006 IEEE International Joint Conference on Neural Network Proceedings. IEEE, 2006. http://dx.doi.org/10.1109/ijcnn.2006.246619.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Forghani, Yahya, Reza Sigari Tabrizi, Hadi Sadoghi Yazdi, and Mohammad-R. Akbarzadeh-T. "Fuzzy support vector regression." In 2011 International eConference on Computer and Knowledge Engineering (ICCKE). IEEE, 2011. http://dx.doi.org/10.1109/iccke.2011.6413319.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Bouboulis, Pantelis, Sergios Theodoridis, and Charalampos Mavroforakis. "Complex support vector regression." In 2012 3rd International Workshop on Cognitive Information Processing (CIP). IEEE, 2012. http://dx.doi.org/10.1109/cip.2012.6232895.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Eleuteri, A. "Support vector survival regression." In 4th IET International Conference on Advances in Medical, Signal and Information Processing (MEDSIP 2008). IEE, 2008. http://dx.doi.org/10.1049/cp:20080436.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

George, Jose, and K. Rajeev. "Hybrid wavelet Support Vector Regression." In 2008 7th IEEE International Conference on Cybernetic Intelligent Systems (CIS). IEEE, 2008. http://dx.doi.org/10.1109/ukricis.2008.4798920.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Su, Wei-Han, and Chih-Hung Wu. "Support Vector Regression for GDOP." In 2008 Eighth International Conference on Intelligent Systems Design and Applications (ISDA). IEEE, 2008. http://dx.doi.org/10.1109/isda.2008.196.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Stoean, Ruxandra, D. Dumitrescu, Mike Preuss, and Catalin Stoean. "Evolutionary Support Vector Regression Machines." In 2006 Eighth International Symposium on Symbolic and Numeric Algorithms for Scientific Computing. IEEE, 2006. http://dx.doi.org/10.1109/synasc.2006.39.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Ferrari, Stefano, Francesco Bellocchio, Vincenzo Piuri, and N. Alberto Borghese. "Multi-scale Support Vector Regression." In 2010 International Joint Conference on Neural Networks (IJCNN). IEEE, 2010. http://dx.doi.org/10.1109/ijcnn.2010.5596630.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ruta, Dymitr, Ling Cen, and Quang Hieu Vu. "Greedy Incremental Support Vector Regression." In 2019 Federated Conference on Computer Science and Information Systems. IEEE, 2019. http://dx.doi.org/10.15439/2019f364.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Support vector regression"

1

Puttanapong, Nattapong, Arturo M. Martinez Jr, Mildred Addawe, Joseph Bulan, Ron Lester Durante, and Marymell Martillan. Predicting Poverty Using Geospatial Data in Thailand. Asian Development Bank, December 2020. http://dx.doi.org/10.22617/wps200434-2.

Full text
Abstract:
This study examines an alternative approach in estimating poverty by investigating whether readily available geospatial data can accurately predict the spatial distribution of poverty in Thailand. It also compares the predictive performance of various econometric and machine learning methods such as generalized least squares, neural network, random forest, and support vector regression. Results suggest that intensity of night lights and other variables that approximate population density are highly associated with the proportion of population living in poverty. The random forest technique yielded the highest level of prediction accuracy among the methods considered, perhaps due to its capability to fit complex association structures even with small and medium-sized datasets.
APA, Harvard, Vancouver, ISO, and other styles
2

Alwan, Iktimal, Dennis D. Spencer, and Rafeed Alkawadri. Comparison of Machine Learning Algorithms in Sensorimotor Functional Mapping. Progress in Neurobiology, December 2023. http://dx.doi.org/10.60124/j.pneuro.2023.30.03.

Full text
Abstract:
Objective: To compare the performance of popular machine learning algorithms (ML) in mapping the sensorimotor cortex (SM) and identifying the anterior lip of the central sulcus (CS). Methods: We evaluated support vector machines (SVMs), random forest (RF), decision trees (DT), single layer perceptron (SLP), and multilayer perceptron (MLP) against standard logistic regression (LR) to identify the SM cortex employing validated features from six-minute of NREM sleep icEEG data and applying standard common hyperparameters and 10-fold cross-validation. Each algorithm was tested using vetted features based on the statistical significance of classical univariate analysis (p<0.05) and extended () 17 features representing power/coherence of different frequency bands, entropy, and interelectrode-based distance. The analysis was performed before and after weight adjustment for imbalanced data (w). Results: 7 subjects and 376 contacts were included. Before optimization, ML algorithms performed comparably employing conventional features (median CS accuracy: 0.89, IQR [0.88-0.9]). After optimization, neural networks outperformed others in means of accuracy (MLP: 0.86), the area under the curve (AUC) (SLPw, MLPw, MLP: 0.91), recall (SLPw: 0.82, MLPw: 0.81), precision (SLPw: 0.84), and F1-scores (SLPw: 0.82). SVM achieved the best specificity performance. Extending the number of features and adjusting the weights improved recall, precision, and F1-scores by 48.27%, 27.15%, and 39.15%, respectively, with gains or no significant losses in specificity and AUC across CS and Function (correlation r=0.71 between the two clinical scenarios in all performance metrics, p<0.001). Interpretation: Computational passive sensorimotor mapping is feasible and reliable. Feature extension and weight adjustments improve the performance and counterbalance the accuracy paradox. Optimized neural networks outperform other ML algorithms even in binary classification tasks. The best-performing models and the MATLAB® routine employed in signal processing are available to the public at (Link 1).
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography