Siga este enlace para ver otros tipos de publicaciones sobre el tema: Matrix regression.

Artículos de revistas sobre el tema "Matrix regression"

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte los 50 mejores artículos de revistas para su investigación sobre el tema "Matrix regression".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Explore artículos de revistas sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.

1

Zhou, Hua y Lexin Li. "Regularized matrix regression". Journal of the Royal Statistical Society: Series B (Statistical Methodology) 76, n.º 2 (12 de agosto de 2013): 463–83. http://dx.doi.org/10.1111/rssb.12031.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Viroli, Cinzia. "On matrix-variate regression analysis". Journal of Multivariate Analysis 111 (octubre de 2012): 296–309. http://dx.doi.org/10.1016/j.jmva.2012.04.005.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Luo, Changtong y Shao-Liang Zhang. "Parse-matrix evolution for symbolic regression". Engineering Applications of Artificial Intelligence 25, n.º 6 (septiembre de 2012): 1182–93. http://dx.doi.org/10.1016/j.engappai.2012.05.015.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Koláček, Jan y Ivana Horová. "Bandwidth matrix selectors for kernel regression". Computational Statistics 32, n.º 3 (16 de enero de 2017): 1027–46. http://dx.doi.org/10.1007/s00180-017-0709-3.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Mukha, V. S. "The best polynomial multidimensional-matrix regression". Cybernetics and Systems Analysis 43, n.º 3 (mayo de 2007): 427–32. http://dx.doi.org/10.1007/s10559-007-0065-3.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Zhang, Jianguang y Jianmin Jiang. "Rank-Optimized Logistic Matrix Regression toward Improved Matrix Data Classification". Neural Computation 30, n.º 2 (febrero de 2018): 505–25. http://dx.doi.org/10.1162/neco_a_01038.

Texto completo
Resumen
While existing logistic regression suffers from overfitting and often fails in considering structural information, we propose a novel matrix-based logistic regression to overcome the weakness. In the proposed method, 2D matrices are directly used to learn two groups of parameter vectors along each dimension without vectorization, which allows the proposed method to fully exploit the underlying structural information embedded inside the 2D matrices. Further, we add a joint [Formula: see text]-norm on two parameter matrices, which are organized by aligning each group of parameter vectors in columns. This added co-regularization term has two roles—enhancing the effect of regularization and optimizing the rank during the learning process. With our proposed fast iterative solution, we carried out extensive experiments. The results show that in comparison to both the traditional tensor-based methods and the vector-based regression methods, our proposed solution achieves better performance for matrix data classifications.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Zeebari, Zangin, B. M. Golam Kibria y Ghazi Shukur. "Seemingly unrelated regressions with covariance matrix of cross-equation ridge regression residuals". Communications in Statistics - Theory and Methods 47, n.º 20 (13 de noviembre de 2017): 5029–53. http://dx.doi.org/10.1080/03610926.2017.1383431.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Chitsaz, Shabnam y S. Ejaz Ahmed. "Shrinkage estimation for the regression parameter matrix in multivariate regression model". Journal of Statistical Computation and Simulation 82, n.º 2 (febrero de 2012): 309–23. http://dx.doi.org/10.1080/00949655.2011.648938.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Chitsaz, S. y S. Ejaz Ahmed. "An Improved Estimation in Regression Parameter Matrix in Multivariate Regression Model". Communications in Statistics - Theory and Methods 41, n.º 13-14 (julio de 2012): 2305–20. http://dx.doi.org/10.1080/03610926.2012.664672.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Turner, David L. "Matrix Calculator and Stepwise Interactive Regression Programs". American Statistician 41, n.º 4 (noviembre de 1987): 329. http://dx.doi.org/10.2307/2684760.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
11

Srivastava, Muni S. y Dietrich von Rosen. "Regression models with unknown singular covariance matrix". Linear Algebra and its Applications 354, n.º 1-3 (octubre de 2002): 255–73. http://dx.doi.org/10.1016/s0024-3795(02)00342-7.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
12

Hou, Chenping, Yuanyuan Jiao, Feiping Nie, Tingjin Luo y Zhi-Hua Zhou. "2D Feature Selection by Sparse Matrix Regression". IEEE Transactions on Image Processing 26, n.º 9 (septiembre de 2017): 4255–68. http://dx.doi.org/10.1109/tip.2017.2713948.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
13

Gimenez-Febrer, Pere, Alba Pages-Zamora y Georgios B. Giannakis. "Matrix Completion and Extrapolation via Kernel Regression". IEEE Transactions on Signal Processing 67, n.º 19 (1 de octubre de 2019): 5004–17. http://dx.doi.org/10.1109/tsp.2019.2932875.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
14

Mi, Jian-Xun, Quanwei Zhu y Zhiheng Luo. "Matrix regression-based classification with block-norm". Pattern Recognition Letters 125 (julio de 2019): 654–60. http://dx.doi.org/10.1016/j.patrec.2019.07.007.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
15

Cook, R. D., B. Li y F. Chiaromonte. "Dimension reduction in regression without matrix inversion". Biometrika 94, n.º 3 (5 de agosto de 2007): 569–84. http://dx.doi.org/10.1093/biomet/asm038.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
16

Bu, Shanshan. "A Matrix-based Method for Ordinal Regression". Journal of Information and Computational Science 11, n.º 17 (20 de noviembre de 2014): 6209–20. http://dx.doi.org/10.12733/jics20104940.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
17

James, Alan T. y William N. Venables. "Matrix Weighting of Several Regression Coefficient Vectors". Annals of Statistics 21, n.º 2 (junio de 1993): 1093–114. http://dx.doi.org/10.1214/aos/1176349166.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
18

Fiszeder, Piotr y Witold Orzeszko. "Covariance matrix forecasting using support vector regression". Applied Intelligence 51, n.º 10 (23 de febrero de 2021): 7029–42. http://dx.doi.org/10.1007/s10489-021-02217-5.

Texto completo
Resumen
AbstractSupport vector regression is a promising method for time-series prediction, as it has good generalisability and an overall stable behaviour. Recent studies have shown that it can describe the dynamic characteristics of financial processes and make more accurate forecasts than other machine learning techniques. The first main contribution of this paper is to propose a methodology for dynamic modelling and forecasting covariance matrices based on support vector regression using the Cholesky decomposition. The procedure is applied to range-based covariance matrices of returns, which are estimated on the basis of low and high prices. Such prices are most often available with closing prices for many financial series and contain more information about volatility and relationships between returns. The methodology guarantees the positive definiteness of the forecasted covariance matrices and is flexible, as it can be applied to different dependence patterns. The second contribution of the paper is to show with an example of the exchange rates from the forex market that the covariance matrix forecasts calculated using the proposed approach are more accurate than the forecasts from the benchmark dynamic conditional correlation model. The advantage of the suggested procedure is higher during turbulent periods, i.e., when forecasting is the most difficult and accurate forecasts matter most.
Los estilos APA, Harvard, Vancouver, ISO, etc.
19

Nkurunziza, Sévérien y S. Ejaz Ahmed. "Estimation strategies for the regression coefficient parameter matrix in multivariate multiple regression". Statistica Neerlandica 65, n.º 4 (26 de mayo de 2011): 387–406. http://dx.doi.org/10.1111/j.1467-9574.2011.00491.x.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
20

Parente, Paulo M. D. C. y João M. C. Santos Silva. "Quantile Regression with Clustered Data". Journal of Econometric Methods 5, n.º 1 (1 de enero de 2016): 1–15. http://dx.doi.org/10.1515/jem-2014-0011.

Texto completo
Resumen
AbstractWe study the properties of the quantile regression estimator when data are sampled from independent and identically distributed clusters, and show that the estimator is consistent and asymptotically normal even when there is intra-cluster correlation. A consistent estimator of the covariance matrix of the asymptotic distribution is provided, and we propose a specification test capable of detecting the presence of intra-cluster correlation. A small simulation study illustrates the finite sample performance of the test and of the covariance matrix estimator.
Los estilos APA, Harvard, Vancouver, ISO, etc.
21

Xie, Luofeng, Ming Yin, Ling Wang, Feng Tan y Guofu Yin. "Matrix regression preserving projections for robust feature extraction". Knowledge-Based Systems 161 (diciembre de 2018): 35–46. http://dx.doi.org/10.1016/j.knosys.2018.07.028.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
22

Xiang, Shiming, Gaofeng Meng, Ying Wang, Chunhong Pan y Changshui Zhang. "Image deblurring with matrix regression and gradient evolution". Pattern Recognition 45, n.º 6 (junio de 2012): 2164–79. http://dx.doi.org/10.1016/j.patcog.2011.11.026.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
23

Nakonechnyi, Alexander G., Grigoriy I. Kudin, Petr N. Zinko y Taras P. Zinko. "Perturbation Method in Problems of Linear Matrix Regression". Journal of Automation and Information Sciences 52, n.º 1 (2020): 1–12. http://dx.doi.org/10.1615/jautomatinfscien.v52.i1.10.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
24

Xie, Jianchun, Jian Yang, Jianjun Qian y Lei Luo. "Bi-weighted robust matrix regression for face recognition". Neurocomputing 237 (mayo de 2017): 375–87. http://dx.doi.org/10.1016/j.neucom.2017.01.028.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
25

Lai, Yeuntyng, Morihiro Hayashida y Tatsuya Akutsu. "Survival Analysis by Penalized Regression and Matrix Factorization". Scientific World Journal 2013 (2013): 1–11. http://dx.doi.org/10.1155/2013/632030.

Texto completo
Resumen
Because every disease has its unique survival pattern, it is necessary to find a suitable model to simulate followups. DNA microarray is a useful technique to detect thousands of gene expressions at one time and is usually employed to classify different types of cancer. We propose combination methods of penalized regression models and nonnegative matrix factorization (NMF) for predicting survival. We triedL1- (lasso),L2- (ridge), andL1-L2combined (elastic net) penalized regression for diffuse large B-cell lymphoma (DLBCL) patients' microarray data and found thatL1-L2combined method predicts survival best with the smallest logrankPvalue. Furthermore, 80% of selected genes have been reported to correlate with carcinogenesis or lymphoma. Through NMF we found that DLBCL patients can be divided into 4 groups clearly, and it implies that DLBCL may have 4 subtypes which have a little different survival patterns. Next we excluded some patients who were indicated hard to classify in NMF and executed three penalized regression models again. We found that the performance of survival prediction has been improved with lower logrankPvalues. Therefore, we conclude that after preselection of patients by NMF, penalized regression models can predict DLBCL patients' survival successfully.
Los estilos APA, Harvard, Vancouver, ISO, etc.
26

McNeil, Sue y Chris Hendrickson. "A Regression Formulation of the Matrix Estimation Problem". Transportation Science 19, n.º 3 (agosto de 1985): 278–92. http://dx.doi.org/10.1287/trsc.19.3.278.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
27

Jauffret, Claude. "Observability and fisher information matrix in nonlinear regression". IEEE Transactions on Aerospace and Electronic Systems 43, n.º 2 (abril de 2007): 756–59. http://dx.doi.org/10.1109/taes.2007.4285368.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
28

Lemonte, Artur J. "Covariance matrix formula for Birnbaum–Saunders regression models". Journal of Statistical Computation and Simulation 81, n.º 7 (julio de 2011): 899–908. http://dx.doi.org/10.1080/00949650903555288.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
29

Li, Mei y Lingchen Kong. "Double fused Lasso penalized LAD for matrix regression". Applied Mathematics and Computation 357 (septiembre de 2019): 119–38. http://dx.doi.org/10.1016/j.amc.2019.03.051.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
30

Zhang, Jiawei, Peng Wang y Ning Zhang. "Distribution Network Admittance Matrix Estimation With Linear Regression". IEEE Transactions on Power Systems 36, n.º 5 (septiembre de 2021): 4896–99. http://dx.doi.org/10.1109/tpwrs.2021.3090250.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
31

LIPOVETSKY, STAN. "MEANINGFUL REGRESSION COEFFICIENTS BUILT BY DATA GRADIENTS". Advances in Adaptive Data Analysis 02, n.º 04 (octubre de 2010): 451–62. http://dx.doi.org/10.1142/s1793536910000574.

Texto completo
Resumen
Multiple regression's coefficients define change in the dependent variable due to a predictor's change while all other predictors are constant. Rearranging data to paired differences of observations and keeping only biggest changes yield a matrix of a single variable change, which is close to orthogonal design, so there is no impact of multicollinearity on the regression. A similar approach is used for meaningful coefficients of nonlinear regressions with coefficients of half-elasticity, elasticity, and odds' elasticity due the gradients in each predictor. In contrast to regular linear and nonlinear regressions, the suggested technique produces interpretable coefficients not prone to multicollinearity effects.
Los estilos APA, Harvard, Vancouver, ISO, etc.
32

Hahn, Jinyong. "Bootstrapping Quantile Regression Estimators". Econometric Theory 11, n.º 1 (febrero de 1995): 105–21. http://dx.doi.org/10.1017/s0266466600009051.

Texto completo
Resumen
The asymptotic variance matrix of the quantile regression estimator depends on the density of the error. For both deterministic and random regressors, the bootstrap distribution is shown to converge weakly to the limit distribution of the quantile regression estimator in probability. Thus, the confidence intervals constructed by the bootstrap percentile method have asymptotically correct coverage probabilities.
Los estilos APA, Harvard, Vancouver, ISO, etc.
33

Shang, Pan y Lingchen Kong. "On the Degrees of Freedom of Mixed Matrix Regression". Mathematical Problems in Engineering 2017 (2017): 1–8. http://dx.doi.org/10.1155/2017/6942865.

Texto completo
Resumen
With the increasing prominence of big data in modern science, data of interest are more complex and stochastic. To deal with the complex matrix and vector data, this paper focuses on the mixed matrix regression model. We mainly establish the degrees of freedom of the underlying stochastic model, which is one of the important topics to construct adaptive selection criteria for efficiently selecting the optimal model fit. Under some mild conditions, we prove that the degrees of freedom of mixed matrix regression model are the sum of the degrees of freedom of Lasso and regularized matrix regression. Moreover, we establish the degrees of freedom of nuclear-norm regularization multivariate regression. Furthermore, we prove that the estimates of the degrees of freedom of the underlying models process the consistent property.
Los estilos APA, Harvard, Vancouver, ISO, etc.
34

ZHENG, SHENG, YUQIU SUN, JINWEN TIAN y JAIN LIU. "MAPPED LEAST SQUARES SUPPORT VECTOR MACHINE REGRESSION". International Journal of Pattern Recognition and Artificial Intelligence 19, n.º 03 (mayo de 2005): 459–75. http://dx.doi.org/10.1142/s0218001405004058.

Texto completo
Resumen
This paper describes a novel version of regression SVM (Support Vector Machines) that is based on the least-squares error. We show that the solution of this optimization problem can be obtained easily once the inverse of a certain matrix is computed. This matrix, however, depends only on the input vectors, but not on the labels. Thus, if many learning problems with the same set of input vectors but different sets of labels have to be solved, it makes sense to compute the inverse of the matrix just once and then use it for computing all subsequent models. The computational complexity to train an regression SVM can be reduced to O (N2), just a matrix multiplication operation, and thus probably faster than known SVM training algorithms that have O (N2) work with loops. We describe applications from image processing, where the input points are usually of the form {(x0 + dx, y0 + dy) : |dx| < m, |dy| < n} and all such set of points can be translated to the same set {(dx, dy) : |dx| < m, |dy| < n} by subtracting (x0, y0) from all the vectors. The experimental results demonstrate that the proposed approach is faster than those processing each learning problem separately.
Los estilos APA, Harvard, Vancouver, ISO, etc.
35

Mahaboob, B., J. P. Praveen, B. V. A. Rao, Y. Harnath, C. Narayana y G. B. Prakash. "A STUDY ON MULTIPLE LINEAR REGRESSION USING MATRIX CALCULUS". Advances in Mathematics: Scientific Journal 9, n.º 7 (2 de agosto de 2020): 4863–72. http://dx.doi.org/10.37418/amsj.9.7.52.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
36

Foucart, Thierry. "Stability of the inverse correlation matrix. Partial ridge regression". Journal of Statistical Planning and Inference 77, n.º 1 (febrero de 1999): 141–54. http://dx.doi.org/10.1016/s0378-3758(98)00195-5.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
37

Mi, Jian-Xun, Zhiheng Luo, Li-Fang Zhou y Fujin Zhong. "Bilateral structure based matrix regression classification for face recognition". Neurocomputing 348 (julio de 2019): 107–19. http://dx.doi.org/10.1016/j.neucom.2018.05.123.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
38

Yuan, Shi-Fang, Yi-Bin Yu, Ming-Zhao Li y Hua Jiang. "A direct method to Frobenius norm-based matrix regression". International Journal of Computer Mathematics 97, n.º 9 (26 de septiembre de 2019): 1767–80. http://dx.doi.org/10.1080/00207160.2019.1668558.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
39

Piribauer, Philipp y Manfred M. Fischer. "Model Uncertainty in Matrix Exponential Spatial Growth Regression Models". Geographical Analysis 47, n.º 3 (17 de septiembre de 2014): 240–61. http://dx.doi.org/10.1111/gean.12057.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
40

Aubin, Elisete da Conceição Q. y Gauss M. Cordeiro. "BIAS in linear regression models with unknown covariance matrix". Communications in Statistics - Simulation and Computation 26, n.º 3 (enero de 1997): 813–28. http://dx.doi.org/10.1080/03610919708813413.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
41

Kalivas, John H. "Cyclic subspace regression with analysis of the hat matrix". Chemometrics and Intelligent Laboratory Systems 45, n.º 1-2 (enero de 1999): 215–24. http://dx.doi.org/10.1016/s0169-7439(98)00106-3.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
42

Lipovetsky, Stan y W. Michael Conklin. "Dual- and triple-mode matrix approximation and regression modelling". Applied Stochastic Models in Business and Industry 19, n.º 4 (2003): 291–301. http://dx.doi.org/10.1002/asmb.503.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
43

Bargiela, Andrzej y Joanna K. Hartley. "Orthogonal linear regression algorithm based on augmented matrix formulation". Computers & Operations Research 20, n.º 8 (octubre de 1993): 829–36. http://dx.doi.org/10.1016/0305-0548(93)90104-q.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
44

Zhang, Jianguang, Jianmin Jiang y Yahong Han. "Semisupervised Regression With Optimized Rank for Matrix Data Classification". IEEE Transactions on Cybernetics 49, n.º 9 (septiembre de 2019): 3443–56. http://dx.doi.org/10.1109/tcyb.2018.2844860.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
45

Sinay, Marick S. y John S. J. Hsu. "Bayesian Inference of a Multivariate Regression Model". Journal of Probability and Statistics 2014 (2014): 1–13. http://dx.doi.org/10.1155/2014/673657.

Texto completo
Resumen
We explore Bayesian inference of a multivariate linear regression model with use of a flexible prior for the covariance structure. The commonly adopted Bayesian setup involves the conjugate prior, multivariate normal distribution for the regression coefficients and inverse Wishart specification for the covariance matrix. Here we depart from this approach and propose a novel Bayesian estimator for the covariance. A multivariate normal prior for the unique elements of the matrix logarithm of the covariance matrix is considered. Such structure allows for a richer class of prior distributions for the covariance, with respect to strength of beliefs in prior location hyperparameters, as well as the added ability, to model potential correlation amongst the covariance structure. The posterior moments of all relevant parameters of interest are calculated based upon numerical results via a Markov chain Monte Carlo procedure. The Metropolis-Hastings-within-Gibbs algorithm is invoked to account for the construction of a proposal density that closely matches the shape of the target posterior distribution. As an application of the proposed technique, we investigate a multiple regression based upon the 1980 High School and Beyond Survey.
Los estilos APA, Harvard, Vancouver, ISO, etc.
46

Yang, Ming, Ying-ming Li y Zhongfei Zhang. "Scientific articles recommendation with topic regression and relational matrix factorization". Journal of Zhejiang University SCIENCE C 15, n.º 11 (noviembre de 2014): 984–98. http://dx.doi.org/10.1631/jzus.c1300374.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
47

Mao, Shangqin, Xinhan Huang y Min Wang. "Image Jacobian Matrix Estimation Based on Online Support Vector Regression". International Journal of Advanced Robotic Systems 9, n.º 4 (octubre de 2012): 111. http://dx.doi.org/10.5772/51833.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
48

Deng, Yang-Jun, Heng-Chao Li, Qi Wang y Qian Du. "Nuclear norm-based matrix regression preserving embedding for face recognition". Neurocomputing 311 (octubre de 2018): 279–90. http://dx.doi.org/10.1016/j.neucom.2018.05.078.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
49

Ohsaki, Miho, Peng Wang, Kenji Matsuda, Shigeru Katagiri, Hideyuki Watanabe y Anca Ralescu. "Confusion-Matrix-Based Kernel Logistic Regression for Imbalanced Data Classification". IEEE Transactions on Knowledge and Data Engineering 29, n.º 9 (1 de septiembre de 2017): 1806–19. http://dx.doi.org/10.1109/tkde.2017.2682249.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
50

WU, JiTao, DiRong CHEN y Heng CHEN. "Semi-supervised learning for regression based on the diffusion matrix". SCIENTIA SINICA Mathematica 44, n.º 4 (1 de marzo de 2014): 399–408. http://dx.doi.org/10.1360/n012013-00116.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía