Kliknij ten link, aby zobaczyć inne rodzaje publikacji na ten temat: Low-Rank matrix approximation.

Artykuły w czasopismach na temat „Low-Rank matrix approximation”

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Sprawdź 50 najlepszych artykułów w czasopismach naukowych na temat „Low-Rank matrix approximation”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Przeglądaj artykuły w czasopismach z różnych dziedzin i twórz odpowiednie bibliografie.

1

Ting Liu, Ting Liu, Mingjian Sun Mingjian Sun, Naizhang Feng Naizhang Feng, Minghua Wang Minghua Wang, Deying Chen Deying Chen, and and Yi Shen and Yi Shen. "Sparse photoacoustic microscopy based on low-rank matrix approximation." Chinese Optics Letters 14, no. 9 (2016): 091701–91705. http://dx.doi.org/10.3788/col201614.091701.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Parekh, Ankit, and Ivan W. Selesnick. "Enhanced Low-Rank Matrix Approximation." IEEE Signal Processing Letters 23, no. 4 (2016): 493–97. http://dx.doi.org/10.1109/lsp.2016.2535227.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Fomin, Fedor V., Petr A. Golovach, and Fahad Panolan. "Parameterized low-rank binary matrix approximation." Data Mining and Knowledge Discovery 34, no. 2 (2020): 478–532. http://dx.doi.org/10.1007/s10618-019-00669-5.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Fomin, Fedor V., Petr A. Golovach, Daniel Lokshtanov, Fahad Panolan, and Saket Saurabh. "Approximation Schemes for Low-rank Binary Matrix Approximation Problems." ACM Transactions on Algorithms 16, no. 1 (2020): 1–39. http://dx.doi.org/10.1145/3365653.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Jia, Yuheng, Hui Liu, Junhui Hou, and Qingfu Zhang. "Clustering Ensemble Meets Low-rank Tensor Approximation." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 9 (2021): 7970–78. http://dx.doi.org/10.1609/aaai.v35i9.16972.

Pełny tekst źródła
Streszczenie:
This paper explores the problem of clustering ensemble, which aims to combine multiple base clusterings to produce better performance than that of the individual one. The existing clustering ensemble methods generally construct a co-association matrix, which indicates the pairwise similarity between samples, as the weighted linear combination of the connective matrices from different base clusterings, and the resulting co-association matrix is then adopted as the input of an off-the-shelf clustering algorithm, e.g., spectral clustering. However, the co-association matrix may be dominated by po
Style APA, Harvard, Vancouver, ISO itp.
6

Zhenyue Zhang and Keke Zhao. "Low-Rank Matrix Approximation with Manifold Regularization." IEEE Transactions on Pattern Analysis and Machine Intelligence 35, no. 7 (2013): 1717–29. http://dx.doi.org/10.1109/tpami.2012.274.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Xu, An-Bao, and Dongxiu Xie. "Low-rank approximation pursuit for matrix completion." Mechanical Systems and Signal Processing 95 (October 2017): 77–89. http://dx.doi.org/10.1016/j.ymssp.2017.03.024.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Barlow, Jesse L., and Hasan Erbay. "Modifiable low-rank approximation to a matrix." Numerical Linear Algebra with Applications 16, no. 10 (2009): 833–60. http://dx.doi.org/10.1002/nla.651.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Zhang, Jiani, Jennifer Erway, Xiaofei Hu, Qiang Zhang, and Robert Plemmons. "Randomized SVD Methods in Hyperspectral Imaging." Journal of Electrical and Computer Engineering 2012 (2012): 1–15. http://dx.doi.org/10.1155/2012/409357.

Pełny tekst źródła
Streszczenie:
We present a randomized singular value decomposition (rSVD) method for the purposes of lossless compression, reconstruction, classification, and target detection with hyperspectral (HSI) data. Recent work in low-rank matrix approximations obtained from random projections suggests that these approximations are well suited for randomized dimensionality reduction. Approximation errors for the rSVD are evaluated on HSI, and comparisons are made to deterministic techniques and as well as to other randomized low-rank matrix approximation methods involving compressive principal component analysis. Nu
Style APA, Harvard, Vancouver, ISO itp.
10

Soto-Quiros, Pablo. "Error analysis of the generalized low-rank matrix approximation." Electronic Journal of Linear Algebra 37 (July 23, 2021): 544–48. http://dx.doi.org/10.13001/ela.2021.5961.

Pełny tekst źródła
Streszczenie:
In this paper, we propose an error analysis of the generalized low-rank approximation, which is a generalization of the classical approximation of a matrix $A\in\mathbb{R}^{m\times n}$ by a matrix of a rank at most $r$, where $r\leq\min\{m,n\}$.
Style APA, Harvard, Vancouver, ISO itp.
11

Tropp, Joel A., Alp Yurtsever, Madeleine Udell, and Volkan Cevher. "Practical Sketching Algorithms for Low-Rank Matrix Approximation." SIAM Journal on Matrix Analysis and Applications 38, no. 4 (2017): 1454–85. http://dx.doi.org/10.1137/17m1111590.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
12

Liu, Huafeng, Liping Jing, Yuhua Qian, and Jian Yu. "Adaptive Local Low-rank Matrix Approximation for Recommendation." ACM Transactions on Information Systems 37, no. 4 (2019): 1–34. http://dx.doi.org/10.1145/3360488.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
13

Persson, David, and Daniel Kressner. "Randomized Low-Rank Approximation of Monotone Matrix Functions." SIAM Journal on Matrix Analysis and Applications 44, no. 2 (2023): 894–918. http://dx.doi.org/10.1137/22m1523923.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
14

Amini, Arash, Amin Karbasi, and Farokh Marvasti. "Low-Rank Matrix Approximation Using Point-Wise Operators." IEEE Transactions on Information Theory 58, no. 1 (2012): 302–10. http://dx.doi.org/10.1109/tit.2011.2167714.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
15

Hou, Junhui, Lap-Pui Chau, Nadia Magnenat-Thalmann, and Ying He. "Sparse Low-Rank Matrix Approximation for Data Compression." IEEE Transactions on Circuits and Systems for Video Technology 27, no. 5 (2017): 1043–54. http://dx.doi.org/10.1109/tcsvt.2015.2513698.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
16

Zhang, Zhenyue, and Lixin Wu. "Optimal low-rank approximation to a correlation matrix." Linear Algebra and its Applications 364 (May 2003): 161–87. http://dx.doi.org/10.1016/s0024-3795(02)00551-7.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
17

Gillis, Nicolas, and Yaroslav Shitov. "Low-rank matrix approximation in the infinity norm." Linear Algebra and its Applications 581 (November 2019): 367–82. http://dx.doi.org/10.1016/j.laa.2019.07.017.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
18

Song, Guang-Jing, and Michael K. Ng. "Nonnegative low rank matrix approximation for nonnegative matrices." Applied Mathematics Letters 105 (July 2020): 106300. http://dx.doi.org/10.1016/j.aml.2020.106300.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
19

van der Veen, Alle-Jan. "A Schur Method for Low-Rank Matrix Approximation." SIAM Journal on Matrix Analysis and Applications 17, no. 1 (1996): 139–60. http://dx.doi.org/10.1137/s0895479893261340.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
20

Sun, Dongxia, and Lihong Zhi. "Structured Low Rank Approximation of a Bezout Matrix." Mathematics in Computer Science 1, no. 2 (2007): 427–37. http://dx.doi.org/10.1007/s11786-007-0014-6.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
21

Mena, Hermann, Alexander Ostermann, Lena-Maria Pfurtscheller, and Chiara Piazzola. "Numerical low-rank approximation of matrix differential equations." Journal of Computational and Applied Mathematics 340 (October 2018): 602–14. http://dx.doi.org/10.1016/j.cam.2018.01.035.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
22

Zhu, E., M. Xu, and D. Pi. "A Novel Robust Principal Component Analysis Algorithm of Nonconvex Rank Approximation." Mathematical Problems in Engineering 2020 (September 30, 2020): 1–17. http://dx.doi.org/10.1155/2020/9356935.

Pełny tekst źródła
Streszczenie:
Noise exhibits low rank or no sparsity in the low-rank matrix recovery, and the nuclear norm is not an accurate rank approximation of low-rank matrix. In the present study, to solve the mentioned problem, a novel nonconvex approximation function of the low-rank matrix was proposed. Subsequently, based on the nonconvex rank approximation function, a novel model of robust principal component analysis was proposed. Such model was solved with the alternating direction method, and its convergence was verified theoretically. Subsequently, the background separation experiments were performed on the W
Style APA, Harvard, Vancouver, ISO itp.
23

Fernández-Val, Iván, Hugo Freeman, and Martin Weidner. "Low-rank approximations of nonseparable panel models." Econometrics Journal 24, no. 2 (2021): C40—C77. http://dx.doi.org/10.1093/ectj/utab007.

Pełny tekst źródła
Streszczenie:
Summary We provide estimation methods for nonseparable panel models based on low-rank factor structure approximations. The factor structures are estimated by matrix-completion methods to deal with the computational challenges of principal component analysis in the presence of missing data. We show that the resulting estimators are consistent in large panels, but suffer from approximation and shrinkage biases. We correct these biases using matching and difference-in-differences approaches. Numerical examples and an empirical application to the effect of election day registration on voter turnou
Style APA, Harvard, Vancouver, ISO itp.
24

Chen, Zhilong, Peng Wang, and Detong Zhu. "Approximation Conjugate Gradient Method for Low-Rank Matrix Recovery." Symmetry 16, no. 5 (2024): 547. http://dx.doi.org/10.3390/sym16050547.

Pełny tekst źródła
Streszczenie:
Large-scale symmetric and asymmetric matrices have emerged in predicting the relationship between genes and diseases. The emergence of large-scale matrices increases the computational complexity of the problem. Therefore, using low-rank matrices instead of original symmetric and asymmetric matrices can greatly reduce computational complexity. In this paper, we propose an approximation conjugate gradient method for solving the low-rank matrix recovery problem, i.e., the low-rank matrix is obtained to replace the original symmetric and asymmetric matrices such that the approximation error is the
Style APA, Harvard, Vancouver, ISO itp.
25

Chang, Xiangyu, Yan Zhong, Yao Wang, and Shaobo Lin. "Unified Low-Rank Matrix Estimate via Penalized Matrix Least Squares Approximation." IEEE Transactions on Neural Networks and Learning Systems 30, no. 2 (2019): 474–85. http://dx.doi.org/10.1109/tnnls.2018.2844242.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
26

Matveev, Sergey, and Stanislav Budzinskiy. "Sketching for a low-rank nonnegative matrix approximation: Numerical study." Russian Journal of Numerical Analysis and Mathematical Modelling 38, no. 2 (2023): 99–114. http://dx.doi.org/10.1515/rnam-2023-0009.

Pełny tekst źródła
Streszczenie:
Abstract We propose new approximate alternating projection methods, based on randomized sketching, for the low-rank nonnegative matrix approximation problem: find a low-rank approximation of a nonnegative matrix that is nonnegative, but whose factors can be arbitrary. We calculate the computational complexities of the proposed methods and evaluate their performance in numerical experiments. The comparison with the known deterministic alternating projection methods shows that the randomized approaches are faster and exhibit similar convergence properties.
Style APA, Harvard, Vancouver, ISO itp.
27

Horasan, Fahrettin, Hasan Erbay, Fatih Varçın, and Emre Deniz. "Alternate Low-Rank Matrix Approximation in Latent Semantic Analysis." Scientific Programming 2019 (February 3, 2019): 1–12. http://dx.doi.org/10.1155/2019/1095643.

Pełny tekst źródła
Streszczenie:
The latent semantic analysis (LSA) is a mathematical/statistical way of discovering hidden concepts between terms and documents or within a document collection (i.e., a large corpus of text). Each document of the corpus and terms are expressed as a vector with elements corresponding to these concepts to form a term-document matrix. Then, the LSA uses a low-rank approximation to the term-document matrix in order to remove irrelevant information, to extract more important relations, and to reduce the computational time. The irrelevant information is called as “noise” and does not have a notewort
Style APA, Harvard, Vancouver, ISO itp.
28

Nie, Feiping, Zhanxuan Hu, and Xuelong Li. "Matrix Completion Based on Non-Convex Low-Rank Approximation." IEEE Transactions on Image Processing 28, no. 5 (2019): 2378–88. http://dx.doi.org/10.1109/tip.2018.2886712.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
29

Zheng, Jianwei, Mengjie Qin, Xiaolong Zhou, Jiafa Mao, and Hongchuan Yu. "Efficient Implementation of Truncated Reweighting Low-Rank Matrix Approximation." IEEE Transactions on Industrial Informatics 16, no. 1 (2020): 488–500. http://dx.doi.org/10.1109/tii.2019.2916986.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
30

Pitaval, Renaud-Alexandre, Wei Dai, and Olav Tirkkonen. "Convergence of Gradient Descent for Low-Rank Matrix Approximation." IEEE Transactions on Information Theory 61, no. 8 (2015): 4451–57. http://dx.doi.org/10.1109/tit.2015.2448695.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
31

Pei Chen. "Heteroscedastic Low-Rank Matrix Approximation by the Wiberg Algorithm." IEEE Transactions on Signal Processing 56, no. 4 (2008): 1429–39. http://dx.doi.org/10.1109/tsp.2007.909353.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
32

Duan, Xuefeng, Jiaofen Li, Qingwen Wang, and Xinjun Zhang. "Low rank approximation of the symmetric positive semidefinite matrix." Journal of Computational and Applied Mathematics 260 (April 2014): 236–43. http://dx.doi.org/10.1016/j.cam.2013.09.080.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
33

Mohd Sagheer, Sameera V., and Sudhish N. George. "Ultrasound image despeckling using low rank matrix approximation approach." Biomedical Signal Processing and Control 38 (September 2017): 236–49. http://dx.doi.org/10.1016/j.bspc.2017.06.011.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
34

Luo, Yu, and Jie Ling. "Single-image de-raining using low-rank matrix approximation." Neural Computing and Applications 32, no. 11 (2019): 7503–14. http://dx.doi.org/10.1007/s00521-019-04271-0.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
35

Li, Chi-Kwong, and Gilbert Strang. "An elementary proof of Mirsky's low rank approximation theorem." Electronic Journal of Linear Algebra 36, no. 36 (2020): 694–97. http://dx.doi.org/10.13001/ela.2020.5551.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
36

Shi, Chengfei, Zhengdong Huang, Li Wan, and Tifan Xiong. "Low-Rank Tensor Completion Based on Log-Det Rank Approximation and Matrix Factorization." Journal of Scientific Computing 80, no. 3 (2019): 1888–912. http://dx.doi.org/10.1007/s10915-019-01009-x.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
37

Lebedeva, O. S., A. I. Osinsky, and S. V. Petrov. "Low-Rank Approximation Algorithms for Matrix Completion with Random Sampling." Computational Mathematics and Mathematical Physics 61, no. 5 (2021): 799–815. http://dx.doi.org/10.1134/s0965542521050122.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
38

Huang, Zhi-Long, and Hsu-Feng Hsiao. "Inter-frame Prediction with Fast Weighted Low-rank Matrix Approximation." International Journal of Electronics and Telecommunications 59, no. 1 (2013): 9–16. http://dx.doi.org/10.2478/eletel-2013-0001.

Pełny tekst źródła
Streszczenie:
Abstract In the field of video coding, inter-frame prediction plays an important role in improving compression efficiency. The improved efficiency is achieved by finding predictors for video blocks such that the residual data can be close to zero as much as possible. For recent video coding standards, motion vectors are required for a decoder to locate the predictors during video reconstruction. Block matching algorithms are usually utilized in the stage of motion estimation to find such motion vectors. For decoder-side motion derivation, proper templates are defined and template matching algo
Style APA, Harvard, Vancouver, ISO itp.
39

Kirsteins, I. P., and D. W. Tufts. "Adaptive detection using low rank approximation to a data matrix." IEEE Transactions on Aerospace and Electronic Systems 30, no. 1 (1994): 55–67. http://dx.doi.org/10.1109/7.250406.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
40

Hutchings, Matthew, and Bertrand Gauthier. "Energy-Based Sequential Sampling for Low-Rank PSD-Matrix Approximation." SIAM Journal on Mathematics of Data Science 6, no. 4 (2024): 1055–77. http://dx.doi.org/10.1137/23m162449x.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
41

Xu, Fei, Jingqi Han, Yongli Wang, et al. "Dynamic Magnetic Resonance Imaging via Nonconvex Low-Rank Matrix Approximation." IEEE Access 5 (2017): 1958–66. http://dx.doi.org/10.1109/access.2017.2657645.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
42

Zhou, Guoxu, Andrzej Cichocki, and Shengli Xie. "Fast Nonnegative Matrix/Tensor Factorization Based on Low-Rank Approximation." IEEE Transactions on Signal Processing 60, no. 6 (2012): 2928–40. http://dx.doi.org/10.1109/tsp.2012.2190410.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
43

Nechepurenko, Yuri M., and Miloud Sadkane. "A Low-Rank Approximation for Computing the Matrix Exponential Norm." SIAM Journal on Matrix Analysis and Applications 32, no. 2 (2011): 349–63. http://dx.doi.org/10.1137/100789774.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
44

Shen, Haipeng, and Jianhua Z. Huang. "Sparse principal component analysis via regularized low rank matrix approximation." Journal of Multivariate Analysis 99, no. 6 (2008): 1015–34. http://dx.doi.org/10.1016/j.jmva.2007.06.007.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
45

Feng, Xingdong, and Xuming He. "Statistical inference based on robust low-rank data matrix approximation." Annals of Statistics 42, no. 1 (2014): 190–210. http://dx.doi.org/10.1214/13-aos1186.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
46

Gillard, J. W., and A. A. Zhigljavsky. "Stochastic algorithms for solving structured low-rank matrix approximation problems." Communications in Nonlinear Science and Numerical Simulation 21, no. 1-3 (2015): 70–88. http://dx.doi.org/10.1016/j.cnsns.2014.08.023.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
47

Chang, Haixia. "Constrained Low Rank Approximation of the Hermitian Nonnegative-Definite Matrix." Advances in Linear Algebra & Matrix Theory 10, no. 02 (2020): 22–33. http://dx.doi.org/10.4236/alamt.2020.102003.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
48

Chen, Yongyong, Yanwen Guo, Yongli Wang, Dong Wang, Chong Peng, and Guoping He. "Denoising of Hyperspectral Images Using Nonconvex Low Rank Matrix Approximation." IEEE Transactions on Geoscience and Remote Sensing 55, no. 9 (2017): 5366–80. http://dx.doi.org/10.1109/tgrs.2017.2706326.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
49

Inoue, Kohei, and Kiichi Urahama. "Dimensionality reduction by simultaneous low-rank approximation of matrix data." Electronics and Communications in Japan (Part II: Electronics) 90, no. 9 (2007): 42–49. http://dx.doi.org/10.1002/ecjb.20379.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
50

Chang, Haixia, Chunmei Li, and Longsheng Liu. "Generalized low-rank approximation to the symmetric positive semidefinite matrix." AIMS Mathematics 10, no. 4 (2025): 8022–35. https://doi.org/10.3934/math.2025368.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!