Auswahl der wissenschaftlichen Literatur zum Thema „Low-Rank matrix approximation“

Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an

Wählen Sie eine Art der Quelle aus:

Machen Sie sich mit den Listen der aktuellen Artikel, Bücher, Dissertationen, Berichten und anderer wissenschaftlichen Quellen zum Thema "Low-Rank matrix approximation" bekannt.

Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.

Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.

Zeitschriftenartikel zum Thema "Low-Rank matrix approximation"

1

Ting Liu, Ting Liu, Mingjian Sun Mingjian Sun, Naizhang Feng Naizhang Feng, Minghua Wang Minghua Wang, Deying Chen Deying Chen, and and Yi Shen and Yi Shen. "Sparse photoacoustic microscopy based on low-rank matrix approximation." Chinese Optics Letters 14, no. 9 (2016): 091701–91705. http://dx.doi.org/10.3788/col201614.091701.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Parekh, Ankit, and Ivan W. Selesnick. "Enhanced Low-Rank Matrix Approximation." IEEE Signal Processing Letters 23, no. 4 (2016): 493–97. http://dx.doi.org/10.1109/lsp.2016.2535227.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Fomin, Fedor V., Petr A. Golovach, and Fahad Panolan. "Parameterized low-rank binary matrix approximation." Data Mining and Knowledge Discovery 34, no. 2 (2020): 478–532. http://dx.doi.org/10.1007/s10618-019-00669-5.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Fomin, Fedor V., Petr A. Golovach, Daniel Lokshtanov, Fahad Panolan, and Saket Saurabh. "Approximation Schemes for Low-rank Binary Matrix Approximation Problems." ACM Transactions on Algorithms 16, no. 1 (2020): 1–39. http://dx.doi.org/10.1145/3365653.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Jia, Yuheng, Hui Liu, Junhui Hou, and Qingfu Zhang. "Clustering Ensemble Meets Low-rank Tensor Approximation." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 9 (2021): 7970–78. http://dx.doi.org/10.1609/aaai.v35i9.16972.

Der volle Inhalt der Quelle
Annotation:
This paper explores the problem of clustering ensemble, which aims to combine multiple base clusterings to produce better performance than that of the individual one. The existing clustering ensemble methods generally construct a co-association matrix, which indicates the pairwise similarity between samples, as the weighted linear combination of the connective matrices from different base clusterings, and the resulting co-association matrix is then adopted as the input of an off-the-shelf clustering algorithm, e.g., spectral clustering. However, the co-association matrix may be dominated by po
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Zhenyue Zhang and Keke Zhao. "Low-Rank Matrix Approximation with Manifold Regularization." IEEE Transactions on Pattern Analysis and Machine Intelligence 35, no. 7 (2013): 1717–29. http://dx.doi.org/10.1109/tpami.2012.274.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Xu, An-Bao, and Dongxiu Xie. "Low-rank approximation pursuit for matrix completion." Mechanical Systems and Signal Processing 95 (October 2017): 77–89. http://dx.doi.org/10.1016/j.ymssp.2017.03.024.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Barlow, Jesse L., and Hasan Erbay. "Modifiable low-rank approximation to a matrix." Numerical Linear Algebra with Applications 16, no. 10 (2009): 833–60. http://dx.doi.org/10.1002/nla.651.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Zhang, Jiani, Jennifer Erway, Xiaofei Hu, Qiang Zhang, and Robert Plemmons. "Randomized SVD Methods in Hyperspectral Imaging." Journal of Electrical and Computer Engineering 2012 (2012): 1–15. http://dx.doi.org/10.1155/2012/409357.

Der volle Inhalt der Quelle
Annotation:
We present a randomized singular value decomposition (rSVD) method for the purposes of lossless compression, reconstruction, classification, and target detection with hyperspectral (HSI) data. Recent work in low-rank matrix approximations obtained from random projections suggests that these approximations are well suited for randomized dimensionality reduction. Approximation errors for the rSVD are evaluated on HSI, and comparisons are made to deterministic techniques and as well as to other randomized low-rank matrix approximation methods involving compressive principal component analysis. Nu
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Soto-Quiros, Pablo. "Error analysis of the generalized low-rank matrix approximation." Electronic Journal of Linear Algebra 37 (July 23, 2021): 544–48. http://dx.doi.org/10.13001/ela.2021.5961.

Der volle Inhalt der Quelle
Annotation:
In this paper, we propose an error analysis of the generalized low-rank approximation, which is a generalization of the classical approximation of a matrix $A\in\mathbb{R}^{m\times n}$ by a matrix of a rank at most $r$, where $r\leq\min\{m,n\}$.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Mehr Quellen

Dissertationen zum Thema "Low-Rank matrix approximation"

1

Robeyns, Matthieu. "Mixed precision algorithms for low-rank matrix and tensor approximations." Electronic Thesis or Diss., université Paris-Saclay, 2024. http://www.theses.fr/2024UPASG095.

Der volle Inhalt der Quelle
Annotation:
La gestion des données est souvent réalisée par des objets mathématiques tels que les matrices et les tenseurs, qui sont la généralisation des matrices à plus de deux dimensions.Certains domaines d'application nécessitent de stocker trop d'éléments, créant des tenseurs trop grands ; ce problème est connu sous le nom de emph curse of dimensionality.Des méthodes mathématiques telles que les approximations de rang faible ont été développées pour réduire la dimensionnalité de ces objets malgré un coût très élevé en temps de calcul.De plus, de nouvelles architectures informatiques telles que les GP
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Blanchard, Pierre. "Fast hierarchical algorithms for the low-rank approximation of matrices, with applications to materials physics, geostatistics and data analysis." Thesis, Bordeaux, 2017. http://www.theses.fr/2017BORD0016/document.

Der volle Inhalt der Quelle
Annotation:
Les techniques avancées pour l’approximation de rang faible des matrices sont des outils de réduction de dimension fondamentaux pour un grand nombre de domaines du calcul scientifique. Les approches hiérarchiques comme les matrices H2, en particulier la méthode multipôle rapide (FMM), bénéficient de la structure de rang faible par bloc de certaines matrices pour réduire le coût de calcul de problèmes d’interactions à n-corps en O(n) opérations au lieu de O(n2). Afin de mieux traiter des noyaux d’interaction complexes de plusieurs natures, des formulations FMM dites ”kernel-independent” ont réc
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Lee, Joonseok. "Local approaches for collaborative filtering." Diss., Georgia Institute of Technology, 2015. http://hdl.handle.net/1853/53846.

Der volle Inhalt der Quelle
Annotation:
Recommendation systems are emerging as an important business application as the demand for personalized services in E-commerce increases. Collaborative filtering techniques are widely used for predicting a user's preference or generating a list of items to be recommended. In this thesis, we develop several new approaches for collaborative filtering based on model combination and kernel smoothing. Specifically, we start with an experimental study that compares a wide variety of CF methods under different conditions. Based on this study, we formulate a combination model similar to boosting but w
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Kim, Jingu. "Nonnegative matrix and tensor factorizations, least squares problems, and applications." Diss., Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/42909.

Der volle Inhalt der Quelle
Annotation:
Nonnegative matrix factorization (NMF) is a useful dimension reduction method that has been investigated and applied in various areas. NMF is considered for high-dimensional data in which each element has a nonnegative value, and it provides a low-rank approximation formed by factors whose elements are also nonnegative. The nonnegativity constraints imposed on the low-rank factors not only enable natural interpretation but also reveal the hidden structure of data. Extending the benefits of NMF to multidimensional arrays, nonnegative tensor factorization (NTF) has been shown to be successful in
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Galvin, Timothy Matthew. "Faster streaming algorithms for low-rank matrix approximations." Thesis, Massachusetts Institute of Technology, 2014. http://hdl.handle.net/1721.1/91810.

Der volle Inhalt der Quelle
Annotation:
Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2014.<br>Cataloged from PDF version of thesis.<br>Includes bibliographical references (pages 53-55).<br>Low-rank matrix approximations are used in a significant number of applications. We present new algorithms for generating such approximations in a streaming fashion that expand upon recently discovered matrix sketching techniques. We test our approaches on real and synthetic data to explore runtime and accuracy performance. We apply our algorithms to the technique of Latent Sema
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Abbas, Kinan. "Dématriçage et démélange conjoints d'images multispectrales." Electronic Thesis or Diss., Littoral, 2024. http://www.theses.fr/2024DUNK0710.

Der volle Inhalt der Quelle
Annotation:
Dans cette thèse, nous considérons des images captées par une caméra multispectrale (MS) miniaturisée « snapshot ». Contrairement aux caméras RVB classiques, l’imagerie MS permet d’observer une scène sur des dizaines de longueurs d’onde différentes, permettant une analyse beaucoup plus précise du contenu observé. Alors que la plupart des caméras MS nécessitent un scan pour générer une image, les caméras MS snapshot peuvent fournir instantanément des images, voire des vidéos. Lorsque la caméra est miniaturisée, au lieu d’un cube de données 3D, elle fournit une image 2D, chaque pixel étant assoc
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Castorena, Juan. "Remote-Sensed LIDAR Using Random Impulsive Scans." International Foundation for Telemetering, 2012. http://hdl.handle.net/10150/581855.

Der volle Inhalt der Quelle
Annotation:
Third generation full-waveform (FW) LIDAR systems image an entire scene by emitting laser pulses in particular directions and measuring the echoes. Each of these echoes provides range measurements about the objects intercepted by the laser pulse along a specified direction. By scanning through a specified region using a series of emitted pulses and observing their echoes, connected 1D profiles of 3D scenes can be readily obtained. This extra information has proven helpful in providing additional insight into the scene structure which can be used to construct effective characterizations and cla
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Vinyes, Marina. "Convex matrix sparsity for demixing with an application to graphical model structure estimation." Thesis, Paris Est, 2018. http://www.theses.fr/2018PESC1130/document.

Der volle Inhalt der Quelle
Annotation:
En apprentissage automatique on a pour but d'apprendre un modèle, à partir de données, qui soit capable de faire des prédictions sur des nouvelles données (pas explorées auparavant). Pour obtenir un modèle qui puisse se généraliser sur les nouvelles données, et éviter le sur-apprentissage, nous devons restreindre le modèle. Ces restrictions sont généralement une connaissance a priori de la structure du modèle. Les premières approches considérées dans la littérature sont la régularisation de Tikhonov et plus tard le Lasso pour induire de la parcimonie dans la solution. La parcimonie fait partie
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Sadek, El Mostafa. "Méthodes itératives pour la résolution d'équations matricielles." Thesis, Littoral, 2015. http://www.theses.fr/2015DUNK0434/document.

Der volle Inhalt der Quelle
Annotation:
Nous nous intéressons dans cette thèse, à l’étude des méthodes itératives pour la résolutiond’équations matricielles de grande taille : Lyapunov, Sylvester, Riccati et Riccatinon symétrique.L’objectif est de chercher des méthodes itératives plus efficaces et plus rapides pour résoudreles équations matricielles de grande taille. Nous proposons des méthodes itérativesde type projection sur des sous espaces de Krylov par blocs Km(A, V ) = Image{V,AV, . . . ,Am−1V }, ou des sous espaces de Krylov étendus par blocs Kem(A, V ) = Image{V,A−1V,AV,A−2V,A2V, · · · ,Am−1V,A−m+1V } . Ces méthodes sont gén
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Winkler, Anderson M. "Widening the applicability of permutation inference." Thesis, University of Oxford, 2016. https://ora.ox.ac.uk/objects/uuid:ce166876-0aa3-449e-8496-f28bf189960c.

Der volle Inhalt der Quelle
Annotation:
This thesis is divided into three main parts. In the first, we discuss that, although permutation tests can provide exact control of false positives under the reasonable assumption of exchangeability, there are common examples in which global exchangeability does not hold, such as in experiments with repeated measurements or tests in which subjects are related to each other. To allow permutation inference in such cases, we propose an extension of the well known concept of exchangeability blocks, allowing these to be nested in a hierarchical, multi-level definition. This definition allows permu
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Buchteile zum Thema "Low-Rank matrix approximation"

1

Kannan, Ramakrishnan, Mariya Ishteva, Barry Drake, and Haesun Park. "Bounded Matrix Low Rank Approximation." In Signals and Communication Technology. Springer Berlin Heidelberg, 2015. http://dx.doi.org/10.1007/978-3-662-48331-2_4.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Friedland, Shmuel, and Venu Tammali. "Low-Rank Approximation of Tensors." In Numerical Algebra, Matrix Theory, Differential-Algebraic Equations and Control Theory. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-15260-8_14.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Dewilde, Patrick, and Alle-Jan van der Veen. "Low-Rank Matrix Approximation and Subspace Tracking." In Time-Varying Systems and Computations. Springer US, 1998. http://dx.doi.org/10.1007/978-1-4757-2817-0_11.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Zhang, Huaxiang, Zhichao Wang, and Linlin Cao. "Fast Nyström for Low Rank Matrix Approximation." In Advanced Data Mining and Applications. Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-35527-1_38.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Deshpande, Amit, and Santosh Vempala. "Adaptive Sampling and Fast Low-Rank Matrix Approximation." In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques. Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11830924_28.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Evensen, Geir, Femke C. Vossepoel, and Peter Jan van Leeuwen. "Localization and Inflation." In Springer Textbooks in Earth Sciences, Geography and Environment. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-96709-3_10.

Der volle Inhalt der Quelle
Annotation:
AbstractLocalization and inflation have become essential means of mitigating the effects of the low-rank approximation in ensemble methods. Localization increases the effective rank of the ensemble covariance matrix and allows it to fit a large number of independent observations. Thus, we use localization to reduce sampling errors, in combination with inflation, to reduce the underestimation of the ensemble variance caused by the low-rank approximation. These methods are essential for high-dimensional applications, and this chapter will give a general introduction to various formulations of lo
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Li, Chong-Ya, Wenzheng Bao, Zhipeng Li, Youhua Zhang, Yong-Li Jiang, and Chang-An Yuan. "Local Sensitive Low Rank Matrix Approximation via Nonconvex Optimization." In Intelligent Computing Methodologies. Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-63315-2_67.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Wacira, Joseph Muthui, Dinna Ranirina, and Bubacarr Bah. "Low Rank Matrix Approximation for Imputing Missing Categorical Data." In Artificial Intelligence Research. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-95070-5_16.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Wu, Jiangang, and Shizhong Liao. "Accuracy-Preserving and Scalable Column-Based Low-Rank Matrix Approximation." In Knowledge Science, Engineering and Management. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-25159-2_22.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Mantzaflaris, Angelos, Bert Jüttler, B. N. Khoromskij, and Ulrich Langer. "Matrix Generation in Isogeometric Analysis by Low Rank Tensor Approximation." In Curves and Surfaces. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-22804-4_24.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Konferenzberichte zum Thema "Low-Rank matrix approximation"

1

Kannan, Ramakrishnan, Mariya Ishteva, and Haesun Park. "Bounded Matrix Low Rank Approximation." In 2012 IEEE 12th International Conference on Data Mining (ICDM). IEEE, 2012. http://dx.doi.org/10.1109/icdm.2012.131.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Li, Chong-Ya, Lin Zhu, Wen-Zheng Bao, Yong-Li Jiang, Chang-An Yuan, and De-Shuang Huang. "Convex local sensitive low rank matrix approximation." In 2017 International Joint Conference on Neural Networks (IJCNN). IEEE, 2017. http://dx.doi.org/10.1109/ijcnn.2017.7965863.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

van der Veen, Alle-Jan. "Schur method for low-rank matrix approximation." In SPIE's 1994 International Symposium on Optics, Imaging, and Instrumentation, edited by Franklin T. Luk. SPIE, 1994. http://dx.doi.org/10.1117/12.190848.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Nadakuditi, Raj Rao. "Exploiting random matrix theory to improve noisy low-rank matrix approximation." In 2011 45th Asilomar Conference on Signals, Systems and Computers. IEEE, 2011. http://dx.doi.org/10.1109/acssc.2011.6190110.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Tatsukawa, Manami, and Mirai Tanaka. "Box Constrained Low-rank Matrix Approximation with Missing Values." In 7th International Conference on Operations Research and Enterprise Systems. SCITEPRESS - Science and Technology Publications, 2018. http://dx.doi.org/10.5220/0006612100780084.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Yinqiang Zheng, Guangcan Liu, S. Sugimoto, Shuicheng Yan, and M. Okutomi. "Practical low-rank matrix approximation under robust L1-norm." In 2012 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2012. http://dx.doi.org/10.1109/cvpr.2012.6247828.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Alelyani, Salem, and Huan Liu. "Supervised Low Rank Matrix Approximation for Stable Feature Selection." In 2012 Eleventh International Conference on Machine Learning and Applications (ICMLA). IEEE, 2012. http://dx.doi.org/10.1109/icmla.2012.61.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Liu, Yang, Wenji Chen, and Yong Guan. "Monitoring Traffic Activity Graphs with low-rank matrix approximation." In 2012 IEEE 37th Conference on Local Computer Networks (LCN 2012). IEEE, 2012. http://dx.doi.org/10.1109/lcn.2012.6423680.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Wang, Hengyou, Ruizhen Zhao, Yigang Cen, and Fengzhen Zhang. "Low-rank matrix recovery based on smooth function approximation." In 2016 IEEE 13th International Conference on Signal Processing (ICSP). IEEE, 2016. http://dx.doi.org/10.1109/icsp.2016.7877928.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Kaloorazi, Maboud F., and Jie Chen. "Low-rank Matrix Approximation Based on Intermingled Randomized Decomposition." In ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2019. http://dx.doi.org/10.1109/icassp.2019.8683284.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Wir bieten Rabatte auf alle Premium-Pläne für Autoren, deren Werke in thematische Literatursammlungen aufgenommen wurden. Kontaktieren Sie uns, um einen einzigartigen Promo-Code zu erhalten!