Academic literature on the topic 'Sparse Tensors'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Sparse Tensors.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Sparse Tensors"

1

Chou, Stephen, and Saman Amarasinghe. "Compilation of dynamic sparse tensor algebra." Proceedings of the ACM on Programming Languages 6, OOPSLA2 (2022): 1408–37. http://dx.doi.org/10.1145/3563338.

Full text
Abstract:
Many applications, from social network graph analytics to control flow analysis, compute on sparse data that evolves over the course of program execution. Such data can be represented as dynamic sparse tensors and efficiently stored in formats (data layouts) that utilize pointer-based data structures like block linked lists, binary search trees, B-trees, and C-trees among others. These specialized formats support fast in-place modification and are thus better suited than traditional, array-based data structures like CSR for storing dynamic sparse tensors. However, different dynamic sparse tens
APA, Harvard, Vancouver, ISO, and other styles
2

Zhang, Genghan, Olivia Hsu, and Fredrik Kjolstad. "Compilation of Modular and General Sparse Workspaces." Proceedings of the ACM on Programming Languages 8, PLDI (2024): 1213–38. http://dx.doi.org/10.1145/3656426.

Full text
Abstract:
Recent years have seen considerable work on compiling sparse tensor algebra expressions. This paper addresses a shortcoming in that work, namely how to generate efficient code (in time and space) that scatters values into a sparse result tensor. We address this shortcoming through a compiler design that generates code that uses sparse intermediate tensors (sparse workspaces) as efficient adapters between compute code that scatters and result tensors that do not support random insertion. Our compiler automatically detects sparse scattering behavior in tensor expressions and inserts necessary in
APA, Harvard, Vancouver, ISO, and other styles
3

Fang, Jingzhi, Yanyan Shen, Yue Wang, and Lei Chen. "STile: Searching Hybrid Sparse Formats for Sparse Deep Learning Operators Automatically." Proceedings of the ACM on Management of Data 2, no. 1 (2024): 1–26. http://dx.doi.org/10.1145/3639323.

Full text
Abstract:
Sparse operators, i.e., operators that take sparse tensors as input, are of great importance in deep learning models. Due to the diverse sparsity patterns in different sparse tensors, it is challenging to optimize sparse operators by seeking an optimal sparse format, i.e., leading to the lowest operator latency. Existing works propose to decompose a sparse tensor into several parts and search for a hybrid of sparse formats to handle diverse sparse patterns. However, they often make a trade-off between search space and search time: their search spaces are limited in some cases, resulting in lim
APA, Harvard, Vancouver, ISO, and other styles
4

Liu, Peiming, Alexander J. Root, Anlun Xu, Yinying Li, Fredrik Kjolstad, and Aart J. C. Bik. "Compiler Support for Sparse Tensor Convolutions." Proceedings of the ACM on Programming Languages 8, OOPSLA2 (2024): 275–303. http://dx.doi.org/10.1145/3689721.

Full text
Abstract:
This paper extends prior work on sparse tensor algebra compilers to generate asymptotically efficient code for tensor expressions with affine subscript expressions. Our technique enables compiler support for a wide range of sparse computations, including sparse convolutions and pooling that are widely used in ML and graphics applications. We propose an approach that gradually rewrites compound subscript expressions to simple subscript expressions with loops that exploit the sparsity pattern of the input sparse tensors. As a result, the time complexity of the generated kernels is bounded by the
APA, Harvard, Vancouver, ISO, and other styles
5

Hackbusch, W. "A Note on Nonclosed Tensor Formats." Vietnam Journal of Mathematics 48, no. 4 (2019): 621–31. http://dx.doi.org/10.1007/s10013-019-00372-4.

Full text
Abstract:
AbstractVarious tensor formats exist which allow a data-sparse representation of tensors. Some of these formats are not closed. The consequences are (i) possible non-existence of best approximations and (ii) divergence of the representing parameters when a tensor within the format tends to a border tensor outside. The paper tries to describe the nature of this divergence. A particular question is whether the divergence is uniform for all border tensors.
APA, Harvard, Vancouver, ISO, and other styles
6

Ahrens, Willow, Teodoro Fields Collin, Radha Patel, Kyle Deeds, Changwan Hong, and Saman Amarasinghe. "Finch: Sparse and Structured Tensor Programming with Control Flow." Proceedings of the ACM on Programming Languages 9, OOPSLA1 (2025): 1042–72. https://doi.org/10.1145/3720473.

Full text
Abstract:
From FORTRAN to NumPy, tensors have revolutionized how we express computation. However, tensors in these, and almost all prominent systems, can only handle dense rectilinear integer grids. Real world tensors often contain underlying structure, such as sparsity, runs of repeated values, or symmetry. Support for structured data is fragmented and incomplete. Existing frameworks limit the tensor structures and program control flow they support to better simplify the problem. In this work, we propose a new programming language, Finch, which supports both flexible control flow and diverse data struc
APA, Harvard, Vancouver, ISO, and other styles
7

Deeds, Kyle, Willow Ahrens, Magdalena Balazinska, and Dan Suciu. "Galley: Modern Query Optimization for Sparse Tensor Programs." Proceedings of the ACM on Management of Data 3, no. 3 (2025): 1–24. https://doi.org/10.1145/3725301.

Full text
Abstract:
The tensor programming abstraction is a foundational paradigm which allows users to write high performance programs via a high-level imperative interface. Recent work on sparse tensor compilers has extended this paradigm to sparse tensors (i.e., tensors where most entries are not explicitly represented). With these systems, users define the semantics of the program and the algorithmic decisions in a concise language that can be compiled to efficient low-level code. However, these systems still require users to make complex decisions about program structure and memory layouts to write efficient
APA, Harvard, Vancouver, ISO, and other styles
8

Wang, Shuangyue, and Ziyan Luo. "Sparse Support Tensor Machine with Scaled Kernel Functions." Mathematics 11, no. 13 (2023): 2829. http://dx.doi.org/10.3390/math11132829.

Full text
Abstract:
As one of the supervised tensor learning methods, the support tensor machine (STM) for tensorial data classification is receiving increasing attention in machine learning and related applications, including remote sensing imaging, video processing, fault diagnosis, etc. Existing STM approaches lack consideration for support tensors in terms of data reduction. To address this deficiency, we built a novel sparse STM model to control the number of support tensors in the binary classification of tensorial data. The sparsity is imposed on the dual variables in the context of the feature space, whic
APA, Harvard, Vancouver, ISO, and other styles
9

Tang, Tao, and Gangyao Kuang. "SAR Image Reconstruction of Vehicle Targets Based on Tensor Decomposition." Electronics 11, no. 18 (2022): 2859. http://dx.doi.org/10.3390/electronics11182859.

Full text
Abstract:
Due to the imaging mechanism of Synthetic Aperture Radars (SARs), the target shape on an SAR image is sensitive to the radar incidence angle and target azimuth, but there is strong correlation and redundancy between adjacent azimuth images of SAR targets. This paper studies multi-angle SAR image reconstruction based on non-negative Tucker decomposition using adjacent azimuth images reconstructed to form a sparse tensor. Sparse tensors are used to perform non-negative Tucker decomposition, resulting in non-negative core tensors and factor matrices. The reconstruction tensor is obtained by calcu
APA, Harvard, Vancouver, ISO, and other styles
10

Friedland, Shmuel, Qun Li, and Dan Schonfeld. "Compressive Sensing of Sparse Tensors." IEEE Transactions on Image Processing 23, no. 10 (2014): 4438–47. http://dx.doi.org/10.1109/tip.2014.2348796.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Sparse Tensors"

1

Persu, Elena-Mădălina. "Tensors, sparse problems and conditional hardness." Thesis, Massachusetts Institute of Technology, 2018. http://hdl.handle.net/1721.1/120418.

Full text
Abstract:
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2018.<br>Cataloged from PDF version of thesis.<br>Includes bibliographical references (pages 101-108).<br>In this thesis we study the interplay between theoretical computer science and machine learning in three different directions. First, we make a connection between two ubiquitous sparse problems: Sparse Principal Component Analysis (SPCA) and Sparse Linear Regression (SLR). We show how to efficiently transform a blackbox solver for SLR into an algorithm for SPCA. Assuming the SL
APA, Harvard, Vancouver, ISO, and other styles
2

Xu, Helen Jiang. "Fill estimation for blocked sparse matrices and tensors." Thesis, Massachusetts Institute of Technology, 2018. http://hdl.handle.net/1721.1/117816.

Full text
Abstract:
Thesis: S.M., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2018.<br>This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.<br>Cataloged from student-submitted PDF version of thesis.<br>Includes bibliographical references (pages 69-71).<br>Many sparse matrices and tensors from a variety of applications, such as finite element methods and computational chemistry, have a natural aligned rectangular nonzero block structure. Researchers have designed high-
APA, Harvard, Vancouver, ISO, and other styles
3

Ahrens, Peter(Peter James). "A parallel fill estimation algorithm for sparse matrices and tensors in blocked formats." Thesis, Massachusetts Institute of Technology, 2019. https://hdl.handle.net/1721.1/121653.

Full text
Abstract:
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.<br>Thesis: S.M., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2019<br>Cataloged from student-submitted PDF version of thesis.<br>Includes bibliographical references (pages 27-30).<br>Many sparse matrices and tensors from a variety of applications, such as finite element methods and computational chemistry, have a natural aligned rectangular nonzero block structure. Researchers have designed high-pe
APA, Harvard, Vancouver, ISO, and other styles
4

Cordolino, Sobral Andrews. "Robust low-rank and sparse decomposition for moving object detection : from matrices to tensors." Thesis, La Rochelle, 2017. http://www.theses.fr/2017LAROS007/document.

Full text
Abstract:
Dans ce manuscrit de thèse, nous introduisons les avancées récentes sur la décomposition en matrices (et tenseurs) de rang faible et parcimonieuse ainsi que les contributions pour faire face aux principaux problèmes dans ce domaine. Nous présentons d’abord un aperçu des méthodes matricielles et tensorielles les plus récentes ainsi que ses applications sur la modélisation d’arrière-plan et la segmentation du premier plan. Ensuite, nous abordons le problème de l’initialisation du modèle de fond comme un processus de reconstruction à partir de données manquantes ou corrompues. Une nouvelle méthod
APA, Harvard, Vancouver, ISO, and other styles
5

Reich, Nils Christopher. "Wavelet compression of anisotropic integrodifferential operators on sparse tensor product spaces." Zürich : ETH, 2008. http://e-collection.ethbib.ethz.ch/show?type=diss&nr=17661.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kjølstad, Fredrik Berg. "Sparse tensor algebra compilation." Thesis, Massachusetts Institute of Technology, 2020. https://hdl.handle.net/1721.1/128314.

Full text
Abstract:
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2020<br>Cataloged from PDF of thesis.<br>Includes bibliographical references (pages 118-128).<br>This dissertation shows how to compile any sparse tensor algebra expression to CPU and GPU code that matches the performance of hand-optimized implementations. A tensor algebra expression is sparse if at least one of its tensor operands is sparse, and a tensor is sparse if most of its values are zero. If a tensor is sparse, then we can store its nonzero values in a compressed data struc
APA, Harvard, Vancouver, ISO, and other styles
7

Mueller, Suzanne A. "Sparse tensor transpositions in the tensor algebra compiler." Thesis, Massachusetts Institute of Technology, 2020. https://hdl.handle.net/1721.1/129919.

Full text
Abstract:
Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, February, 2020<br>Cataloged from student-submitted PDF of thesis.<br>Includes bibliographical references (pages 89-90).<br>The current state of the art for transposing sparse tensors involves converting the sparse tensor into a list of coordinates, sorting the list of coordinates and finally packing the list of coordinates into the desired sparse tensor format. This thesis explores the potential for faster methodologies. Its main contributions are an algorithm that exploits partia
APA, Harvard, Vancouver, ISO, and other styles
8

Tew, Parker Allen. "An investigation of sparse tensor formats for tensor libraries." Thesis, Massachusetts Institute of Technology, 2016. http://hdl.handle.net/1721.1/113496.

Full text
Abstract:
Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2016.<br>Cataloged from PDF version of thesis.<br>Includes bibliographical references (pages 52-53).<br>Tensors provide a generalized structure to store arbitrary indexable data, which is applicable in fields such as chemometrics, physics simulations, signal processing and lies at the heart of machine learning. Many naturally occurring tensors are considered sparse as they contain mostly zero values. As with sparse matrices, various techniques can be employed to more efficiently s
APA, Harvard, Vancouver, ISO, and other styles
9

Van, Zyl Augustinus Johannes. "Metrical aspects of the complexification of tensor products and tensor norms." Thesis, Pretoria : [s.n.], 2009. http://upetd.up.ac.za/thesis/available/etd-07142009-180520.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Chou, Stephen S. M. Massachusetts Institute of Technology. "Unfield sparse formats for tensor algebra compilers." Thesis, Massachusetts Institute of Technology, 2018. http://hdl.handle.net/1721.1/115625.

Full text
Abstract:
Thesis: S.M., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2018.<br>This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.<br>Cataloged from student-submitted PDF version of thesis.<br>Includes bibliographical references (pages 99-105).<br>Tensor algebra is a powerful tool for computing on multidimensional data and has applications in many fields. Practical applications often deal with tensors that are sparse, and there exists a wide variety of format
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Sparse Tensors"

1

H, Pulliam Thomas, and Research Institute for Advanced Computer Science (U.S.), eds. Tensor-GMRES method for large sparse systems of nonlinear equations. Research Institute for Advanced Computer Science, NASA Ames Research Center, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Hackbusch, Wolfgang. Tensor Spaces and Numerical Tensor Calculus. Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-28027-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Hackbusch, Wolfgang. Tensor Spaces and Numerical Tensor Calculus. Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-35554-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Hackbusch, Wolfgang. Tensor Spaces and Numerical Tensor Calculus. Springer Berlin Heidelberg, 2012.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Yokonuma, Takeo. Tensor spaces and exterior algebra. American Mathematical Society, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Light, William Allan, and Elliott Ward Cheney. Approximation Theory in Tensor Product Spaces. Springer Berlin Heidelberg, 1985. http://dx.doi.org/10.1007/bfb0075391.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

1929-, Cheney E. W., ed. Approximation theory in tensor product spaces. Springer-Verlag, 1985.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Pisier, Gilles. The operator Hilbert space OH, complex interpolation, and tensor norms. American Mathematical Society, 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Ryan, Raymond A. Introduction to Tensor Products of Banach Spaces. Springer London, 2002. http://dx.doi.org/10.1007/978-1-4471-3903-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Weatherburn, C. E. An introduction to Riemannian geometry and the tensor calculus. Cambridge University Press, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Sparse Tensors"

1

Cavdar, Derya, Valeriu Codreanu, Can Karakus, et al. "Densifying Assumed-Sparse Tensors." In Lecture Notes in Computer Science. Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-20656-7_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Deussen, Jens, and Uwe Naumann. "Efficient Computation of Sparse Higher Derivative Tensors." In Lecture Notes in Computer Science. Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-22734-0_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Friedland, Shmuel, Qun Li, Dan Schonfeld, and Edgar A. Bernal. "Two Algorithms for Compressed Sensing of Sparse Tensors." In Compressed Sensing and its Applications. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-16042-9_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Smith, Shaden, and George Karypis. "Accelerating the Tucker Decomposition with Compressed Sparse Tensors." In Lecture Notes in Computer Science. Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-64203-1_47.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Shivakumar, Shruti, Jiajia Li, Ramakrishnan Kannan, and Srinivas Aluru. "Efficient Parallel Sparse Symmetric Tucker Decomposition for High-Order Tensors." In SIAM Conference on Applied and Computational Discrete Algorithms (ACDA21). Society for Industrial and Applied Mathematics, 2021. http://dx.doi.org/10.1137/1.9781611976830.18.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Moroz, Guillaume. "Sparse Tensors and Subdivision Methods for Finding the Zero Set of Polynomial Equations." In Lecture Notes in Computer Science. Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-69070-9_14.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Dimitrienko, Yu I. "Tensors in Riemannian Spaces and Affinely Connected Spaces." In Tensor Analysis and Nonlinear Tensor Functions. Springer Netherlands, 2002. http://dx.doi.org/10.1007/978-94-017-3221-5_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Hackbusch, Wolfgang. "Banach Tensor Spaces." In Tensor Spaces and Numerical Tensor Calculus. Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-28027-6_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Hackbusch, Wolfgang. "Banach Tensor Spaces." In Tensor Spaces and Numerical Tensor Calculus. Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-35554-8_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Hackbusch, Wolfgang. "Tensor Spaces." In Hierarchical Matrices: Algorithms and Analysis. Springer Berlin Heidelberg, 2015. http://dx.doi.org/10.1007/978-3-662-47324-5_16.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Sparse Tensors"

1

Colucci, Alessio, Andreas Steininger, and Muhammad Shafique. "SBanTEM: A Novel Methodology for Sparse Band Tensors as Soft-Error Mitigation in Sparse Convolutional Neural Networks." In 2024 IEEE 30th International Symposium on On-Line Testing and Robust System Design (IOLTS). IEEE, 2024. http://dx.doi.org/10.1109/iolts60994.2024.10616070.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Koul, Kalhan, Maxwell Strange, Jackson Melchert, et al. "Onyx: A Programmable Accelerator for Sparse Tensor Algebra." In 2024 IEEE Hot Chips 36 Symposium (HCS). IEEE, 2024. http://dx.doi.org/10.1109/hcs61935.2024.10665150.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Peltekis, Christodoulos, Chrysostomos Nicopoulos, and Giorgos Dimitrakopoulos. "Periodic Online Testing for Sparse Systolic Tensor Arrays." In 2025 14th International Conference on Modern Circuits and Systems Technologies (MOCAST). IEEE, 2025. https://doi.org/10.1109/mocast65744.2025.11083733.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ahn, Dawon, Jun-Gi Jang, and U. Kang. "Time-Aware Tensor Decomposition for Sparse Tensors." In 2021 IEEE 8th International Conference on Data Science and Advanced Analytics (DSAA). IEEE, 2021. http://dx.doi.org/10.1109/dsaa53316.2021.9564142.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Ahn, Dawon, Uday Singh Saini, Evangelos E. Papalexakis, and Ali Payani. "Neural Additive Tensor Decomposition for Sparse Tensors." In CIKM '24: The 33rd ACM International Conference on Information and Knowledge Management. ACM, 2024. http://dx.doi.org/10.1145/3627673.3679833.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Abo Khamis, Mahmoud, Hung Q. Ngo, XuanLong Nguyen, Dan Olteanu, and Maximilian Schleich. "In-Database Learning with Sparse Tensors." In SIGMOD/PODS '18: International Conference on Management of Data. ACM, 2018. http://dx.doi.org/10.1145/3196959.3196960.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Zhou, Shuo, Sarah Erfani, and James Bailey. "Online CP Decomposition for Sparse Tensors." In 2018 IEEE International Conference on Data Mining (ICDM). IEEE, 2018. http://dx.doi.org/10.1109/icdm.2018.00202.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Li, Jiajia, Jimeng Sun, and Richard Vuduc. "HiCOO: Hierarchical Storage of Sparse Tensors." In SC18: International Conference for High Performance Computing, Networking, Storage and Analysis. IEEE, 2018. http://dx.doi.org/10.1109/sc.2018.00022.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Jain, Swayambhoo, Alexander Gutierrez, and Jarvis Haupt. "Noisy tensor completion for tensors with a sparse canonical polyadic factor." In 2017 IEEE International Symposium on Information Theory (ISIT). IEEE, 2017. http://dx.doi.org/10.1109/isit.2017.8006910.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Baskaran, Muthu, Benoit Meister, Nicolas Vasilache, and Richard Lethin. "Efficient and scalable computations with sparse tensors." In 2012 IEEE Conference on High Performance Extreme Computing (HPEC). IEEE, 2012. http://dx.doi.org/10.1109/hpec.2012.6408676.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Sparse Tensors"

1

Bader, Brett William, and Tamara Gibson Kolda. Efficient MATLAB computations with sparse and factored tensors. Office of Scientific and Technical Information (OSTI), 2006. http://dx.doi.org/10.2172/897641.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Devine, Karen, and Grey Ballard. GentenMPI: Distributed Memory Sparse Tensor Decomposition. Office of Scientific and Technical Information (OSTI), 2020. http://dx.doi.org/10.2172/1656940.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bouaricha, A. Tensor methods for large, sparse unconstrained optimization. Office of Scientific and Technical Information (OSTI), 1996. http://dx.doi.org/10.2172/409872.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Myers, Jeremy, Daniel Dunlavy, Keita Teranishi, and David Hollman. Parameter Sensitivity Analysis of the SparTen High Performance Sparse Tensor Decomposition Software: Extended Analysis. Office of Scientific and Technical Information (OSTI), 2020. http://dx.doi.org/10.2172/1706215.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Bouaricha, A., and R. B. Schnabel. Tensor methods for large sparse systems of nonlinear equations. Office of Scientific and Technical Information (OSTI), 1996. http://dx.doi.org/10.2172/434848.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Lopez, Oscar, Richard Lehoucq, and Daniel Dunlavy. Zero-Truncated Poisson Tensor Decomposition for Sparse Count Data. Office of Scientific and Technical Information (OSTI), 2022. http://dx.doi.org/10.2172/1841834.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Clayton, John D., David L. McDowell, and Douglas J. Bammann. Anholonomic Configuration Spaces and Metric Tensors in Finite Elastoplasticity. Defense Technical Information Center, 2006. http://dx.doi.org/10.21236/ada445112.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Bouaricha, A. STENMIN: A software package for large, sparse unconstrained optimization using tensor methods. Office of Scientific and Technical Information (OSTI), 1996. http://dx.doi.org/10.2172/399726.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Geronimo Anderson, Sean Isaac, and Daniel Dunlavy. Computing Sparse Tensor Decompositions via Chapel and C++/MPI Interoperability without Intermediate I/O. Office of Scientific and Technical Information (OSTI), 2023. http://dx.doi.org/10.2172/2430185.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Staples, Henry, Ozge Ozduzen, Vania Rolon, and Nelli Ferenczi. Spatial Aspects of De-Radicalisation Processes in London. Glasgow Caledonian University, 2025. https://doi.org/10.59019/akdbcf61.

Full text
Abstract:
This report investigates how Windrush Square, a public space in London (UK) is experienced by people from diverse demographic backgrounds, and how these everyday interactions and tensions shape experiences of social cohesion and alienation. This space was selected as a case study as it captures the underlying tensions of the UK’s colonial past and its continuing impact in the present day. We first conducted three expert interviews to shed light on the role of the Greater London Authority in public space governance, landscape design and wellbeing of migrant populations, and community-led neighb
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!