Academic literature on the topic 'Dimensionality reduction'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Dimensionality reduction.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Dimensionality reduction"

1

Cheng, Long, Chenyu You, and Yani Guan. "Random Projections for Non-linear Dimensionality Reduction." International Journal of Machine Learning and Computing 6, no. 4 (2016): 220–25. http://dx.doi.org/10.18178/ijmlc.2016.6.4.601.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Marchette, David J., and Wendy L. Poston. "Local dimensionality reduction." Computational Statistics 14, no. 4 (1999): 469–89. http://dx.doi.org/10.1007/s001800050026.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Sun, Yu-Yin, Michael Ng, and Zhi-Hua Zhou. "Multi-Instance Dimensionality Reduction." Proceedings of the AAAI Conference on Artificial Intelligence 24, no. 1 (2010): 587–92. http://dx.doi.org/10.1609/aaai.v24i1.7700.

Full text
Abstract:
Multi-instance learning deals with problems that treat bags of instances as training examples. In single-instance learning problems, dimensionality reduction is an essential step for high-dimensional data analysis and has been studied for years. The curse of dimensionality also exists in multiinstance learning tasks, yet this difficult task has not been studied before. Direct application of existing single-instance dimensionality reduction objectives to multi-instance learning tasks may not work well since it ignores the characteristic of multi-instance learning that the labels of bags are kno
APA, Harvard, Vancouver, ISO, and other styles
4

Koren, Y., and L. Carmel. "Robust linear dimensionality reduction." IEEE Transactions on Visualization and Computer Graphics 10, no. 4 (2004): 459–70. http://dx.doi.org/10.1109/tvcg.2004.17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Lotlikar, R., and R. Kothari. "Fractional-step dimensionality reduction." IEEE Transactions on Pattern Analysis and Machine Intelligence 22, no. 6 (2000): 623–27. http://dx.doi.org/10.1109/34.862200.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Gottlieb, Lee-Ad, Aryeh Kontorovich, and Robert Krauthgamer. "Adaptive metric dimensionality reduction." Theoretical Computer Science 620 (March 2016): 105–18. http://dx.doi.org/10.1016/j.tcs.2015.10.040.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Pang, Rich, Benjamin J. Lansdell, and Adrienne L. Fairhall. "Dimensionality reduction in neuroscience." Current Biology 26, no. 14 (2016): R656—R660. http://dx.doi.org/10.1016/j.cub.2016.05.029.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Lovaglio, Pietro Giorgio, and Giorgio Vittadini. "Multilevel dimensionality-reduction methods." Statistical Methods & Applications 22, no. 2 (2012): 183–207. http://dx.doi.org/10.1007/s10260-012-0215-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Carter, Kevin, Raviv Raich, William Finn, and Alfred Hero,III. "Information-Geometric Dimensionality Reduction." IEEE Signal Processing Magazine 28, no. 2 (2011): 89–99. http://dx.doi.org/10.1109/msp.2010.939536.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Gonen, Mehmet. "Bayesian Supervised Dimensionality Reduction." IEEE Transactions on Cybernetics 43, no. 6 (2013): 2179–89. http://dx.doi.org/10.1109/tcyb.2013.2245321.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Dimensionality reduction"

1

Ariu, Kaito. "Online Dimensionality Reduction." Licentiate thesis, KTH, Reglerteknik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-290791.

Full text
Abstract:
In this thesis, we investigate online dimensionality reduction methods, wherethe algorithms learn by sequentially acquiring data. We focus on two specificalgorithm design problems in (i) recommender systems and (ii) heterogeneousclustering from binary user feedback. (i) For recommender systems, we consider a system consisting of m users and n items. In each round, a user,selected uniformly at random, arrives to the system and requests a recommendation. The algorithm observes the user id and recommends an itemfrom the item set. A notable restriction here is that the same item cannotbe recommend
APA, Harvard, Vancouver, ISO, and other styles
2

LEGRAMANTI, SIRIO. "Bayesian dimensionality reduction." Doctoral thesis, Università Bocconi, 2021. http://hdl.handle.net/11565/4035711.

Full text
Abstract:
No abstract available<br>We are currently witnessing an explosion in the amount of available data. Such growth involves not only the number of data points but also their dimensionality. This poses new challenges to statistical modeling and computations, thus making dimensionality reduction more central than ever. In the present thesis, we provide methodological, computational and theoretical advancements in Bayesian dimensionality reduction via novel structured priors. Namely, we develop a new increasing shrinkage prior and illustrate how it can be employed to discard redundant dimensions in G
APA, Harvard, Vancouver, ISO, and other styles
3

Baldiwala, Aliakbar. "Dimensionality Reduction for Commercial Vehicle Fleet Monitoring." Thesis, Université d'Ottawa / University of Ottawa, 2018. http://hdl.handle.net/10393/38330.

Full text
Abstract:
A variety of new features have been added in the present-day vehicles like a pre-crash warning, the vehicle to vehicle communication, semi-autonomous driving systems, telematics, drive by wire. They demand very high bandwidth from in-vehicle networks. Various electronic control units present inside the automotive transmit useful information via automotive multiplexing. Automotive multiplexing allows sharing information among various intelligent modules inside an automotive electronic system. Optimum functionality is achieved by transmitting this data in real time. The high bandwidth and high-
APA, Harvard, Vancouver, ISO, and other styles
4

Bolelli, Maria Virginia. "Diffusion Maps for Dimensionality Reduction." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2019. http://amslaurea.unibo.it/18246/.

Full text
Abstract:
In this thesis we present the diffusion maps, a framework based on diffusion processes for finding meaningful geometric descriptions of data sets. A diffusion process can be described via an iterative application of the heat kernel which has two main characteristics: it satisfies a Markov semigroup property and its level sets encode all geometric features of the space. This process, well known in regular manifolds, has been extended to general data set by Coifman and Lafon. They define a diffusion kernel starting from the geometric properties of the data and their density properties. This kern
APA, Harvard, Vancouver, ISO, and other styles
5

Khosla, Nitin, and n/a. "Dimensionality Reduction Using Factor Analysis." Griffith University. School of Engineering, 2006. http://www4.gu.edu.au:8080/adt-root/public/adt-QGU20061010.151217.

Full text
Abstract:
In many pattern recognition applications, a large number of features are extracted in order to ensure an accurate classification of unknown classes. One way to solve the problems of high dimensions is to first reduce the dimensionality of the data to a manageable size, keeping as much of the original information as possible and then feed the reduced-dimensional data into a pattern recognition system. In this situation, dimensionality reduction process becomes the pre-processing stage of the pattern recognition system. In addition to this, probablility density estimation, with fewer variables i
APA, Harvard, Vancouver, ISO, and other styles
6

Vamulapalli, Harika Rao. "On Dimensionality Reduction of Data." ScholarWorks@UNO, 2010. http://scholarworks.uno.edu/td/1211.

Full text
Abstract:
Random projection method is one of the important tools for the dimensionality reduction of data which can be made efficient with strong error guarantees. In this thesis, we focus on linear transforms of high dimensional data to the low dimensional space satisfying the Johnson-Lindenstrauss lemma. In addition, we also prove some theoretical results relating to the projections that are of interest when applying them in practical applications. We show how the technique can be applied to synthetic data with probabilistic guarantee on the pairwise distance. The connection between dimensionality red
APA, Harvard, Vancouver, ISO, and other styles
7

Widemann, David P. "Dimensionality reduction for hyperspectral data." College Park, Md.: University of Maryland, 2008. http://hdl.handle.net/1903/8448.

Full text
Abstract:
Thesis (Ph. D.) -- University of Maryland, College Park, 2008.<br>Thesis research directed by: Dept. of Mathematics. Title from t.p. of PDF. Includes bibliographical references. Published by UMI Dissertation Services, Ann Arbor, Mich. Also available in paper.
APA, Harvard, Vancouver, ISO, and other styles
8

Khosla, Nitin. "Dimensionality Reduction Using Factor Analysis." Thesis, Griffith University, 2006. http://hdl.handle.net/10072/366058.

Full text
Abstract:
In many pattern recognition applications, a large number of features are extracted in order to ensure an accurate classification of unknown classes. One way to solve the problems of high dimensions is to first reduce the dimensionality of the data to a manageable size, keeping as much of the original information as possible and then feed the reduced-dimensional data into a pattern recognition system. In this situation, dimensionality reduction process becomes the pre-processing stage of the pattern recognition system. In addition to this, probablility density estimation, with fewer variables i
APA, Harvard, Vancouver, ISO, and other styles
9

Sætrom, Jon. "Reduction of Dimensionality in Spatiotemporal Models." Doctoral thesis, Norges teknisk-naturvitenskapelige universitet, Institutt for matematiske fag, 2010. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-11247.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ghodsi, Boushehri Ali. "Nonlinear Dimensionality Reduction with Side Information." Thesis, University of Waterloo, 2006. http://hdl.handle.net/10012/1020.

Full text
Abstract:
In this thesis, I look at three problems with important applications in data processing. Incorporating side information, provided by the user or derived from data, is a main theme of each of these problems. <br /><br /> This thesis makes a number of contributions. The first is a technique for combining different embedding objectives, which is then exploited to incorporate side information expressed in terms of transformation invariants known to hold in the data. It also introduces two different ways of incorporating transformation invariants in order to make new similarity meas
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Dimensionality reduction"

1

Lee, John A., and Michel Verleysen, eds. Nonlinear Dimensionality Reduction. Springer New York, 2007. http://dx.doi.org/10.1007/978-0-387-39351-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Lespinats, Sylvain, Benoit Colange, and Denys Dutykh. Nonlinear Dimensionality Reduction Techniques. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-81026-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Garzon, Max, Ching-Chi Yang, Deepak Venugopal, Nirman Kumar, Kalidas Jana, and Lih-Yuan Deng, eds. Dimensionality Reduction in Data Science. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-05371-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Paul, Arati, and Nabendu Chaki. Dimensionality Reduction of Hyperspectral Imagery. Springer International Publishing, 2024. http://dx.doi.org/10.1007/978-3-031-42667-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Strange, Harry, and Reyer Zwiggelaar. Open Problems in Spectral Dimensionality Reduction. Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-03943-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kramer, Oliver. Dimensionality Reduction with Unsupervised Nearest Neighbors. Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-38652-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Kramer, Oliver. Dimensionality Reduction with Unsupervised Nearest Neighbors. Springer Berlin Heidelberg, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Shaw, Blake. Graph Embedding and Nonlinear Dimensionality Reduction. [publisher not identified], 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Ghojogh, Benyamin, Mark Crowley, Fakhri Karray, and Ali Ghodsi. Elements of Dimensionality Reduction and Manifold Learning. Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-10602-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Wang, Jianzhong. Geometric Structure of High-Dimensional Data and Dimensionality Reduction. Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-27497-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Dimensionality reduction"

1

Herrera, Francisco, Francisco Charte, Antonio J. Rivera, and María J. del Jesus. "Dimensionality Reduction." In Multilabel Classification. Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-41111-8_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Kramer, Oliver. "Dimensionality Reduction." In Dimensionality Reduction with Unsupervised Nearest Neighbors. Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-38652-7_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Hull, Isaiah. "Dimensionality Reduction." In Machine Learning for Economics and Finance in TensorFlow 2. Apress, 2020. http://dx.doi.org/10.1007/978-1-4842-6373-0_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Shen, Heng Tao. "Dimensionality Reduction." In Encyclopedia of Database Systems. Springer New York, 2017. http://dx.doi.org/10.1007/978-1-4899-7993-3_551-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Webb, Geoffrey I., Johannes Fürnkranz, Johannes Fürnkranz, et al. "Dimensionality Reduction." In Encyclopedia of Machine Learning. Springer US, 2011. http://dx.doi.org/10.1007/978-0-387-30164-8_216.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Dinov, Ivo D. "Dimensionality Reduction." In Data Science and Predictive Analytics. Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-72347-1_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Shen, Heng Tao. "Dimensionality Reduction." In Encyclopedia of Database Systems. Springer US, 2009. http://dx.doi.org/10.1007/978-0-387-39940-9_551.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Mathar, Rudolf, Gholamreza Alirezaei, Emilio Balda, and Arash Behboodi. "Dimensionality Reduction." In Fundamentals of Data Analytics. Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-56831-3_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Durstewitz, Daniel. "Dimensionality Reduction." In Advanced Data Analysis in Neuroscience. Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-59976-2_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Braga-Neto, Ulisses. "Dimensionality Reduction." In Fundamentals of Pattern Recognition and Machine Learning. Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-27656-0_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Dimensionality reduction"

1

Bunte, Kerstin, Michael Biehl, and Barbara Hammer. "Dimensionality reduction mappings." In 2011 Ieee Symposium On Computational Intelligence And Data Mining - Part Of 17273 - 2011 Ssci. IEEE, 2011. http://dx.doi.org/10.1109/cidm.2011.5949443.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Schclar, Alon, and Amir Averbuch. "Diffusion Bases Dimensionality Reduction." In 7th International Conference on Neural Computation Theory and Applications. SCITEPRESS - Science and and Technology Publications, 2015. http://dx.doi.org/10.5220/0005625301510156.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bingham, Ella, Aristides Gionis, Niina Haiminen, Heli Hiisilä, Heikki Mannila, and Evimaria Terzi. "Segmentation and dimensionality reduction." In Proceedings of the 2006 SIAM International Conference on Data Mining. Society for Industrial and Applied Mathematics, 2006. http://dx.doi.org/10.1137/1.9781611972764.33.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Zhang, Daoqiang, Zhi-Hua Zhou, and Songcan Chen. "Semi-Supervised Dimensionality Reduction." In Proceedings of the 2007 SIAM International Conference on Data Mining. Society for Industrial and Applied Mathematics, 2007. http://dx.doi.org/10.1137/1.9781611972771.73.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Guo, Ce, and Wayne Luk. "Quantisation-aware Dimensionality Reduction." In 2020 International Conference on Field-Programmable Technology (ICFPT). IEEE, 2020. http://dx.doi.org/10.1109/icfpt51103.2020.00041.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Zhu, Xiaofeng, Cong Lei, Hao Yu, Yonggang Li, Jiangzhang Gan, and Shichao Zhang. "Robust Graph Dimensionality Reduction." In Twenty-Seventh International Joint Conference on Artificial Intelligence {IJCAI-18}. International Joint Conferences on Artificial Intelligence Organization, 2018. http://dx.doi.org/10.24963/ijcai.2018/452.

Full text
Abstract:
In this paper, we propose conducting Robust Graph Dimensionality Reduction (RGDR) by learning a transformation matrix to map original high-dimensional data into their low-dimensional intrinsic space without the influence of outliers. To do this, we propose simultaneously 1) adaptively learning three variables, \ie a reverse graph embedding of original data, a transformation matrix, and a graph matrix preserving the local similarity of original data in their low-dimensional intrinsic space; and 2) employing robust estimators to avoid outliers involving the processes of optimizing these three ma
APA, Harvard, Vancouver, ISO, and other styles
7

Gashler, Mike, and Tony Martinez. "Temporal nonlinear dimensionality reduction." In 2011 International Joint Conference on Neural Networks (IJCNN 2011 - San Jose). IEEE, 2011. http://dx.doi.org/10.1109/ijcnn.2011.6033465.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Heylen, Rob, and Paul Scheunders. "Nonlinear barycentric dimensionality reduction." In 2010 17th IEEE International Conference on Image Processing (ICIP 2010). IEEE, 2010. http://dx.doi.org/10.1109/icip.2010.5653675.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Mosci, Sofia, Lorenzo Rosasco, and Alessandro Verri. "Dimensionality reduction and generalization." In the 24th international conference. ACM Press, 2007. http://dx.doi.org/10.1145/1273496.1273579.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Luo, Xianghui, and Robert J. Durrant. "Maximum Gradient Dimensionality Reduction." In 2018 24th International Conference on Pattern Recognition (ICPR). IEEE, 2018. http://dx.doi.org/10.1109/icpr.2018.8546198.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Dimensionality reduction"

1

Jain, Anil K. Classification, Clustering and Dimensionality Reduction. Defense Technical Information Center, 2008. http://dx.doi.org/10.21236/ada483446.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Wolf, Lior, and Stanley Bileschi. Combining Variable Selection with Dimensionality Reduction. Defense Technical Information Center, 2005. http://dx.doi.org/10.21236/ada454990.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Jones, Michael J. Using Recurrent Networks for Dimensionality Reduction. Defense Technical Information Center, 1992. http://dx.doi.org/10.21236/ada259497.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

León, Carlos. Detecting anomalous payments networks: A dimensionality reduction approach. Banco de la República de Colombia, 2019. http://dx.doi.org/10.32468/be.1098.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Sarwar, Badrul, George Karypis, Joseph Konstan, and John Riedl. Application of Dimensionality Reduction in Recommender System - A Case Study. Defense Technical Information Center, 2000. http://dx.doi.org/10.21236/ada439541.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Fukumizu, Kenji, Francis R. Bach, and Michael I. Jordan. Dimensionality Reduction for Supervised Learning With Reproducing Kernel Hilbert Spaces. Defense Technical Information Center, 2003. http://dx.doi.org/10.21236/ada446572.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Nichols, Jonathan M., Frank Bucholtz, and Joseph V. Michalowicz. Intelligent Data Fusion Using Sparse Representations and Nonlinear Dimensionality Reduction. Defense Technical Information Center, 2009. http://dx.doi.org/10.21236/ada507109.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Vales, C., Y. Choi, D. Copeland, and S. Cheung. Energy conserving quadrature based dimensionality reduction for nonlinear hydrodynamics problems. Office of Scientific and Technical Information (OSTI), 2023. http://dx.doi.org/10.2172/1995059.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Oskolkov, Nikolay. Dimension Reduction Methods for Life Sciences. Instats Inc., 2024. http://dx.doi.org/10.61700/gyxh9ued08xio1347.

Full text
Abstract:
This seminar provides a comprehensive overview of dimension reduction techniques in R and Python for high-dimensional biological data, focusing on their practical applications in life sciences. Participants will gain both theoretical knowledge and practical experience in linear and nonlinear dimensionality reduction methods such as tSNE and UMAP, enhancing their ability to analyze complex datasets effectively. By the conclusion of the seminar, participants will understand the theoretical and practical foundations of these methods, with a wealth of examples that can be rapidly applied for their
APA, Harvard, Vancouver, ISO, and other styles
10

Ho, Tu Bao. Methods of Sparse Modeling and Dimensionality Reduction to Deal with Big Data. Defense Technical Information Center, 2015. http://dx.doi.org/10.21236/ada623178.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!