To see the other types of publications on this topic, follow the link: Subspace approach.

Journal articles on the topic 'Subspace approach'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Subspace approach.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Vijendra, Singh, and Sahoo Laxman. "Subspace Clustering of High-Dimensional Data: An Evolutionary Approach." Applied Computational Intelligence and Soft Computing 2013 (2013): 1–12. http://dx.doi.org/10.1155/2013/863146.

Full text
Abstract:
Clustering high-dimensional data has been a major challenge due to the inherent sparsity of the points. Most existing clustering algorithms become substantially inefficient if the required similarity measure is computed between data points in the full-dimensional space. In this paper, we have presented a robust multi objective subspace clustering (MOSCL) algorithm for the challenging problem of high-dimensional clustering. The first phase of MOSCL performs subspace relevance analysis by detecting dense and sparse regions with their locations in data set. After detection of dense regions it eli
APA, Harvard, Vancouver, ISO, and other styles
2

Blajer, W. "A Projection Method Approach to Constrained Dynamic Analysis." Journal of Applied Mechanics 59, no. 3 (1992): 643–49. http://dx.doi.org/10.1115/1.2893772.

Full text
Abstract:
The paper presents a unified approach to the dynamic analysis of mechanical systems subject to (ideal) holonomic and/or nonholonomic constraints. The approach is based on the projection of the initial (constraint reaction-containing) dynamical equations into the orthogonal and tangent subspaces; the orthogonal subspace which is spanned by the constraint vectors, and the tangent subspace which complements the orthogonal subspace in the system’s configuration space. The tangential projection gives the reaction-free (or purely kinetic) equations of motion, whereas the orthogonal projection determ
APA, Harvard, Vancouver, ISO, and other styles
3

Fatehi, Kavan, Mohsen Rezvani, Mansoor Fateh, and Mohammad-Reza Pajoohan. "Subspace Clustering for High-Dimensional Data Using Cluster Structure Similarity." International Journal of Intelligent Information Technologies 14, no. 3 (2018): 38–55. http://dx.doi.org/10.4018/ijiit.2018070103.

Full text
Abstract:
This article describes how recently, because of the curse of dimensionality in high dimensional data, a significant amount of research has been conducted on subspace clustering aiming at discovering clusters embedded in any possible attributes combination. The main goal of subspace clustering algorithms is to find all clusters in all subspaces. Previous studies have mostly been generating redundant subspace clusters, leading to clustering accuracy loss and also increasing the running time of the algorithms. A bottom-up density-based approach is suggested in this article, in which the cluster s
APA, Harvard, Vancouver, ISO, and other styles
4

Wang, Xing, Jun Wang, Carlotta Domeniconi, Guoxian Yu, Guoqiang Xiao, and Maozu Guo. "Multiple Independent Subspace Clusterings." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 5353–60. http://dx.doi.org/10.1609/aaai.v33i01.33015353.

Full text
Abstract:
Multiple clustering aims at discovering diverse ways of organizing data into clusters. Despite the progress made, it’s still a challenge for users to analyze and understand the distinctive structure of each output clustering. To ease this process, we consider diverse clusterings embedded in different subspaces, and analyze the embedding subspaces to shed light into the structure of each clustering. To this end, we provide a two-stage approach called MISC (Multiple Independent Subspace Clusterings). In the first stage, MISC uses independent subspace analysis to seek multiple and statistical ind
APA, Harvard, Vancouver, ISO, and other styles
5

Sia, Florence, and Rayner Alfred. "Tree-based mining contrast subspace." International Journal of Advances in Intelligent Informatics 5, no. 2 (2019): 169. http://dx.doi.org/10.26555/ijain.v5i2.359.

Full text
Abstract:
All existing mining contrast subspace methods employ density-based likelihood contrast scoring function to measure the likelihood of a query object to a target class against other class in a subspace. However, the density tends to decrease when the dimensionality of subspaces increases causes its bounds to identify inaccurate contrast subspaces for the given query object. This paper proposes a novel contrast subspace mining method that employs tree-based likelihood contrast scoring function which is not affected by the dimensionality of subspaces. The tree-based scoring measure recursively bin
APA, Harvard, Vancouver, ISO, and other styles
6

Ratilal, Purnima, Peter Gerstoft, and Joo Thiam Goh. "Subspace Approach to Inversion by Genetic Algorithms Involving Multiple Frequencies." Journal of Computational Acoustics 06, no. 01n02 (1998): 99–115. http://dx.doi.org/10.1142/s0218396x98000090.

Full text
Abstract:
Based on waveguide physics, a subspace inversion approach is proposed. It is observed that the ability to estimate a given parameter depends on its sensitivity to the acoustic wavefield, and this sensitivity depends on frequency. At low frequencies it is mainly the bottom parameters that are most sensitive and at high frequencies the geometric parameters are the most sensitive. Thus, the parameter vector to be determined is split into two subspaces, and only part of the data that is most influenced by the parameters in each subspace is used. The data sets from the Geoacoustic Inversion Worksho
APA, Harvard, Vancouver, ISO, and other styles
7

Wang, Junpeng, Xiaotong Liu, and Han-Wei Shen. "High-dimensional data analysis with subspace comparison using matrix visualization." Information Visualization 18, no. 1 (2017): 94–109. http://dx.doi.org/10.1177/1473871617733996.

Full text
Abstract:
Due to the intricate relationship between different dimensions of high-dimensional data, subspace analysis is often conducted to decompose dimensions and give prominence to certain subsets of dimensions, i.e. subspaces. Exploring and comparing subspaces are important to reveal the underlying features of subspaces, as well as to portray the characteristics of individual dimensions. To date, most of the existing high-dimensional data exploration and analysis approaches rely on dimensionality reduction algorithms (e.g. principal component analysis and multi-dimensional scaling) to project high-di
APA, Harvard, Vancouver, ISO, and other styles
8

Zhang, Yingwei, Lingjun Zhang, and Hailong Zhang. "Fault Detection for Industrial Processes." Mathematical Problems in Engineering 2012 (2012): 1–18. http://dx.doi.org/10.1155/2012/757828.

Full text
Abstract:
A new fault-relevant KPCA algorithm is proposed. Then the fault detection approach is proposed based on the fault-relevant KPCA algorithm. The proposed method further decomposes both the KPCA principal space and residual space into two subspaces. Compared with traditional statistical techniques, the fault subspace is separated based on the fault-relevant influence. This method can find fault-relevant principal directions and principal components of systematic subspace and residual subspace for process monitoring. The proposed monitoring approach is applied to Tennessee Eastman process and peni
APA, Harvard, Vancouver, ISO, and other styles
9

Ying Han, Pang, Andrew Teoh Beng Jin, and Lim Heng Siong. "Eigenvector Weighting Function in Face Recognition." Discrete Dynamics in Nature and Society 2011 (2011): 1–15. http://dx.doi.org/10.1155/2011/521935.

Full text
Abstract:
Graph-based subspace learning is a class of dimensionality reduction technique in face recognition. The technique reveals the local manifold structure of face data that hidden in the image space via a linear projection. However, the real world face data may be too complex to measure due to both external imaging noises and the intra-class variations of the face images. Hence, features which are extracted by the graph-based technique could be noisy. An appropriate weight should be imposed to the data features for better data discrimination. In this paper, a piecewise weighting function, known as
APA, Harvard, Vancouver, ISO, and other styles
10

Sun, Chengli, Jianxiao Xie, and Yan Leng. "A Signal Subspace Speech Enhancement Approach Based on Joint Low-Rank and Sparse Matrix Decomposition." Archives of Acoustics 41, no. 2 (2016): 245–54. http://dx.doi.org/10.1515/aoa-2016-0024.

Full text
Abstract:
Abstract Subspace-based methods have been effectively used to estimate enhanced speech from noisy speech samples. In the traditional subspace approaches, a critical step is splitting of two invariant subspaces associated with signal and noise via subspace decomposition, which is often performed by singular-value decomposition or eigenvalue decomposition. However, these decomposition algorithms are highly sensitive to the presence of large corruptions, resulting in a large amount of residual noise within enhanced speech in low signal-to-noise ratio (SNR) situations. In this paper, a joint low-r
APA, Harvard, Vancouver, ISO, and other styles
11

Saxon, Stephen A., та William H. Ruckle. "Reducing the classical multipliers ℓ∞, C0 and bv0". Proceedings of the Edinburgh Mathematical Society 40, № 2 (1997): 345–52. http://dx.doi.org/10.1017/s0013091500023786.

Full text
Abstract:
For R ∈ {bv0, c0, ℓ∞} a multiplier of FK spaces, the classical sectional convergence theorems permit the reduction of R to any of its dense barrelled subspaces as a simple consequence of the Closed Graph Theorem. (Cf. the Bachelis/Rosenthal reduction of R = ℓ∞ to its dense barrelled subspace m0.) A natural modern setting permits the reduction of R to any of the larger class of dense βφ subspaces. Bennett and Kalton's FK setting remarkably reduced R = ℓ∞ to any of its dense subspaces. This extreme reduction also obtains in the modern βφ setting since, surprisingly, every dense subspace of ℓ∞ is
APA, Harvard, Vancouver, ISO, and other styles
12

LENG, JINSONG, and ZHIHU HUANG. "OUTLIERS DETECTION WITH CORRELATED SUBSPACES FOR HIGH DIMENSIONAL DATASETS." International Journal of Wavelets, Multiresolution and Information Processing 09, no. 02 (2011): 227–36. http://dx.doi.org/10.1142/s0219691311004067.

Full text
Abstract:
Detecting outliers in high dimensional datasets is quite a difficult data mining task. Mining outliers in subspaces seems to be a promising solution, because outliers may be embedded in some interesting subspaces. Due to the existence of many irrelevant dimensions in high dimensional datasets, it is of great importance to eliminate the irrelevant or unimportant dimensions and identify outliers in interesting subspaces with strong correlation. Normally, the correlation among dimensions can be determined by traditional feature selection techniques and subspace-based clustering methods. The dimen
APA, Harvard, Vancouver, ISO, and other styles
13

Wang, Jing, Atsushi Suzuki, Linchuan Xu, Feng Tian, Liang Yang, and Kenji Yamanishi. "Orderly Subspace Clustering." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 5264–72. http://dx.doi.org/10.1609/aaai.v33i01.33015264.

Full text
Abstract:
Semi-supervised representation-based subspace clustering is to partition data into their underlying subspaces by finding effective data representations with partial supervisions. Essentially, an effective and accurate representation should be able to uncover and preserve the true data structure. Meanwhile, a reliable and easy-to-obtain supervision is desirable for practical learning. To meet these two objectives, in this paper we make the first attempt towards utilizing the orderly relationship, such as the data a is closer to b than to c, as a novel supervision. We propose an orderly subspace
APA, Harvard, Vancouver, ISO, and other styles
14

Xu, Baoxun, Joshua Zhexue Huang, Graham Williams, Qiang Wang, and Yunming Ye. "Classifying Very High-Dimensional Data with Random Forests Built from Small Subspaces." International Journal of Data Warehousing and Mining 8, no. 2 (2012): 44–63. http://dx.doi.org/10.4018/jdwm.2012040103.

Full text
Abstract:
The selection of feature subspaces for growing decision trees is a key step in building random forest models. However, the common approach using randomly sampling a few features in the subspace is not suitable for high dimensional data consisting of thousands of features, because such data often contains many features which are uninformative to classification, and the random sampling often doesn’t include informative features in the selected subspaces. Consequently, classification performance of the random forest model is significantly affected. In this paper, the authors propose an improved r
APA, Harvard, Vancouver, ISO, and other styles
15

Datta, Amitava, Amardeep Kaur, Tobias Lauer, and Sami Chabbouh. "Exploiting multi–core and many–core parallelism for subspace clustering." International Journal of Applied Mathematics and Computer Science 29, no. 1 (2019): 81–91. http://dx.doi.org/10.2478/amcs-2019-0006.

Full text
Abstract:
Abstract Finding clusters in high dimensional data is a challenging research problem. Subspace clustering algorithms aim to find clusters in all possible subspaces of the dataset, where a subspace is a subset of dimensions of the data. But the exponential increase in the number of subspaces with the dimensionality of data renders most of the algorithms inefficient as well as ineffective. Moreover, these algorithms have ingrained data dependency in the clustering process, which means that parallelization becomes difficult and inefficient. SUBSCALE is a recent subspace clustering algorithm which
APA, Harvard, Vancouver, ISO, and other styles
16

Pan, Xiaoda, Hengliang Zhu, Fan Yang, and Xuan Zeng. "Subspace Trajectory Piecewise-Linear Model Order Reduction for Nonlinear Circuits." Communications in Computational Physics 14, no. 3 (2013): 639–63. http://dx.doi.org/10.4208/cicp.070512.051112a.

Full text
Abstract:
AbstractDespite the efficiency of trajectory piecewise-linear (TPWL) model order reduction (MOR) for nonlinear circuits, it needs large amount of expansion points for large-scale nonlinear circuits. This will inevitably increase the model size as well as the simulation time of the resulting reduced macromodels. In this paper, subspace TPWL-MOR approach is developed for the model order reduction of nonlinear circuits. By breaking the high-dimensional state space into several subspaces with much lower dimensions, the subspace TPWL-MOR has very promising advantages of reducing the number of expan
APA, Harvard, Vancouver, ISO, and other styles
17

Menghi, Nicholas, Kemal Kacar, and Will Penny. "Multitask learning over shared subspaces." PLOS Computational Biology 17, no. 7 (2021): e1009092. http://dx.doi.org/10.1371/journal.pcbi.1009092.

Full text
Abstract:
This paper uses constructs from machine learning to define pairs of learning tasks that either shared or did not share a common subspace. Human subjects then learnt these tasks using a feedback-based approach and we hypothesised that learning would be boosted for shared subspaces. Our findings broadly supported this hypothesis with either better performance on the second task if it shared the same subspace as the first, or positive correlations over task performance for shared subspaces. These empirical findings were compared to the behaviour of a Neural Network model trained using sequential
APA, Harvard, Vancouver, ISO, and other styles
18

Barrett, S. A., and G. C. Beroza. "An Empirical Approach to Subspace Detection." Seismological Research Letters 85, no. 3 (2014): 594–600. http://dx.doi.org/10.1785/0220130152.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Cadzow, J. A. "Direction finding: a signal subspace approach." IEEE Transactions on Systems, Man, and Cybernetics 21, no. 5 (1991): 1115–24. http://dx.doi.org/10.1109/21.120063.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Chui, N. L. C., and J. M. Maciejowski. "Subspace identification – a Markov parameter approach." International Journal of Control 78, no. 17 (2005): 1412–36. http://dx.doi.org/10.1080/00207170500362001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Xiaodong Wang and H. V. Poor. "Blind multiuser detection: a subspace approach." IEEE Transactions on Information Theory 44, no. 2 (1998): 677–90. http://dx.doi.org/10.1109/18.661512.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Ganesan, Girish. "A Subspace Approach to Portfolio Analysis." IEEE Signal Processing Magazine 28, no. 5 (2011): 49–60. http://dx.doi.org/10.1109/msp.2011.941551.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Harikumar, Sandhya, and A. S. Akhil. "Semi supervised approach towards subspace clustering." Journal of Intelligent & Fuzzy Systems 34, no. 3 (2018): 1619–29. http://dx.doi.org/10.3233/jifs-169456.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Struski, Łukasz, Jacek Tabor, and Przemysław Spurek. "Lossy compression approach to subspace clustering." Information Sciences 435 (April 2018): 161–83. http://dx.doi.org/10.1016/j.ins.2017.12.056.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Kim, J. W., and C. K. Un. "Noise subspace approach for interference cancellation." Electronics Letters 25, no. 11 (1989): 712–13. http://dx.doi.org/10.1049/el:19890482.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Struski, Lukasz, Marek Śmieja, and Jacek Tabor. "Pointed Subspace Approach to Incomplete Data." Journal of Classification 37, no. 1 (2019): 42–57. http://dx.doi.org/10.1007/s00357-019-9304-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Lei, F., XP Xie, XW Wang, and YG Wang. "Research on the efficiency of reduced-basis approach in computations of structural problems and its improvements." Proceedings of the Institution of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science 227, no. 10 (2012): 2143–56. http://dx.doi.org/10.1177/0954406212470895.

Full text
Abstract:
In this article, procedure and efficiency of the reduced-basis approach in structural design computation are studied. As a model order reduction approach, it provides fast evaluation of a structural system in explicitly parameterized formulation. Theoretically, the original structural system is reduced to obtain a reduced system by being projected onto a lower dimensional subspace. However, in practice, it is a time-consuming process due to the iterations of adaptive procedure in subspace construction. To improve the efficiency of the method, some characteristics are analyzed. First, the accur
APA, Harvard, Vancouver, ISO, and other styles
28

Benner, Peter, and Christian Himpe. "Cross-Gramian-based dominant subspaces." Advances in Computational Mathematics 45, no. 5-6 (2019): 2533–53. http://dx.doi.org/10.1007/s10444-019-09724-7.

Full text
Abstract:
AbstractA standard approach for model reduction of linear input-output systems is balanced truncation, which is based on the controllability and observability properties of the underlying system. The related dominant subspaces projection model reduction method similarly utilizes these system properties, yet instead of balancing, the associated subspaces are directly conjoined. In this work, we extend the dominant subspace approach by computation via the cross Gramian for linear systems, and describe an a-priori error indicator for this method. Furthermore, efficient computation is discussed al
APA, Harvard, Vancouver, ISO, and other styles
29

Luo, Xiaosuo, and Yongduan Song. "Adaptive Predictive Control: A Data-Driven Closed-Loop Subspace Identification Approach." Abstract and Applied Analysis 2014 (2014): 1–11. http://dx.doi.org/10.1155/2014/869879.

Full text
Abstract:
This paper presents a data-driven adaptive predictive control method using closed-loop subspace identification. As the predictor is the key element of the predictive controller, we propose to derive such predictor based on the subspace matrices which are obtained through the closed-loop subspace identification algorithm driven by input-output data. Taking advantage of transformational system model, the closed-loop data is effectively processed in this subspace algorithm. By combining the merits of receding window and recursive identification methods, an adaptive mechanism for online updating s
APA, Harvard, Vancouver, ISO, and other styles
30

Guo, Qin Zhen, Zhi Zeng, and Shu Wu Zhang. "Uniform Variance Product Quantization." Applied Mechanics and Materials 651-653 (September 2014): 2224–27. http://dx.doi.org/10.4028/www.scientific.net/amm.651-653.2224.

Full text
Abstract:
Product quantization (PQ) is an efficient and effective vector quantization approach to fast approximate nearest neighbor (ANN) search especially for high-dimensional data. The basic idea of PQ is to decompose the original data space into the Cartesian product of some low-dimensional subspaces and then every subspace is quantized separately with the same number of codewords. However, the performance of PQ depends largely on the distribution of the original data. If the distributions of every subspace have larger difference, PQ will achieve bad results as shown in our experiments. In this paper
APA, Harvard, Vancouver, ISO, and other styles
31

Zhu, Jianchen, Shengjie Zhao, and Di Wu. "Classification of Remote Sensing Images Through Reweighted Sparse Subspace Representation Using Compressed Data." Traitement du Signal 38, no. 1 (2021): 27–37. http://dx.doi.org/10.18280/ts.380103.

Full text
Abstract:
In many real-world scenarios, subspace clustering essentially aims to cluster unlabeled high-dimensional data into a union of finite-dimensional linear subspaces. The problem is that the data are always high-dimensional, with the increase of the computation, storge, and communication of various intelligent data-driven systems. This paper attempts to develop a method to cluster spectral images directly using the measurements of compressive coded aperture snapshot spectral imager (CASSI), eliminating the need to reconstruct the entire data cube. Assuming that compressed measurements are drawn fr
APA, Harvard, Vancouver, ISO, and other styles
32

LIU, ZHI-QIANG. "ADAPTIVE SUBSPACE SELF-ORGANIZING MAP AND ITS APPLICATIONS IN FACE RECOGNITION." International Journal of Image and Graphics 02, no. 04 (2002): 519–40. http://dx.doi.org/10.1142/s0219467802000834.

Full text
Abstract:
Recently Kohonen proposed the Adaptive Subspace Self-Organizing Map (ASSOM) for extracting subspace detectors from the input data. In the ASSOM, all subspaces represented by the neurons are constrained to intersect the origin in the feature space. As a result, it cannot compensate for the mean present in the data set. In this paper we propose affined subspaces for constructing a set of linear manifolds. This gives rise to a modified ASSOM known as the Adaptive Manifold Self-Organizing Map (AMSOM). In some cases, AMSOM performs many orders of magnitude better than ASSOM. We apply AMSOM to face
APA, Harvard, Vancouver, ISO, and other styles
33

Narayanan, H., and H. Priyadarshan. "A subspace approach to linear dynamical systems." Linear Algebra and its Applications 438, no. 9 (2013): 3576–99. http://dx.doi.org/10.1016/j.laa.2012.12.039.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Prakash, M., and M. Narasimha Murty. "Hebbian learning subspace method: A new approach." Pattern Recognition 30, no. 1 (1997): 141–49. http://dx.doi.org/10.1016/s0031-3203(96)00054-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Cadzow, J. A. "Multiple source location-the signal subspace approach." IEEE Transactions on Acoustics, Speech, and Signal Processing 38, no. 7 (1990): 1110–25. http://dx.doi.org/10.1109/29.57540.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Zhang, Sheng, and Terence Sim. "Discriminant Subspace Analysis: A Fukunaga-Koontz Approach." IEEE Transactions on Pattern Analysis and Machine Intelligence 29, no. 10 (2007): 1732–45. http://dx.doi.org/10.1109/tpami.2007.1089.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Manfredi, Claudia, Massimo D'Aniello, and Piero Bruscaglioni. "A simple subspace approach for speech denoising." Logopedics Phoniatrics Vocology 26, no. 4 (2001): 179–92. http://dx.doi.org/10.1080/14015430127773.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Ki-Seung Lee, Eun Suk Kim, Won Doh, and Dae Hee Youn. "Image enhancement based on signal subspace approach." IEEE Transactions on Image Processing 8, no. 8 (1999): 1129–34. http://dx.doi.org/10.1109/83.777094.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Ephraim, Y., and H. L. Van Trees. "A signal subspace approach for speech enhancement." IEEE Transactions on Speech and Audio Processing 3, no. 4 (1995): 251–66. http://dx.doi.org/10.1109/89.397090.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Li, Dandan, Chung-Ping Kwong, and Dik Lun Lee. "Unified linear subspace approach to semantic analysis." Journal of the American Society for Information Science and Technology 61, no. 1 (2009): 175–89. http://dx.doi.org/10.1002/asi.21219.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Takei, Y., H. Nanto, S. Kanae, Z. J. Yang, and K. Wada. "A Unified Approach for Subspace Identification Algorithms." Proceedings of the ISCIE International Symposium on Stochastic Systems Theory and its Applications 2004 (May 5, 2004): 193–98. http://dx.doi.org/10.5687/sss.2004.193.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Takei, Y., H. Nanto, S. Kanae, Z. J. Yang, and K. Wada. "Recursive Subspace Identification Using A Unified Approach." Proceedings of the ISCIE International Symposium on Stochastic Systems Theory and its Applications 2005 (May 5, 2005): 72–77. http://dx.doi.org/10.5687/sss.2005.72.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Srivastava, A. "A Bayesian approach to geometric subspace estimation." IEEE Transactions on Signal Processing 48, no. 5 (2000): 1390–400. http://dx.doi.org/10.1109/78.839985.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Mari, J., P. Stoica, and T. McKelvey. "Vector ARMA estimation: a reliable subspace approach." IEEE Transactions on Signal Processing 48, no. 7 (2000): 2092–104. http://dx.doi.org/10.1109/78.847793.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Iupikov, O. A., C. Craeye, R. Maaskant, and M. V. Ivashina. "Domain-Decomposition Approach to Krylov Subspace Iteration." IEEE Antennas and Wireless Propagation Letters 15 (2016): 1414–17. http://dx.doi.org/10.1109/lawp.2015.2511195.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Song, Junquan, Yujian Ye, Danda Zhang, and Jun Zhang. "Conditional Lie-Bäcklund Symmetries and Reductions of the Nonlinear Diffusion Equations with Source." Abstract and Applied Analysis 2014 (2014): 1–17. http://dx.doi.org/10.1155/2014/898032.

Full text
Abstract:
Conditional Lie-Bäcklund symmetry approach is used to study the invariant subspace of the nonlinear diffusion equations with sourceut=e−qx(epxP(u)uxm)x+Q(x,u),m≠1. We obtain a complete list of canonical forms for such equations admit multidimensional invariant subspaces determined by higher order conditional Lie-Bäcklund symmetries. The resulting equations are either solved exactly or reduced to some finite-dimensional dynamic systems.
APA, Harvard, Vancouver, ISO, and other styles
47

Fan, Yunpeng, Shipeng Li, and Yingwei Zhang. "Monitoring of Multimode Processes Based on Quality-Related Common Subspace Separation." Mathematical Problems in Engineering 2014 (2014): 1–7. http://dx.doi.org/10.1155/2014/487582.

Full text
Abstract:
A new monitoring approach for multimode processes based on quality-related common subspace separation is proposed. In the model, the data set forms a larger space when the correlation between process variables and quality variables is considered. And then the whole space is decomposed: quality-related common subspace, quality-related specific subspace, and the residual subspace. Monitoring method is performed in every subspace, respectively. The simulation results show the proposed method is effective.
APA, Harvard, Vancouver, ISO, and other styles
48

G.P, Hegde, Seetha M, and Nagaratna Hegde. "Symmetrical Weighted Subspace Holistic Approach for Expression Recognition." International Journal of Computer Science and Information Technology 7, no. 4 (2015): 93–103. http://dx.doi.org/10.5121/ijcsit.2015.7408.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Bajeux-Besnainou, Isabelle, Wachindra Bandara, and Efstathia Bura. "A Krylov subspace approach to large portfolio optimization." Journal of Economic Dynamics and Control 36, no. 11 (2012): 1688–99. http://dx.doi.org/10.1016/j.jedc.2012.04.009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Bagheri, Mohammad Ali, Gholam Ali Montazer, and Ehsanollah Kabir. "A subspace approach to error correcting output codes." Pattern Recognition Letters 34, no. 2 (2013): 176–84. http://dx.doi.org/10.1016/j.patrec.2012.09.010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!