Academic literature on the topic 'Feature based Principal Component Analysis (FPCA)'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Feature based Principal Component Analysis (FPCA).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Feature based Principal Component Analysis (FPCA)"

1

Salvatore, Stefania, Jørgen G. Bramness, and Jo Røislien. "Exploring functional data analysis and wavelet principal component analysis on ecstasy (MDMA) wastewater data." BMC Medical Research Methodology 16, no. 1 (2016): 81. https://doi.org/10.1186/s12874-016-0179-2.

Full text
Abstract:
<strong>Background: </strong>Wastewater-based epidemiology (WBE) is a novel approach in drug use epidemiology which aims to monitor the extent of use of various drugs in a community. In this study, we investigate functional principal component analysis (FPCA) as a tool for analysing WBE data and compare it to traditional principal component analysis (PCA) and to wavelet principal component analysis (WPCA) which is more flexible temporally.<strong>Methods: </strong>We analysed temporal wastewater data from 42 European cities collected daily over one week in March 2013. The main temporal features of ecstasy (MDMA) were extracted using FPCA using both Fourier and B-spline basis functions with three different smoothing parameters, along with PCA and WPCA with different mother wavelets and shrinkage rules. The stability of FPCA was explored through bootstrapping and analysis of sensitivity to missing data.<strong>Results: </strong>The first three principal components (PCs), functional principal components (FPCs) and wavelet principal components (WPCs) explained 87.5-99.6 % of the temporal variation between cities, depending on the choice of basis and smoothing. The extracted temporal features from PCA, FPCA and WPCA were consistent. FPCA using Fourier basis and common-optimal smoothing was the most stable and least sensitive to missing data.<strong>Conclusion: </strong>FPCA is a flexible and analytically tractable method for analysing temporal changes in wastewater data, and is robust to missing data. WPCA did not reveal any rapid temporal changes in the data not captured by FPCA. Overall the results suggest FPCA with Fourier basis functions and common-optimal smoothing parameter as the most accurate approach when analysing WBE data.
APA, Harvard, Vancouver, ISO, and other styles
2

Pesaresi, Simone, Adriano Mancini, Giacomo Quattrini, and Simona Casavecchia. "Mapping Mediterranean Forest Plant Associations and Habitats with Functional Principal Component Analysis Using Landsat 8 NDVI Time Series." Remote Sensing 12, no. 7 (2020): 1132. http://dx.doi.org/10.3390/rs12071132.

Full text
Abstract:
The classification of plant associations and their mapping play a key role in defining habitat biodiversity management, monitoring, and conservation strategies. In this work we present a methodological framework to map Mediterranean forest plant associations and habitats that relies on the application of the Functional Principal Component Analysis (FPCA) to the remotely sensed Normalized Difference Vegetation Index (NDVI) time series. FPCA, considering the chronological order of the data, reduced the NDVI time series data complexity and provided (as FPCA scores) the main seasonal NDVI phenological variations of the forests. We performed a supervised classification of the FPCA scores combined with topographic and lithological features of the study area to map the forest plant associations. The supervised mapping achieved an overall accuracy of 87.5%. The FPCA scores contributed to the global accuracy of the map much more than the topographic and lithological features. The results showed that (i) the main seasonal phenological variations (FPCA scores) are effective spatial predictors to obtain accurate plant associations and habitat maps; (ii) the FPCA is a suitable solution to simultaneously express the relationships between remotely sensed and ecological field data, since it allows us to integrate these two different perspectives about plant associations in a single graph. The proposed approach based on the FPCA is useful for forest habitat monitoring, as it can contribute to produce periodically detailed vegetation-based habitat maps that reflect the “current” status of vegetation and habitats, also supporting the study of plant associations.
APA, Harvard, Vancouver, ISO, and other styles
3

K, N. Kusuma, and Lakshmi Ram Prasath H. "Application of Feature Based Principal Component Analysis (FPCA) technique on Landsat8 OLI multispectral data to map Kimberlite pipes." Indian Journal of Science and Technology 14, no. 4 (2021): 361–72. https://doi.org/10.17485/IJST/v14i4.1741.

Full text
Abstract:
Abstract <strong>Objectives:</strong>&nbsp;To map the kimberlite pipes emplaced in parts of Anantpur District, India using Landsat-8 OLI multispectral data. Kimberlite are considered as the primary host of natural diamond. Kimberlite pipes have very limited exposure and are altered, therefore the indirect surface indicators associated with kimberlite such as ferric iron bearing minerals (hematite, goethite), hydroxyl (clay) and carbonate (calcrete) minerals, were mapped to trace kimberlite pipe.&nbsp;<strong>Methods:</strong>&nbsp;Feature based Principal Component Analysis (FPCA) was applied over the OLI bands 2, 4, 5 and 6, and 2, 5, 6 and 7 to generate ferric iron (F image) and hydroxyl/carbonate image (H/C images). The color composite was generated by assigning RGB colours to F, H/C and F+H/C images.&nbsp;<strong>Findings:</strong>&nbsp;When matched with the pre-explored kimberlite pipe locations, it was observed that the kimberlitic pipes display different colours in the above colour composite. Hence, the Isodata clustering was carried out to segregate the classes, which resulted in 12 unique classes. Of these, the kimberlite pipes fall in 4 classes. However, due to the moderate resolution of OLI, false positive areas were also noted. Further the target area was found to be reduced by incorporating the structural control (lineament) over the emplacement of Kimberlite pipes.<strong>&nbsp;Novelty:</strong>&nbsp;The present work highlights the usefulness of the moderate resolution multispectral image in mapping the Kimberlite pipes in semiarid region, in absence of a hyperspectral sensor. <strong>Keywords:</strong> Kimberlite; Landsat8 OLI; feature based Principal Component Analysis (FPCA); Lineaments; Dharwar Craton
APA, Harvard, Vancouver, ISO, and other styles
4

Montanino, Andrea, Gianluca Alaimo, and Ettore Lanzarone. "A gradient-based optimization method with functional principal component analysis for efficient structural topology optimization." Structural and Multidisciplinary Optimization 64, no. 1 (2021): 177–88. http://dx.doi.org/10.1007/s00158-021-02872-9.

Full text
Abstract:
AbstractStructural topology optimization (STO) is usually treated as a constrained minimization problem, which is iteratively addressed by solving the equilibrium equations for the problem under consideration. To reduce the computational effort, several reduced basis approaches that solve the equilibrium equations in a reduced space have been proposed. In this work, we apply functional principal component analysis (FPCA) to generate the reduced basis, and we couple FPCA with a gradient-based optimization method for the first time in the literature. The proposed algorithm has been tested on a large STO problem with 4.8 million degrees of freedom. Results show that the proposed algorithm achieves significant computational time savings with negligible loss of accuracy. Indeed, the density maps obtained with the proposed algorithm capture the larger features of maps obtained without reduced basis, but in significantly lower computational times, and are associated with similar values of the minimized compliance.
APA, Harvard, Vancouver, ISO, and other styles
5

Hael, Mohanned A., Hai Qiang Ma, Hamas A. AL-kuhali, and Zeinab Rizk. "Quantile-based Clustering for Functional Data via Modelling Functional Principal Components Scores." Journal of Physics: Conference Series 2449, no. 1 (2023): 012016. http://dx.doi.org/10.1088/1742-6596/2449/1/012016.

Full text
Abstract:
Abstract Clustering tasks of functional data arise naturally in many applications, and efficient classification approaches are needed to find groups. The current paper combines the quantile-based model with the principal component analysis of functional data (FPCA). In our proposed procedures, the projection of functional data is first approximated based on (rotated) FPCA. The quantile-based model is then implemented on the space of rotated scores to identify the potential features of underlying clusters. The proposed method overcomes the limitation of using direct basis function expansion such as Fourier, B-spline, or linear fitting, besides representing a nonparametric clustering alternative based on a quantile approach. The proposed method’s performance has been evaluated in a comprehensive simulation study and afterward compared with existing functional and non-functional clustering methods. The simulation study results showed that the proposed method performs well in terms of correct classification rate and computing time average. Finally, a real-world application concerning temporal wind speed data has been analyzed to demonstrate the proposed method’s advantages and usefulness.
APA, Harvard, Vancouver, ISO, and other styles
6

P., Gopinath. "Raspberry PI-Based Finger Vein Recognition System Using PCA NET." International Research Journal of Computer Science 10, no. 06 (2023): 414–18. http://dx.doi.org/10.26562/irjcs.2023.v1006.24.

Full text
Abstract:
Despite simultaneously ignoring the intensity distribution that is formed by the finger tissue and, in some instances, processing it as background noise, the majority of finger vein feature extraction algorithms achieve satisfactory performance due to their ability to represent texture. This project makes use of two- directional two-dimensional Fisher Principal Component Analysis, also known as (2D) 2 FPCA, for feature extraction. This type of "noise" is presented as a novel soft biometric trait for improving finger vein recognition performance. In order to demonstrate that the intensity distribution that is formed by the finger tissue in the background can be extracted as a soft biometric trait for recognition, a comprehensive analysis of the finger vein imaging principle and the characteristics of the image are first presented.
APA, Harvard, Vancouver, ISO, and other styles
7

Kusuma, K. N. "Application of Feature Based Principal Component Analysis (FPCA) technique on Landsat8 OLI multispectral data to map Kimberlite pipes." Indian Journal of Science and Technology 14, no. 4 (2021): 361–72. http://dx.doi.org/10.17485/ijst/v14i4.1741.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Zhou, Xin, and Xianqing Lei. "Fault Diagnosis Method of the Construction Machinery Hydraulic System Based on Artificial Intelligence Dynamic Monitoring." Mobile Information Systems 2021 (July 15, 2021): 1–10. http://dx.doi.org/10.1155/2021/1093960.

Full text
Abstract:
This paper aims to study the fault diagnosis method of the mechanical hydraulic system based on artificial intelligence dynamic monitoring. According to the characteristics of functional principal component analysis (FPCA) and neural network in the fault diagnosis method in the feature extraction process, the fault diagnosis method combining functional principal component analysis and BP neural network is studied and it is applied to the fault of the coordinator hydraulic system diagnosis. This article mainly completed the following tasks: analyzing the structure and working principle of the mechanical hydraulic system, studying the failure mechanism and failure mode of the mechanical hydraulic system, summarizing the common failures of the hydraulic system and the individual failures of the mechanical hydraulic system, and establishing the mechanical hydraulic system. Description of failure mode and effects analysis (FMEA): then, a joint simulation model of the mechanical hydraulic system was established in ADAMS and AMESim, and the fault detection signal of the hydraulic system was determined and compared with the experimental data. At the same time, the simulation data of the cosimulation model were compared with the simulation data of the hydraulic model in MATLAB to further verify the correctness of the model. The functional principal component analysis is used to perform functional processing on sample data, feature parameters are extracted, and the BP neural network is used to train the mapping relationship between feature parameters and fault parameters. The consistency is verified, and the fault diagnosis method is finally completed. The experimental results show that the diagnostic accuracy rates are 0.9848 and 0.9927, respectively, the reliability is significantly improved, close to 100%, and the uncertainty is basically 0, which significantly improves the accuracy of fault diagnosis.
APA, Harvard, Vancouver, ISO, and other styles
9

Hou, Yunhui, Na Shen, and Yubin Lin. "A Classification Method for Multichannel MI-EEG Signal with FPCA and DNN." Journal of Physics: Conference Series 2891, no. 11 (2024): 112014. https://doi.org/10.1088/1742-6596/2891/11/112014.

Full text
Abstract:
Abstract A new accurate identification method has been proposed to address the lack of interpretability in current deep learning-based feature extraction methods for motor imagery electroencephalogram (MI-EEG) signals. This method combines functional principal component analysis (FPCA) and deep neural networks (DNN) for four classifications of MI-EEG signals. The process involves preprocessing the acquired MI-EEG signals and obtaining power spectral density (PSD) versus frequency curves in the alpha band for multiple channels and samples through FIR filtering. All PSD-frequency curves are then functionally smoothed according to the theory of functional data analysis (FDA). Feature parameters are derived using FPCA, and the parameters of all samples are normalized. Training samples are selected randomly for clustering training with DNNs. Category prediction is carried out on the test data classification samples. This method is applied to 4×120 four-categorized MI-EEG samples, each from six channels obtained from Enobio test, a wireless EEG system from Spain Neuroelectrics, involving left hand, right hand, left foot, and right foot motor imagery at a sampling rate of 500Hz. 80% of the samples were used for training, and the remaining 20% were used for testing. The prediction accuracy ranged from 84.3% to 91.66%. While this multivariate feature parameter extraction method has clear mathematical and physical significance, it does demand a high sampling rate of 500Hz.
APA, Harvard, Vancouver, ISO, and other styles
10

B D, Mr Darshan, Vyshnavi Shekhar B S, Meghana M. Totiger, Priyanka N, and Spurthi A. "Classification of Emotion using Eeg Signals: an FPGA Based Implementation." International Journal of Recent Technology and Engineering (IJRTE) 12, no. 2 (2023): 102–9. http://dx.doi.org/10.35940/ijrte.b7808.0712223.

Full text
Abstract:
An electroencephalograph is a device that records all electrical energy in the human brain using wearable metal electrodes placed on the skull. Electrical impulses connect brain cells and are always mobile, even at rest. This activity appears as a squiggly line in EEG recordings. Activity gaze data is pre-processed to a frequency range of 0 to 75 Hz. This creates a new matrix with a sample rate of 200 Hz and a range of 0-75 Hz. A finite-impulse-response low-pass filter was used because the bandpass would distort his EEG data after processing. Each pre-processed EEG signal has an output, which completes feature extraction. Principal Component Analysis or PCA is passed in the feature reduction phase. PCA is an analytical process that uses singular value decomposition to transform a collection of corresponding features into mutually uncorrelated features or principal components. Principal component analysis: (a) mean normalization of features (b) covariance matrix (c) eigenvectors (d) reduced features or principal components. The above steps are passed to the SVM classifier for sentiment output. His VHDL code and testbench for 2*2 matrices were written, waveforms and RTL schemes were created in Xilinx 14.5. For the FPGA implementation, a Simulink model was designed, and the eigenvalues were pre-determined using a system generator.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Feature based Principal Component Analysis (FPCA)"

1

Li, Xiaomeng. "Human Promoter Recognition Based on Principal Component Analysis." Thesis, The University of Sydney, 2008. http://hdl.handle.net/2123/3656.

Full text
Abstract:
This thesis presents an innovative human promoter recognition model HPR-PCA. Principal component analysis (PCA) is applied on context feature selection DNA sequences and the prediction network is built with the artificial neural network (ANN). A thorough literature review of all the relevant topics in the promoter prediction field is also provided. As the main technique of HPR-PCA, the application of PCA on feature selection is firstly developed. In order to find informative and discriminative features for effective classification, PCA is applied on the different n-mer promoter and exon combined frequency matrices, and principal components (PCs) of each matrix are generated to construct the new feature space. ANN built classifiers are used to test the discriminability of each feature space. Finally, the 3 and 5-mer feature matrix is selected as the context feature in this model. Two proposed schemes of HPR-PCA model are discussed and the implementations of sub-modules in each scheme are introduced. The context features selected by PCA are III used to build three promoter and non-promoter classifiers. CpG-island modules are embedded into models in different ways. In the comparison, Scheme I obtains better prediction results on two test sets so it is adopted as the model for HPR-PCA for further evaluation. Three existing promoter prediction systems are used to compare to HPR-PCA on three test sets including the chromosome 22 sequence. The performance of HPR-PCA is outstanding compared to the other four systems.
APA, Harvard, Vancouver, ISO, and other styles
2

Li, Xiaomeng. "Human Promoter Recognition Based on Principal Component Analysis." University of Sydney, 2008. http://hdl.handle.net/2123/3656.

Full text
Abstract:
Master of Engineering<br>This thesis presents an innovative human promoter recognition model HPR-PCA. Principal component analysis (PCA) is applied on context feature selection DNA sequences and the prediction network is built with the artificial neural network (ANN). A thorough literature review of all the relevant topics in the promoter prediction field is also provided. As the main technique of HPR-PCA, the application of PCA on feature selection is firstly developed. In order to find informative and discriminative features for effective classification, PCA is applied on the different n-mer promoter and exon combined frequency matrices, and principal components (PCs) of each matrix are generated to construct the new feature space. ANN built classifiers are used to test the discriminability of each feature space. Finally, the 3 and 5-mer feature matrix is selected as the context feature in this model. Two proposed schemes of HPR-PCA model are discussed and the implementations of sub-modules in each scheme are introduced. The context features selected by PCA are III used to build three promoter and non-promoter classifiers. CpG-island modules are embedded into models in different ways. In the comparison, Scheme I obtains better prediction results on two test sets so it is adopted as the model for HPR-PCA for further evaluation. Three existing promoter prediction systems are used to compare to HPR-PCA on three test sets including the chromosome 22 sequence. The performance of HPR-PCA is outstanding compared to the other four systems.
APA, Harvard, Vancouver, ISO, and other styles
3

Ergin, Emre. "Investigation Of Music Algorithm Based And Wd-pca Method Based Electromagnetic Target Classification Techniques For Their Noise Performances." Master's thesis, METU, 2009. http://etd.lib.metu.edu.tr/upload/12611218/index.pdf.

Full text
Abstract:
Multiple Signal Classification (MUSIC) Algorithm based and Wigner Distribution-Principal Component Analysis (WD-PCA) based classification techniques are very recently suggested resonance region approaches for electromagnetic target classification. In this thesis, performances of these two techniques will be compared concerning their robustness for noise and their capacity to handle large number of candidate targets. In this context, classifier design simulations will be demonstrated for target libraries containing conducting and dielectric spheres and for dielectric coated conducting spheres. Small scale aircraft targets modeled by thin conducting wires will also be used in classifier design demonstrations.
APA, Harvard, Vancouver, ISO, and other styles
4

Bird, Gregory David. "Linear and Nonlinear Dimensionality-Reduction-Based Surrogate Models for Real-Time Design Space Exploration of Structural Responses." BYU ScholarsArchive, 2020. https://scholarsarchive.byu.edu/etd/8653.

Full text
Abstract:
Design space exploration (DSE) is a tool used to evaluate and compare designs as part of the design selection process. While evaluating every possible design in a design space is infeasible, understanding design behavior and response throughout the design space may be accomplished by evaluating a subset of designs and interpolating between them using surrogate models. Surrogate modeling is a technique that uses low-cost calculations to approximate the outcome of more computationally expensive calculations or analyses, such as finite element analysis (FEA). While surrogates make quick predictions, accuracy is not guaranteed and must be considered. This research addressed the need to improve the accuracy of surrogate predictions in order to improve DSE of structural responses. This was accomplished by performing comparative analyses of linear and nonlinear dimensionality-reduction-based radial basis function (RBF) surrogate models for emulating various FEA nodal results. A total of four dimensionality reduction methods were investigated, namely principal component analysis (PCA), kernel principal component analysis (KPCA), isometric feature mapping (ISOMAP), and locally linear embedding (LLE). These methods were used in conjunction with surrogate modeling to predict nodal stresses and coordinates of a compressor blade. The research showed that using an ISOMAP-based dual-RBF surrogate model for predicting nodal stresses decreased the estimated mean error of the surrogate by 35.7% compared to PCA. Using nonlinear dimensionality-reduction-based surrogates did not reduce surrogate error for predicting nodal coordinates. A new metric, the manifold distance ratio (MDR), was introduced to measure the nonlinearity of the data manifolds. When applied to the stress and coordinate data, the stress space was found to be more nonlinear than the coordinate space for this application. The upfront training cost of the nonlinear dimensionality-reduction-based surrogates was larger than that of their linear counterparts but small enough to remain feasible. After training, all the dual-RBF surrogates were capable of making real-time predictions. This same process was repeated for a separate application involving the nodal displacements of mode shapes obtained from a FEA modal analysis. The modal assurance criterion (MAC) calculation was used to compare the predicted mode shapes, as well as their corresponding true mode shapes obtained from FEA, to a set of reference modes. The research showed that two nonlinear techniques, namely LLE and KPCA, resulted in lower surrogate error in the more complex design spaces. Using a RBF kernel, KPCA achieved the largest average reduction in error of 13.57%. The results also showed that surrogate error was greatly affected by mode shape reversal. Four different approaches of identifying reversed mode shapes were explored, all of which resulted in varying amounts of surrogate error. Together, the methods explored in this research were shown to decrease surrogate error when performing DSE of a turbomachine compressor blade. As surrogate accuracy increases, so does the ability to correctly make engineering decisions and judgements throughout the design process. Ultimately, this will help engineers design better turbomachines.
APA, Harvard, Vancouver, ISO, and other styles
5

Ersoy, Mehmet Okan. "Application Of A Natural-resonance Based Feature Extraction Technique To Small-scale Aircraft Modeled By Conducting Wires For Electromagnetic Target Classification." Master's thesis, METU, 2004. http://etd.lib.metu.edu.tr/upload/3/12605522/index.pdf.

Full text
Abstract:
The problem studied in this thesis, is the classification of the small-scale aircraft targets by using a natural resonance based electromagnetic feature extraction technique. The aircraft targets are modeled by perfectly conducting, thin wire structures. The electromagnetic back-scattered data used in the classification process, are numerically generated for five aircraft models. A contemporary signal processing tool, the Wigner-Ville distribution is employed in this study in addition to using the principal components analysis technique to extract target features mainly from late-time target responses. The Wigner-Ville distribution (WD) is applied to the electromagnetic back-scattered responses from different aspects. Then, feature vectors are extracted from suitably chosen late-time portions of the WD outputs, which include natural resonance related v information, for every target and aspect to decrease aspect dependency. The database of the classifier is constructed by the feature vectors extracted at only a few reference aspects. Principal components analysis is also used to fuse the feature vectors and/or late-time aircraft responses extracted from reference aspects of a given target into a single characteristic feature vector of that target to further reduce aspect dependency. Consequently, an almost aspect independent classifier is designed for small-scale aircraft targets reaching high correct classification rate.
APA, Harvard, Vancouver, ISO, and other styles
6

Trahan, Patrick. "Classification of Carpiodes Using Fourier Descriptors: A Content Based Image Retrieval Approach." ScholarWorks@UNO, 2009. http://scholarworks.uno.edu/td/1085.

Full text
Abstract:
Taxonomic classification has always been important to the study of any biological system. Many biological species will go unclassified and become lost forever at the current rate of classification. The current state of computer technology makes image storage and retrieval possible on a global level. As a result, computer-aided taxonomy is now possible. Content based image retrieval techniques utilize visual features of the image for classification. By utilizing image content and computer technology, the gap between taxonomic classification and species destruction is shrinking. This content based study utilizes the Fourier Descriptors of fifteen known landmark features on three Carpiodes species: C.carpio, C.velifer, and C.cyprinus. Classification analysis involves both unsupervised and supervised machine learning algorithms. Fourier Descriptors of the fifteen known landmarks provide for strong classification power on image data. Feature reduction analysis indicates feature reduction is possible. This proves useful for increasing generalization power of classification.
APA, Harvard, Vancouver, ISO, and other styles
7

Chen, Beichen, and Amy Jinxin Chen. "PCA based dimensionality reduction of MRI images for training support vector machine to aid diagnosis of bipolar disorder." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-259621.

Full text
Abstract:
This study aims to investigate how dimensionality reduction of neuroimaging data prior to training support vector machines (SVMs) affects the classification accuracy of bipolar disorder. This study uses principal component analysis (PCA) for dimensionality reduction. An open source data set of 19 bipolar and 31 control structural magnetic resonance imaging (sMRI) samples was used, part of the UCLA Consortium for Neuropsychiatric Phenomics LA5c Study funded by the NIH Roadmap Initiative aiming to foster breakthroughs in the development of novel treatments for neuropsychiatric disorders. The images underwent smoothing, feature extraction and PCA before they were used as input to train SVMs. 3-fold cross-validation was used to tune a number of hyperparameters for linear, radial, and polynomial kernels. Experiments were done to investigate the performance of SVM models trained using 1 to 29 principal components (PCs). Several PC sets reached 100% accuracy in the final evaluation, with the minimal set being the first two principal components. Accumulated variance explained by the PCs used did not have a correlation with the performance of the model. The choice of kernel and hyperparameters is of utmost importance as the performance obtained can vary greatly. The results support previous studies that SVM can be useful in aiding the diagnosis of bipolar disorder, and that the use of PCA as a dimensionality reduction method in combination with SVM may be appropriate for the classification of neuroimaging data for illnesses not limited to bipolar disorder. Due to the limitation of a small sample size, the results call for future research using larger collaborative data sets to validate the accuracies obtained.<br>Syftet med denna studie är att undersöka hur dimensionalitetsreduktion av neuroradiologisk data före träning av stödvektormaskiner (SVMs) påverkar klassificeringsnoggrannhet av bipolär sjukdom. Studien använder principalkomponentanalys (PCA) för dimensionalitetsreduktion. En datauppsättning av 19 bipolära och 31 friska magnetisk resonanstomografi(MRT) bilder användes, vilka tillhör den öppna datakällan från studien UCLA Consortium for Neuropsychiatric Phenomics LA5c som finansierades av NIH Roadmap Initiative i syfte att främja genombrott i utvecklingen av nya behandlingar för neuropsykiatriska funktionsnedsättningar. Bilderna genomgick oskärpa, särdragsextrahering och PCA innan de användes som indata för att träna SVMs. Med 3-delad korsvalidering inställdes ett antal parametrar för linjära, radiala och polynomiska kärnor. Experiment gjordes för att utforska prestationen av SVM-modeller tränade med 1 till 29 principalkomponenter (PCs). Flera PC uppsättningar uppnådde 100% noggrannhet i den slutliga utvärderingen, där den minsta uppsättningen var de två första PCs. Den ackumulativa variansen över antalet PCs som användes hade inte någon korrelation med prestationen på modellen. Valet av kärna och hyperparametrar är betydande eftersom prestationen kan variera mycket. Resultatet stödjer tidigare studier att SVM kan vara användbar som stöd för diagnostisering av bipolär sjukdom och användningen av PCA som en dimensionalitetsreduktionsmetod i kombination med SVM kan vara lämplig för klassificering av neuroradiologisk data för bipolär och andra sjukdomar. På grund av begränsningen med få dataprover, kräver resultaten framtida forskning med en större datauppsättning för att validera de erhållna noggrannheten.
APA, Harvard, Vancouver, ISO, and other styles
8

Bie, Yifeng. "Functional principal component analysis based machine learning algorithms for spectral analysis." Thesis, 2021. http://hdl.handle.net/1828/13372.

Full text
Abstract:
The ability to probe molecular electronic and vibrational structures gives rise to optical absorption spectroscopy, which is a credible tool used in molecular quantification and classification with high sensitivity, low limit of detection (LoD), and immunity to electromagnetic noises. Spectra are sensitive to slight analyte variations, so they are often used to identify a sample’s components. This thesis proposes several methods for quick classification and quantification of analysts based on their absorbance spectra. functional Principal Component Analysis (fPCA) is employed for feature extraction and dimension reduction. For 1,000-pixel spectra data, fPCA can capture the majority variance with as few output scores as the number of expected analytes. This reduces the amount of calculation required for the following machine learning algorithms. Further, the output scores are fed into XGBoost and logistic regression for classification, and fed into XGBoost and linear regression for quantification. Our models were tested on both synthesized datasets and experimentally acquired dataset. Our models demonstrated similar performance compared to deep learning but with much faster processing speeds. For the synthesized 30 dB dataset, our model XGBoost with fPCA could reach a micro-averaged f1 score of 0.9551 ± 0.0008, while FNN-OT [1] could obtain 0.940±0.001. fPCA helped the algorithms extract the feature of each analyte; furthermore, the output scores nearly had a linear relationship with their concentrations. It was much easier for the algorithm to find the mapping function between the inputs and the outputs with fPCA, which shortened the training and testing time.<br>Graduate
APA, Harvard, Vancouver, ISO, and other styles
9

Le, Duc Huy, and 黎德輝. "Comparing principal component analysis and similarity feature-based selection and classification algorithms for face recognition." Thesis, 2015. http://ndltd.ncl.edu.tw/handle/839cpa.

Full text
Abstract:
碩士<br>國立高雄應用科技大學<br>製造與管理外國學生碩士專班<br>102<br>Abstract Purpose: Nowadays, face recognition has many applications in the real world. It has attracted a lot of attention of researchers. They have divided face recognition system involves three main steps: face detection, feature extraction and face classification. Feature extraction and feature selection are two important steps in facial image recognition problems. The objective of them is to reduce classification errors. Finding an efficient algorithm for these steps is a challenging task. In this research, we compared two methods based on global feature approach, are principal component analysis (PCA) and similarity feature-based selection and classification (SFSC). Materials and methods: The LBP operator was first introduced by Ojala et al., Local Binary Patterns were first used in order to describe ordinary textures and, since a face can be seen as a composition of micro textures depending on the local situation, it is also useful for face description. The operator assigns a label to each pixel of an image by thresholding a 3 × 3 neighborhood with the center pixel value and considering the result as a binary number. The Principal Component Analysis (PCA), also known as Karhunen-Loève expansion, is one of the most successful techniques that have been used to solve compression and recognition problems. The purpose of it is to represent a picture of a face in terms of an optimal coordinate system and to reduce the large dimensionality of the data space (observed variables) to the smaller intrinsic dimensionality of the feature space (independent variables), which are needed to describe the data economically. The similarity feature-based selection and classification algorithms (SFSC) was initially proposed by Tran et al., and has proven an efficient tool for improving the performance of descriptor-based face recognition systems. The purpose of the algorithm is to retain similar features of the training images in a class to minimize within-class differences, but maximize between-class differences and to classify based on this similarity feature set. Results: The experiments were conducted on four databases: the ORL Database of Faces, Faces94, the Japanese Female Facial Expression, and the Extended Yale Face Database B. The results indicated that SFSC method achieved a better recognition rate than the PCA method which used the same databases and was an efficient tool for improving the performance of descriptor-based face recognition. Additionally, when combined with the descriptor such as LBP, SFSC algorithm also results higher recognition rate of the traditional PCA algorithm. We want to give you some comparison, so that you can choose the method best face recognition. Conclusions: A comparative study demonstrated the superiority of the proposed algorithm in face recognition in comparison with the traditional PCA algorithm. The obtained results indicated that SFSC method was better than PCA method in terms of recognition rate and combination with descriptors. We recommend the use of a holistic technique involving the SFSC algorithm to obtain better results. Keywords: Face recognition, local binary patterns, PCA, SFSC, similarity feature
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Feature based Principal Component Analysis (FPCA)"

1

Hilgurt, S. Ya, and O. A. Chemerys. Reconfigurable signature-based information security tools of computer systems. PH “Akademperiodyka”, 2022. http://dx.doi.org/10.15407/akademperiodyka.458.297.

Full text
Abstract:
The book is devoted to the research and development of methods for combining computational structures for reconfigurable signature-based information protection tools for computer systems and networks in order to increase their efficiency. Network security tools based, among others, on such AI-based approaches as deep neural networking, despite the great progress shown in recent years, still suffer from nonzero recognition error probability. Even a low probability of such an error in a critical infrastructure can be disastrous. Therefore, signature-based recognition methods with their theoretically exact matching feature are still relevant when creating information security systems such as network intrusion detection systems, antivirus, anti-spam, and wormcontainment systems. The real time multi-pattern string matching task has been a major performance bottleneck in such systems. To speed up the recognition process, developers use a reconfigurable hardware platform based on FPGA devices. Such platform provides almost software flexibility and near-ASIC performance. The most important component of a signature-based information security system in terms of efficiency is the recognition module, in which the multipattern matching task is directly solved. It must not only check each byte of input data at speeds of tens and hundreds of gigabits/sec against hundreds of thousand or even millions patterns of signature database, but also change its structure every time a new signature appears or the operating conditions of the protected system change. As a result of the analysis of numerous examples of the development of reconfigurable information security systems, three most promising approaches to the construction of hardware circuits of recognition modules were identified, namely, content-addressable memory based on digital comparators, Bloom filter and Aho–Corasick finite automata. A method for fast quantification of components of recognition module and the entire system was proposed. The method makes it possible to exclude resource-intensive procedures for synthesizing digital circuits on FPGAs when building complex reconfigurable information security systems and their components. To improve the efficiency of the systems under study, structural-level combinational methods are proposed, which allow combining into single recognition device several matching schemes built on different approaches and their modifications, in such a way that their advantages are enhanced and disadvantages are eliminated. In order to achieve the maximum efficiency of combining methods, optimization methods are used. The methods of: parallel combining, sequential cascading and vertical junction have been formulated and investigated. The principle of multi-level combining of combining methods is also considered and researched. Algorithms for the implementation of the proposed combining methods have been developed. Software has been created that allows to conduct experiments with the developed methods and tools. Quantitative estimates are obtained for increasing the efficiency of constructing recognition modules as a result of using combination methods. The issue of optimization of reconfigurable devices presented in hardware description languages is considered. A modification of the method of affine transformations, which allows parallelizing such cycles that cannot be optimized by other methods, was presented. In order to facilitate the practical application of the developed methods and tools, a web service using high-performance computer technologies of grid and cloud computing was considered. The proposed methods to increase efficiency of matching procedure can also be used to solve important problems in other fields of science as data mining, analysis of DNA molecules, etc. Keywords: information security, signature, multi-pattern matching, FPGA, structural combining, efficiency, optimization, hardware description language.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Feature based Principal Component Analysis (FPCA)"

1

Ananda, Ridho, Dina Rachmawaty, Budi Pratikno, Odai Amer Hamid, and Maifuza Binti Mohd Amin. "Improved Clustering-Based Feature Selection Using Feature Extraction Based on Principal Component Analysis." In Communications in Computer and Information Science. Springer Nature Switzerland, 2025. https://doi.org/10.1007/978-3-031-81065-7_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Yao, Biyuan, Jianhua Yin, Hui Li, Hui Zhou, and Wei Wu. "Channel Feature Extraction and Modeling Based on Principal Component Analysis." In Communications in Computer and Information Science. Springer Singapore, 2018. http://dx.doi.org/10.1007/978-981-13-1026-3_15.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Taguchi, Y.-h. "Principal Component Analysis-Based Unsupervised Feature Extraction Applied to Single-Cell Gene Expression Analysis." In Intelligent Computing Theories and Application. Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-95933-7_90.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ibrahim, Abdelhameed, Aboul Ella Hassanien, and Siddhartha Bhattacharyya. "3D Object Recognition Based on Data Fusion at Feature Level via Principal Component Analysis." In Recent Trends in Signal and Image Processing. Springer Singapore, 2018. http://dx.doi.org/10.1007/978-981-10-8863-6_18.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Wu, Yun, Qiang Wang, and Yu Shi. "Research on Principal Component Feature Extraction Method Based on Improved Pearson Correlation Coefficient Analysis." In Advances in Intelligent Information Hiding and Multimedia Signal Processing. Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-33-6757-9_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Bhavani, D., N. Madhavi, V. Sireesha, M. Swetha, and P. Kishore Kumar. "Principal Component Analysis-Based Pre-Trained Neural Network for Liver Cancer Data Feature Extraction." In Recent Developments in Microbiology, Biotechnology and Pharmaceutical Sciences. CRC Press, 2025. https://doi.org/10.1201/9781003618140-171.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Taguchi, Y. H. "Sincle Cell RNA-seq Analysis Using Tensor Decomposition and Principal Component Analysis Based Unsupervised Feature Extraction." In Studies in Big Data. Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-9158-4_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Taguchi, Y. H. "Multiomics Data Analysis of Cancers Using Tensor Decomposition and Principal Component Analysis Based Unsupervised Feature Extraction." In Studies in Big Data. Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-9158-4_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Chen, Chunjun, and Linshuying Huang. "Construction of Bearing Performance Degradation Indicators for Adaptive Improvement of Principal Component Analysis." In Lecture Notes in Mechanical Engineering. Springer Nature Singapore, 2025. https://doi.org/10.1007/978-981-97-7887-4_101.

Full text
Abstract:
Abstract Rolling bearing health assessment relies on the constructed degradation indicators, In order to improve the monotonicity and trend of the indicators, a fusion indicator construction method considering feature burr removal is proposed. For the feature burrs appearing in the rolling bearing degradation performance characterization features that deviate from the expected degradation trend, a criterion-based adaptive burr removal strategy is used to detect and remove the burrs existing in the characterization features to improve the performance of the degradation indicators; then, principal component analysis (PCA) is used to fuse six kinds of time domain features, to remove the redundant information in the original state feature space, to maintain the global structure of the bearing degradation data, and further use the Exponentially Weighted Moving Average (EWMA) algorithm to smooth the fusion indicators to obtain high-quality degradation indicators. The experimental results verify the superiority of the proposed method in terms of monotonicity and trend.
APA, Harvard, Vancouver, ISO, and other styles
10

Li, Jia wei, and Ming Sun. "A New Palm-Print Image Feature Extraction Method Based on Wavelet Transform and Principal Component Analysis." In Computer and Computing Technologies in Agriculture IV. Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-18369-0_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Feature based Principal Component Analysis (FPCA)"

1

Routh, Bikky, Vikram Kumawat, Arijit Guha, Siddhartha Mukhopadhyay, and Amit Patra. "State-of-Health Estimation of Li-ion Batteries using Multiple Linear Regression and Optimized Feature Extraction based on Principal Component Analysis." In 2024 IEEE International Conference on Prognostics and Health Management (ICPHM). IEEE, 2024. http://dx.doi.org/10.1109/icphm61352.2024.10626868.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Liu, Heyuan, Yi Zhao, and Fran�ois Mar�chal. "On the role of artificial intelligence in feature oriented multi-criteria decision analysis." In The 35th European Symposium on Computer Aided Process Engineering. PSE Press, 2025. https://doi.org/10.69997/sct.175488.

Full text
Abstract:
Balancing economic and environmental goals in industrial applications is critical amid challenges like climate change. Multi-objective optimization (MOO) and multi-criteria decision analysis (MCDA) are key tools for addressing conflicting objectives. MOO generates viable solutions, while MCDA selects the optimal option based on key performance indicators such as profitability, environmental impact, safety, and efficiency. However, large datasets pose a challenge in selecting the preferred solution during the MCDA process This study introduces a novel machine learning-enhanced MCDA framework and applies the method to analyze decarbonization solutions for a European refinery. A stage-wise dimensionality reduction method, combining AutoEncoders and Principal Component Analysis (PCA), is applied to simplify high-dimensional datasets while preserving key spatial features. Geometric analysis techniques, including Intrinsic Shape Signatures (ISS), are employed to refine the identification of typical configurations for baseline evaluations. Once typical configurations are identified, Large Language Models (LLMs) are utilized to enhance decision-making by providing contextual problem explanations and generating weight proposals for the weighted sum method, ensuring alignment with decision criteria. This framework is designed to support stakeholders in making informed, transparent decisions in complex, uncertain environments.
APA, Harvard, Vancouver, ISO, and other styles
3

Jun-Ling Xu, Bao-Wen Xu, Wei-Feng Zhang, and Zi-Feng Cui. "Principal Component Analysis based Feature Selection for clustering." In 2008 International Conference on Machine Learning and Cybernetics (ICMLC). IEEE, 2008. http://dx.doi.org/10.1109/icmlc.2008.4620449.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Li, Zhangyu, and Yihui Qiu. "Feature selection based on improved principal component analysis." In CACML 2023: 2023 2nd Asia Conference on Algorithms, Computing and Machine Learning. ACM, 2023. http://dx.doi.org/10.1145/3590003.3590036.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Yaicharoen, Auapong, Kotaro Hashikura, Md Abdus Samad Kamal, and Kou Yamada. "Principal Component Analysis-based Customizable Feature Selection Algorithm." In 2022 19th International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology (ECTI-CON). IEEE, 2022. http://dx.doi.org/10.1109/ecti-con54298.2022.9795390.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Huibo, Zhao, Pan Quan, and Cheng Yongmei. "Feature Extraction Based on Mixture Probabilistic Kernel Principal Component Analysis." In 2009 International Forum on Information Technology and Applications (IFITA). IEEE, 2009. http://dx.doi.org/10.1109/ifita.2009.11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Hasan, Hasmarina, and Nooritawati Md Tahir. "Feature selection of breast cancer based on Principal Component Analysis." In its Applications (CSPA). IEEE, 2010. http://dx.doi.org/10.1109/cspa.2010.5545298.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Lhazmir, Safae, Ismail El Moudden, and Abdellatif Kobbane. "Feature extraction based on principal component analysis for text categorization." In 2017 International Conference on Performance Evaluation and Modeling in Wired and Wireless Networks (PEMWN). IEEE, 2017. http://dx.doi.org/10.23919/pemwn.2017.8308030.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Gulati, Vineeta, Neeraj Raheja, and Rajneesh Kumar Gujral. "Pica-A Hybrid Feature Extraction Technique Based on Principal Component Analysis and Independent Component Analysis." In 2022 IEEE 3rd Global Conference for Advancement in Technology (GCAT). IEEE, 2022. http://dx.doi.org/10.1109/gcat55367.2022.9971838.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Yu, REN, HUI Ji-zhuang, SHI Ze, ZHANG Ze-yu, Zhang Xu-hui, and Fan Hong-wei. "Feature Extraction of Loader Operation Based on Kernel Principal Component Analysis." In 2021 6th International Conference on Intelligent Computing and Signal Processing (ICSP). IEEE, 2021. http://dx.doi.org/10.1109/icsp51882.2021.9408684.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!