Academic literature on the topic 'False Rejection Rate (FRR)'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'False Rejection Rate (FRR).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "False Rejection Rate (FRR)"

1

Assia, Toumi, Toumi Tarek, Adjoudj Réda, and Louchene Ibtissem. "The importance of parameter configuration in ECG-based biometric recognition models." STUDIES IN ENGINEERING AND EXACT SCIENCES 5, no. 3 (2024): e12998. https://doi.org/10.54021/seesv5n3-137.

Full text
Abstract:
This study examines the efficacy of an ECG-based biometric recognition model, concentrating specifically on the influence of iteration configurations and the quantity of nodes within the neural network. Through a series of experiments, we evaluate the false rejection rate (FRR) and false acceptance rate (FAR) in order to assess the model's efficacy.The results indicate that augmenting the number of nodes in the neural network markedly reduces the false rejection rate, decreasing from 0.5 (with 5 nodes) to 0.15 (with 15 nodes), hence demonstrating an improved capacity of the model to differentiate between individual ECG signals. On the other hand, the number of iterations, assessed from 100 to 800, had negligible impact on performance, with the FRR remaining stable and exhibiting just a tiny reduction as iterations increased. These findings underline the relevance of parameter setting in ECG-based biometric recognition models. While an enhanced number of nodes proved effective in decreasing false rejections, the impact of iteration count in model optimization needs further exploration. Future research directions should investigate the possibility for parameter modifications to maximize the performance and reliability of ECG-based biometric systems.
APA, Harvard, Vancouver, ISO, and other styles
2

Sunardi, Ariyawan, and Rezky Mahardika. "Studi Perbandingan Metode Wavelet Dalam Speech Recognition Pada Sistem Akses Personel." ELKHA 11, no. 1 (2019): 1. http://dx.doi.org/10.26418/elkha.v11i1.29343.

Full text
Abstract:
Penelitian tentang speech recognition terus berkembang terkait identifikasi personel. Pada penelitian ini, kami melakukan studi perbandingan metode wavelet dalam speech recognition. Pada penelitian ini teknologi speech recognition berbasiskan wavelett dan neuro fuzzy. Beberapa parameter yang digunakan dalam penelitian ini adalah sampel suara dengan frekuensi sampling 8000 Hz dan 8 bit per sampel dengan filter wavelet High Pass Filter (HPF). Level dekomposisi menggunakan wavelet daubechies, symlet dan coiflet. Nilai thereshold filter wavelet identifikasi personel 57,72%, False Rejection Rate (FRR) 40% dan running time 1,97 detik. Untuk nilai thereshold identifikasi personel 100%, False Rejection Rate (FRR) 0% dan running time 5,43 detik didapatkan pada level dekomposisi 5 pada wavelet db1. Identifikasi tipe wavelet dari yang terbaik adalah coiflet, symlet dan daubechies karena coif2 level 2 memberikan identifikasi 60,00%, FRR 40,00% dan running time 1,97 detik
APA, Harvard, Vancouver, ISO, and other styles
3

BIRGALE, LENINA, and MANESH KOKARE. "COMPARISON OF COLOR AND TEXTURE FOR IRIS RECOGNITION." International Journal of Pattern Recognition and Artificial Intelligence 26, no. 03 (2012): 1256007. http://dx.doi.org/10.1142/s0218001412560071.

Full text
Abstract:
This paper proposes the utility of texture and color for iris recognition systems. It contributes for improvement of system accuracy with reduced feature vector size of just 1 × 3 and reduction of false acceptance rate (FAR) and false rejection rate (FRR). It avoids the iris normalization process used traditionally in iris recognition systems. Proposed method is compared with the existing methods. Experimental results indicate that the proposed method using only color achieves 99.9993 accuracy, 0.0160 FAR, and 0.0813 FRR. Computational time efficiency achieved is of 947.7 ms.
APA, Harvard, Vancouver, ISO, and other styles
4

Modupe, Agagu. "Development of an Improved Multi-filtering Matching Model for Fingerprint Recognition." Aug-Sept 2023, no. 35 (September 14, 2023): 24–38. http://dx.doi.org/10.55529/jipirs.35.24.38.

Full text
Abstract:
Over the years research done in the area of fingerprint recognition in which the hybrid matching algorithm is one of the most common techniques, though the hybrid algorithm performed well but still faced with the challenge of false minutiae. This study formulated, simulated, and evaluated a multi-filtering fingerprint matching model to develop a multi-filtering matching model for fingerprint recognition. The method employed a multi-filtering model that was formulated using image pre-processing; minutiae feature extraction, post-processing, and cancellation of false minutiae algorithms in the processed images. The model was simulated using Matlab and fingerprint images from the Fingerprint Verification Competition (FVC) 2002 database. The performance of the model was evaluated using the False Acceptance Rate (FAR), False Rejection Rate (FRR), and Error Equal Rate (EER). The results showed that the false minutiae cancellation algorithm considerably reduced the false minutiae points in the thinned images which resulted in the reduction of false acceptance when two different images were tested, and also reduction in false rejection rate when two same images were tested. The match score was below the threshold value of 50 for false acceptance rate and above the threshold value of 50 for the false rejection rate. The error equal rate EER value of 0.076 was recorded. The study concluded that there was a significant reduction in the false minutiae points present in the thinned images and that a high accuracy of fingerprint matching was achieved when the datasets include poor quality fingerprint images.
APA, Harvard, Vancouver, ISO, and other styles
5

Akhundjanov, Umidjon. "VERIFICATION OF STATIC SIGNATURE USING CONVOLUTIONAL NEURAL NETWORK." Al-Farg'oniy avlodlari 1, no. 4 (2023): 70–74. https://doi.org/10.5281/zenodo.10333369.

Full text
Abstract:
This article is devoted to the development of a method that provides verification of handwritten signatures based on real samples obtained by scanning with a resolution of 800 dpi. Handwritten signature remains one of the most common identification methods and consideration of the problems of this promising area contributes to the search for a solution to this problemOne of the main stages of recognition is classification. This article describes the results of handwritten signature recognition using a convolutional neural network. A database of handwritten signatures of 10 people was used for experiments. The signatures are digitized as color images with a resolution of 850×550 pixels. There are 10 genuine and 10 fake signatures for each person. Experiments were carried out with the reduction of signatures to the size 128×128, 256×256, 512×512 pixels.As a result of the study of this model, it has shown its effectiveness and practical suitability for use in biometric identification systems.
APA, Harvard, Vancouver, ISO, and other styles
6

CARDOT, HUBERT, MARINETTE REVENU, BERNARD VICTORRI, and MARIE-JOSÈPHE REVILLET. "A STATIC SIGNATURE VERIFICATION SYSTEM BASED ON A COOPERATING NEURAL NETWORKS ARCHITECTURE." International Journal of Pattern Recognition and Artificial Intelligence 08, no. 03 (1994): 679–92. http://dx.doi.org/10.1142/s021800149400036x.

Full text
Abstract:
We are applying neural networks to the problem of handwritten signature verification. Our system is working on checks, so we can only use the static information (the image). This static information is used in three representations: geometrical parameters, outline and image. Our system is composed of several neural networks which cooperate together during the learning and decision phases. The performances in generalization, obtained with a large-scale database of 6000 signatures from real checks on random forgeries, are False Acceptance Rate (FAR)=2% and False Rejection Rate (FRR)=4%.
APA, Harvard, Vancouver, ISO, and other styles
7

Jain, Charu, Priti Singh, and Preeti Rana. "Offline Signature Verification System with Gaussian Mixture Models (GMM)." INTERNATIONAL JOURNAL OF COMPUTERS & TECHNOLOGY 10, no. 6 (2013): 1700–1705. http://dx.doi.org/10.24297/ijct.v10i6.3196.

Full text
Abstract:
Gaussian Mixture Models (GMMs) has been proposed for off-line signature verification. The individual Gaussian components are shown to represent some global features such as skewness, kurtosis, etc. that characterize various aspects of a signature, and are effective for modeling its specificity. The learning phase involves the use of Gaussian Mixture Model (GMM) technique to build a reference model for each signature sample of a particular user. The verification phase uses three layers of statistical techniques. The first layer involves computation of GMM-based log-likelihood probability match score, Â second layer performs the mapping of this score into soft boundary ranges of acceptance or rejection through the use of z-score analysis and normalization function, thirdly, threshold is used to arrive at the final decision of accepting or rejecting a given signature sample. The focus of this work is on faster detection of authenticated signature as no vector analysis is done in GMM. From the experimental results, the new features proved to be more robust than other related features used in the earlier systems. The FAR (False Acceptance Rate) and FRR (False Rejection Rate) for the genuine samples is 0.15 and 0.19 respectively.
APA, Harvard, Vancouver, ISO, and other styles
8

Ramadhanti, Fitri, and Bahrianto Prakoso. "Key Generation Based on Face Image Processing Using Kanade-Lucas-Tomasi Algorithm." Info Kripto 18, no. 1 (2024): 1–8. http://dx.doi.org/10.56706/ik.v18i1.84.

Full text
Abstract:
Sistem biometrik menawarkan keunggulan yaitu memiliki keunikan yang berbeda pada setiap manusia. Biometrik juga memiliki keunggulan yaitu bersifat permanen, sulit dipalsukan, sulit diretas atau dicuri, dan anti penyangkalan. Dari berbagai banyak keunggulan biomterik maka penelitian ini membahas pembangkitan kunci berbasis biometrik wajah. Syarat suatu angka acak dianggap sebagai kunci kriptografi yang baik adalah angka acak tersebut lulus uji keacakan. Pada penelitian ini menggunakan algoritma Kanade-Lucas-Tomasi (KLT) untuk pelacakan titik unik wajah dan algoritma Viola Jones untuk pendeteksian wajah pada gambar. Pengujian keluaran nilai hash SHA256 sebagai kunci dari proses ekstraksi titik unik wajah menggunakan NIST Stastical Test Suite (NIST STS) untuk menentukan kunci yang dihasilkan dapat digunakan sebagai kunci kriptografi. Sistem yang diusulkan disimulasikan menggunakan MATLAB versi 2017a. Hasil terhadap uji keacakan angka menunjukkan bahwa kunci keluaran hanya dapat memenuhi 7 dari 15 NIST STS sehingga metode ini tidak lolos uji keacakan. Selain itu, terdapat pengujian histogram dan pengujian False Acceptance Rate (FAR) dan False Rejection Rate (FRR). Pengujian histogram menunjukkan terdapat nilai dominan yang menjelaskan bahwa pixel gambar tersebut tidak terdistribusi rata. Pengujian FAR dan FRR menunjukkkan persentase FAR akan semakin kecil jika nilai threshold semakin tinggi dan nilai FRR akan semakin meningkat dengan Equal Error Rate (EER) pada nilai T=4.
APA, Harvard, Vancouver, ISO, and other styles
9

CHIU, CHUANG-CHIEN, CHOU-MIN CHUANG, and CHIH-YU HSU. "DISCRETE WAVELET TRANSFORM APPLIED ON PERSONAL IDENTITY VERIFICATION WITH ECG SIGNAL." International Journal of Wavelets, Multiresolution and Information Processing 07, no. 03 (2009): 341–55. http://dx.doi.org/10.1142/s0219691309002957.

Full text
Abstract:
The main purpose of this study is to present a novel personal authentication approach with the electrocardiogram (ECG) signal. The electrocardiogram is a recording of the electrical activity of the heart and the recorded signals can be used for individual verification because ECG signals of one person are never the same as those of others. The discrete wavelet transform was applied for extracting features that are the wavelet coefficients derived from digitized signals sampled from one-lead ECG signal. By the proposed approach applied on 35 normal subjects and 10 arrhythmia patients, the verification rate was 100% for normal subjects and 81% for arrhythmia patients. Furthermore, the performance of the ECG verification system was evaluated by the false acceptance rate (FAR) and false rejection rate (FRR). The FAR was 0.83% and FRR was 0.86% for a database containing only 35 normal subjects. When 10 arrhythmia patients were added into the database, FAR was 12.50% and FRR was 5.11%. The experimental results demonstrated that the proposed approach worked well for normal subjects. For this reason, it can be concluded that ECG used as a biometric measure for personal identity verification is feasible.
APA, Harvard, Vancouver, ISO, and other styles
10

Hussian, Abdulrahman, Foud Murshed, Mohammed Nasser Alandoli, and Ghalib Aljafari. "A Hybrid Deep Learning Approach for Secure Biometric Authentication Using Fingerprint Data." Computers 14, no. 5 (2025): 178. https://doi.org/10.3390/computers14050178.

Full text
Abstract:
Despite significant advancements in fingerprint-based authentication, existing models still suffer from challenges such as high false acceptance and rejection rates, computational inefficiency, and vulnerability to spoofing attacks. Addressing these limitations is crucial for ensuring reliable biometric security in real-world applications, including law enforcement, financial transactions, and border security. This study proposes a hybrid deep learning approach that integrates Convolutional Neural Networks (CNNs) with Long Short-Term Memory (LSTM) networks to enhance fingerprint authentication accuracy and robustness. The CNN component efficiently extracts intricate fingerprint patterns, while the LSTM module captures sequential dependencies to refine feature representation. The proposed model achieves a classification accuracy of 99.42%, reducing the false acceptance rate (FAR) to 0.31% and the false rejection rate (FRR) to 0.27%, demonstrating a 12% improvement over traditional CNN-based models. Additionally, the optimized architecture reduces computational overheads, ensuring faster processing suitable for real-time authentication systems. These findings highlight the superiority of hybrid deep learning techniques in biometric security by providing a quantifiable enhancement in both accuracy and efficiency. This research contributes to the advancement of secure, adaptive, and high-performance fingerprint authentication systems, bridging the gap between theoretical advancements and real-world applications.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "False Rejection Rate (FRR)"

1

Benditkis, Julia [Verfasser], Arnold [Akademischer Betreuer] Janssen, and Helmut [Akademischer Betreuer] Finner. "Martingale Methods for Control of False Discovery Rate and Expected Number of False Rejections / Julia Benditkis. Gutachter: Arnold Janssen ; Helmut Finner." Düsseldorf : Universitäts- und Landesbibliothek der Heinrich-Heine-Universität Düsseldorf, 2015. http://d-nb.info/1077295170/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "False Rejection Rate (FRR)"

1

Costa, Diogo, Eugénio M. Rocha, and Pedro Ramalho. "Minimizing False-Rejection Rates in Gas Leak Testing Using an Ensemble Multiclass Classifier for Unbalanced Data." In Communications in Computer and Information Science. Springer Nature Switzerland, 2022. http://dx.doi.org/10.1007/978-3-031-20319-0_32.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Katsriku, Ferdinand Apietu, Samuel Darko Abankwah, Edward Danso Ansong, and Winfred Yaokumah. "Enhancing Textual Password Authentication Using Typing Rhythm." In Advances in Information Security, Privacy, and Ethics. IGI Global, 2025. https://doi.org/10.4018/979-8-3693-8014-7.ch011.

Full text
Abstract:
This chapter aims to enhance the security of textual passwords by adding a new layer that involves an individual's typing rhythm without requiring additional devices. It discusses authentication methods and develops a textual password authentication system, which takes advantage of the fact that every user has a distinct way of typing. Twenty participants, including novice and expert users, assessed the proposed system. The measuring metrics used were the False Acceptance Rate (FAR) and False Rejection Rate (FRR). The flight time and typing rhythm were the biometric identification template. The results indicated low false rejection and acceptance rates, with the initial session registering 0.0 FRR and 0.1 FAR. Two weeks later, the second session recorded 0.25 FRR and 0.0 FAR. Using this model, users do not need to select a complex password they might forget. Instead, they can utilize a good rhythm different from their natural typing rhythm, making it challenging to guess.
APA, Harvard, Vancouver, ISO, and other styles
3

Sudha, L. R., and R. Bhavani. "Gait Based Biometric Authentication System with Reduced Search Space." In Emerging Technologies in Intelligent Applications for Image and Video Processing. IGI Global, 2016. http://dx.doi.org/10.4018/978-1-4666-9685-3.ch012.

Full text
Abstract:
Deployment of human gait in developing new tools for security enhancement has received growing attention in modern era. Since the efficiency of any algorithm depends on the size of search space, the aim is to propose a novel approach to reduce the search space. In order to achieve this, the database is split into two based on gender and the search is restricted in the identified gender database. Then highly discriminant gait features are selected by forward sequential feature selection algorithm in the confined space. Experimental results evaluated on the benchmark CASIA B gait dataset with the newly proposed combined classifier kNN-SVM, shows less False Acceptance Rate (FAR) and less False Rejection Rate (FRR).
APA, Harvard, Vancouver, ISO, and other styles
4

Luna, Francisco, Julio César Martínez Romo, Miguel Mora-González, Evelia Martínez-Cano, and Valentín López Rivas. "Handwritten Signature Verification Using Multi Objective Optimization with Genetic Algorithms in a Forensic Architecture." In Logistics Management and Optimization through Hybrid Artificial Intelligence Systems. IGI Global, 2012. http://dx.doi.org/10.4018/978-1-4666-0297-7.ch006.

Full text
Abstract:
This chapter presents the use of multi-objective optimization for on-line automatic verification of handwritten signatures; as discriminating features of each signer are used here some functions of time and space of the position of the pen on the paper; these functions are directly used in a multi-objective optimization task in order to obtain high values of false positives indicators (FAR False Acceptance Rate) and false negatives (FFR, false rejection rate). The genetic algorithms are used to create a signer´s model that optimally characterizes him, thus rejecting the skilled forgeries and recognizing genuine signatures with large variation with respect to the training set.
APA, Harvard, Vancouver, ISO, and other styles
5

Favorskaya Margarita and Baranov Roman. "The Off-line Signature Verification Based on Structural Similarity." In Frontiers in Artificial Intelligence and Applications. IOS Press, 2014. https://doi.org/10.3233/978-1-61499-405-3-421.

Full text
Abstract:
The off-line signature verification is demanded in business and marketing, bank transactions, security control, and document authentication. The off-line signature verification is the difficult process in against of the on-line verification. In this paper, the extension of feature set including global, local, and special features for simple and cursive types of signatures is proposed. The global features are required to create a decision tree, which limits a field of the search. The pre-processing procedures remove noises and artifacts of scanning and luminance. The Structural Similarity Index Measure (SSIM) and detection of special points are based on a skeleton signature representation. SSIM shows the well results of verification and demonstrates low values of False Rejection Rate (FRR) and False Acceptance Rate (FAR) for the GPDS-100 signature corpus.
APA, Harvard, Vancouver, ISO, and other styles
6

Baragi, Shivakumar, and Nalini C. Iyer. "Face Recognition using Fast Fourier Transform." In Research Advances in the Integration of Big Data and Smart Computing. IGI Global, 2016. http://dx.doi.org/10.4018/978-1-4666-8737-0.ch017.

Full text
Abstract:
Biometrics refers to metrics related to human characteristics and Traits. Face Recognition is the process of identification of a person by their facial image. It has been an active area of research for several decades, but still remains a challenging problem because of the complexity of the human face. The objective is to authenticate a person, to have a FAR and FRR very low. This project introduces a new approach for face recognition system using FFT algorithm. The database that contains the images is named as train database and the test image which is stored in test database is compared with the created train database. For further processing RGB data is converted into grayscale, thus reduces the matrix dimension. FFT is applied to the entire database and mean value of the images is computed and the same is repeated on test database also. Based on the threshold value of the test image, face recognition is done. Performance evaluation of Biometrics is done for normal image, skin color image, ageing image and blur image using False Acceptance Rate(FAR), False Rejection Rate(FRR), Equal Error Rate(EER) and also calculated the accuracy of different images.
APA, Harvard, Vancouver, ISO, and other styles
7

Raj, Alex Noel Joseph, and Vijayalakshmi G. V. Mahesh. "Zernike-Moments-Based Shape Descriptors for Pattern Recognition and Classification Applications." In Advanced Image Processing Techniques and Applications. IGI Global, 2017. http://dx.doi.org/10.4018/978-1-5225-2053-5.ch004.

Full text
Abstract:
This chapter presents an analysis on Zernike Moments from the class of orthogonal moments which are invariant to rotation, translation and scaling. The chapter initially focuses on the review of Zernike moments as 1D, 2D and 3D based on their dimension and later investigates on the construction, characteristics of Zernike Moments and their invariants, which can be used as a shape descriptor to capture global and detailed information from the data based on their order to provide outstanding performance in various applications of image processing. Further the chapter also presents an application of 2D Zernike Moments features for plant species recognition and classification using supervised learning techniques. The performance of the learned models was evaluated with True Positive Rate, True Rejection ratio, False Acceptance Rate, False Rejection Ratio and Receiver Operating Characteristics. The simulation results indicate that the Zernike moments with its invariants was successful in recognising and classifying the images with least FAR and significant TRR.
APA, Harvard, Vancouver, ISO, and other styles
8

Suganthi Devi S. "Automatic Biometric System for Finger Knuckle Using Sparse Encoder Approaches." In Advances in Parallel Computing Technologies and Applications. IOS Press, 2021. http://dx.doi.org/10.3233/apc210155.

Full text
Abstract:
Biometric recognition is one of the effective authentication techniques which is utilized in various applications for making the individual identification process. During the verification and authentication process different biometric features such as signature, ear, iris, face, palm, finger knuckle details are used to perform this process. Due to the easy acceptance of the palm surface, fine textures and stable features characteristics are helps to choose the finger knuckle feature for biometric process in this work. First the finger biometric features are collected from PolyU finger knuckle database. After that, the noise present in the images are eliminated using weighted median filter and the knuckle region is located with the help of the variational approach. After that key point descriptors are extracted using sparse autoencoder approach. Finally, the specific features are trained using compositional networks and features matching is performed by Chebyshev distance. The matching process authenticate the user whether they are authorized person or not. At last efficiency of the system is evaluated using MATLAB based experimental results such as false acceptance rate, equal error rate and false rejection rate.
APA, Harvard, Vancouver, ISO, and other styles
9

Liu, Han, Shuai Li, Yifan Song, Qidi Jiao, and Xiaorui Cui. "Research on Intelligent Identification and Traceability Techniques for Cyber Attacks." In Frontiers in Artificial Intelligence and Applications. IOS Press, 2025. https://doi.org/10.3233/faia250365.

Full text
Abstract:
To solve the problems of traditional network attack behavior identification methods in terms of false alarm rate and identification efficiency, the research of intelligent identification and traceability technology of network attack behavior is proposed. Firstly, we analyze the principle of optical communication network attack, collect the data that can describe the type of optical communication network attack behavior, then we adopt the data-driven algorithm to design the classifier for optical communication network attack pattern recognition, classify the optical communication network attack behaviors through the classifier, and finally we conduct the control experiments of optical communication network attack pattern recognition with the classical method. The experimental results show that the mean values of the correct rate of optical communication network attack pattern recognition based on the multi-step attack detection method of network communication anomaly identification and the complex multi-step attack recognition method for wireless intrusion detection system are 84.96% and 87.96%, respectively, and the rejection rate is 8.00% and 7.05%, and the false recognition rate is 7.04% and 6.19%, respectively. Conclusion: The method in this paper has good real-time performance for optical communication network attack pattern recognition and can effectively intercept various optical communication network attacks online, which has a very wide application prospect.
APA, Harvard, Vancouver, ISO, and other styles
10

Amine Hmani, Mohamed, Dijana Petrovska-Delacretaz, and Bernadette Dorizzi. "Revocable Crypto-Biometric Key Regeneration Using Face Biometrics, Fuzzy Commitment, and a Cohort Bit Selection Process." In Biometrics - Recent Advances and New Perspectives [Working Title]. IntechOpen, 2024. http://dx.doi.org/10.5772/intechopen.1003710.

Full text
Abstract:
Security is a main concern nowadays. Cryptography and biometrics are the main pillars of security. Using biometrics to obtain cryptographic keys offers distinct advantages over traditional methods. Classical systems rely on passwords or tokens assigned by administrators, which can be stolen or shared, making them insufficient for identity verification. In contrast, biometric-based keys provide a better solution for proving a user’s identity. This chapter proposes an approach to regenerate crypto-biometric keys with high entropy, ensuring high security using facial biometrics. The keys are regenerated using a fuzzy commitment scheme, utilizing BCH error-correcting codes, and have a high entropy of 528 bits. To achieve this, we use an intra-inter variance strategy for the process of bit selection from our facial deep binary embeddings. The system is evaluated on the MOBIO dataset and gives 0% FAR and less than 1% FRR. The proposed crypto-biometric keys are resistant to quantum computing algorithms, provide non-repudiation, and are revocable and convenient with low false rejection rates.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "False Rejection Rate (FRR)"

1

Jagtap, Shilpa, J. L. Mudegaonkar, Sanjay Patil, and Dinesh Bhoyar. "A Novel Approach for Diagnosis of Diabetes Using Iris Image Processing Technique and Evaluation Parameters." In National Conference on Relevance of Engineering and Science for Environment and Society. AIJR Publisher, 2021. http://dx.doi.org/10.21467/proceedings.118.37.

Full text
Abstract:
This paper presented here deals with study of identification and verification approach of Diabetes based on human iris pattern. In the pre-processing of this work, region of interest according to color (ROI) concept is used for iris localization, Dougman's rubber sheet model is used for normalization and Circular Hough Transform can be used for pupil and boundary detection. To extract features, Gabor Filter, Histogram of Oriented Gradients, five level decomposition of wavelet transforms likeHaar, db2, db4, bior 2.2, bior6.8 waveletscan be used. Binary coding scheme binaries’ the feature vector coefficients and classifier like hamming distance, Support Vector Machine (SVM), Adaptive Boosting (AdaBoost), Neural Networks (NN), Random Forest (RF) and Linear Discriminative Analysis (LDA) with shrinkage parametercan be used for template matching. Performance parameters such as Computational time, Hamming distance variation, False Acceptance Rate (FAR), False Rejection Rate (FRR), Accuracy, and Match ratio can be calculated for the comparison purpose.
APA, Harvard, Vancouver, ISO, and other styles
2

Shaik Riyaz, Noor Basha, and V. Parthipan. "A Novel Prediction Analysing the False Acceptance Rate and False Rejection Rate using CNN Model to Improve the Accuracy for Iris Recognition System for Biometric Security in Clouds Comparing with Traditional Inception Model." In 2022 4th International Conference on Advances in Computing, Communication Control and Networking (ICAC3N). IEEE, 2022. http://dx.doi.org/10.1109/icac3n56670.2022.10074026.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ceschini, Giuseppe Fabio, Nicolò Gatta, Mauro Venturini, Thomas Hubauer, and Alin Murarasu. "Optimization of Statistical Methodologies for Anomaly Detection in Gas Turbine Dynamic Time Series." In ASME Turbo Expo 2017: Turbomachinery Technical Conference and Exposition. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/gt2017-63409.

Full text
Abstract:
Statistical parametric methodologies are widely employed in the analysis of time series of gas turbine sensor readings. These methodologies identify outliers as a consequence of excessive deviation from a statistically-based model, derived from available observations. Among parametric techniques, the k-σ methodology demonstrates its effectiveness in the analysis of stationary time series. Furthermore, the simplicity and the clarity of this approach justify its direct application to industry. On the other hand, the k-σ methodology usually proves to be unable to adapt to dynamic time series, since it identifies observations in a transient as outliers. As this limitation is caused by the nature of the methodology itself, two improved approaches are considered in this paper in addition to the standard k-σ methodology. The two proposed methodologies maintain the same rejection rule of the standard k-σ methodology, but differ in the portions of the time series from which statistical parameters (mean and standard deviation) are inferred. The first approach performs statistical inference by considering all observations prior to the current one, which are assumed reliable, plus a forward window containing a specified number of future observations. The second approach proposed in this paper is based on a moving window scheme. Simulated data are used to tune the parameters of the proposed improved methodologies and to prove their effectiveness in adapting to dynamic time series. The moving window approach is found to be the best on simulated data in terms of True Positive Rate (TPR), False Negative Rate (FNR) and False Positive Rate (FPR). Therefore, the performance of the moving window approach is further assessed towards both different simulated scenarios and field data taken on a gas turbine.
APA, Harvard, Vancouver, ISO, and other styles
4

de Raad, Jan A. "Reliability of Mechanised UT Systems to Inspect Girth Welds During Pipeline Construction." In 1998 2nd International Pipeline Conference. American Society of Mechanical Engineers, 1998. http://dx.doi.org/10.1115/ipc1998-2073.

Full text
Abstract:
As an alternative to radiography, a field-proven mechanized ultrasonic inspection system is discussed. Called Rotoscan, this system has been developed for inspection of girth welds during construction of long-distance pipelines, both on- and offshore. It is characterized by high inspection speed and instant recording of results. Unlike prevailing radiography, it provides immediate feedback to the welders. Recent technical improvements in flaw sizing and recording have allowed the application of rejection/acceptance criteria for weld defects based on fracture mechanics principles. The development and actual use of such modern acceptance criteria, particularly in Canada, supported the introduction of mechanised ultrasonic inspection. World wide applications proved that, contrary to expectations, ultrasonic inspection does not lead to higher weld repair rates than radiography does. Between early 1989 and now, over 5.000 km of pipeline (300.000 welds) were inspected with Rotoscan and its reliability proven. The introduction of colour enhanced transit distance “C-scan mapping”, producing a coherent picture based on the signal’s transit distance, enabled the system to cope with most existing ultrasonic procedures and acceptance criteria, because of its capability to detect and quantify volumetric defects. Moreover, the integrated simultaneous Time Of Flight Diffraction (TOFD) function enables through-thickness sizing of defect. The present system is capable of achieving a high Probability Of Detection (POD) together with a low False Call Rate (FCR). In the meantime, Rotoscan has been qualified in various countries, for different customers and for a variety of weld processes, pipe diameters and wall thicknesses. Because of its features, the now mature system has demonstrated its capabilities also for use on lay barges as an alternative to high-speed radiography.
APA, Harvard, Vancouver, ISO, and other styles
5

Mingote, Victoria, Antonio Miguel, Dayana Ribas, Alfonso Ortega, and Eduardo Lleida. "Optimization of False Acceptance/Rejection Rates and Decision Threshold for End-to-End Text-Dependent Speaker Verification Systems." In Interspeech 2019. ISCA, 2019. http://dx.doi.org/10.21437/interspeech.2019-2550.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Cullen, John J., and Hugh L. MacIntyre. "The case for using the Most Probable Number (MPN) method in ballast water management system type approval testing." In IMarEST Ballast Water Technology Conference. IMarEST, 2017. http://dx.doi.org/10.24868/bwtc6.2017.010.

Full text
Abstract:
Recently, the U.S. Coast Guard (USCG) rejected the Serial Dilution Culture-Most Probable Number (SDC-MPN) method for enumerating viable phytoplankton cells in ballast water discharge as an alternate to their prescribed method — the Environmental Technology Verification (ETV) Protocol. This method distinguishes living from dead organisms using vital stains and motility. Succinctly, the USCG position has been that the ETV Protocol is a reliable and repeatable efficacy test and the SDC-MPN method is not. New evidence and an expanded consideration of published research supports a fundamentally different assessment. A peer-reviewed quantitative evaluation of ETV vital stains for 24 species of phytoplankton has conclusively established that the ETV Protocol, even with observations of motility, is not reliable for all species. In contrast, published results suggest that errors in the method were small for the limited number of locations studied to date. It is possible that the communities tested in these were dominated by species that can be classified accurately using vital stains. Even so, it must be acknowledged that the reliability and accuracy of vital stains is untested for thousands of species of phytoplankton. Introduced in 1951, the SDC-MPN method for phytoplankton is an established approach for use with multi-species communities. As applied to ballast water testing, SDC-MPN is much less vulnerable to methodological uncertainties than has been assumed. Notably, all species of phytoplankton need not be cultured in the conventional sense. Rather, a single viable cell in a dilution tube need grow only enough to be detected — a requirement known to have been met by otherwise uncultured species. Further, delayed restoration of viability after treatment with ultraviolet radiation (UV) is not a problem: organisms repair UV damage quickly or not at all, consistent with the assumptions of the test. Two critical methodological failures could compromise protection of the environment in ballast water testing: living organisms that do not stain or move, and viable organisms that do not grow to detection in the MPN cultures. These can be assessed with complementary measurements, but importantly, the relative protection of each method can be evaluated by comparing counts of living cells from the ETV Protocol with counts of viable cell from SDC-MPN in untreated samples. Available evidence provides no basis for concluding that either method is consistently less protective. However, as applied in ballast water testing, the statistical estimate of MPN is less precise. On this basis, SDC-MPN is worse for a single test. But, counter-intuitively, it is more protective of the environment when five consecutive tests must be passed for type approval, because the likelihood of one false rejection out of five tests is higher and five false passes would be exceedingly rare. Addressing only the science, we conclude that both the ETV Protocol and the SDC-MPN method, though imperfect, are currently appropriate for assessing the efficacy of ballast water management systems in a type-approval testing regime. In closing, we show proof of concept for a rapid assay of viability, benchmarked against SDC-MPN, that could be well suited for routine assessment of treatment system performance.
APA, Harvard, Vancouver, ISO, and other styles
7

Araujo, Maria S., Heath A. Spidle, Shane P. Siebenaler, Samantha G. Blaisdell, and David W. Vickers. "Application of Machine Learning to Distributed Temperature Sensing (DTS) Systems." In 2018 12th International Pipeline Conference. American Society of Mechanical Engineers, 2018. http://dx.doi.org/10.1115/ipc2018-78640.

Full text
Abstract:
The timely detection of small leaks from liquid pipelines poses a significant challenge for pipeline operations. One technology considered for continual monitoring is distributed temperature sensing (DTS), which utilizes a fiber-optic cable to provide distributed temperature measurements along a pipeline segment. This measurement technique allows for a high accuracy of temperature determination over long distances. Unexpected deviations in temperature at any given location can indicate various physical changes in the environment, including contact with a heated hydrocarbon due to a pipeline leak. The signals stemming from pipeline leaks may not be significantly greater than the noise in the DTS measurements, so care must be taken to configure the system in a manner that can detect small leaks while rejecting non-leak temperature anomalies. There are many factors that influence the frequency and intensity of the backscattered optical signal. This can result in noise in the fine-grained temperature sensing data. Thus, the DTS system must be tuned to the nominal temperature profile along the pipe segment. This customization allows for significant sensitivity and can utilize different leak detection thresholds at various locations based on normal temperature patterns. However, this segment-specific tuning can require a significant amount of resources and time. Additionally, this configuration exercise may have to be repeated as pipeline operating conditions change over time. Thus, there is a significant need and interest in advancing existing DTS processing techniques to enable the detection of leaks that today go undetected by DTS due to their signal response being too close to the noise floor and/or requiring significant resources to achieve positive results. This paper discusses the recent work focused on using machine learning (ML) techniques to detect leak signatures. Initial proof-of-concept results provide a more robust methodology for detecting leaks and allow for the detection of smaller leaks than are currently detectable by typical DTS systems, with low false alarm rates. A key use of ML approaches is that the system can “learn” about a given pipeline on its own without the need to utilize resources for pipeline segment-specific tuning. The potential to have a self-taught system is a powerful concept, and this paper discusses some key initial findings from applying ML-based techniques to optimize leak detection capabilities of an existing DTS system.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography