Academic literature on the topic 'False Reject Rate (FRR)'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'False Reject Rate (FRR).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "False Reject Rate (FRR)"

1

Popoola, O. P., and R. A. Lasisi. "A Biometric Fusion System of Face and Fingerprint for Enhanced Human Identification Using HOG-LBP Approach." Journal of Engineering Research 25, no. 2 (2020): 197–202. https://doi.org/10.52968/72011428.

Full text
Abstract:
This paper presents a biometric fusion system of fingerprint and face images for Ergonomic-Based Enrolment and Verification System. Features from fingerprints and faces are extracted to create a new biometric template with enhanced performance and with an extra level of assurance for identification. A fusion scheme combines the extracted Histogram of gradients (HOG) and local Binary Pattern (LBP) features from a subject’s fingerprint and face images. Manhattan Distance is used to compare between the template in the database and the input data. The difference between the database template and the input data determines the decision either to reject or accept. Different "matching score thresholds" were set to evaluate the relationship between False Rejection Ratio and False Acceptance Ratio which is a common measure to determine system performance level. From the experiments and based on the characteristic nature of this HOG-LBP algorithm, a threshold between 75% and 80% is determined to be moderate and close to the EER (Equal Error Rate) point, which is the intersection of the False Accept Rate (FAR) and False Reject Rate (FRR). The system is robust enough to accommodate an increase in the threshold if a high level of system confidence is required
APA, Harvard, Vancouver, ISO, and other styles
2

WU, LIANWEN, and QIANSHENG CHENG. "AN ASYMMETRIC ADAPTIVE CLASSIFICATION METHOD." International Journal of Wavelets, Multiresolution and Information Processing 09, no. 01 (2011): 169–79. http://dx.doi.org/10.1142/s021969131100392x.

Full text
Abstract:
There should be different requirements for False Reject rate and False Accept rate in classification applications, and classifier learning should use an asymmetric factor to balance between False Reject rate and False Accept rate. A novel AdaBoost algorithm was developed with the asymmetric weight. Moreover we provide the theoretical analysis of its performance and derive the upper bound of the classification error.
APA, Harvard, Vancouver, ISO, and other styles
3

Syukur, Arba Abdul. "Implementasi Webcam sebagai Pendeteksi Wajah pada Sistem Keamanan Perumahan menggunakan Image Processing." ELECTRICES 2, no. 1 (2020): 1–5. http://dx.doi.org/10.32722/ees.v2i1.2791.

Full text
Abstract:
Pencurian yang sangat meresahkan masyarakat seringkali terjadi pada suatu ruangan atau lingkungan seperti gedung, kantor, lorong bahkan tempat ibadah juga menjadi sasaran para pencuri. Upaya yang dilakukan DKM (Dewan Kemakmuran Masjid) yaitu memberikan himbauan supaya tetap menjaga barang pentingnya masing-masing. Masjid seharusnya menjadi tempat yang aman dan nyaman untuk dikunjungi. Oleh karena itu kami memiliki ide yang bertujuan untuk mengantisipasi pencurian di masjid atau tempattempat yang rawan pencurian. Penelitian ini merancangbangun sistem pengenalan wajah sebagai solusi untuk mengurangi tingkat pencurian. Sistem ini dilengkapi dengan perangkat keras Raspberry Pi 3 model B dan webcam A4Tech. Perangkat lunak database yang dapat menyimpan data pengguna. Tujuan penelitian untuk membandingkan 2 metode yang terbaik dalam pengenalan wajah yaitu metode LBPH (Local Binary Pattern Histogram) dan metode Eigenface. Penelitian dilakukan pada siang hari untuk mengambil citra wajah yang berbeda. Penelitian dilakukan dengan 3 kondisi yaitu siang hari luar ruangan, siang hari dalam ruangan dan malam hari dalam ruangan. Parameter yang digunakan untuk melihat hasil dari pengenalan wajah yaitu Akurasi, FAR (False Accept Rate) dan FRR (False Reject Rate). Hasil pengujian 2 metode tersebut yang memiliki tingkat rata-rata Akurasi tertinggi dan tingkat rata-rata FAR dan FRR terendah adalah metode Eigenface. Kesimpulan dari hasil penelitian yaitu pencahayaan mempengaruhi pengenalan wajah dalam 2 metode tersebut.
APA, Harvard, Vancouver, ISO, and other styles
4

Mao, Rui, Xiaoyu Wang, and Heming Ji. "ACBM: attention-based CNN and Bi-LSTM model for continuous identity authentication." Journal of Physics: Conference Series 2352, no. 1 (2022): 012005. http://dx.doi.org/10.1088/1742-6596/2352/1/012005.

Full text
Abstract:
With the evolution of network attack methods, implicit continuous identity authentication technology has attracted more and more attention. Among them, keystroke dynamics is widely used because it does not need the assistance of devices other than keyboards. In this paper, we propose a keystroke dynamic identity authentication model based on deep learning. This model combines convolutional neural network (CNN), bi-directional Long Short-Term Memory (BI-LSTM), and the attention mechanism. Unlike most existing models that only use keystroke time as the feature vector, this model uses keystroke content and keystroke time as the feature vector. First, CNN is used to process feature vectors. Then the normalized vector is input into the bi-LSTM network for training. The model in this paper is tested using Buffalo open data set. The results show that FRR (False Reject Rate), FAR (False Accept Rate), and EER(Equal Error Rate) are 3.09%, 3.03%, and 4.23%, respectively. The validity and accuracy of the model in continuous identity authentication are proved.
APA, Harvard, Vancouver, ISO, and other styles
5

Rohim, Muhammad Imaduddin Abdur, Auliati Nisa, Muhammad Nurkhoiri Hindratno, et al. "Peningkatan Performa Pengenalan Wajah pada Gambar <i>Low-Resolution</i> Menggunakan Metode<i> Super-Resolution</i>." Jurnal Teknologi Informasi dan Ilmu Komputer 11, no. 1 (2024): 199–208. http://dx.doi.org/10.25126/jtiik.20241117947.

Full text
Abstract:
Kartu Tanda Penduduk Elektronik (KTP-el) merupakan identitas wajib bagi penduduk Indonesia. Penyimpanan pada cip KTP-el yang mana selain digunakan untuk menyimpan gambar potret wajah individu, juga harus dapat menyimpan identitas lain seperti biodata, tanda tangan, dan sidik jari kiri dan kanan. Keterbatasan tersebut mengharuskan gambar potret wajah disimpan pada ukuran low-resolution (LR) sehingga sistem pengenalan wajah tidak optimal. Dalam penelitian ini, kami menggunakan Poznan University of Technology (PUT) Face database yang terdiri atas 200 gambar dari 100 individu. Data tersebut dilakukan proses down sampling menggunakan bicubic interpolation untuk menghasilkan data LR. Kami menginvestigasi penggunaan metode super-resolution (SR) berbasis deep learning, termasuk DFDNet, LapSRN, GFPGAN, Real-ESRGAN, Real-ESRGAN+GFPGAN, dan FaceSPARNet. Hal ini bertujuan untuk meningkatkan kualitas gambar LR. Evaluasi performa dilakukan dengan menggunakan matriks False Rejection Rate(FRR) pada beberapa tingkatan False Acceptance Rate (FAR). Hasil penelitian menunjukkan bahwa beberapa metode SR terutama FaceSPARNet menunjukkan peningkatan performa face recognition hingga 2%. Sedangkan, metode SR yang berbasis GAN (GFPGAN, Real-ESRGAN, Real-ESRGAN+GFPGAN) cenderung meningkatkan false reject rate. Penelitian ini menunjukkan bahwa metode SR dari kategori General Basic CNN-based FSR dapat digunakan untuk meningkatkan kinerja face recognition pada gambar LR, seperti pada KTP-el.
APA, Harvard, Vancouver, ISO, and other styles
6

Bajaber, Asrar, and Lamiaa Elrefaei. "Biometric Template Protection for Dynamic Touch Gestures Based on Fuzzy Commitment Scheme and Deep Learning." Mathematics 10, no. 3 (2022): 362. http://dx.doi.org/10.3390/math10030362.

Full text
Abstract:
Privacy plays an important role in biometric authentication systems. Touch authentication systems have been widely used since touch devices reached their current level of development. In this work, a fuzzy commitment scheme (FCS) is proposed based on deep learning (DL) to protect the touch-gesture template in a touch authentication system. The binary Bose–Ray-Chaudhuri code (BCH) is used with FCS to deal with touch variations. The BCH code is described by the triplet (n, k, t) where n denotes the code word’s length, k denotes the length of the key and t denotes error-correction capability. In our proposed system, the system performance is investigated using different lengths k. The learning-based approach is applied to extract touch features from raw touch data, as the recurrent neural network (RNN) is used based on a convolutional neural network (CNN). The proposed system has been evaluated on two different touch datasets: the Touchalytics dataset and BioIdent dataset. The best results obtained were with a key length k = 99 and n = 255; the false accept rate (FAR) was 0.00 and false reject rate (FRR) was 0.5854 for the Touchalytics dataset, while the FAR was 0.00 and FRR was 0.5399 with the BioIdent dataset. The FCS shows its effectiveness in dynamic authentication systems, as good results are obtained and compared with other works.
APA, Harvard, Vancouver, ISO, and other styles
7

Wang, Tawose, Jiang, and Zhao. "Toward Automatic Cardiomyocyte Clustering and Counting through Hesitant Fuzzy Sets." Applied Sciences 9, no. 14 (2019): 2875. http://dx.doi.org/10.3390/app9142875.

Full text
Abstract:
The isolation and observation of cardiomyocytes serve as the fundamental approach to cardiovascular research. The state-of-the-practice for the isolation and observation relies on manual operation of the entire culture process. Such a manual approach not only incurs high human errors, but also takes a long period of time. This paper proposes a new computer-aided paradigm to automatically, accurately, and efficiently perform the clustering and counting of cardiomyocytes, one of the key procedures for evaluating the success rate of cardiomyocytes isolation and the quality of culture medium. The key challenge of the proposed method lies in the unique, rod-like shape of cardiomyocytes, which has been hardly addressed in literature. Our proposed method employs a novel algorithm inspired by hesitant fuzzy sets and integrates an efficient implementation into the whole process of analyzing cardiomyocytes. The system, along with the data extracted from adult rats’ cardiomyocytes, has been experimentally evaluated with Matlab, showing promising results. The false accept rate (FAR) and the false reject rate (FRR) are as low as 1.46% and 1.97%, respectively. The accuracy rate is up to 98.7%—20% higher than the manual approach—and the processing time is reduced from tens of seconds to 3–5 s—an order of magnitude performance improvement.
APA, Harvard, Vancouver, ISO, and other styles
8

Fadlisyah, Fadlisyah. "KUALITAS UNJUK KERJA PENDETEKSIAN CITRA IRIS DENGAN WAVELET 2D." TECHSI - Jurnal Teknik Informatika 7, no. 1 (2019): 1–10. https://doi.org/10.29103/techsi.v7i1.176.

Full text
Abstract:
The performance of an iris recognition system can be undermined by poor quality images and result in high false reject rates (FRR) and failure to enroll (FTE) rates. In this paper, a wavelet-based quality measure for iris images is proposed. The merit of the this approach lies in its ability to deliver good spatial adaptivity and determine local quality measures for different regions of an iris image. Our experiments demonstrate that the proposed quality index can reliably predict the matching performance of an iris recognition system. By incorporating local quality measures in the matching algorithm, we also observe a relative matching performance improvement of about 20% and 10% at the equal error rate (EER), respectively, on the CASIA and WVU iris databases.
APA, Harvard, Vancouver, ISO, and other styles
9

Zhou, Yiwen, Jianbin Zheng, Huacheng Hu, and Yizhen Wang. "Handwritten Signature Verification Method Based on Improved Combined Features." Applied Sciences 11, no. 13 (2021): 5867. http://dx.doi.org/10.3390/app11135867.

Full text
Abstract:
As a behavior feature, handwritten signatures are widely used in financial and administrative institutions. The appearance of forged signatures will cause great property losses to customers. This paper proposes a handwritten signature verification method based on improved combined features. According to advanced smart pen technology, when writing a signature, offline images and online data of the signature can be obtained in real time. It is the first time to realize the combination of offline and online. We extract the static and dynamic features of the signature and verify them with support vector machine (SVM) and dynamic time warping (DTW) respectively. We use a small number of samples during the training stage, which solves the problem of insufficient number of samples to a certain extent. We get two decision scores while getting the verification result. Finally, we propose a score fusion method based on accuracy (SF-A), which combines offline and online features through score fusion and utilize the complementarity among classifiers effectively. Experimental results show that using different numbers of training samples to conduct experiments on local data sets, the false acceptance rate (FAR) and false reject rate (FRR) obtained are better than the offline or online verification results.
APA, Harvard, Vancouver, ISO, and other styles
10

Ramírez Flores, Manuel, Gualberto Aguilar Torres, Gina Gallegos García, and Miguel Ángel García Licona. "Fingerprint verification using computational geometry." DYNA 83, no. 195 (2016): 128–37. http://dx.doi.org/10.15446/dyna.v83n195.46323.

Full text
Abstract:
This paper presents a robust minutiae based method for fingerprint verification. The proposed method uses Delaunay Triangulation to represent minutiae as nodes of a connected graph composed of triangles. The minimum angle over all triangulations is maximized, which gives local stability to the constructed structures against rotation and translation variations. Geometric thresholds and minutiae data were used to characterize the triangulations created from input and template fingerprint images. The effectiveness of the proposed method is confirmed through calculations of false acceptance rate (FAR), false rejected rate (FRR) and equal error rate (EER) over FVC2002 databases compared to the results of other approaches.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "False Reject Rate (FRR)"

1

Katsriku, Ferdinand Apietu, Samuel Darko Abankwah, Edward Danso Ansong, and Winfred Yaokumah. "Enhancing Textual Password Authentication Using Typing Rhythm." In Advances in Information Security, Privacy, and Ethics. IGI Global, 2025. https://doi.org/10.4018/979-8-3693-8014-7.ch011.

Full text
Abstract:
This chapter aims to enhance the security of textual passwords by adding a new layer that involves an individual's typing rhythm without requiring additional devices. It discusses authentication methods and develops a textual password authentication system, which takes advantage of the fact that every user has a distinct way of typing. Twenty participants, including novice and expert users, assessed the proposed system. The measuring metrics used were the False Acceptance Rate (FAR) and False Rejection Rate (FRR). The flight time and typing rhythm were the biometric identification template. The results indicated low false rejection and acceptance rates, with the initial session registering 0.0 FRR and 0.1 FAR. Two weeks later, the second session recorded 0.25 FRR and 0.0 FAR. Using this model, users do not need to select a complex password they might forget. Instead, they can utilize a good rhythm different from their natural typing rhythm, making it challenging to guess.
APA, Harvard, Vancouver, ISO, and other styles
2

Baragi, Shivakumar, and Nalini C. Iyer. "Face Recognition using Fast Fourier Transform." In Research Advances in the Integration of Big Data and Smart Computing. IGI Global, 2016. http://dx.doi.org/10.4018/978-1-4666-8737-0.ch017.

Full text
Abstract:
Biometrics refers to metrics related to human characteristics and Traits. Face Recognition is the process of identification of a person by their facial image. It has been an active area of research for several decades, but still remains a challenging problem because of the complexity of the human face. The objective is to authenticate a person, to have a FAR and FRR very low. This project introduces a new approach for face recognition system using FFT algorithm. The database that contains the images is named as train database and the test image which is stored in test database is compared with the created train database. For further processing RGB data is converted into grayscale, thus reduces the matrix dimension. FFT is applied to the entire database and mean value of the images is computed and the same is repeated on test database also. Based on the threshold value of the test image, face recognition is done. Performance evaluation of Biometrics is done for normal image, skin color image, ageing image and blur image using False Acceptance Rate(FAR), False Rejection Rate(FRR), Equal Error Rate(EER) and also calculated the accuracy of different images.
APA, Harvard, Vancouver, ISO, and other styles
3

Sudha, L. R., and R. Bhavani. "Gait Based Biometric Authentication System with Reduced Search Space." In Emerging Technologies in Intelligent Applications for Image and Video Processing. IGI Global, 2016. http://dx.doi.org/10.4018/978-1-4666-9685-3.ch012.

Full text
Abstract:
Deployment of human gait in developing new tools for security enhancement has received growing attention in modern era. Since the efficiency of any algorithm depends on the size of search space, the aim is to propose a novel approach to reduce the search space. In order to achieve this, the database is split into two based on gender and the search is restricted in the identified gender database. Then highly discriminant gait features are selected by forward sequential feature selection algorithm in the confined space. Experimental results evaluated on the benchmark CASIA B gait dataset with the newly proposed combined classifier kNN-SVM, shows less False Acceptance Rate (FAR) and less False Rejection Rate (FRR).
APA, Harvard, Vancouver, ISO, and other styles
4

Favorskaya Margarita and Baranov Roman. "The Off-line Signature Verification Based on Structural Similarity." In Frontiers in Artificial Intelligence and Applications. IOS Press, 2014. https://doi.org/10.3233/978-1-61499-405-3-421.

Full text
Abstract:
The off-line signature verification is demanded in business and marketing, bank transactions, security control, and document authentication. The off-line signature verification is the difficult process in against of the on-line verification. In this paper, the extension of feature set including global, local, and special features for simple and cursive types of signatures is proposed. The global features are required to create a decision tree, which limits a field of the search. The pre-processing procedures remove noises and artifacts of scanning and luminance. The Structural Similarity Index Measure (SSIM) and detection of special points are based on a skeleton signature representation. SSIM shows the well results of verification and demonstrates low values of False Rejection Rate (FRR) and False Acceptance Rate (FAR) for the GPDS-100 signature corpus.
APA, Harvard, Vancouver, ISO, and other styles
5

Jamieson, Dale. "When Utilitarians Should Be Virtue Theorists." In Climate Ethics. Oxford University Press, 2010. http://dx.doi.org/10.1093/oso/9780195399622.003.0028.

Full text
Abstract:
1. I begin with an assumption that few would deny, but about which many are in denial: human beings are transforming earth in ways that are devastating for other forms of life, future human beings, and many of our human contemporaries. The epidemic of extinction now under way is an expression of this. So is the changing climate. Ozone depletion, which continues at a very high rate, is potentially the most lethal expression of these transformations, for without an ozone layer, no life on earth could exist. Call anthropogenic mass extinctions, climate change, and ozone depletion “the problem of global environmental change” (or “the problem” for short). 2. Philosophers in their professional roles have by and large remained silent about the problem. There are many reasons for this. I believe that one reason is that it is hard to know what to say from the perspective of the reigning moral theories: Kantianism, contractarianism, and commonsense pluralism. While I cannot fully justify this claim here, some background remarks may help to motivate my interest in exploring utilitarian approaches to the problem. 3. Consider first Kantianism. Christine Korsgaard writes that it is “nonaccidental” that utilitarians are “obsessed” with “population control” and “the preservation of the environment.” For “a basic feature of the consequentialist outlook still pervades and distorts our thinking: the view that the business of morality is to bring something about.” Korsgaard leaves the impression that a properly conceived moral theory would have little to say about the environment, for such a theory would reject this false picture of the “business of morality.” This impression is reinforced by the fact that her remark about the environmental obsessions of utilitarians is the only mention of the environment in a book of more than 400 pages. It is not surprising that a view that renounces as “the business of morality” the question of what we should bring about would be disabled when it comes to thinking about how to respond to global environmental change.
APA, Harvard, Vancouver, ISO, and other styles
6

Favors, Jelani M. "Race Women." In Shelter in a Time of Storm. University of North Carolina Press, 2019. http://dx.doi.org/10.5149/northcarolina/9781469648330.003.0004.

Full text
Abstract:
This chapter examines the fascinating history of Bennett College – one of only two single sex colleges dedicated to educating African American women. Although Bennett would not make that transition until 1926, the institution played a vital role in educating African American women in Greensboro, North Carolina from the betrayal of the Nadir to the promises of a New Negro Era. The latter period witnessed Bennett, under the leadership of David Dallas Jones, mold scores of young girls into politically conscious race women who were encouraged to resist Jim Crow policies and reject the false principals of white supremacy. Their politicization led to a massive boycott of a theatre in downtown Greensboro and helped to set the tone for Greensboro’s evolution into a critical launching point for the modern civil rights movement.
APA, Harvard, Vancouver, ISO, and other styles
7

McPherson, Lionel K. "Framing a Racialist Placeholder." In The Afterlife of Race. Oxford University PressNew York, 2024. http://dx.doi.org/10.1093/oso/9780197626849.003.0005.

Full text
Abstract:
Abstract The power of reason over ideology should not be overestimated. This section rejects the popular antiracist notion that false beliefs about race have been leaders in defense of color-conscious injustice—as if inherited slavery, settler colonialism, and enforced segregation could seek sober rationalization through racial stories. The “race” placeholder lent a guise of seriousness to reports about innate mental differences between Europeans vis-à-vis other (sub)continent-wide populations. This was amid Western commitments to acquiring natural resources, human labor, and larger markets, which called for extreme subjugation and exploitation of non-European peoples. From a deflationary critical perspective, the section shows there are no race concept puzzles that demand theoretical or empirical solutions. Any of the many impacts of race ideology-rhetoric can be studied without concern over a metaphysics or science of race.
APA, Harvard, Vancouver, ISO, and other styles
8

Frazier, Jessica M. "Developing “Third World” Feminist Networks, 1970." In Women's Antiwar Diplomacy during the Vietnam War Era. University of North Carolina Press, 2017. http://dx.doi.org/10.5149/northcarolina/9781469631790.003.0004.

Full text
Abstract:
This chapter asks whether and how connections between American civil rights movements and socialist revolutions outside the United States shaped feminisms of women of color. Scholars have noted the domestic challenges women of color faced when they tried to fight for both race and gender issues—they were often expected to choose and were labeled "sell outs" if they worked in white women's groups. Vietnamese women, who fought for both their nation's sovereignty and women's rights, provided an important example to women of color—one they could use as an inspiration and as evidence of the inseparability of race and gender. Through the example of Vietnamese women, women of color rejected the false dichotomy of fighting for race or gender and insisted on struggling against all forms of oppression.
APA, Harvard, Vancouver, ISO, and other styles
9

Whitworth, Brian. "Spam as a Symptom of Electronic Communication Technologies that Ignore Social Requirements." In E-Collaboration. IGI Global, 2009. http://dx.doi.org/10.4018/978-1-60566-652-5.ch107.

Full text
Abstract:
Spam, undesired and usually unsolicited e-mail, has been a growing problem for some time. A 2003 Sunbelt Software poll found spam (or junk mail) has surpassed viruses as the number-one unwanted network intrusion (Townsend &amp; Taphouse, 2003). Time magazine reports that for major e-mail providers, 40 to 70% of all incoming mail is deleted at the server (Taylor, 2003), and AOL reports that 80% of its inbound e-mail, 1.5 to 1.9 billion messages a day, is spam the company blocks. Spam is the e-mail consumer’s number-one complaint (Davidson, 2003). Despite Internet service provider (ISP) filtering, up to 30% of in-box messages are spam. While each of us may only take seconds (or minutes) to deal with such mail, over billions of cases the losses are significant. A Ferris Research report estimates spam 2003 costs for U.S. companies at $10 billion (Bekker, 2003). While improved filters send more spam to trash cans, ever more spam is sent, consuming an increasing proportion of network resources. Users shielded behind spam filters may notice little change, but the Internet transmitted-spam percentage has been steadily growing. It was 8% in 2001, grew from 20% to 40% in 6 months over 2002 to 2003, and continues to grow (Weiss, 2003). In May 2003, the amount of spam e-mail exceeded nonspam for the first time, that is, over 50% of transmitted e-mail is now spam (Vaughan-Nichols, 2003). Informal estimates for 2004 are over 60%, with some as high as 80%. In practical terms, an ISP needing one server for customers must buy another just for spam almost no one reads. This cost passes on to users in increased connection fees. Pretransmission filtering could reduce this waste, but creates another problem: spam false positives, that is, valid e-mail filtered as spam. If you accidentally use spam words, like enlarge, your e-mail may be filtered. Currently, receivers can recover false rejects from their spam filter’s quarantine area, but filtering before transmission means the message never arrives at all, so neither sender nor receiver knows there is an error. Imagine if the postal mail system shredded unwanted mail and lost mail in the process. People could lose confidence that the mail will get through. If a communication environment cannot be trusted, confidence in it can collapse. Electronic communication systems sit on the horns of a dilemma. Reducing spam increases delivery failure rate, while guaranteeing delivery increases spam rates. Either way, by social failure of confidence or technical failure of capability, spam threatens the transmission system itself (Weinstein, 2003). As the percentage of transmitted spam increases, both problems increase. If spam were 99% of sent mail, a small false-positive percentage becomes a much higher percentage of valid e-mail that failed. The growing spam problem is recognized ambivalently by IT writers who espouse new Bayesian spam filters but note, “The problem with spam is that it is almost impossible to define” (Vaughan-Nichols, 2003, p. 142), or who advocate legal solutions but say none have worked so far. The technical community seems to be in a state of denial regarding spam. Despite some successes, transmitted spam is increasing. Moral outrage, spam blockers, spamming the spammers, black and white lists, and legal responses have slowed but not stopped it. Spam blockers, by hiding the problem from users, may be making it worse, as a Band-Aid covers but does not cure a systemic sore. Asking for a technical tool to stop spam may be asking the wrong question. If spam is a social problem, it may require a social solution, which in cyberspace means technical support for social requirements (Whitworth &amp; Whitworth, 2004).
APA, Harvard, Vancouver, ISO, and other styles
10

Whitworth, Brian. "Spam as a Symptom of Electronic Communication Technologies that Ignore Social Requirements." In Encyclopedia of Human Computer Interaction. IGI Global, 2006. http://dx.doi.org/10.4018/978-1-59140-562-7.ch083.

Full text
Abstract:
Spam, undesired and usually unsolicited e-mail, has been a growing problem for some time. A 2003 Sunbelt Software poll found spam (or junk mail) has surpassed viruses as the number-one unwanted network intrusion (Townsend &amp; Taphouse, 2003). Time magazine reports that for major e-mail providers, 40 to 70% of all incoming mail is deleted at the server (Taylor, 2003), and AOL reports that 80% of its inbound e-mail, 1.5 to 1.9 billion messages a day, is spam the company blocks. Spam is the e-mail consumer’s number-one complaint (Davidson, 2003). Despite Internet service provider (ISP) filtering, up to 30% of in-box messages are spam. While each of us may only take seconds (or minutes) to deal with such mail, over billions of cases the losses are significant. A Ferris Research report estimates spam 2003 costs for U.S. companies at $10 billion (Bekker, 2003). While improved filters send more spam to trash cans, ever more spam is sent, consuming an increasing proportion of network resources. Users shielded behind spam filters may notice little change, but the Internet transmitted-spam percentage has been steadily growing. It was 8% in 2001, grew from 20% to 40% in 6 months over 2002 to 2003, and continues to grow (Weiss, 2003). In May 2003, the amount of spam e-mail exceeded nonspam for the first time, that is, over 50% of transmitted e-mail is now spam (Vaughan-Nichols, 2003). Informal estimates for 2004 are over 60%, with some as high as 80%. In practical terms, an ISP needing one server for customers must buy another just for spam almost no one reads. This cost passes on to users in increased connection fees. Pretransmission filtering could reduce this waste, but creates another problem: spam false positives, that is, valid e-mail filtered as spam. If you accidentally use spam words, like enlarge, your e-mail may be filtered. Currently, receivers can recover false rejects from their spam filter’s quarantine area, but filtering before transmission means the message never arrives at all, so neither sender nor receiver knows there is an error. Imagine if the postal mail system shredded unwanted mail and lost mail in the process. People could lose confidence that the mail will get through. If a communication environment cannot be trusted, confidence in it can collapse. Electronic communication systems sit on the horns of a dilemma. Reducing spam increases delivery failure rate, while guaranteeing delivery increases spam rates. Either way, by social failure of confidence or technical failure of capability, spam threatens the transmission system itself (Weinstein, 2003). As the percentage of transmitted spam increases, both problems increase. If spam were 99% of sent mail, a small false-positive percentage becomes a much higher percentage of valid e-mail that failed. The growing spam problem is recognized ambivalently by IT writers who espouse new Bayesian spam filters but note, “The problem with spam is that it is almost impossible to define” (Vaughan-Nichols, 2003, p. 142), or who advocate legal solutions but say none have worked so far. The technical community seems to be in a state of denial regarding spam. Despite some successes, transmitted spam is increasing. Moral outrage, spam blockers, spamming the spammers, black and white lists, and legal responses have slowed but not stopped it. Spam blockers, by hiding the problem from users, may be making it worse, as a Band-Aid covers but does not cure a systemic sore. Asking for a technical tool to stop spam may be asking the wrong question. If spam is a social problem, it may require a social solution, which in cyberspace means technical support for social requirements (Whitworth &amp; Whitworth, 2004).
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "False Reject Rate (FRR)"

1

Araujo, Maria S., Shane P. Siebenaler, Edmond M. Dupont, Samantha G. Blaisdell, and Daniel S. Davila. "Near Real-Time Automated Detection of Small Hazardous Liquid Pipeline Leaks Using Remote Optical Sensing and Machine Learning." In 2016 11th International Pipeline Conference. American Society of Mechanical Engineers, 2016. http://dx.doi.org/10.1115/ipc2016-64218.

Full text
Abstract:
The prevailing leak detection systems used today on hazardous liquid pipelines (computational pipeline monitoring) do not have the required sensitivities to detect small leaks smaller than 1% of the nominal flow rate. False alarms of any leak detection system are a major industry concern, as such events will eventually lead to alarms being ignored, rendering the leak detection system ineffective [1]. This paper discusses the recent work focused on the development of an innovative remote sensing technology that is capable of reliably and automatically detecting small hazardous liquid leaks in near real-time. The technology is suitable for airborne applications, including manned and unmanned aircraft, ground applications, as well as stationary applications, such as monitoring of pipeline pump stations. While the focus of the development was primarily for detecting liquid hydrocarbon leaks, the technology also shows promise for detecting gas leaks. The technology fuses inputs from various types of optical sensors and applies machine learning techniques to reliably detect “fingerprints” of small hazardous liquid leaks. The optical sensors used include long-wave infrared, short-wave infrared, hyperspectral, and visual cameras. The utilization of these different imaging approaches raises the possibility for detecting spilled product from a past event even if the leak is not actively progressing. In order to thoroughly characterize leaks, tests were performed by imaging a variety of different types of hazardous liquid constitutions (e.g. crude oil, refined products, crude oil mixed with a variety of common refined products, etc.) in several different environmental conditions (e.g., lighting, temperature, etc.) and on various surfaces (e.g., grass, pavement, gravel, etc.). Tests were also conducted to characterize non-leak events. Focus was given to highly reflective and highly absorbent materials/conditions that are typically found near pipelines. Techniques were developed to extract a variety of features across the several spectral bands to identify unique attributes of different types of hazardous liquid constitutions and environmental conditions as well as non-leak events. The characterization of non-leak events is crucial in significantly reducing false alarm rates. Classifiers were then trained to detect small leaks and reject non-leak events (false alarms), followed by system performance testing. The trial results of this work are discussed in this paper.
APA, Harvard, Vancouver, ISO, and other styles
2

Ganeshkumar, M., Kai Keng Ang, and Rosa Q. So. "Reject option to reduce false prediction rates for EEG-motor imagery based BCI." In 2017 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). IEEE, 2017. http://dx.doi.org/10.1109/embc.2017.8037479.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Jagtap, Shilpa, J. L. Mudegaonkar, Sanjay Patil, and Dinesh Bhoyar. "A Novel Approach for Diagnosis of Diabetes Using Iris Image Processing Technique and Evaluation Parameters." In National Conference on Relevance of Engineering and Science for Environment and Society. AIJR Publisher, 2021. http://dx.doi.org/10.21467/proceedings.118.37.

Full text
Abstract:
This paper presented here deals with study of identification and verification approach of Diabetes based on human iris pattern. In the pre-processing of this work, region of interest according to color (ROI) concept is used for iris localization, Dougman's rubber sheet model is used for normalization and Circular Hough Transform can be used for pupil and boundary detection. To extract features, Gabor Filter, Histogram of Oriented Gradients, five level decomposition of wavelet transforms likeHaar, db2, db4, bior 2.2, bior6.8 waveletscan be used. Binary coding scheme binaries’ the feature vector coefficients and classifier like hamming distance, Support Vector Machine (SVM), Adaptive Boosting (AdaBoost), Neural Networks (NN), Random Forest (RF) and Linear Discriminative Analysis (LDA) with shrinkage parametercan be used for template matching. Performance parameters such as Computational time, Hamming distance variation, False Acceptance Rate (FAR), False Rejection Rate (FRR), Accuracy, and Match ratio can be calculated for the comparison purpose.
APA, Harvard, Vancouver, ISO, and other styles
4

Cullen, John J., and Hugh L. MacIntyre. "The case for using the Most Probable Number (MPN) method in ballast water management system type approval testing." In IMarEST Ballast Water Technology Conference. IMarEST, 2017. http://dx.doi.org/10.24868/bwtc6.2017.010.

Full text
Abstract:
Recently, the U.S. Coast Guard (USCG) rejected the Serial Dilution Culture-Most Probable Number (SDC-MPN) method for enumerating viable phytoplankton cells in ballast water discharge as an alternate to their prescribed method — the Environmental Technology Verification (ETV) Protocol. This method distinguishes living from dead organisms using vital stains and motility. Succinctly, the USCG position has been that the ETV Protocol is a reliable and repeatable efficacy test and the SDC-MPN method is not. New evidence and an expanded consideration of published research supports a fundamentally different assessment. A peer-reviewed quantitative evaluation of ETV vital stains for 24 species of phytoplankton has conclusively established that the ETV Protocol, even with observations of motility, is not reliable for all species. In contrast, published results suggest that errors in the method were small for the limited number of locations studied to date. It is possible that the communities tested in these were dominated by species that can be classified accurately using vital stains. Even so, it must be acknowledged that the reliability and accuracy of vital stains is untested for thousands of species of phytoplankton. Introduced in 1951, the SDC-MPN method for phytoplankton is an established approach for use with multi-species communities. As applied to ballast water testing, SDC-MPN is much less vulnerable to methodological uncertainties than has been assumed. Notably, all species of phytoplankton need not be cultured in the conventional sense. Rather, a single viable cell in a dilution tube need grow only enough to be detected — a requirement known to have been met by otherwise uncultured species. Further, delayed restoration of viability after treatment with ultraviolet radiation (UV) is not a problem: organisms repair UV damage quickly or not at all, consistent with the assumptions of the test. Two critical methodological failures could compromise protection of the environment in ballast water testing: living organisms that do not stain or move, and viable organisms that do not grow to detection in the MPN cultures. These can be assessed with complementary measurements, but importantly, the relative protection of each method can be evaluated by comparing counts of living cells from the ETV Protocol with counts of viable cell from SDC-MPN in untreated samples. Available evidence provides no basis for concluding that either method is consistently less protective. However, as applied in ballast water testing, the statistical estimate of MPN is less precise. On this basis, SDC-MPN is worse for a single test. But, counter-intuitively, it is more protective of the environment when five consecutive tests must be passed for type approval, because the likelihood of one false rejection out of five tests is higher and five false passes would be exceedingly rare. Addressing only the science, we conclude that both the ETV Protocol and the SDC-MPN method, though imperfect, are currently appropriate for assessing the efficacy of ballast water management systems in a type-approval testing regime. In closing, we show proof of concept for a rapid assay of viability, benchmarked against SDC-MPN, that could be well suited for routine assessment of treatment system performance.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "False Reject Rate (FRR)"

1

Baker, S. E., K. W. Bowyer, P. J. Flynn, and P. J. Phillips. Empirical evidence for increased false reject rate with time lapse in ICE 2006. National Institute of Standards and Technology, 2011. http://dx.doi.org/10.6028/nist.ir.7752.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography