To see the other types of publications on this topic, follow the link: Random number generator.

Dissertations / Theses on the topic 'Random number generator'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Random number generator.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Karanam, Shashi Prashanth. "Tiny true random number generator." Fairfax, VA : George Mason University, 2009. http://hdl.handle.net/1920/4587.

Full text
Abstract:
Thesis (M.S.)--George Mason University, 2009.
Vita: p. 91. Thesis director: Jens-Peter Kaps. Submitted in partial fulfillment of the requirements for the degree of Master of Science in Computer Engineering. Title from PDF t.p. (viewed Oct. 12, 2009). Includes bibliographical references (p. 88-90). Also issued in print.
APA, Harvard, Vancouver, ISO, and other styles
2

Crunk, Anthony Wayne. "A portable C random number generator." Thesis, Virginia Tech, 1985. http://hdl.handle.net/10919/45720.

Full text
Abstract:
Proliferation of computers with varying word sizes has led to increases in software use where random number generation is required. Several techniques have been developed. Criteria of randomness, portability, period, reproducibility, variety, speed, and storage are used to evaluate developed generation methods. The Tausworthe method is the only method to meet the portability requirement, and is chosen to be implemented. A C language implementation is proposed as a possible implementation and test results are presented to confirm the acceptability of the proposed code.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
3

Franco, Juan. "Rapid Prototyping and Design of a Fast Random Number Generator." Thesis, University of North Texas, 2011. https://digital.library.unt.edu/ark:/67531/metadc115036/.

Full text
Abstract:
Information in the form of online multimedia, bank accounts, or password usage for diverse applications needs some form of security. the core feature of many security systems is the generation of true random or pseudorandom numbers. Hence reliable generators of such numbers are indispensable. the fundamental hurdle is that digital computers cannot generate truly random numbers because the states and transitions of digital systems are well understood and predictable. Nothing in a digital computer happens truly randomly. Digital computers are sequential machines that perform a current state and move to the next state in a deterministic fashion. to generate any secure hash or encrypted word a random number is needed. But since computers are not random, random sequences are commonly used. Random sequences are algorithms that generate a pattern of values that appear to be random but after some time start repeating. This thesis implements a digital random number generator using MATLAB, FGPA prototyping, and custom silicon design. This random number generator is able to use a truly random CMOS source to generate the random number. Statistical benchmarks are used to test the results and to show that the design works. Thus the proposed random number generator will be useful for online encryption and security.
APA, Harvard, Vancouver, ISO, and other styles
4

Franco, Juan. "Rapid Prototyping and Design of a Fast Random Number Generator." Thesis, University of North Texas, 2012. https://digital.library.unt.edu/ark:/67531/metadc115040/.

Full text
Abstract:
Information in the form of online multimedia, bank accounts, or password usage for diverse applications needs some form of security. the core feature of many security systems is the generation of true random or pseudorandom numbers. Hence reliable generators of such numbers are indispensable. the fundamental hurdle is that digital computers cannot generate truly random numbers because the states and transitions of digital systems are well understood and predictable. Nothing in a digital computer happens truly randomly. Digital computers are sequential machines that perform a current state and move to the next state in a deterministic fashion. to generate any secure hash or encrypted word a random number is needed. But since computers are not random, random sequences are commonly used. Random sequences are algorithms that generate a pattern of values that appear to be random but after some time start repeating. This thesis implements a digital random number generator using MATLAB, FGPA prototyping, and custom silicon design. This random number generator is able to use a truly random CMOS source to generate the random number. Statistical benchmarks are used to test the results and to show that the design works. Thus the proposed random number generator will be useful for online encryption and security.
APA, Harvard, Vancouver, ISO, and other styles
5

Mitchum, Sam. "Digital Implementation of a True Random Number Generator." VCU Scholars Compass, 2010. http://scholarscompass.vcu.edu/etd/2327.

Full text
Abstract:
Random numbers are important for gaming, simulation and cryptography. Random numbers have been generated using analog circuitry. Two problems exist with using analog circuits in a digital design: (1) analog components require an analog circuit designer to insure proper structure and functionality and (2) analog components are not easily transmigrated into a different fabrication technology. This paper proposes a class of random number generators that are constructed using only digital components and typical digital design methodology. The proposed classification is called divergent path since the path of generated numbers through the range of possible values diverges at every sampling. One integrated circuit was fabricated and several models were synthesized into a FPGA. Test results are given.
APA, Harvard, Vancouver, ISO, and other styles
6

Yadav, Avantika. "Design and Analysis of Digital True Random Number Generator." VCU Scholars Compass, 2013. http://scholarscompass.vcu.edu/etd/3229.

Full text
Abstract:
Random number generator is a key component for strengthening and securing the confidentiality of electronic communications. Random number generators can be divided as either pseudo random number generators or true random number generators. A pseudo random number generator produces a stream of numbers that appears to be random but actually follow predefined sequence. A true random number generator produces a stream of unpredictable numbers that have no defined pattern. There has been growing interest to design true random number generator in past few years. Several Field Programmable Gate Array (FPGA) and Application Specific Integrated Circuit (ASIC) based approaches have been used to generate random data that requires analog circuit. RNGs having analog circuits demand for more power and area. These factors weaken hardware analog circuit-based RNG systems relative to hardware completely digital-based RNGs systems. This thesis is focused on the design of completely digital true random number generator ASIC.
APA, Harvard, Vancouver, ISO, and other styles
7

Pilcher, Martha Geraldine. "Development and validation of random cut test problem generator." Diss., Georgia Institute of Technology, 1985. http://hdl.handle.net/1853/24560.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Mattioli, Federico. "Testing a Random Number Generator: formal properties and automotive application." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2019. http://amslaurea.unibo.it/18187/.

Full text
Abstract:
L'elaborato analizza un metodo di validazione dei generatori di numeri casuali (RNG), utilizzati per garantire la sicurezza dei moderni sistemi automotive. Il primo capitolo fornisce una panoramica della struttura di comunicazione dei moderni autoveicoli attraverso l'utilizzo di centraline (ECU): vengono riportati i principali punti di accesso ad un automobile, assieme a possibili tipologie di hacking; viene poi descritto l'utilizzo dei numeri casuali in crittografia, con particolare riferimento a quella utilizzata nei veicoli. Il secondo capitolo riporta le basi di probabilità necessarie all'approccio dei test statistici utilizzati per la validazione e riporta i principali approcci teorici al problema della casualità. Nei due capitoli centrali, viene proposta una descrizione dei metodi probabilistici ed entropici per l'analisi di dati reali utilizzati nei test. Vengono poi descritti e studiati i 15 test statistici proposti dal National Institute of Standards and Technology (NIST). Dopo i primi test, basati su proprietà molto semplici delle sequenze casuali, vengono proposti test più sofisticati, basati sull'uso della trasformata di Fourier (per testare eventuali comportamenti periodici), dell'entropia (strettamente connessi con la comprimibilità della sequenza), o sui random path. Due ulteriori test, permettono di valutare il buon funzionamento del generatore, e non solo delle singole sequenze generate. Infine, il quinto capitolo è dedicato all'implementazione dei test al fine di testare il TRNG delle centraline.
APA, Harvard, Vancouver, ISO, and other styles
9

Gärtner, Joel. "Analysis of Entropy Usage in Random Number Generators." Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-214567.

Full text
Abstract:
Cryptographically secure random number generators usually require an outside seed to be initialized. Other solutions instead use a continuous entropy stream to ensure that the internal state of the generator always remains unpredictable. This thesis analyses four such generators with entropy inputs. Furthermore, different ways to estimate entropy is presented and a new method useful for the generator analysis is developed. The developed entropy estimator performs well in tests and is used to analyse entropy gathered from the different generators. Furthermore, all the analysed generators exhibit some seemingly unintentional behaviour, but most should still be safe for use.
Kryptografiskt säkra slumptalsgeneratorer behöver ofta initialiseras med ett oförutsägbart frö. En annan lösning är att istället konstant ge slumptalsgeneratorer entropi. Detta gör det möjligt att garantera att det interna tillståndet i generatorn hålls oförutsägbart. I den här rapporten analyseras fyra sådana generatorer som matas med entropi. Dessutom presenteras olika sätt att skatta entropi och en ny skattningsmetod utvecklas för att användas till analysen av generatorerna. Den framtagna metoden för entropiskattning lyckas bra i tester och används för att analysera entropin i de olika generatorerna. Alla analyserade generatorer uppvisar beteenden som inte verkar optimala för generatorns funktionalitet. De flesta av de analyserade generatorerna verkar dock oftast säkra att använda.
APA, Harvard, Vancouver, ISO, and other styles
10

Saiprasert, Chalermpol. "Design exploration of an FPGA-based multivariate Gaussian random number generator." Thesis, Imperial College London, 2010. http://hdl.handle.net/10044/1/6212.

Full text
Abstract:
Monte Carlo simulation is one of the most widely used techniques for computationally intensive simulations in a variety of applications including mathematical analysis and modeling and statistical physics. A multivariate Gaussian random number generator (MVGRNG) is one of the main building blocks of such a system. Field Programmable Gate Arrays (FPGAs) are gaining increased popularity as an alternative means to the traditional general purpose processors targeting the acceleration of the computationally expensive random number generator block due to their fine grain parallelism and reconfigurability properties and lower power consumption. As well as the ability to achieve hardware designs with high throughput it is also desirable to produce designs with the flexibility to control the resource usage in order to meet given resource constraints. This work proposes a novel approach for mapping a MVGRNG onto an FPGA by optimizing the computational path in terms of hardware resource usage subject to an acceptable error in the approximation of the distribution of interest. An analysis on the impact of the error due to truncation/rounding operation along the computational path is performed and an analytical expression of the error inserted into the system is presented. Extra dimensionality is added to the feature of the proposed algorithm by introducing a novel methodology to map many multivariate Gaussian random number generators onto a single FPGA. The effective resource sharing techniques introduced in this thesis allows further reduction in hardware resource usage. The use of MVGNRG can be found in a wide range of application, especially in financial applications which involve many correlated assets. In this work it is demonstrated that the choice of the objective function employed for the hardware optimization of the MVRNG core has a considerable impact on the final performance of the application of interest. Two of the most important financial applications, Value-at-Risk estimation and option pricing are considered in this work.
APA, Harvard, Vancouver, ISO, and other styles
11

Hörmann, Wolfgang, and Gerhard Derflinger. "A portable uniform random number generator well suited for the rejection method." Institut für Statistik und Mathematik, Abt. f. Angewandte Statistik u. Datenverarbeitung, WU Vienna University of Economics and Business, 1992. http://epub.wu.ac.at/1288/1/document.pdf.

Full text
Abstract:
Up to now all known efficient portable implementations of linear congruential random number generators with modulus 2^(31)-1 are working only with multipliers which are small compared with the modulus. We show that for non-uniform distributions, the rejection method may generate random numbers of bad quality if combined with a linear congruential generator with small multiplier. Therefore a method is described that works for any multiplier smaller than 2^(30). It uses the decomposition of multiplier and seed in high order and low order bits to compute the upper and the lower half of the product. The sum of the two halfs gives the product of multiplier and seed modulo 2^(31)-1. Coded in ANSI-C and FORTRAN77 the method results in a portable implementation of the linear congruential generator that is as fast or faster than other portable methods. (author's abstract)
Series: Preprint Series / Department of Applied Statistics and Data Processing
APA, Harvard, Vancouver, ISO, and other styles
12

Shanmuga, Sundaram Prassanna. "Development of a FPGA-based True Random Number Generator for Space Applications." Thesis, Linköping University, Electronics System, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-54534.

Full text
Abstract:

Random numbers are required for cryptographic applications such as IT security products, smart cards etc. Hardwarebased random number generators are widely employed. Cryptographic algorithms are implemented on FieldProgrammable Gate Arrays (FPGAs). In this work a True Random Number Generator (TRNG) employed for spaceapplication was designed, investigated and evaluated. Several cryptographic requirements has to be satisfied for therandom numbers. Two different noise sources was designed and implemented on the FPGA. The first design wasbased on ring oscillators as a noise source. The second design was based on astable oscillators developed on a separatehardware board and interfaced with the FPGA as another noise source. The main aim of the project was to analyse theimportant requirement of independent noise source on a physical level. Jitter from the oscillators being the source forthe randomness, was analysed on both the noise sources. The generated random sequences was finally subjected tostatistical tests.

APA, Harvard, Vancouver, ISO, and other styles
13

Bakiri, Mohammed. "Hardware implementation of a pseudo random number generator based on chaotic iteration." Thesis, Bourgogne Franche-Comté, 2018. http://www.theses.fr/2018UBFCD014/document.

Full text
Abstract:
La sécurité et la cryptographie sont des éléments clés pour les dispositifs soumis à des contraintes comme l’IOT, Carte à Puce, Systèm Embarqué, etc. Leur implémentation matérielle constitue un défi en termes de limitation en ressources physiques, vitesse de fonctionnement, capacité de mémoire, etc. Dans ce contexte, comme la plupart des protocoles s’appuient sur la sécurité d’un bon générateur de nombres aléatoires, considéré comme un élément indispensable dans le noyau de sécurité. Par conséquent, le présent travail propose des nouveaux générateurs pseudo-aléatoires basés sur des itérations chaotiques, et conçus pour être déployés sur des supports matériels, à savoir sur du FPGA ou du ASIC. Ces implémentations matérielles peuvent être décrites comme des post-traitements sur des générateurs existants. Elles transforment donc une suite de nombres non-uniformes en une autre suite de nombres uniformes. La dépendance entre l’entrée et la sortie a été prouvée chaotique selon les définitions mathématiques du chaos fournies notamment par Devaney et Li-Yorke. Suite à cela, nous effectuant tout d’abord un état de l’art complet sur les mises en œuvre matérielles et physiques des générateurs de nombres pseudo-aléatoires (PRNG, pour pseudorandom number generators). Nous proposons ensuite de nouveaux générateurs à base d’itérations chaotiques (IC) qui seront testés sur notre plate-forme matérielle. L’idée de départ était de partir du n-cube (ou, de manière équivalente, de la négation vectorielle dans les IC), puis d’enlever un cycle Hamiltonien suffisamment équilibré pour produire de nouvelles fonctions à itérer, à laquelle s’ajoute une permutation en sortie. Les méthodes préconisées pour trouver de bonnes fonctions serons détaillées, et le tout sera implanté sur notre plate-forme FPGA. Les générateurs obtenus disposent généralement d’un meilleur profil statistique que leur entrée, tout en fonctionnant à une grande vitesse. Finalement, nous les implémenterons sur de nombreux supports matériels (65-nm ASIC circuit and Zynq FPGA platform)
Security and cryptography are key elements in constrained devices such as IoT, smart card, embedded system, etc. Their hardware implementations represent a challenge in terms of limitations in physical resources, operating speed, memory capacity, etc. In this context, as most protocols rely on the security of a good random number generator, considered an indispensable element in lightweight security core. Therefore, this work proposes new pseudo-random generators based on chaotic iterations, and designed to be deployed on hardware support, namely FPGA or ASIC. These hardware implementations can be described as post-processing on existing generators. They transform a sequence of numbers not uniform into another sequence of numbers uniform. The dependency between input and output has been proven chaotic, according notably to the mathematical definitions of chaos provided by Devaney and Li-Yorke. Following that, we firstly elaborate or develop out a complete state of the art of the material and physical implementations of pseudo-random number generators (PRNG, for pseudorandom number generators). We then propose new generators based on chaotic iterations (IC) which will be tested on our hardware platform. The initial idea was to start from the n-cube (or, in an equivalent way, the vectorial negation in CIs), then remove a Hamiltonian cycle balanced enough to produce new functions to be iterated, for which is added permutation on output . The methods recommended to find good functions, will be detailed, and the whole will be implemented on our FPGA platform. The resulting generators generally have a better statistical profiles than its inputs, while operating at a high speed. Finally, we will implement them on many hardware support (65-nm ASIC circuit and Zynq FPGA platform)
APA, Harvard, Vancouver, ISO, and other styles
14

Krempa, Peter. "Analysis of Entropy Levels in the Entropy Pool of Random Number Generator." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2013. http://www.nusl.cz/ntk/nusl-236179.

Full text
Abstract:
V informatice je pojem entropie obvykle znám jako nahodný proud dat.  Tato práce krátce shrnuje metody generovaní nahodných dat a popisuje generátor náhodnych čísel, jež je obsažen v jádře operačního systému Linux.  Dále se práce zabývá určením bitové rychlosti generování nahodných dat tímto generátorem ve virtualizovaném prosředí, které poskytují různé hypervizory.  Práce popíše problémy nízkého výkonu generátory nahodných dat ve virtualním prostředí a navrhne postup pro jejich řešení.  Poté je nastíňena implementace navržených postupů, které je podrobena testům a její vysledky jsou porovnány s původním systémem. Systém pro distribuci entropie může dále vylepšit množství entropie v sytémovém jádře o několik řádu, pokud je připojen k vykonému generátoru nahodných dat.
APA, Harvard, Vancouver, ISO, and other styles
15

Lian, Guinan. "Testing Primitive Polynomials for Generalized Feedback Shift Register Random Number Generators." Diss., CLICK HERE for online access, 2005. http://contentdm.lib.byu.edu/ETD/image/etd1131.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Dusitsin, Krid, and Kurt Kosbar. "Accuracy of Computer Simulations that use Common Pseudo-random Number Generators." International Foundation for Telemetering, 1998. http://hdl.handle.net/10150/609238.

Full text
Abstract:
International Telemetering Conference Proceedings / October 26-29, 1998 / Town & Country Resort Hotel and Convention Center, San Diego, California
In computer simulations of communication systems, linear congruential generators and shift registers are typically used to model noise and data sources. These generators are often assumed to be close to ideal (i.e. delta correlated), and an insignificant source of error in the simulation results. The samples generated by these algorithms have non-ideal autocorrelation functions, which may cause a non-uniform distribution in the data or noise signals. This error may cause the simulation bit-error-rate (BER) to be artificially high or low. In this paper, the problem is described through the use of confidence intervals. Tests are performed on several pseudo-random generators to access which ones are acceptable for computer simulation.
APA, Harvard, Vancouver, ISO, and other styles
17

Botha, Roelof Cornelis. "The development of a hardware random number generator for gamma-ray astronomy / R.C. Botha." Thesis, North-West University, 2005. http://hdl.handle.net/10394/581.

Full text
Abstract:
Pulsars, as rotating magnetised neutron stars got much attention during the last 40 years since their discovery. Observations revealed them to be gamma-ray emitters with energies continuing up to the sub 100 GeV region. Better observation of this upper energy cut-off region will serve to enhance our theoretical understanding of pulsars and neutron stars. The H-test has been used the most extensively in the latest periodicity searches, whereas other tests have limited applications and are unsuited for pulsar searches. If the probability distribution of a test statistic is not accurately known, it is possible that, after searching through many trials, a probability for uniformity can be given, which is much smaller than the real value, possibly leading to false detections. The problem with the H-test is that one must obtain the distribution by simulation and cannot do so analytically. For such simulations, random numbers are needed and are usually obtained by utilising so-called pseudo-random number generators, which are not truly random. This immediately renders such generators as useless for the simulation of the distribution of the H-test. Alternatively there exists hardware random number generators, but such devices, apart from always being slow, are also expensive, large and most still don't exhibit the true random nature required. This was the motivation behind the development of a hardware random number generator which provides truly random U(0,l) numbers at very high speed and at low cost The development of and results obtained by such a generator are discussed. The device delivered statistically truly random numbers and was already used in a small simulation of the H-test distribution.
Thesis (M.Sc. (Physics))--North-West University, Potchefstroom Campus, 2005.
APA, Harvard, Vancouver, ISO, and other styles
18

Aponte, Erick. "A Study on Energy Harvesters for Physical Unclonable Functions and Random Number Generation." Thesis, Virginia Tech, 2017. http://hdl.handle.net/10919/78673.

Full text
Abstract:
As the broad implementation and use of wireless sensor nodes in Internet of Things (IOT) devices increase over the years, securing personal data becomes a growing issue. Physical unclonable functions (PUFs) and random number generators (RNGs) provide methods to generate security keys for data encryption. Transducers used in the energy harvesting systems of wireless sensor nodes, can generate the PUFs and RNGs. These transducers include piezoelectric devices (piezo), thermoelectric generators (TEG) and solar cells. This research studies the electrical properties of transducers at normal and low operating levels for electrical responses that can be used in PUF generation and random number generation respectively. The PUF generation discussed in this study analyzes the resonance frequency of 10 piezos, and the open-circuit voltages of 5 TEGs and 5 solar cells. The transducers are tested multiple times over a 10-day period to evaluate PUF reproducibility and reliability characteristics. The random number generation is accomplished by applying a low-level vibration, thermal or light excitation to each respective transducer. The generated electrical signals are amplified and digitally processed and analyzed using the National Institute of Standards and Technology (NIST) Statistical Test Suite. The experiment results for the PUF generation are promising and indicate that the piezos are the better choice due to their stable frequency output. Each transducer was able to produce random numbers and pass the NIST tests, but the TEGs passed the NIST tests more often than the other transducers. These results offer a preliminary basis for transducers to be used directly in security applications.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
19

Petura, Oto. "True random number generators for cryptography : Design, securing and evaluation." Thesis, Lyon, 2019. http://www.theses.fr/2019LYSES053.

Full text
Abstract:
Les nombres aléatoires sont essentiels pour les systèmes cryptographiques modernes. Ils servent de clés cryptographiques, de nonces, de vecteurs d’initialisation et de masques aléatoires pour la protection contre les attaques par canaux cachés. Dans cette thèse, nous traitons des générateurs de nombres aléatoires dans les circuits logiques (FPGA et ASIC). Nous présentons les méthodes fondamentales de génération de nombres aléatoires dans des circuits logiques. Ensuite, nous discutons de différents types de TRNG en utilisant le jitter d’horloge comme source d’aléa. Nous faisons une évaluation rigoureuse de divers noyaux TRNG conformes à la norme AIS-20/31 et mis en œuvre dans trois familles de FPGA différentes: Intel Cyclone V, Xilinx Spartan-6 et Microsemi SmartFusion2. Puis, nous présentons l’implémentation des noyaux TRNG sélectionnés dans des ASIC et leur évaluation. Ensuite, nous étudions en profondeur PLL-TRNG afin de fournir une conception sécurisée de ce TRNG ainsi que des tests intégrés. Enfin, nous étudions les TRNG basés sur les oscillateurs. Nous comparons de différentes méthodes d'extraction d’aléa ainsi que de différents types d'oscillateurs et le comportement du jitter d'horloge à l'intérieur de chacun d'eux. Nous proposons également des méthodes de mesure du jitter intégrée pour le test en ligne des TRNG basés sur les oscillateurs
Random numbers are essential for modern cryptographic systems. They are used as cryptographic keys, nonces, initialization vectors and random masks for protection against side channel attacks. In this thesis, we deal with random number generators in logic devices (Field Programmable Gate Arrays – FPGAs and Application Specific Integrated Circuits – ASICs). We present fundamental methods of generation of random numbers in logic devices. Then, we discuss different types of TRNGs using clock jitter as a source of randomness. We provide a rigorous evaluation of various AIS-20/31 compliant TRNG cores implemented in three different FPGA families : Intel Cyclone V, Xilinx Spartan-6 and Microsemi SmartFusion2. We then present the implementation of selected TRNG cores in custom ASIC and we evaluate them. Next, we study PLL-TRNG in depth in order to provide a secure design of this TRNG together with embedded tests. Finally, we study oscillator based TRNGs. We compare different randomness extraction methods as well as different oscillator types and the behavior of the clock jitter inside each of them. We also propose methods of embedded jitter measurement for online testing of oscillator based TRNGs
APA, Harvard, Vancouver, ISO, and other styles
20

Zouhar, Petr. "Generátor náhodných čísel." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2010. http://www.nusl.cz/ntk/nusl-218290.

Full text
Abstract:
The thesis deals with issues of random numbers, their generating and use in cryptography. Introduction of work is aimed to resolution of random number generators and pseudo--random number generators. There is also included often used dividing generators on software and hardware. We mention advantages and disadvantages of each type and area of their use. Then we describe examples of random and pseudorandom numbers, mainly hardware based on physical phenomenon such as the decay of radioactive material or use atmospheric noise. The following part is devoted to suggestion own random number generator and a description of its functionality. In the second half of the work we devote to the field of cryptography. We know basic types of cryptographic systems, namely symmetric and asymmetric cryptosystems. We introduce a typical representant the various type and their properties. At the end of the work we again return to our random number generator and verify the randomness generated numbers and obtained cryptograms.
APA, Harvard, Vancouver, ISO, and other styles
21

Grigaravičienė, Milda. "Pseudoatsitiktinių skaičių generatorių statistinių savybių tyrimas." Master's thesis, Lithuanian Academic Libraries Network (LABT), 2006. http://vddb.library.lt/obj/LT-eLABa-0001:E.02~2006~D_20060605_212214-88087.

Full text
Abstract:
Pseudorandom number generator‘s statistical features were analyzed in this work. Pseudorandom numbers are applied in many fields, that‘s why it‘s important for them to satisfy following requirements: • to have uniform distribution, • to be uncorrelated. Hypothesis that random numbers are distributed uniformly is checked by Pearson test. Hypothesis that autocorrelation function is equal to zero is checked by Box Ljung test. During investigation it was noticed, that in all ways generated random numbers didn’t have uniform distribution, except linear congriuential generator. Applying different transformations was set, that for combinations v1-v8 when using parabola, transformed random numbers had uniform distribution. Arcsine was the best transformation for nonlinear congriuential generator. While testing hypothesis about autocorrelation function’s equality to zero was noticed, that zero hypothesis rejection or not depends on: • random numbers generation algorithm, • generated sample size, The best generator, which satisfied requirements, is linear congriuential generator, but it is not suitable, because it is too predictable. Nonlinear congruential generator is chosen as the best one, because its statistical features are closest to the linear generator.
APA, Harvard, Vancouver, ISO, and other styles
22

Hörmann, Wolfgang. "A Universal Generator for Bivariate Log-Concave Distributions." Department of Statistics and Mathematics, Abt. f. Angewandte Statistik u. Datenverarbeitung, WU Vienna University of Economics and Business, 1995. http://epub.wu.ac.at/1044/1/document.pdf.

Full text
Abstract:
Different universal (also called automatic or black-box) methods have been suggested to sample from univariate log-concave distributions. The description of a universal generator for bivariate distributions has not been published up to now. The new algorithm for bivariate log-concave distributions is based on the method of transformed density rejection. In order to construct a hat function for a rejection algorithm the bivariate density is transformed by the logarithm into a concave function. Then it is possible to construct a dominating function by taking the minimum of several tangent planes which are by exponentiation transformed back into the original scale. The choice of the points of contact is automated using adaptive rejection sampling. This means that a point that is rejected by the rejection algorithm is used as additional point of contact until the maximal number of points of contact is reached. The paper describes the details how this main idea can be used to construct Algorithm ULC2D that can generate random pairs from bivariate log-concave distribution with a computable density. (author's abstract)
Series: Preprint Series / Department of Applied Statistics and Data Processing
APA, Harvard, Vancouver, ISO, and other styles
23

Weigl, Andrew. "Improving security for elliptic curve implementations on smart cards a random number generator test unit /." [S.l.] : [s.n.], 2006. http://deposit.ddb.de/cgi-bin/dokserv?idn=983478090.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Hörmann, Wolfgang. "A universal generator for discrete log-concave distributions." Institut für Statistik und Mathematik, Abt. f. Angewandte Statistik u. Datenverarbeitung, WU Vienna University of Economics and Business, 1993. http://epub.wu.ac.at/1704/1/document.pdf.

Full text
Abstract:
We give an algorithm that can be used to sample from any discrete log-concave distribution (e.g. the binomial and hypergeometric distributions). It is based on rejection from a discrete dominating distribution that consists of parts of the geometric distribution. The algorithm is uniformly fast for all discrete log-concave distributions and not much slower than algorithms designed for a single distribution. (author's abstract)
Series: Preprint Series / Department of Applied Statistics and Data Processing
APA, Harvard, Vancouver, ISO, and other styles
25

Gautham, Smitha. "An Efficient Implementation of an Exponential Random Number Generator in a Field Programmable Gate Array (FPGA)." VCU Scholars Compass, 2010. http://scholarscompass.vcu.edu/etd/2173.

Full text
Abstract:
Many physical, biological, ecological and behavioral events occur at times and rates that are exponentially distributed. Modeling these systems requires simulators that can accurately generate a large quantity of exponentially distributed random numbers, which is a computationally intensive task. To improve the performance of these simulators, one approach is to move portions of the computationally inefficient simulation tasks from software to custom hardware implemented in Field Programmable Gate Arrays (FPGAs). In this work, we study efficient FPGA implementations of exponentially distributed random number generators to improve simulator performance. Our approach is to generate uniformly distributed random numbers using standard techniques and scale them using the inverse cumulative distribution function (CDF). Scaling is implemented by curve fitting piecewise linear, quadratic, cubic, and higher order functions to solve for the inverse CDF. As the complexity of the scaling function increases (in terms of order and the number of pieces), number accuracy increases and additional FPGA resources (logic cells and block RAMs) are consumed. We analyze these tradeoffs and show how a designer with particular accuracy requirements and FPGA resource constraints can implement an accurate and efficient exponentially distributed random number generator.
APA, Harvard, Vancouver, ISO, and other styles
26

Križan, Viliam. "Generátor náhodných čísel." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2015. http://www.nusl.cz/ntk/nusl-220413.

Full text
Abstract:
This master thesis deals with a generation of random numbers and Fortuna generator implementation in Java language. In the first part the theoretical familiarization to the issues is introduced. Various entropy sources like mouse movement, keyboard typing, microphone and web camera noise are described and analysed. The analysis focuses on randomness, usability and volume of gathered data. Also the Fortuna random number generator is described from the theoretical view. Object analysis and implementation details are described in the last chapter of the document.
APA, Harvard, Vancouver, ISO, and other styles
27

Stewart, Robert Grisham. "A Statistical Evaluation of Algorithms for Independently Seeding Pseudo-Random Number Generators of Type Multiplicative Congruential (Lehmer-Class)." Digital Commons @ East Tennessee State University, 2007. https://dc.etsu.edu/etd/2049.

Full text
Abstract:
To be effective, a linear congruential random number generator (LCG) should produce values that are (a) uniformly distributed on the unit interval (0,1) excluding endpoints and (b) substantially free of serial correlation. It has been found that many statistical methods produce inflated Type I error rates for correlated observations. Theoretically, independently seeding an LCG under the following conditions attenuates serial correlation: (a) simple random sampling of seeds, (b) non-replicate streams, (c) non-overlapping streams, and (d) non-adjoining streams. Accordingly, 4 algorithms (each satisfying at least 1 condition) were developed: (a) zero-leap, (b) fixed-leap, (c) scaled random-leap, and (d) unscaled random-leap. Note that the latter satisfied all 4 independent seeding conditions. To assess serial correlation, univariate and multivariate simulations were conducted at 3 equally spaced intervals for each algorithm (N=24) and measured using 3 randomness tests: (a) the serial correlation test, (b) the runs up test, and (c) the white noise test. A one-way balanced multivariate analysis of variance (MANOVA) was used to test 4 hypotheses: (a) omnibus, (b) contrast of unscaled vs. others, (c) contrast of scaled vs. others, and (d) contrast of fixed vs. others. The MANOVA assumptions of independence, normality, and homogeneity were satisfied. In sum, the seeding algorithms did not differ significantly from each other (omnibus hypothesis). For the contrast hypotheses, only the fixed-leap algorithm differed significantly from all other algorithms. Surprisingly, the scaled random-leap offered the least difference among the algorithms (theoretically this algorithm should have produced the second largest difference). Although not fully supported by the research design used in this study, it is thought that the unscaled random-leap algorithm is the best choice for independently seeding the multiplicative congruential random number generator. Accordingly, suggestions for further research are proposed.
APA, Harvard, Vancouver, ISO, and other styles
28

Marangon, Davide Giacomo. "Improving Quantum Key Distribution and Quantum Random Number Generation in presence of Noise." Doctoral thesis, Università degli studi di Padova, 2015. http://hdl.handle.net/11577/3424117.

Full text
Abstract:
The argument of this thesis might be summed up as the exploitation of the noise to generate better noise. More specifically this work is about the possibility of exploiting classic noise to effectively transmit quantum information and measuring quantum noise to generate better quantum randomness. What do i mean by exploiting classical noise to transmit effectively quantum information? In this case I refer to the task of sending quantum bits through the atmosphere in order set up transmissions of quantum key distribution (QKD) and this will be the subject of Chapter 1 and Chapter 2. In the Quantum Communications framework, QKD represents a topic with challenging problems both theoretical and experimental. In principle QKD offers unconditional security, however practical realizations of it must face all the limitations of the real world. One of the main limitation are the losses introduced by real transmission channels. Losses cause errors and errors make the protocol less secure because an eavesdropper could try to hide his activity behind the losses. When this problem is addressed under a full theoretical point of view, one tries to model the effect of losses by means of unitary transforms which affect the qubits in average according a fixed level of link attenuation. However this approach is somehow limiting because if one has a high level of background noise and the losses are assumed in average constant, it could happen that the protocol might abort or not even start, being the predicted QBER to high. To address this problem and generate key when normally it would not be possible, we have proposed an adaptive real time selection (ARTS) scheme where transmissivity peaks are instantaneously detected. In fact, an additional resource may be introduced to estimate the link transmissivity in its intrinsic time scale with the use of an auxiliary classical laser beam co-propagating with the qubits but conveniently interleaved in time. In this way the link scintillation is monitored in real time and the selection of the time intervals of high channel transmissivity corresponding to a viable QBER for a positive key generation is made available. In Chapter 2 we present a demonstration of this protocol in conditions of losses equivalent to long distance and satellite links, and with a range of scintillation corresponding to moderate to severe weather. A useful criterion for the preselection of the low QBER interval is presented that employs a train of intense pulses propagating in the same path as the qubits, with parameters chosen such that its fluctuation in time reproduces that of the quantum communication. For what concern the content of Chapter 3 we describe a novel principle for true random number generator (TRNG) which is based on the observation that a coherent beam of light crossing a very long path with atmospheric turbulence may generate random and rapidly varying images. To implement our method in a proof of concept demonstrator, we have chosen a very long free space channel used in the last years for experiments in Quantum Communications at the Canary Islands. Here, after a propagation of 143 km at an altitude of the terminals of about 2400 m, the turbulence in the path is converted into a dynamical speckle at the receiver. The source of entropy is then the atmospheric turbulence. Indeed, for such a long path, a solution of the Navier-Stokes equations for the {atmospheric flow in which the beam propagates is out of reach. Several models are based on the Kolmogorov statistical theory, which parametrizes the repartition of kinetic energy as the interaction of decreasing size eddies. However, such models only provide a statistical description for the spot of the beam and its wandering and never an instantaneous prediction for the irradiance distribution. These are mainly ruled by temperature variations and by the wind and cause fluctuations in the air refractive index. For such reason, when a laser beam is sent across the atmosphere, this latter may be considered as a dynamic volumetric scatterer which distorts the beam wavefront. We will evaluate the experimental data to ensure that the images are uniform and independent. Moreover, we will assess that our method for the randomness extraction based on the combinatorial analysis is optimal in the context of Information Theory. In Chapter 5 we will present a new approach for what concerns the generation of random bits from quantum physical processes. Quantum Mechanics has been always regarded as a possible and valuable source of randomness, because of its intrinsic probabilistic Nature. However the typical paradigm is employed to extract random number from a quantum system it commonly assumes that the state of said system is pure. Such assumption, only in theory would lead to full and unpredictable randomness. The main issue however it is that in real implementations, such as in a laboratory or in some commercial device, it is hardly possible to forge a pure quantum state. One has then to deal with quantum state featuring some degree of mixedness. A mixed state however might be somehow correlated with some other system which is hold by an adversary, a quantum eavesdropper. In the extreme case of a full mixed state, practically one it is like if he is extracting random numbers from a classical state. In order to do that we will show how it is important to shift from a classical randomness estimator, such as the min-classical entropy H-min(Z) of a random variable Z to quantum ones such as the min-entropy conditioned on quantum side information E. We have devised an effective protocol based on the entropic uncertainty principle for the estimation of the min-conditional entropy. The entropic uncertainty principle lets one to take in account the information which is shared between multiple parties holding a multipartite quantum system and, more importantly, lets one to bound the information a party has on the system state after that it has been measured. We adapted such principle to the bipartite case where an user Alice, A, is supplied with a quantum system prepared by the provider Eve, E, who could be maliciously correlated to it. In principle then Eve might be able to predict all the outcomes of the measurements Alice performs on the basis Z in order to extract random numbers from the system. However we will show that if Alice randomly switches from the measurement basis to a basis X mutually unbiased to Z, she can lower bound the min entropy conditioned to the side information of Eve. In this way for Alice is possible to expand a small initial random seed in a much larger amount of trusted numbers. We present the results of an experimental demonstration of the protocol where random numbers passing the most rigorous classical tests of randomness were produced. In Chapter 6, we will provide a secure generation scheme for a continuos variable (CV) QRNG. Since random true random numbers are an invaluable resource for both the classical Information Technology and the uprising Quantum one, it is clear that to sustain the present and future even growing fluxes of data to encrypt it is necessary to devise quantum random number generators able to generate numbers in the rate of Gigabit or Terabit per second. In the Literature are given several examples of QRNG protocols which in theory could reach such limits. Typically, these are based on the exploitation of the quadratures of the electro-magnetic field, regarded as an infinite bosonic quantum system. The quadratures of the field can be measured with a well known measurement scheme, the so called homodyne detection scheme which, in principle, can yield an infinite band noise. Consequently the band of the random signal is limited only by the passband of the devices used to measure it. Photodiodes detectors work commonly in the GHz band, so if one sample the signal with an ADC enough fast, the Gigabit or Terabit rates can be easily reached. However, as in the case of discrete variable QRNG, the protocols that one can find in the Literature, do not properly consider the purity of the quantum state being measured. The idea has been to extend the discrete variable protocol of the previous Chapter, to the Continuous case. We will show how in the CV framework, not only the problem of the state purity is given but also the problem related to the precision of the measurements used to extract the randomness.
L'argomento di questa tesi può essere riassunto nella frase utilizzare il rumore classico per generare un migliore rumore quantistico. In particolare questa tesi riguarda da una parte la possibilita di sfruttare il rumore classico per trasmettere in modo efficace informazione quantistica, e dall'altra la misurazione del rumore classico per generare una migliore casualita quantistica. Nel primo caso ci si riferisce all'inviare bit quantistici attraverso l'atmosfera per creare trasmissioni allo scopo di distribuire chiavi crittografiche in modo quantistico (QKD) e questo sara oggetto di Capitolo 1 e Capitolo 2. Nel quadro delle comunicazioni quantistiche, la QKD è caratterizzata da notevoli difficolta sperimentali. Infatti, in linea di principio la QKD offre sicurezza incondizionata ma le sue realizzazioni pratiche devono affrontare tutti i limiti del mondo reale. Uno dei limiti principali sono le perdite introdotte dai canali di trasmissione. Le perdite causano errori e gli errori rendono il protocollo meno sicuro perché un avversario potrebbe camuffare la sua attivita di intercettazione utilizzando le perdite. Quando questo problema viene affrontato da un punto di vista teorico, si cerca di modellare l'effetto delle perdite mediante trasformazioni unitarie che trasformano i qubits in media secondo un livello fisso di attenuazione del canale. Tuttavia questo approccio è in qualche modo limitante, perché se si ha ha un elevato livello di rumore di fondo e le perdite si assumono costanti in media, potrebbe accadere che il protocollo possa abortire o peggio ancora, non iniziare, essendo il quantum bit error rate (QBER) oltre il limite (11\%) per la distribuzione sicura. Tuttavia, studiando e caratterizzando un canale ottico libero, si trova che il livello di perdite è tutt'altro che stabile e che la turbolenza induce variazioni di trasmissivita che seguono una statistica log-normale. Il punto pertanto è sfruttare questo rumore classico per generare chiave anche quando normalmente non sarebbe possibile. Per far ciò abbiamo ideato uno schema adattativo per la selezione in tempo reale (ARTS) degli istanti a basse perdite in cui vengono istantaneamente rilevati picchi di alta trasmissivita. A tal scopo, si utilizza un fascio laser classico ausiliario co-propagantesi con i qubit ma convenientemente inframezzato nel tempo. In questo modo la scintillazione viene monitorata in tempo reale e vengono selezionati gli intervalli di tempo che daranno luogo ad un QBER praticabile per una generazione di chiavi. Verra quindi presentato un criterio utile per la preselezione dell'intervallo di QBER basso in cui un treno di impulsi intensi si propaga nello stesso percorso dei qubits, con i parametri scelti in modo tale che la sua oscillazione nel tempo riproduce quello della comunicazione quantistica. Nel Capitolo 2 presentiamo quindi una dimostrazione ed i risultati di tale protocollo che è stato implementato presso l'arcipelago delle Canarie, tra l'isola di La Palma e quella di Tenerife: tali isole essendo separate da 143 km, costituiscono un ottimo teatro per testare la validita del protocollo in quanto le condizioni di distanza sono paragonabili a quelle satellitari e la gamma di scintillazione corrisponde quella che si avrebbe in ambiente con moderato maltempo in uno scenario di tipo urbano. Per quanto riguarda il contenuto del Capitolo 3 descriveremo un metodo innovativo per la generazione fisica di numeri casuali che si basa sulla constatazione che un fascio di luce coerente, attraversando un lungo percorso con turbolenza atmosferica da luogo ad immagini casuali e rapidamente variabili. Tale fenomeno è stato riscontrato a partire dai diversi esperimenti di comunicazione quantistica effettuati alle Isole Canarie, dove il fascio laser classico utilizzato per puntare i terminali, in fase di ricezione presentava un fronte d'onda completamente distorto rispetto al tipico profilo gaussiano. In particolare ciò che si osserva è un insieme di macchie chiare e scure che si evolvono geometricamente in modo casuale, il cosiddetto profilo dinamico a speckle. La fonte di tale entropia è quindi la turbolenza atmosferica. Infatti, per un canale di tale lunghezza, una soluzione delle equazioni di Navier-Stokes per il flusso atmosferico in cui si propaga il fascio è completamente fuori portata, sia analiticamente che per mezzo di metodi computazionali. Infatti i vari modelli di dinamica atmosferica sono basati sulla teoria statistica Kolmogorov, che parametrizza la ripartizione dell'energia cinetica come l'interazione di vortici d'aria di dimensioni decrescenti. Tuttavia, tali modelli forniscono solo una descrizione statistica per lo spot del fascio e delle sue eventuali deviazioni ma mai una previsione istantanea per la distribuzione dell' irraggiamento. Per tale motivo, quando un raggio laser viene inviato attraverso l'atmosfera, quest'ultima può essere considerato come un diffusore volumetrico dinamico che distorce il fronte d'onda del fascio. All'interno del Capitolo verranno presentati i dati sperimentali che assicurano che le immagini del fascio presentano le caratteristiche di impredicibilita tali per cui sia possibile numeri casuali genuini. Inoltre, verra presentato anche il metodo per l'estrazione della casualita basato sull'analisi combinatoria ed ottimale nel contesto della Teoria dell'Informazione. In Capitolo 5 presenteremo un nuovo approccio per quanto riguarda la generazione di bit casuali dai processi fisici quantistici. La Meccanica quantistica è stata sempre considerata come la migliore fonte di casualita, a causa della sua intrinseca natura probabilistica. Tuttavia il paradigma tipico impiegato per estrarre numeri casuali da un sistema quantistico assume che lo stato di detto sistema sia puro. Tale assunzione, in principio comporta una generazione in cui il risultato delle misure è complemente impredicibile secondo la legge di Born. Il problema principale tuttavia è che nelle implementazioni reali, come in un laboratorio o in qualche dispositivo commerciale, difficilmente è possibile creare uno stato quantico puro. Generalmente ciò che si ottiene è uno stato quantistico misto. Uno stato misto tuttavia potrebbe essere in qualche modo correlato con un altro sistema quantistico in possesso, eventualmente, di un avversario. Nel caso estremo di uno stato completamente misto, un generatore quantistico praticamente è equivalente ad un generatore che impiega un processo di fisica classica, che in principio è predicibile. Nel Capitolo, si mostrera quindi come sia necessario passare da un estimatore di casualita classico, come l' entropia minima classica $ H_ {min (Z) $ di una variabile casuale $ Z $ ad un estimatore che tenga conto di una informazione marginale $E$ di tipo quantistico, ovvero l'entropia minima condizionata $H_{min(Z|E)$. La entropia minima condizionata è una quantita fondamentale perchè consente di derivare quale sia il minimo contenuto di bit casuali estraibili dal sistema, in presenza di uno stato non puro. Abbiamo ideato un protocollo efficace basato sul principio di indeterminazione entropica per la stima dell'entropia min-condizionale. In generale, il principio di indeterminazione entropico consente di prendere in considerazione le informazioni che sono condivise tra più parti in possesso di un sistema quantistico tri-partitico e, soprattutto, consente di stimare il limite all'informazione che un partito ha sullo stato del sistema, dopo che è stato misurato. Abbiamo adattato tale principio al caso bipartito in cui un utente Alice, $A$, è dotato di un sistema quantistico che nel caso in studio ipotizziamo essere preparato dall'avversario stesso, Eve $E$, e che quindi potrebbe essere con esso correlato. Quindi, teoricamente Eve potrebbe essere in grado di prevedere tutti i risultati delle misurazioni che Alice esegue sulla sua parte di sistema, cioè potrebbe avere una conoscenza massima della variabile casuale $Z$ in cui si registrano i risultati delle misure nella base $\mathcal{Z$. Tuttavia mostreremo che se Alice casualmente misura il sistema in una base $\mathcal{X$ massimamente complementare a $\mathcal{Z$, Alice può inferire un limite inferiore l'entropia per $H_{min(Z|E)$. In questo modo per Alice, utilizzando tecniche della crittografia classeica, è possibile espandere un piccolo seme iniziale di casualita utilizzato per la scelta delle basi di misura, in una quantita molto maggiore di numeri sicuri. Presenteremo i risultati di una dimostrazione sperimentale del protocollo in cui sono stati prodotti numeri casuali che passano i più rigorosi test per la valutazione della casualita. Nel Capitolo 6, verra illustrato un sistema di generazione ultraveloce di numeri casuali per mezzo di variabili continue(CV) QRNG. Siccome numeri casuali genuini sono una preziosa risorsa sia per l'Information Technology classica che quella quantistica, è chiaro che per sostenere i flussi sempre crescenti di dati per la crittografia, è necessario mettere a punto generatori in grado di produrre streaming con rate da Gigabit o Terabit al secondo. In Letteratura sono riportati alcuni esempi di protocolli QRNG che potrebbero raggiungere tali limiti. In genere, questi si basano sulla misura dele quadrature del campo elettromagnetico che può essere considerato come un infinito sistema quantistico bosonico. Le quadrature del campo possono essere misurate con il cosiddetto sistema di rivelazione a omodina che, in linea di principio, può estrarre un segnale di rumore a banda infinita. Di conseguenza, la banda del segnale casuale viene ad essere limitata solo dalla banda passante dei dispositivi utilizzati per misurare. Siccome, rilevatori a fotodiodi lavorano comunemente nella banda delle decine dei GHz, se il segnale è campionato con un ADC sufficientemente veloce e con un elevato numero di bit di digitalizzazione, rate da Gigabit o Terabit sono facilmente raggiungibili. Tuttavia, come nel caso dei QRNG a variabili discrete, i protocolli che si hanno in Letteratura, non considerano adeguatamente la purezza dello stato quantistico da misurare. Nel L'idea è di estendere il protocollo a variabile discreta del capitolo precedente, al caso continuo. Mostreremo come nell'ambito CV, non solo sia abbia il problema della purezza dello stato ma anche il problema relativo alla precisione delle misure utilizzate su di esso. Proporremo e daremo i risultati sperimentali per un nuovo protocollo in grado di estrarre numeri casuali ad alto rate e con un elevato grado di sicurezza.
APA, Harvard, Vancouver, ISO, and other styles
29

Julis, Guenaëlle de. "Analyse d'accumulateurs d'entropie pour les générateurs aléatoires cryptographiques." Thesis, Grenoble, 2014. http://www.theses.fr/2014GRENM075.

Full text
Abstract:
En cryptographie, l'utilisation de nombres aléatoires est fréquente (graine, token, ...) et une mauvaise génération d'aléa peut compromettre toute la sécurité d'un protocole, comme en témoigne régulièrement l'actualité. Les générateurs de nombres aléatoires à usage cryptographique sont des composants formés de trois modules : la source brute qui produit de l'aléa (un algorithme ou un phénomène physique), un retraitement pour corriger les défauts de la source, et un retraitement cryptographique pour obtenir l'aléa final. Cette thèse se focalise sur l'analyse des générateurs issus d'une source physique, en vue de dégager des retraitements adaptés à leurs propriétés et résistants à des perturbations de leur environnement d'utilisation. La complexité des dispositifs entravant souvent la formulation explicite d'un modèle stochastique prouvé, leur évaluation repose principalement sur une analyse statistique. Or, les tests statistiques, principale méthode recommandée par les institutions gouvernementales (ANSSI, BSI, NIST) pour certifier ces composants, peuvent détecter des anomalies mais ne permettent pas de les identifier, et de les caractériser. Les travaux de cette thèse structurent la modélisation d'une source d'aléa, vue comme une suite de variables aléatoires, affinent les tests statistiques, et ajoutent une analyse temporelle pour détecter et expliciter ses anomalies au niveau global ou local. Les résultats ont été implantés dans une librairie composée d'un simulateur de perturbations, des outils statistiques et temporels obtenus, des batteries de tests recommandées (FIPS, AIS31, Test U01, SP800), et de retraitements appropriés à certaines anomalies. La structure mise en place a permis d'extraire des familles d'anomalies de motifs dont les propriétés rendent certains tests incapables de distinguer la source anormale d'une source idéalement aléatoire. L'analyse des faiblesses inhérentes aux méthodes statistiques a montré que l'interprétation d'un test par intervalle de rejet ou taux de réussite n'est pas adapté à la détection de certaines fautes de transition. Elle a aussi permis d'étudier les méthodes d'estimations d'entropie, notamment les estimateurs proposés dans la norme SP800-90. Par ailleurs, les paramètres de spécifications de certains générateurs, dont un déduit du standard de chiffrement AES, se sont avérés distinguables grâce aux statistiques de test. Les outils temporels développés évaluent la structure des anomalies, leur évolution au cours du temps et analysent les motifs déviants au voisinage d'un motif donné. Cela a permis d'une part d'appliquer les tests statistiques avec des paramètres pertinents, et d'autre part de présenter des retaitements dont la validité repose sur la structure des anomalies et non sur leur amplitude
While random numbers are frequently used in cryptography (seed, token, ...), news regurlarly prove how bad randomness generation can compromise the wole security of a protocol. Random number generators for crypthography are components with three steps : a source (an algorithm or physical phenomenon) produces raw numbers which are two times postprocessed to fix anomalies. This thesis focuses on the analysis of physical random bit generators in order to extract postprocessing which will be adapted to the anomalies of the source. As the design of a physical random bit generator is complex, its evaluation is mainly a statistical analysis with hypothesis testing. However, the current standards (AIS31, FIPS140-2, Test U01, SP800) can not provide informations to characterize anomalies. Thus, this thesis adjust several tests and add a time analysis to identify and to make global and local anomalies explicit. A C library was developped, providing anomalies simulator and tools to apply statistical and time analysis results on random bit generators
APA, Harvard, Vancouver, ISO, and other styles
30

Xu, Jinzhong. "Stream Cipher Analysis Based on FCSRs." UKnowledge, 2000. http://uknowledge.uky.edu/gradschool_diss/320.

Full text
Abstract:
Cryptosystems are used to provide security in communications and data transmissions. Stream ciphers are private key systems that are often used to transform large volumn data. In order to have security, key streams used in stream ciphers must be fully analyzed so that they do not contain specific patterns, statistical infomation and structures with which attackers are able to quickly recover the entire key streams and then break down the systems. Based on different schemes to generate sequences and different ways to represent them, there are a variety of stream cipher analyses. The most important one is the linear analysis based on linear feedback shift registers (LFSRs) which have been extensively studied since the 1960's. Every sequence over a finite field has a well defined linear complexity. If a sequence has small linear complexity, it can be efficiently recoverd by Berlekamp-Messay algorithm. Therefore, key streams must have large linear complexities. A lot of work have been done to generate and analyze sequences that have large linear complexities. In the early 1990's, Klapper and Goresky discovered feedback with carry shift registers over Z/(p) (p-FCSRS), p is prime. Based on p-FCSRs, they developed a stream cipher analysis that has similar properties to linear analysis. For instance, every sequence over Z/(p) has a well defined p-adic complexity and key streams of small p-adic complexity are not secure for use in stream ciphers. This disstation focuses on stream cipher analysis based on feedback with carry shift registers. The first objective is to develop a stream cipher analysis based on feedback with carry shift registers over Z/(N) (N-FCSRs), N is any integer greater than 1, not necessary prime. The core of the analysis is a new rational approximation algorithm that can be used to efficiently compute rational representations of eventually periodic N-adic sequences. This algorithm is different from that used in $p$-adic sequence analysis which was given by Klapper and Goresky. Their algorithm is a modification of De Weger's rational approximation algorithm. The second objective is to generalize feedback with carry shift register architecture to more general algebraic settings which are called algebraic feedback shift registers (AFSRs). By using algebraic operations and structures on certain rings, we are able to not only construct feedback with carry shift registers, but also develop rational approximation algorithms which create new analyses of stream ciphers. The cryptographic implication of the current work is that any sequences used in stream ciphers must have large N-adic complexities and large AFSR-based complexities as well as large linear complexities.
APA, Harvard, Vancouver, ISO, and other styles
31

Jíra, Roman. "Generování náhodných čísel pomocí magnetických nanostruktur." Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2015. http://www.nusl.cz/ntk/nusl-232088.

Full text
Abstract:
Random number generation can be based on physical events with probabilistic character, or on algorithms that use complex or one-way functions, alternatively on both of these approaches. A magnetic vortex is a basic state of magnetization that forms in magnetic micro- and nanostructures of an appropriate shape, dimensions and material. Quantities of the magnetic vortex form randomly if ambient conditions are chosen eligibly. A concept of a true random number generator using a random switching of states of the magnetic vortex is presented in this thesis. This concept is realized and random numbers were experimentally generated and numbers were statistically analysed.
APA, Harvard, Vancouver, ISO, and other styles
32

Novotný, Marek. "Programy pro výpočet nejistoty měření metodou Monte Carlo." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2015. http://www.nusl.cz/ntk/nusl-221220.

Full text
Abstract:
The thesis deals with establishing uncertainties of indirect measurements. It focuses primarily on random number generators in software enabling the calculation of mea-surement uncertainties using Monte Carlo. Then it focuses on the uncertainty calculati-on indirect measurement as the Monte Carlo method and the classical numerical met-hod. The practical part deals with the verification of randomness generators numbers contained in various softwares. It also deals with the determination of uncertainties indi-rect current measurements by both above-mentioned methods and then comparing and evaluating the values achieved.
APA, Harvard, Vancouver, ISO, and other styles
33

Matějíček, Jaroslav. "Generátory náhodných čísel pro kryptografii." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2012. http://www.nusl.cz/ntk/nusl-236519.

Full text
Abstract:
The content of this thesis is the design and statistical tests of two di erent hardware random number generators. It also includes an overview of the sources of entropy, algorithms used to correct deviations from the normal distribution and the description of statistical tests.
APA, Harvard, Vancouver, ISO, and other styles
34

Michálek, Tomáš. "Efektivní generátor náhodných čísel v nízko-výkonových zařízení." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2017. http://www.nusl.cz/ntk/nusl-317140.

Full text
Abstract:
This thesis solves the problem of generating random numbers on low-power devices. Author describes possible ways of generating and implements selected generators of (pseudo)random numbers on MSP430F5438A. 4 generators were added by the enhancement of one of them and a new generator was created, using the phenomenon of temperature change in the surroundings. For each generator, test sequences were generated and these sequences were tested by the Dieharder, STS-NIST, and Visual Test. The output of the thesis is the functional implementation of the generators, their testing by statistical methods and their comparison between each other.
APA, Harvard, Vancouver, ISO, and other styles
35

Liu, Chengxin. "Jitter in oscillators with 1/f noise sources and application to true RNG for cryptography." Link to electronic dissertation, 2006. http://www.wpi.edu/Pubs/ETD/Available/etd-011006-221104/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

AMINO, ROBERT, and JONI BAITAR. "Probabilistic Pseudo-random Number Generators." Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-157351.

Full text
Abstract:
Random numbers are essential in many computer applications and games. The goal of this report is to examine two of the most commonly used random number generators and try to determine some of their strengths and weaknesses. These generators are the Linear Congruential Generator(LCG) and the Mersenne Twister(MT). The main objective will be to determine which one of these is the most optimal for low intensive usage and everyday work. Although some of the test results were in conclusive,there were some indications that MT is the better Pseudorandom Number Generator (PRNG) and therefore the preferred PRNG. However, be wary that this is not a general guideline and some implementations may differ from this.The final verdict was thus that MT is a more favourable option(mainly due to its speed) for everyday work, bothon a practical and theoretical level, if a choice should arise between the two options.
Slumptal representerar en viktig komponent i många datorspel, simulationer och övriga progam. Två av de mest förekommande slumptalsgeneratorerna är Linjär kongruensgeneratorn (LKG) samt Mersenne Twister(MT). Huvudfrågan som skall besvaras i denna rapport är huruvida, för vardagligt bruk, den ena generatorn är att föredra framför den andra. Ett antal tester kommer att utföras för att försöka finna eventuella styrkor samt svagheter med respektive generator.Baserat på ett fåtal tester är MT att föredra framför LKG. Detta stämmer väl överens med teorin. Notera dock att detta inte alltid gäller och att det kan förekomma skiljaktigheter mellan de båda alternativen som strider mot det tidigare påståendet. Detta är främst beroende på vilka implementationer som används för respektive generator. Slutsatsen är således att användning av MT ändå rekommenderasframför LKG, främst på grund av den snabba genereringshastigheten för MT.
APA, Harvard, Vancouver, ISO, and other styles
37

Xu, Xiaoke. "Benchmarking the power of empirical tests for random number generators." Click to view the E-thesis via HKUTO, 2008. http://sunzi.lib.hku.hk/hkuto/record/B41508464.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Pospíšilík, Šimon. "Optimalizace návrhových parametrů bezpečnostního přelivu." Master's thesis, Vysoké učení technické v Brně. Fakulta stavební, 2019. http://www.nusl.cz/ntk/nusl-391881.

Full text
Abstract:
The diploma thesis is focused on the program development. The program is aimed at finding optimal design parameters for a two-pole emergency spillway. These parameters are the lengths of the spillway edges and their relative height arrangement. The program is based on the flood wave transformation simulation. Multi – gradient algorithm was used for optimization design parameters of a two-pole safety spillway.
APA, Harvard, Vancouver, ISO, and other styles
39

Kasikara, Gulin. "Progresses In Parallel Random Number Generators." Master's thesis, METU, 2005. http://etd.lib.metu.edu.tr/upload/12606651/index.pdf.

Full text
Abstract:
Monte Carlo simulations are embarrassingly parallel in nature, so having a parallel and efficient random number generator becomes crucial. To have a parallel generator with uncorrelated processors, parallelization methods are implemented together with a binary tree mapping. Although, this method has considerable advantages, because of the constraints arising from the binary tree structure, a situation defined as problem of falling off the tree occurs. In this thesis, a new spawning method that is based on binary tree traversal and new spawn processor appointment is proposed to use when falling off the tree problem is encountered. With this method, it is seen that, spawning operation becomes more costly but the independency of parallel processors is guaranteed. In Monte Carlo simulations, random number generation time should be unperceivable when compared with the execution time of the whole simulation. That is why
linear congruential generators with Mersenne prime moduli are used. In highly branching Monte Carlo simulations, cost of parameterization also gains importance and it becomes reasonable to consider other types of primes or other parallelization methods that provide different balance between parameterization cost and random number generation cost. With this idea in mind, in this thesis, for improving performance of linear congruential generators, two approaches are proposed. First one is using Sophie-Germain primes as moduli and second one is using a hybrid method combining both parameterization and splitting techniques. Performance consequences of Sophie-Germain primes over Mersenne primes are shown through graphics. It is observed that for some cases proposed approaches have better performance consequences.
APA, Harvard, Vancouver, ISO, and other styles
40

Narayanan, Ramaswamy Karthik. "ROLLBACK-ABLE RANDOM NUMBER GENERATORS FOR THE SYNCHRONOUS PARALLEL ENVIRONMENT FOR EMULATION AND DISCRETE-EVENT SIMULATION (SPE." Master's thesis, University of Central Florida, 2005. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/4352.

Full text
Abstract:
Random Numbers form the heart and soul of a discrete-event simulation system. There are few situations where the actions of the entities in the process being simulated can be completely predicted in advance. The real world processes are more probabilistic than deterministic. Hence, such chances are represented in the system by using various statistical models, like random number generators. These random number generators can be used to represent a various number of factors, such as length of the queue. However, simulations have grown in size and are sometimes required to run on multiple machines, which share the various methods or events in the simulation among themselves. These Machines can be distributed across a LAN or even the internet. In such cases, to keep the validity of the simulation model, we need rollback-able random number generators. This thesis is an effort to develop such rollback able random number generators for the Synchronous Parallel Environment for Emulation and Discrete-Event Simulation (SPEEDES) environment developed by NASA. These rollback-able random number generators will also add several statistical distribution models to the already rich SPEEDES library.
M.S.
Other
Engineering and Computer Science
Modeling and Simulation
APA, Harvard, Vancouver, ISO, and other styles
41

Bang, Jung Woong. "An Empirical Comparison of Random Number Generators: Period, Structure, Correlation, Density, and Efficiency." Thesis, University of North Texas, 1995. https://digital.library.unt.edu/ark:/67531/metadc277807/.

Full text
Abstract:
Random number generators (RNGs) are widely used in conducting Monte Carlo simulation studies, which are important in the field of statistics for comparing power, mean differences, or distribution shapes between statistical approaches. Statistical results, however, may differ when different random number generators are used. Often older methods have been blindly used with no understanding of their limitations. Many random functions supplied with computers today have been found to be comparatively unsatisfactory. In this study, five multiplicative linear congruential generators (MLCGs) were chosen which are provided in the following statistical packages: RANDU (IBM), RNUN (IMSL), RANUNI (SAS), UNIFORM(SPSS), and RANDOM (BMDP). Using a personal computer (PC), an empirical investigation was performed using five criteria: period length before repeating random numbers, distribution shape, correlation between adjacent numbers, density of distributions and normal approach of random number generator (RNG) in a normal function. All RNG FORTRAN programs were rewritten into Pascal which is more efficient language for the PC. Sets of random numbers were generated using different starting values. A good RNG should have the following properties: a long enough period; a well-structured pattern in distribution; independence between random number sequences; random and uniform distribution; and a good normal approach in the normal distribution. Findings in this study suggested that the above five criteria need to be examined when conducting a simulation study with large enough sample sizes and various starting values because the RNG selected can affect the statistical results. Furthermore, a study for purposes of indicating reproducibility and validity should indicate the source of the RNG, the type of RNG used, evaluation results of the RNG, and any pertinent information related to the computer used in the study. Recommendations for future research are suggested in the area of other RNGs and methods not used in this study, such as additive, combined, mixed and shifted RNGs.
APA, Harvard, Vancouver, ISO, and other styles
42

Abellán, Sánchez Carlos. "Quantum random number generators for industrial applications." Doctoral thesis, Universitat Politècnica de Catalunya, 2018. http://hdl.handle.net/10803/587190.

Full text
Abstract:
Randomness is one of the most intriguing, inspiring and debated topics in the history of the world. It appears every time we wonder about our existence, about the way we are, e.g. Do we have free will? Is evolution a result of chance? It is also present in any attempt to understand our anchoring to the universe, and about the rules behind the universe itself, e.g. Why are we here and when and why did all this start? Is the universe deterministic or does unpredictability exist? Remarkably, randomness also plays a central role in the information era and technology. Random digits are used in communication protocols like Ethernet, in search engines and in processing algorithms as page rank. Randomness is also widely used in so-called Monte Carlo methods in physics, biology, chemistry, finance and mathematics, as well as in many other disciplines. However, the most iconic use of random digits is found in cryptography. Random numbers are used to generate cryptographic keys, which are the most basic element to provide security and privacy to any form of secure communication. This thesis has been carried out with the following questions in mind: Does randomness exist in photonics? If so, how do we mine it and how do we mine it in a massively scalable manner so that everyone can easily use it? Addressing these two questions lead us to combine tools from fundamental physics and engineering. The thesis starts with an in-depth study of the phase diffusion process in semiconductor lasers and its application to random number generation. In contrast to other physical processes based on deterministic laws of nature, the phase diffusion process has a pure quantum mechanical origin, and, as such, is an ideal source for generating truly unpredictable digits. First, we experimentally demonstrated the fastest quantum random number generation scheme ever reported (at the time), using components from the telecommunications industry only. Up to 40 Gb/s were demonstrated to be possible using a pulsed scheme. We then moved towards building prototypes and testing them with partners in supercomputation and fundamental research. In particular, the devices developed during this thesis were used in the landmark loophole- free Bell test experiments of 2015. In the process of building the technology, we started a new research focus as an attempt to answer the following question: How do we know that the digits that we generate are really coming from the phase diffusion process that we trust? As a result, we introduced the randomness metrology methodology, which can be used to derive quantitative bounds on the quality of any physical random number generation device. Finally, we moved towards miniaturisation of the technology by leveraging techniques from the photonic integrated circuits technology industry. The first fully integrated quantum random number generator was demonstrated using a novel two-laser scheme on an Indium Phosphide platform. In addition, we also demonstrated the integration of part of the technology on a Silicon Photonics platform, opening the door towards manufacturing in the most advanced semiconductor industry.
L’aleatorietat és un dels temes més intrigants, inspiradors i debatuts al llarg de la història. És un concepte que sorgeix quan ens preguntem sobre la nostra pròpia existència i de per què som com som. Tenim freewill? És l’evolució resultat de l’atzar? L’aleatorietat és també un tema que sorgeix quan intentem entendre la nostra relació amb l’univers mateix. Per què estem aquí? Quan o com va començar tot això? És l’univers una màquina determinista o hi ha cabuda per a l’atzar? Sorprenentment, l’aleatorietat també juga un paper crucial en l’era de la informació i la tecnologia. Els nombres aleatoris es fan servir en protocols de comunicació com Ethernet, en algoritmes de classificació i processat com Page Rank. També usem l’aleatorietat en els mètodes Monte Carlo, que s’utilitzen en els àmbits de la física, la biologia, la química, les finances o les matemàtiques. Malgrat això, l’aplicació més icònica per als nombres aleatoris la trobem en el camp de la criptografia o ciber-seguretat. Els nombres aleatoris es fan servir per a generar claus criptogràfiques, l’element bàsic que proporciona la seguretat i privacitat a les nostres comunicacions. Aquesta tesi parteix de la següent pregunta fonamental: Existeix l’aleatorietat a la fotònica? En cas afirmatiu, com podem extreure-la i ferla accessible a tothom? Per a afrontar aquestes dues preguntes, s’han combinat eines des de la física fonamental fins a l’enginyeria. La tesi parteix d’un estudi detallat del procés de difusió de fase en làsers semiconductors i de com aplicar aquest procés per a la generació de nombres aleatoris. A diferència d’altres processos físics basats en lleis deterministes de la natura, la difusió de fase té un origen purament quàntic, i per tant, és una font ideal per a generar nombres aleatoris. Primerament, i fent servir aquest procés de difusió de fase, vam crear el generador quàntic de nombres aleatoris més ràpid mai implementat (en aquell moment) fent servir, únicament, components de la indústria de les telecomunicacions. Més de 40 Gb/s van ser demostrats fent servir un esquema de làser polsat. Posteriorment, vam construir diversos prototips que van ser testejats en aplicacions de ciència fonamental i supercomputació. En particular, alguns dels prototips desenvolupats en aquesta tesi van ser claus en els famosos experiments loophole-free Bell tests realitzats l’any 2015. En el procés de construir aquests prototips, vam iniciar una nova línia de recerca per a intentar contestar una nova pregunta: Com sabem si els nombres aleatoris que generem realment sorgeixen del procés de difusió de fase, tal com nosaltres creiem? Com a resultat, vam introduir una nova metodologia, la metrologia de l’aleatorietat. Aquesta es pot fer servir per a derivar límits quantificables sobre la qualitat de qualsevol dispositiu de generació de nombres aleatoris físic. Finalment, ens vam moure en la direcció de la miniaturització de la tecnologia utilitzant tècniques de la indústria de la fotònica integrada. En particular, vam demostrar el primer generador de nombres aleatoris quàntic totalment integrat, fent servir un esquema de dos làsers en un xip de Fosfur d’Indi. En paral·lel, també vam demostrar la integració d’una part del dispositiu emprant tecnologia de Silici, obrint les portes, per tant, a la producció a gran escala a través de la indústria més avançada de semiconductors.
La aleatoriedad es uno de los temas más intrigantes, inspiradores y debatidos a lo largo de la historia. Es un concepto que surge cuando nos preguntamos sobre nuestra propia existencia y de por qué somos como somos. ¿Tenemos libre albedrío? ¿Es la evolución resultado del azar? La aleatoriedad es también un tema que surge cuando intentamos entender nuestra relación con el universo. ¿Por qué estamos aquí? ¿Cuándo y cómo empezó todo esto? ¿Es el universo una máquina determinista o existe espacio para el azar? Sorprendentemente, la aleatoriedad también juega un papel crucial en la era de la información y la tecnología. Los números aleatorios se usan en protocolos de comunicación como Ethernet, y en algoritmos de clasificación y procesado como Page Rank. También la utilizamos en los métodos Monte Carlo, que sirven en los ámbitos de la física, la biología, la química, las finanzas o las matemáticas. Sin embargo, la aplicación más icónica para los números aleatorios la encontramos en el campo de la criptografía y la ciberseguridad. Aquí, los números aleatorios se usan para generar claves criptográficas, proporcionando el elemento básico para dotar a nuestras comunicaciones de seguridad y privacidad. En esta tesis partimos de la siguiente pregunta fundamental: ¿Existe la aleatoriedad en la fotónica? En caso afirmativo, ¿Cómo podemos extraerla y hacerla accesible a todo el mundo? Para afrontar estas dos preguntas, se han combinado herramientas desde la física fundamental hasta la ingeniería. La tesis parte de un estudio detallado del proceso de difusión de fase en láseres semiconductores y de cómo aplicar este proceso para la generación de números aleatorios. A diferencia de otros procesos físicos basados en leyes deterministas de la naturaleza, la difusión de fase tiene un origen puramente cuántico y, por lo tanto, es una fuente ideal para generar números aleatorios. Primeramente, y utilizando este proceso de difusión de fase, creamos el generador cuántico de números aleatorios más rápido nunca implementado (en ese momento) utilizando únicamente componentes de la industria de las telecomunicaciones. Más de 40 Gb/s fueron demostrados utilizando un esquema de láser pulsado. Posteriormente, construimos varios prototipos que fueron testeados en aplicaciones de ciencia fundamental y supercomputación. En particular, algunos de los prototipos desarrollados en esta tesis fueron claves en los famosos experimentos Loophole-free Bell tests realizados en el 2015. En el proceso de construir estos prototipos, iniciamos una nueva línea de investigación para intentar dar respuesta a una nueva pregunta: ¿Cómo sabemos si los números aleatorios que generamos realmente surgen del proceso de difusión de fase, tal y como nosotros creemos? Como resultado introdujimos una nueva metodología, la metrología de la aleatoriedad. Esta se puede usar para derivar límites cuantificables sobre la calidad de cualquier dispositivo de generación de números aleatorios físico. Finalmente, nos movimos en la dirección de la miniaturización de la tecnología utilizando técnicas de la industria de la fotónica integrada. En particular, creamos el primer generador de números aleatorios cuántico totalmente integrado utilizando un esquema de dos láseres en un chip de Fosfuro de Indio. En paralelo, también demostramos la integración de una parte del dispositivo utilizando tecnología de Silicio, abriendo las puertas, por tanto, a la producción a gran escala a través de la industria más avanzada de semiconductores.
APA, Harvard, Vancouver, ISO, and other styles
43

Tso, Chi-wai, and 曹志煒. "Stringency of tests for random number generators." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2004. http://hub.hku.hk/bib/B29748367.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Raffaelli, Francesco. "Quantum random number generators in integrated photonics." Thesis, University of Bristol, 2019. http://hdl.handle.net/1983/b20b0798-755d-4a57-843f-3951805e9f53.

Full text
Abstract:
Random numbers find applications in a range of different fields, from quantum key distribution and classical cryptography to fundamental science. They also find extensive use in gambling and lotteries. By exploiting the probabilistic nature of Quantum Mechanics, quantum random number generators (QRNGs) provide a secure and efficient means to produce random numbers. Most of the quantum random number generators demonstrated so far have been built in bulk optics, either using free space or fibre-optic components. While showing good performance, most of these demonstrations are strongly limited in real life applications, due to issues such as size, costs and the manufacturing process. In this thesis I report the demonstration of three different QRNGs based on integrated photonics. First, I demonstrated a QRNG based on homodyne measurement of optical vacuum states on a Silicon-on-insulator (SOI) chip. Second, I developed a SOI QRNG based on phase fluctuations from a laser diode. In these two schemes all the optical and opto-electronic components, excluding the laser, were integrated onto a silicon-on-insulator device. These schemes, being built on a silicon-on-insulator chip are potentially CMOS compatible and pave the way for being integrated onto other more complex systems. These QRNGs showed Gbps generation rates and passed the statistical tests provided by NIST. Third, I report the preliminary study of a QRNG based on homodyne measurement of optical vacuum states onto a Indium Phosphide (InP) chip. In this third experiment, all the components, including a laser diode, were monolithically integrated in the same chip, which provide a great advantage in terms of the overall size of the optics of the device.
APA, Harvard, Vancouver, ISO, and other styles
45

Ruhault, Sylvain. "Security analysis for pseudo-random number generators." Thesis, Paris, Ecole normale supérieure, 2015. http://www.theses.fr/2015ENSU0014/document.

Full text
Abstract:
La génération d’aléa joue un rôle fondamental en cryptographie et en sécurité. Des nombres aléatoires sont nécessaires pour la production de clés cryptographiques ou de vecteurs d’initialisation et permettent également d’assurer que des protocoles d’échange de clé atteignent un niveau de sécurité satisfaisant. Dans la pratique, les bits aléatoires sont générés par un processus de génération de nombre dit pseudo-aléatoire, et dans ce cas, la sécurité finale du système dépend de manière cruciale de la qualité des bits produits par le générateur. Malgré cela, les générateurs utilisés en pratique ne disposent pas ou peu d’analyse de sécurité permettant aux utilisateurs de connaître exactement leur niveau de fiabilité. Nous fournissons dans cette thèse des modèles de sécurité pour cette analyse et nous proposons des constructions prouvées sûres et efficaces qui répondront à des besoins de sécurité forts. Nous proposons notamment une nouvelle notion de robustesse et nous étendons cette propriété afin d’adresser les attaques sur la mémoire et les attaques par canaux cachés. Sur le plan pratique, nous effectuons une analyse de sécurité des générateurs utilisés dans la pratique, fournis de manière native dans les systèmes d’exploitation (/dev/random sur Linux) et dans les librairies cryptographiques (OpenSSL ou Java SecureRandom) et nous montrons que ces générateurs contiennent des vulnérabilités potentielles
In cryptography, randomness plays an important role in multiple applications. It is required in fundamental tasks such as key generation and initialization vectors generation or in key exchange. The security of these cryptographic algorithms and protocols relies on a source of unbiased and uniform distributed random bits. Cryptography practitioners usually assume that parties have access to perfect randomness. However, quite often this assumption is not realizable in practice and random bits are generated by a Pseudo-Random Number Generator. When this is done, the security of the scheme depends of course in a crucial way on the quality of the (pseudo-)randomness generated. However, only few generators used in practice have been analyzed and therefore practitioners and end users cannot easily assess their real security level. We provide in this thesis security models for the assessment of pseudo-random number generators and we propose secure constructions. In particular, we propose a new definition of robustness and we extend it to capture memory attacks and side-channel attacks. On a practical side, we provide a security assessment of generators used in practice, embedded in system kernel (Linux /dev/random) and cryptographic libraries (OpenSSL and Java SecureRandom), and we prove that these generators contain potential vulnerabilities
APA, Harvard, Vancouver, ISO, and other styles
46

Epstein, Peter Carleton University Dissertation Computer Science. "Generating geometric objects at random." Ottawa, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
47

Грицак, Анатолій Васильович, and Anatoliy Hrytsak. "Методи побудови ефективних криптографічних функцій гешування." Thesis, Національний авіаційний університет, 2020. https://er.nau.edu.ua/handle/NAU/44653.

Full text
Abstract:
Дисертаційна робота присвячена розв’язанню актуальної наукової задачі розробки та дослідження нових ефективних геш-функцій, які при достатньо високій швидкодії забезпечуватимуть необхідний рівень стійкості. Проведено аналіз сучасних методів і алгоритмів побудови та реалізації ефективних криптографічних функцій гешування, що дозволило виявити їх недоліки і формалізувати завдання наукового дослідження. Розроблено методи побудови функцій гешування, які дозволили підвищити стійкість і швидкість криптографічної обробки даних. Удосконалено метод побудови генераторів ПВП, що дозволило формувати статистично стійку гаму для криптографічних застосувань. Удосконалено метод криптографічного захисту інформації, що дало можливість забезпечити конфіденційність і цілісність даних. Розроблено спеціалізоване програмне забезпечення у вигляді консольних додатків на мові програмування С++ та методику, що дозволило провести експерименти і верифікувати запропоновані методи. результати дисертації використовуються у навчальному процесі Вінницького національного технічного університету, науковому процесі Національного авіаційного університету та ННВК “Інформаційно-комунікаційні системи”, The dissertation is devoted to solving the actual scientific problem of developing and researching new effective hash functions that will provide the necessary level of the security with a sufficiently high speed. The analysis of modern methods and algorithms for the construction and implementation of effective cryptographic hashing functions was carried out, which made it possible to identify their shortcomings and formalize the tasks of scientific research. A method of constructing hashing functions was developed, which made it possible to increase the speed of cryptographic data processing. A method of constructing hashing functions was developed, which made it possible to provide resistance to cryptanalytic attacks. The method of pseudorandom number generators construction has been improved, which allowed forming a statistically stable range for cryptographic applications. The method of cryptographic protection of information has been improved, which by means of fixing information on user ID, session ID, time of sending, length of message and its serial number, as well as use of the new procedure of formation of session key and encryption, made it possible to ensure confidentiality and integrity of data in the modern information and communication systems and technologies. Last chapter of the dissertation contains research study devoted to collision characteristics of proposed hash functions using so-called “baby versions” of hashing functions based on the existed experimental technique (relevant in cryptography). Specialized software was developed in the form of console applications in C ++ programming language (Microsoft Visual Studio 2013 (Release Version)) and a technique that allowed us to conduct experiments and verify the proposed methods. the results of the dissertation are used in the educational process of Vinnytsa National Technical University (to increase the efficiency of training of specialists in the specialty 125 “Cybersecurity”) as well as in scientific process of National Aviation University and Educational & Research Complex “Information and Communication Systems”. It was confirmed by the acts of implementation.
APA, Harvard, Vancouver, ISO, and other styles
48

Korsbakke, Andreas, and Robin Ringsell. "Promestra Security compared with other random number generators." Thesis, Blekinge Tekniska Högskola, Institutionen för datavetenskap, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-18238.

Full text
Abstract:
Background. Being able to trust cryptographic algorithms is a crucial part of society today, because of all the information that is gathered by companies all over the world. With this thesis, we want to help both Promestra AB and potential future customers to evaluate if you can trust their random number generator. Objectives. The main objective for the study is to compare the random number generator in Promestra security with the help of the test suite made by the NationalInstitute of Standards and Technology. The comparison will be made with other random number generators such as Mersenne Twister, Blum-Blum-Schub and more. Methods. The selected method in this study was to gather a total of 100 million bits of each random number generator and use these in the National Institute ofStandards and Technology test suite for 100 tests to get a fair evaluation of the algorithms. The test suite provides a statistical summary which was then analyzed. Results. The results show how many iterations out of 100 that have passed and also the distribution between the results. The obtained results show that there are some random number generators that have been tested that clearly struggles in many of the tests. It also shows that half of the tested generators passed all of the tests. Conclusions. Promestra security and Blum-Blum-Schub is close to passing all the tests, but in the end, they cannot be considered to be the preferable random number generator. The five that passed and seem to have no clear limitations are:Random.org, Micali-Schnorr, Linear-Congruential, CryptGenRandom, and MersenneTwister.
APA, Harvard, Vancouver, ISO, and other styles
49

Xu, Xiaoke, and 許小珂. "Benchmarking the power of empirical tests for random numbergenerators." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2008. http://hub.hku.hk/bib/B41508464.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Tolunay, John. "Parallel gaming related algorithms for an embedded media processor." Thesis, Linköpings universitet, Informationskodning, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-86154.

Full text
Abstract:
A new type of computing architecture called ePUMA is under development by the ePUMA Research Team at the Department of Electrical Engineering at Linköping University in Linköping. This contains several single instruction multiple data (SIMD) cores, which are called SIMD Units, where up to 64 computations can be done in parallel. The goal with the architecture is to create a low-power chip with good performance for embedded applications. One possible application is video games. In this work we have studied a selected set of video game related algorithms, including a Pseudo-Random Number Generator, Clipping and Rasterization & Fragment Processing, analyzing how well they fit the ePUMA platform.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography