To see the other types of publications on this topic, follow the link: Analysis of Hash Functions.

Dissertations / Theses on the topic 'Analysis of Hash Functions'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Analysis of Hash Functions.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Kocak, Onur. "Design And Analysis Of Hash Functions." Master's thesis, METU, 2009. http://etd.lib.metu.edu.tr/upload/12610769/index.pdf.

Full text
Abstract:
Hash functions are cryptographic tools that are used in various applications like digital signature, message integrity checking, password storage and random number generation. These cryptographic primitives were, first, constructed using modular arithmetical operations which were popular at that time because of public key cryptography. Later, in 1989, Merkle and Damgard independently proposed an iterative construction method. This method was easy to implement and had a security proof. MD-4 was the first hash function to be designed using Merkle-Damgard construction. MD-5 and SHA algorithms followed MD-4. The improvements in the construction methods accordingly resulted in improvements and variations of cryptanalytic methods. The series of attacks of Wang et al. on MD and SHA families threaten the security of these hash functions. Moreover, as the standard hashing algorithm SHA-2 has a similar structure with the mentioned hash functions, its security became questionable. Therefore, NIST announced a publicly available contest to select the new algorithm as the new hash standard SHA-3. The design and analysis of hash functions became the most interesting topic of cryptography. A considerable number of algorithms had been designed for the competition. These algorithms were tested against possible attacks and proposed to NIST. After this step, a worldwide interest started to check the security of the algorithms which will continue untill 4th quarter of 2011 to contribute to the selection process. This thesis presents two important aspects of hash functions: design and analysis. The design of hash functions are investigated under two subtopics which are compression functions and the construction methods. Compression functions are the core of the hashing algorithms and most of the effort is on the compression function when designing an algorithm. Moreover, for Merkle-Damgard hash functions, the security of the algorithm depends on the security of the compression function. Construction method is also an important design parameter which defines the strength of the algorithm. Construction method and compression function should be consistent with each other. On the other hand, when designing a hash function analysis is as important as choosing designing parameters. Using known attacks, possible weaknesses in the algorithm can be revealed and algorithm can be strengthened. Also, the security of a hash function can be examined using cryptanalytic methods. The analysis part of the thesis is consisting of various generic attacks that are selected to apply most of the hash functions. This part includes the attacks that NIST is expecting from new standard algorithm to resist.
APA, Harvard, Vancouver, ISO, and other styles
2

Kasselman, Pieter Retief. "Analysis and design of cryptographic hash functions." Pretoria : [s.n.], 2006. http://upetd.up.ac.za/thesis/available/etd-12202006-125340/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Sulak, Fatih. "Statistical Analysis Of Block Ciphers And Hash Functions." Phd thesis, METU, 2011. http://etd.lib.metu.edu.tr/upload/12613045/index.pdf.

Full text
Abstract:
One of the most basic properties expected from block ciphers and hash functions is passing statistical randomness testing, as they are supposed to behave like random mappings. Previously, testing of AES candidate block ciphers was done by using the statistical tests defined in the NIST Test Suite. As some of the tests in this suite require long sequences, data sets are formed by concatenating the outputs of the algorithms obtained from various input types. However, the nature of block cipher and hash function algorithms necessitates devising tests and test parameters focused particularly on short sequences, therefore we propose a package of statistical randomness tests which produce reliable results for short sequences and test the outputs of the algorithms directly rather than concatenations. Moreover, we propose an alternative method to evaluate the test results and state the required computations of related probabilities for the new evaluation method. We also propose another package of statistical tests which are designed basing on certain cryptographic properties of block ciphers and hash functions to evaluate their randomness, namely the cryptographic randomness testing. The packages are applied to the AES finalists, and produced more precise results than those obtained in similar applications. Moreover, the packages are also applied to SHA-3 second round candidate algorithms.
APA, Harvard, Vancouver, ISO, and other styles
4

Knutsen, Mats, and Kim-André Martinsen. "Java Implementation and Performance Analysis of 14 SHA-3 Hash Functions on a Constrained Device." Thesis, Norwegian University of Science and Technology, Department of Telematics, 2010. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-9114.

Full text
Abstract:

Several of the widely used cryptographic hash functions in use today are under attack. With the need to maintain a certain level of security, NIST is in the process of selecting new cryptographic hash function(s). Through a public competition the candidates will be evaluated and analyzed by the public and the winner(s) become the new standard cryptographic hash algorithm(s). Cryptographic hash algorithms have a wide range of applications, and the winner(s) will have to perform well in various platforms and application areas. The number of constrained devices surrounding us at a daily basis is rapidly increasing. As these devices are used for a great variety of applications, security issues arise. The winning algorithm(s) will not only have to prove a strong security, but also show good performance and capability to run on constrained devices. In this thesis, we show the results of our implementation of the second round SHA-3 candidates in Java, and perform a cost and performance analysis of them on a low-cost 32-bit ARM9 CPU by measuring cycles/byte and ROM requirements. The analysis is conducted on the Sun SPOT platform, by Sun Microsystems, with a Squawk Virtual Machine.

APA, Harvard, Vancouver, ISO, and other styles
5

Abdoun, Nabil. "Design, implementation and analysis of keyed hash functions based on chaotic maps and neural networks." Thesis, Nantes, 2019. http://www.theses.fr/2019NANT4013/document.

Full text
Abstract:
Les fonctions de hachage sont des primitives les plus utiles en cryptographie. En effet, elles jouent un rôle important dans l’intégrité des données, l’authentification des messages, la signature numérique et le chiffrement authentifié. Ainsi, la conception de fonctions de hachage sécurisées est cruciale. Dans cette thèse, nous avons conçu, implanté et analysé les performances de deux architectures comprenant chacune deux structures de fonctions de hachage avec clé basées sur des cartes chaotiques et des réseaux neuronaux (KCNN). La première architecture s’appuie sur la construction Merkle-Dåmgard, tandis que la seconde utilise la fonction Éponge. La première structure de la première architecture est formée de deux couches KCNN avec trois schémas de sortie différents (CNN-Matyas-Meyer-Oseas, CNN-Matyas-Meyer-Oseas Modifié et CNN-Miyaguchi-Preneel), tandis que la seconde structure est composée d’une couche KCNN suivie d'une couche de combinaison de fonctions non linéaires. La première structure de la deuxième architecture est formée de deux couches KCNN avec deux longueurs de hachage 256 et 512 bits. La seconde structure est comparable à celle utilisée dans la première architecture. Le système chaotique est utilisé pour générer les paramètres du KCNN. Les résultats obtenus par les tests statistiques, ainsi que l'analyse cryptanalytique, démontrent la sécurité des fonctions de hachage KCNN proposées. Enfin, nous travaillons actuellement sur la structure KCNNDUPLEX intégrant les fonctions de hachage KCNN proposées (basées Éponge) pour leur utilisation dans une application de chiffrement authentifiée
The hash functions are the most useful primitives in cryptography. They play an important role in data integrity, message authentication, digital signature and authenticated encryption. Thus, the design of secure hash functions is crucial. In this thesis, we designed, implemented, and analyzed the performance of two architectures, each with two keyed hash function structures based on chaotic maps and neural networks (KCNN). The first architecture is based on the Merkle-Dåmgard construction, while the second uses the Sponge function. The first structure of the first architecture consists of two KCNN layers with three different output schemes (CNN-Matyas- Meyer-Oseas, Modified CNN-Matyas-Meyer- Oseas and CNN-Miyaguchi-Preneel). The second structure is composed of a KCNN layer followed by a combination layer of nonlinear functions. The first structure of the second architecture is formed of two KCNN layers with two hash value lengths 256 and 512. The second structure is similar to that used in the first architecture. The chaotic system is used to generate KCNN parameters. The results obtained by the statistical tests, as well as the cryptanalytical analysis, demonstrate the security of the proposed KCNN hash functions. Finally, we are currently working on the KCNN-DUPLEX structure integrating the proposed KCNN hashing functions (Sponge-based) for use in an authenticated encryption application
APA, Harvard, Vancouver, ISO, and other styles
6

Orvidaitė, Halina. "Statistinė SHA-3 konkurso maišos funkcijų analizė." Master's thesis, Lithuanian Academic Libraries Network (LABT), 2014. http://vddb.library.lt/obj/LT-eLABa-0001:E.02~2012~D_20140704_171509-46405.

Full text
Abstract:
Pagrindinis magistro baigiamojo darbo tikslas buvo, pasinaudojant NIST SHA-3 maišos algoritmų kompresijos funkcijomis, sukurti pseudo-atsitiktinių skaičių generatorių ir atliktų juo sugeneruotų sekų statistinius testus. Darbo metu surinkau pagrindinę teorinę bazę, reikalingą, norint susipaţinti su naujosiomis SHA-3 maišos funkcijomis bei NIST pateikiamu statistinių testų paketu. Detaliai išanalizavau algoritmus, kurie šiuo metu yra maišos funkcijų standartai, ir kurių savybių tenkinimas yra minimalus reikalavimas SHA-3 algoritmų kandidatams. Detaliai pristačiau kiekvieną iš penkių finalinių SHA-3 algoritmų, testavimo algoritmus, kurie yra pateikti statistinių testų pakete: aptariau jų idėją ir tikslą, pateikiamus įvesties kintamuosius, atliekamus algoritmų ţingsnius, reikalavimus funkcijoms paduodamiems kintamiesiems bei gautų rezultatų interpretavimo aspektus. Taip pat pristačiau sugalvotą pseudo-atsitiktinių skaičių generatoriaus algoritmą ir jo Java realizaciją. Sugeneravus testinių duomenų paketą, jį įvertinau NIST statistinių testų pagalba.
The main aim of my final master paper work was to gather theoretical basis, which provides description of cryptology and it‘s elements, valid hash function standards and NIST competition for SHA-3. During my studies I’ve gathered needed information to understand hash algorithms which are represented by five finalists of NIST SHA-3 competition. I’ve analyzed algorithms of current hash function standards and main requirements participants must fulfil in order to become a winner of a competition in detail. I’ve represented each SHA-3 finalist’s function with deep analysis. Also I’ve gathered theoretical basis, which provides description of US National Institute of Standards and Technology created Statistical Test Suite. This statistical test suite is testing binary streams generated by random or pseudorandom number generators. I have given a detailed description of algorithms in given statistical suite: I have provided the main idea and aim of those tests, variables used for input, steps of those algorithms, requirements for input data and possible interpretation of results. Also I’ve introduced an algorithm of pseudorandom numbers generator and have given its’ realization in Java. Finally I’ve created a test data suite and have assessed it with NIST provided statistical test suite.
APA, Harvard, Vancouver, ISO, and other styles
7

Hegde, Suprabha Shreepad. "Analysis of Non-Interactive Zero Knowledge Proof." University of Cincinnati / OhioLINK, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1535702372270471.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Aumüller, Martin [Verfasser], Martin [Akademischer Betreuer] Dietzfelbinger, Philipp [Akademischer Betreuer] Woelfel, and Rasmus [Akademischer Betreuer] Pagh. "On the Analysis of Two Fundamental Randomized Algorithms - Multi-Pivot Quicksort and Efficient Hash Functions / Martin Aumüller. Gutachter: Philipp Woelfel ; Rasmus Pagh. Betreuer: Martin Dietzfelbinger." Ilmenau : Universitätsbibliothek Ilmenau, 2015. http://d-nb.info/107549317X/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Graff, Nathaniel. "Differential Power Analysis In-Practice for Hardware Implementations of the Keccak Sponge Function." DigitalCommons@CalPoly, 2018. https://digitalcommons.calpoly.edu/theses/1838.

Full text
Abstract:
The Keccak Sponge Function is the winner of the National Institute of Standards and Technology (NIST) competition to develop the Secure Hash Algorithm-3 Standard (SHA-3). Prior work has developed reference implementations of the algorithm and described the structures necessary to harden the algorithm against power analysis attacks which can weaken the cryptographic properties of the hash algorithm. This work demonstrates the architectural changes to the reference implementation necessary to achieve the theoretical side channel-resistant structures, compare their efficiency and performance characteristics after synthesis and place-and-route when implementing them on Field Programmable Gate Arrays (FPGAs), publish the resulting implementations under the Massachusetts Institute of Technology (MIT) open source license, and show that the resulting implementations demonstrably harden the sponge function against power analysis attacks.
APA, Harvard, Vancouver, ISO, and other styles
10

Cochran, Martin J. "Cryptographic hash functions." Connect to online resource, 2008. http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:3303860.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Shrimpton, Thomas Eric. "Provably-secure cryptographic hash functions /." For electronic version search Digital dissertations database. Restricted to UC campuses. Access is free to UC campus dissertations, 2004. http://uclibs.org/PID/11984.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Al-Kuwari, Saif. "Integrated-key cryptographic hash functions." Thesis, University of Bath, 2011. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.545328.

Full text
Abstract:
Cryptographic hash functions have always played a major role in most cryptographic applications. Traditionally, hash functions were designed in the keyless setting, where a hash function accepts a variable-length message and returns a fixed-length fingerprint. Unfortunately, over the years, significant weaknesses were reported on instances of some popular ``keyless" hash functions. This has motivated the research community to start considering the dedicated-key setting, where a hash function is publicly keyed. In this approach, families of hash functions are constructed such that the individual members are indexed by different publicly-known keys. This has, evidently, also allowed for more rigorous security arguments. However, it turns out that converting an existing keyless hash function into a dedicated-key one is usually non-trivial since the underlying keyless compression function of the keyless hash function does not normally accommodate the extra key input. In this thesis we define and formalise a flexible approach to solve this problem. Hash functions adopting our approach are said to be constructed in the integrated-key setting, where keyless hash functions are seamlessly and transparently transformed into keyed variants by introducing an extra component accompanying the (still keyless) compression function to handle the key input separately outside the compression function. We also propose several integrated-key constructions and prove that they are collision resistant, pre-image resistant, 2nd pre-image resistant, indifferentiable from Random Oracle (RO), indistinguishable from Pseudorandom Functions (PRFs) and Unforgeable when instantiated as Message Authentication Codes (MACs) in the private key setting. We further prove that hash functions constructed in the integrated-key setting are indistinguishable from their variants in the conventional dedicated-key setting, which implies that proofs from the dedicated-key setting can be naturally reduced to the integrated-key setting.
APA, Harvard, Vancouver, ISO, and other styles
13

Halunen, K. (Kimmo). "Hash function security:cryptanalysis of the Very Smooth Hash and multicollisions in generalised iterated hash functions." Doctoral thesis, Oulun yliopisto, 2012. http://urn.fi/urn:isbn:9789514299667.

Full text
Abstract:
Abstract In recent years, the amount of electronic communication has grown enormously. This has posed some new problems in information security. In particular, the methods in cryptography have been under much scrutiny. There are several basic primitives that modern cryptographic protocols utilise. One of these is hash functions, which are used to compute short hash values from messages of any length. In this thesis, we study the security of hash functions from two different viewpoints. First of all, we analyse the security of the Very Smooth Hash against preimage attacks. We develop an improved method for finding preimages of Very Smooth Hash, compare this method with existing methods and demonstrate its efficiency with practical results. Furthermore, we generalise this method to the discrete logarithm variants of the Very Smooth Hash. Secondly, we describe the methods for finding multicollisions in traditional iterated hash functions and give some extensions and improvements to these. We also outline a method for finding multicollisions for generalised iterated hash functions and discuss the implications of these findings. In addition, we generalise these multicollision finding methods to some graph-based hash functions
Tiivistelmä Viime vuosina digitaaliseen tiedonsiirtoon perustuva tiedonsiirto on yleistynyt valtavasti. Tästä on seurannut monia uusia tietoturvaongelmia. Tässä yhteydessä erityisesti tiedon suojaamiseen käytetyt kryptografiset menetelmät ovat olleet tarkastelun kohteena. Hash-funktiot ovat yksi käytetyimmistä työkaluista nykyisissä kryptografisissa protokollissa. Tässä väitöskirjassa tarkastellaan hash-funktioiden turvallisuutta kahden eri tutkimusongelman kautta. Aluksi tutkitaan Very Smooth Hash -funktion turvallisuutta alkukuvien löytämistä vastaan. Alkukuvien löytämiseksi esitetään parannettu menetelmä, jota arvioidaan teoreettisilla ja käytännöllisillä menetelmillä. Tämä parannettu menetelmä yleistetään koskemaan myös Very Smooth Hashin muunnoksia, jotka perustuvat diskreetin logaritmin ongelmaan. Toisena tutkimuskohteena ovat iteroitujen hash-funktioiden yleistykset ja monitörmäykset. Aluksi esitellään perinteisiin iteroituihin hash-funktioihin liittyviä monitörmäysmenetelmiä. Tämän jälkeen tutkitaan iteroitujen hash-funktioiden yleistyksiä ja osoitetaan, että aiemmat monitörmäysmenetelmät voidaan laajentaa koskemaan myös näitä yleistyksiä. Lopuksi tutkitaan graafeihin perustuviin hash-funktioihin liittyviä monitörmäysmenetelmiä ja osoitetaan, että iteroitujen hash-funktioiden monitörmäysmenetelmä voidaan osittain yleistää koskemaan myös graafeihin perustuvia hash-funktioita
APA, Harvard, Vancouver, ISO, and other styles
14

Ødegård, Rune Steinsmo. "Hash Functions and Gröbner Bases Cryptanalysis." Doctoral thesis, Norges teknisk-naturvitenskapelige universitet, Institutt for telematikk, 2012. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-16445.

Full text
Abstract:
Hash functions are being used as building blocks in such diverse primitives as commitment schemes, message authentication codes and digital signatures. These primitives have important applications by themselves, and they are also used in the construction of more complex protocols such as electronic voting systems, online auctions, public-key distribution, mutual authentication handshakes and more. Part of the work presented in this thesis has contributed to the \SHA-3 contest" for developing the new standard for hash functions organized by the National Institute of Standards and Technology. We constructed the candidate Edon-R, which is a hash function based on quasigroup string transformation. Edon-R was designed to be much more efficient than SHA-2 cryptographic hash functions, while at the same time offering same or better security. Most notably Edon-R was the most efficient hash function submitted to the contest. Another contribution to the contest was our cryptanalysis of the second round SHA-3 candidate Hamsi. In our work we studied Hamsi's resistance to differential and higher-order differential cryptanalysis, with focus on the 256-bit version of Hamsi. Our main results are efficient distinguishers and near-collisions for its full (3-round) compression function, and distinguishers for its full (6-round) finalization function, indicating that Hamsi's building blocks do not behave ideally. Another important part of this thesis is the application of Gröbner bases. In the last decade, Gröbner bases have shown to be a valuable tool for algebraic cryptanalysis. The idea is to set up a system of multivariate equations such that the solution of the system reveals some secret information of the cryptographic primitive. The system is then solved with Gröbner bases computation. Staying close to the topic of hash functions, we have applied this tool for cryptanalysis and construction of multivariate digital signature schemes, which is a major hash function application. The result of this is our cryptanalysis of the public-key cryptosystem MQQ, where we show exactly why the multivariate quadratic equation system is so easy to solve in practice. The knowledge we gained from finding the underlying weakness of the MQQ scheme was used to construct a digital signature scheme. The resulting scheme, MQQ-SIG, is a provably CMA resistant multivariate quadratic digital signature scheme based on multivariate quadratic quasigroups. The scheme is designed to be very fast both in hardware and in software. Compared to some other multivariate quadratic digital signature schemes, MQQ-SIG is much better in signing and private key size, while worse in key generation, verification and public key size. This means that MQQ-SIG is a good alternative for protocols where the constrained environment is on the side of the signer.
APA, Harvard, Vancouver, ISO, and other styles
15

Saarinen, markku-Juhani Olavi. "Cryptanalysis of dedicated cryptographic hash functions." Thesis, University of London, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.537512.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Lathrop, Joel. "Cube attacks on cryptographic hash functions /." Online version of thesis, 2009. http://hdl.handle.net/1850/10821.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Van, der Merwe Thyla Joy. "Generic attacks on iterated hash functions." Master's thesis, University of Cape Town, 2009. http://hdl.handle.net/11427/14640.

Full text
Abstract:
Includes bibliographical references (leaves 126-132).
We survery the existing generic attacks on hash functions based on the Merkle­Damgard construction: that is, attacks in which the compression function is treated as a black box.
APA, Harvard, Vancouver, ISO, and other styles
18

Gauravaram, Praveen Srinivasa. "Cryptographic hash functions : cryptanalysis, design and applications." Queensland University of Technology, 2007. http://eprints.qut.edu.au/16372/.

Full text
Abstract:
Cryptographic hash functions are an important tool in cryptography to achieve certain security goals such as authenticity, digital signatures, digital time stamping, and entity authentication. They are also strongly related to other important cryptographic tools such as block ciphers and pseudorandom functions. The standard and widely used hash functions such as MD5 and SHA-1 follow the design principle of Merkle-Damgard iterated hash function construction which was presented independently by Ivan Damgard and Ralph Merkle at Crypto'89. It has been established that neither these hash functions nor the Merkle-Damgard construction itself meet certain security requirements. This thesis aims to study the attacks on this popular construction and propose schemes that offer more resistance against these attacks as well as investigating alternative approaches to the Merkle-Damgard style of designing hash functions. This thesis aims at analysing the security of the standard hash function Cellular Authentication and Voice Encryption Algorithm (CAVE) used for authentication and key-derivation in the second generation (2G) North American IS-41 mobile phone system. In addition, this thesis studies the analysis issues of message authentication codes (MACs) designed using hash functions. With the aim to propose some efficient and secure MAC schemes based on hash functions. This thesis works on three aspects of hash functions: design, cryptanalysis and applications with the following significant contributions: * Proposes a family of variants to the Damgard-Merkle construction called 3CG for better protection against specific and generic attacks. Analysis of the linear variant of 3CG called 3C is presented including its resistance to some of the known attacks on hash functions. * Improves the known cryptanalytical techniques to attack 3C and some other similar designs including a linear variant of GOST, a Russian standard hash function. * Proposes a completely novel approach called Iterated Halving, alternative to the standard block iterated hash function construction. * Analyses provably secure HMAC and NMAC message authentication codes (MACs) based on weaker assumptions than stated in their proofs of security. Proposes an efficient variant for NMAC called NMAC-1 to authenticate short messages. Proposes a variant for NMAC called M-NMAC which offers better protection against the complete key-recovery attacks than NMAC. As well it is shown that M-NMAC with hash functions also resists side-channel attacks against which HMAC and NMAC are vulnerable. Proposes a new MAC scheme called O-NMAC based on hash functions using just one secret key. * Improves the open cryptanalysis of the CAVE algorithm. * Analyses the security and legal implications of the latest collision attacks on the widely used MD5 and SHA-1 hash functions.
APA, Harvard, Vancouver, ISO, and other styles
19

Daum, Magnus. "Cryptanalysis of Hash functions of the MD4-family." [S.l.] : [s.n.], 2005. http://deposit.ddb.de/cgi-bin/dokserv?idn=97642777X.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Chandrasekhar, Santosh. "CONSTRUCTION OF EFFICIENT AUTHENTICATION SCHEMES USING TRAPDOOR HASH FUNCTIONS." UKnowledge, 2011. http://uknowledge.uky.edu/gradschool_diss/162.

Full text
Abstract:
In large-scale distributed systems, where adversarial attacks can have widespread impact, authentication provides protection from threats involving impersonation of entities and tampering of data. Practical solutions to authentication problems in distributed systems must meet specific constraints of the target system, and provide a reasonable balance between security and cost. The goal of this dissertation is to address the problem of building practical and efficient authentication mechanisms to secure distributed applications. This dissertation presents techniques to construct efficient digital signature schemes using trapdoor hash functions for various distributed applications. Trapdoor hash functions are collision-resistant hash functions associated with a secret trapdoor key that allows the key-holder to find collisions between hashes of different messages. The main contributions of this dissertation are as follows: 1. A common problem with conventional trapdoor hash functions is that revealing a collision producing message pair allows an entity to compute additional collisions without knowledge of the trapdoor key. To overcome this problem, we design an efficient trapdoor hash function that prevents all entities except the trapdoor key-holder from computing collisions regardless of whether collision producing message pairs are revealed by the key-holder. 2. We design a technique to construct efficient proxy signatures using trapdoor hash functions to authenticate and authorize agents acting on behalf of users in agent-based computing systems. Our technique provides agent authentication, assurance of agreement between delegator and agent, security without relying on secure communication channels and control over an agent’s capabilities. 3. We develop a trapdoor hash-based signature amortization technique for authenticating real-time, delay-sensitive streams. Our technique provides independent verifiability of blocks comprising a stream, minimizes sender-side and receiver-side delays, minimizes communication overhead, and avoids transmission of redundant information. 4. We demonstrate the practical efficacy of our trapdoor hash-based techniques for signature amortization and proxy signature construction by presenting discrete log-based instantiations of the generic techniques that are efficient to compute, and produce short signatures. Our detailed performance analyses demonstrate that the proposed schemes outperform existing schemes in computation cost and signature size. We also present proofs for security of the proposed discrete-log based instantiations against forgery attacks under the discrete-log assumption.
APA, Harvard, Vancouver, ISO, and other styles
21

Kortelainen, T. (Tuomas). "On iteration-based security flaws in modern hash functions." Doctoral thesis, Oulun yliopisto, 2014. http://urn.fi/urn:isbn:9789526206431.

Full text
Abstract:
Abstract The design principles proposed independently by both Ralph Merkle and Ivan Damgård in 1989 are applied widely in hash functions that are used in practice. The construction reads the message in one message block at a time and applies iteratively a compression function that, given a single message block and a hash value, outputs a new hash value. This iterative structure has some security weaknesses. It is vulnerable, for instance, to Joux's multicollision attack, herding attack that uses diamond structures and Trojan message attack. Our principal research topic comprises the deficiencies in hash function security induced by the Merkle-Damgård construction. In this work, we present a variant of Joux's multicollision attack. We also develop a new, time-saving algorithm for creating diamond structures. Moreover, two new efficient versions of Trojan message attack are introduced. The main contribution of the thesis is the analysis of generalized iterated hash functions. We study the combinatorial properties of words from a new perspective and develop results that are applied to give a new upper bound for the complexity of multicollision attacks against the so called q-bounded generalized iterated hash functions
Tiivistelmä Vuonna 1989 Ralph Merkle ja Ivan Damgård ehdottivat toisistaan riippumatta hash-funktioille suunnitteluperiaatteita, joita käytetään tänä päivänä laajasti. Niin kutsuttu Merkle-Damgård -rakenne lukee viestin sisään viestiblokki kerrallaan ja käyttää tiivistefunktiota, joka liittää hash-arvoon ja viestiblokkiin uuden hash-arvon. Tällä iteratiivisella rakenteella on joitakin turvallisuusheikkouksia. Se on haavoittuva esimerkiksi Joux’n monitörmäyshyökkäykselle, timanttirakenteita hyödyntävälle paimennushyökkäykselle ja Troijan viesti -hyökkäykselle. Väitöskirjan pääasiallinen tutkimusaihe on Merkle-Damgård -rakenteen aiheuttamat puutteet tietoturvassa. Tässä työssä esitetään uusi versio Joux’n monitörmäyshyökkäyksestä, luodaan uusi aikaa säästävä algoritmi timanttirakenteiden kehittämiseksi ja kaksi uutta tehokasta versiota Troijan viesti -hyökkäyksestä. Väitöskirjan tärkein kontribuutio on yleistettyjen iteratiivisten hash-funktioiden turvallisuuden analysointi. Sanojen kombinatorisia ominaisuuksia tutkitaan uudesta näkökulmasta, jonka pohjalta kehitettyjä tuloksia soveltamalla luodaan uusi yläraja niin kutsuttujen q-rajoitettujen yleisten iteratiivisten hash-funktioiden monitörmäyshyökkäysten kompleksisuudelle
APA, Harvard, Vancouver, ISO, and other styles
22

Abidin, Aysajan. "Weaknesses of Authentication inQuantum Cryptography and Strongly Universal Hash Functions." Licentiate thesis, Linköping University, Linköping University, Department of Mathematics, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-57290.

Full text
Abstract:

Authentication is an indispensable part of Quantum Cryptography, which is an unconditionally secure key distribution technique based on the laws of nature. Without proper authentication, Quantum Cryptography is vulnerable to “man-in-the-middle” attacks. Therefore, to guarantee unconditional security of any Quantum Cryptographic protocols, the authentication used must also be unconditionally secure. The standard in Quantum Cryptography is to use theWegman-Carter authentication, which is unconditionally secure and is based on the idea of universal hashing.

In this thesis, we first investigate properties of a Strongly Universal hash function family to facilitate understanding the properties of (classical) authentication used in Quantum Cryptography. Then, we study vulnerabilities of a recently proposed authentication protocol intended to rule out a "man-in-the-middle" attack on Quantum Cryptography. Here, we point out that the proposed authentication primitive is not secure when used in a generic Quantum Cryptographic protocol. Lastly, we estimate the lifetime of authentication using encrypted tags when the encryption key is partially known. Under simplifying assumptions, we derive that the lifetime is linearly dependent on the length of the authentication key. Experimental results that support the theoretical results are also presented.

APA, Harvard, Vancouver, ISO, and other styles
23

Abidin, Aysajan. "Authentication in Quantum Key Distribution : Security Proof and Universal Hash Functions." Doctoral thesis, Linköpings universitet, Informationskodning, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-91265.

Full text
Abstract:
Quantum Key Distribution (QKD) is a secret key agreement technique that consists of two parts: quantum transmission and measurement on a quantum channel, and classical post-processing on a public communication channel. It enjoys provable unconditional security provided that the public communication channel is immutable. Otherwise, QKD is vulnerable to a man-in-the-middle attack. Immutable public communication channels, however, do not exist in practice. So we need to use authentication that implements the properties of an immutable channel as well as possible. One scheme that serves this purpose well is the Wegman-Carter authentication (WCA), which is built upon Almost Strongly Universal2 (ASU2) hashing. This scheme uses a new key in each authentication attempt to select a hash function from an ASU2 family, which is then used to generate the authentication tag for a message. The main focus of this dissertation is on authentication in the context of QKD. We study ASU2 hash functions, security of QKD that employs a computationally secure authentication, and also security of authentication with a partially known key. Specifically, we study the following. First, Universal hash functions and their constructions are reviewed, and as well as a new construction of ASU2 hash functions is presented. Second, security of QKD that employs a specific computationally secure authentication is studied. We present detailed attacks on various practical implementations of QKD that employs this authentication. We also provide countermeasures and prove necessary and sufficient conditions for upgrading the security of the authentication to the level of unconditional security. Third, Universal hash function based multiple authentication is studied. This uses a fixed ASU2 hash function followed by one-time pad encryption, to keep the hash function secret. We show that the one-time pad is necessary in every round for the authentication to be unconditionally secure. Lastly, we study security of the WCA scheme, in the case of a partially known authentication key. Here we prove tight information-theoretic security bounds and also analyse security using witness indistinguishability as used in the Universal Composability framework.
ICG QC
APA, Harvard, Vancouver, ISO, and other styles
24

Abidin, Aysajan. "Weaknesses of Authentication in Quantum Cryptography and Strongly Universal Hash Functions." Licentiate thesis, Linköpings universitet, Tillämpad matematik, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-57290.

Full text
Abstract:
Authentication is an indispensable part of Quantum Cryptography, which is an unconditionally secure key distribution technique based on the laws of nature. Without proper authentication, Quantum Cryptography is vulnerable to “man-in-the-middle” attacks. Therefore, to guarantee unconditional security of any Quantum Cryptographic protocols, the authentication used must also be unconditionally secure. The standard in Quantum Cryptography is to use theWegman-Carter authentication, which is unconditionally secure and is based on the idea of universal hashing. In this thesis, we first investigate properties of a Strongly Universal hash function family to facilitate understanding the properties of (classical) authentication used in Quantum Cryptography. Then, we study vulnerabilities of a recently proposed authentication protocol intended to rule out a "man-in-the-middle" attack on Quantum Cryptography. Here, we point out that the proposed authentication primitive is not secure when used in a generic Quantum Cryptographic protocol. Lastly, we estimate the lifetime of authentication using encrypted tags when the encryption key is partially known. Under simplifying assumptions, we derive that the lifetime is linearly dependent on the length of the authentication key. Experimental results that support the theoretical results are also presented.
ICG QC
APA, Harvard, Vancouver, ISO, and other styles
25

Ozen, Onur. "On The Security Of Tiger Hash Function." Master's thesis, METU, 2008. http://etd.lib.metu.edu.tr/upload/12609117/index.pdf.

Full text
Abstract:
Recent years have witnessed several real threats to the most widely used hash functions which are generally inspired from MD4, such as MD5, RIPEMD, SHA0 and SHA1. These extraordinary developments in cryptanalysis of hash functions brought the attention of the cryptology researchers to the alternative designs. Tiger is an important type of alternative hash functions and is proved to be secure so far as there is no known collision attack on the full (24 rounds) Tiger. It is designed by Biham and Anderson in 1995 to be very fast on modern computers. In two years some weaknesses have been found for Tiger-hash function. First, in FSE 006 Kelsey and Lucks found a collision for 16-17 rounds of Tiger and a pseudo-near-collision for 20 rounds. Then, Mendel et al extended this attack to find 19-round collision and 22-round pseudo-near-collision. Finally in 2007, Mendel and Rijmen found a pseudo-near-collision for the full Tiger. In this work, we modify the attack of Kelsey and Lucks slightly and present the exact values of the differences used in the attack. Moreover, there have been several cryptanalysis papers investigating the randomness properties of the designed hash functions under the encryption modes. In these papers, related-key boomerang and related-key rectangle attacks are performed on MD4,MD5, HAVAL and SHA. In this thesis, we introduce our 17,19 and 21-round related-key boomerang and rectangle distinguishers to the encryption mode of Tiger.
APA, Harvard, Vancouver, ISO, and other styles
26

Valois, Mathieu. "Mesure de la robustesse des mots de passe." Thesis, Normandie, 2019. http://www.theses.fr/2019NORMC251.

Full text
Abstract:
À l'ère où notre identité numérique se confond toujours davantage avec notre identité personnelle, les besoins en sécurité de nos comptes en ligne sont d'autant plus marqués. Les mots de passe sont à la fois la manière de s'authentifier la plus utilisée et à la fois le maillon le plus faible de la chaîne de sécurité. Malgré l'indéniable fragilité de la plupart des mots de passe utilisés en ligne, le mot de passe reste le meilleur moyen de s'authentifier réunissant sécurité, accessibilité et respect de la vie privée.L'objectif de cette thèse est de faciliter la conception de mesures de robustesse des mots de passe qui soient pertinentes vis-à-vis des attaques les plus sophistiquées sur les mots de passe. Ces attaques reposent sur des modèles probabilistes de la manière dont les utilisateurs choisissent leur mot de passe. Il s'avèrent que ces attaques sont très efficaces pour trouver des mots de passe plus complexes que ceux habituellement trouvés par les méthodes naïves. Ce travail repose sur trois contributions pour identifier les points clés d'une mesure de robustesse des mots de passe moderne. La première contribution modélise le processus d'attaque sur les mots de passe, formalise et mesure la performance d'un tel processus. La deuxième contribution se charge de montrer que les méthodes actuellement déployées pour mesurer la robustesse des mots de passe ne sont pas suffisantes pour se prémunir des attaques sophistiquées. La troisième contribution analyse algorithmiquement les attaques sophistiquées en observant leur comportement dans le but de concevoir des méthodes efficaces pour augmenter le coût d'exécution de ces attaques. La validation des méthodes s'est effectuée en utilisant des mots de passe issus de fuites de données publiques, totalisant plus de 500 millions de mots de passe
At the era where our digital identity is always more melt with our personal identity, the needs in security of our online accounts are even more pronounced. Passwords are both the most used authentication mean and the weakest link in the security chain. Despite the undeniable weakness of online passwords, they remain the best authentication mean gathering security, accessibility and privacy protection.The purpose of this thesis is to ease the design of passwords strength measurement methods relevant regarding the most sophisticated attacks on passwords. Such attacks lay on probabilistic models which model the way that passwords are chosen. These attacks are very efficient to find more complex passwords that are usually not found by naive techniques. This work lay on three contributions to spot the key aspects of a modern password strength measurement method. The first contribution models the attack process on passwords, formalizes and measures the performance of such a process. The second contribution shows that currently deployed strength measurement techniques lack to protect passwords against sophisticated attacks. The third contribution analyses the algorithms of sophisticated attacks by observing their behaviour to design techniques that increase the execution cost of these attacks. Validating the methods has been done by using passwords from publicly disclosed leaks, for a total of more than 500 millions passwords
APA, Harvard, Vancouver, ISO, and other styles
27

LIMA, José Paulo da Silva. "Validação de dados através de hashes criptográficos: uma avaliação na perícia forense computacional brasileira." Universidade Federal de Pernambuco, 2015. https://repositorio.ufpe.br/handle/123456789/15966.

Full text
Abstract:
Submitted by Fabio Sobreira Campos da Costa (fabio.sobreira@ufpe.br) on 2016-03-15T14:10:33Z No. of bitstreams: 2 license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) Mestrado - CIn-UFPE - José Paulo.pdf: 1469540 bytes, checksum: ce7369f282093630fb39f482f5e6b4f9 (MD5)
Made available in DSpace on 2016-03-15T14:10:33Z (GMT). No. of bitstreams: 2 license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) Mestrado - CIn-UFPE - José Paulo.pdf: 1469540 bytes, checksum: ce7369f282093630fb39f482f5e6b4f9 (MD5) Previous issue date: 2015-08-31
A criptografia tem três princípios básicos: garantir a confidencialidade das mensagens, que elas não sejam alteradas por intrusos e que a mensagem flua entre o remetente e destinatário sem que haja a interrupção desta comunicação. Visto resumidamente as metas de um esquema criptográfico, podemos observar o quão importante a criptografia é nos dias atuais. Funções hash são usadas comumente para garantir a integridade de dados, ou seja, garantir que os dados não foram mudados. Os hashes acabam sendo usados em diversas áreas, especialmente na perícia computacional onde o perito prova que não alterou os dados que ele coletou. Porém, seria necessário que houvesse um maior cuidado com a utilização de hashes, afinal existem muitos deles que são considerados inseguros e podem continuar a ser usados indevidamente. Visto isso, este trabalho tenta analisar o cenário atual dentro da perícia forense computacional e da legislação de alguns países, com o objetivo de apontar melhorias para que despertem uma preocupação quanto a confiança na utilização dos hashes criptográficos.
The cryptography has three basic principles: ensure the confidentiality of messages, don’t be changed by intruders and the message flow between the sender and the recipient without any interruption in communication. Considering the goals of a cryptographic scheme, we can realise how important encryption is today. Hash functions are commonly used to ensure data integrity, that is, ensure that the data haven’t changed. Hashes are used in various fields, especially in computer forensics where the specialist proves that he didn’t manipulate the data he collected. However, it would be necessary a greater concern with the use of hashes, after all there are many of them who are considered unsafe and can to continue to be used incorrectly. Considering it, this paper attempts to analyze the current situation within the computer forensic expertise and the legislation of some countries, in order to point out improvements to awaken a concern with the confidence in the use of cryptographic hashes.
APA, Harvard, Vancouver, ISO, and other styles
28

Karásek, Jan. "Hashovací funkce - charakteristika, implementace a kolize." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2009. http://www.nusl.cz/ntk/nusl-218059.

Full text
Abstract:
Hash functions belong to elements of modern cryptography. Their task is to transfer the data expected on the entry into a unique bite sequence. Hash functions are used in many application areas, such as message integrity verification, information authentication, and are used in cryptographic protocols, to compare data and other applications. The goal of the master’s thesis is to characterize hash functions to describe their basic characteristics and use. Next task was to focus on one hash function, in particular MD5, and describe it properly. That means, to describe its construction, safety and possible attacks on this function. The last task was to implement this function and collisions. The introductory chapters describe the basic definition of hash function, the properties of the function. The chapters mention the methods preventing collisions and the areas were the hash functions are used. Further chapters are focused on the characteristics of various types of hash functions. These types include basic hash functions built on basic bit operations, perfect hash functions and cryptographic hash functions. After concluding the characteristics of hash functions, I devoted to practical matters. The thesis describes the basic appearance and control of the program and its individual functions which are explained theoretically. The following text describes the function MD5, its construction, safety risks and implementation. The last chapter refers to attacks on hash functions and describes the hash function tunneling method, brute force attack and dictionary attack.
APA, Harvard, Vancouver, ISO, and other styles
29

Tomaz, Antonio Emerson Barros. "Resgate de autoria em esquemas de assinatura em anel." reponame:Repositório Institucional da UFC, 2014. http://www.repositorio.ufc.br/handle/riufc/10842.

Full text
Abstract:
TOMAZ. A. E. B. Resgate de autoria em esquemas de assinatura em anel. 2014. 67 f. Dissertação (Mestrado em Engenharia de Teleinformática) - Centro de Tecnologia, Universidade Federal do Ceará, Fortaleza, 2014.
Submitted by Marlene Sousa (mmarlene@ufc.br) on 2015-02-27T18:29:04Z No. of bitstreams: 1 2014_dis_aebtomaz.pdf: 1072067 bytes, checksum: 405260d86425363feaec1802b2775de1 (MD5)
Approved for entry into archive by Marlene Sousa(mmarlene@ufc.br) on 2015-03-04T16:09:55Z (GMT) No. of bitstreams: 1 2014_dis_aebtomaz.pdf: 1072067 bytes, checksum: 405260d86425363feaec1802b2775de1 (MD5)
Made available in DSpace on 2015-03-04T16:09:55Z (GMT). No. of bitstreams: 1 2014_dis_aebtomaz.pdf: 1072067 bytes, checksum: 405260d86425363feaec1802b2775de1 (MD5) Previous issue date: 2014-05-23
The proposal presented in this thesis represents an expansion of the original concept of ring signature. A ring signature scheme allows a member of a group to publish a message anonymously, so that each member of the group can be considered the author of the message. The main idea of a ring signature is to guarantee the anonymity of the subscriber also ensure the authenticity of information, showing that the message came from one of the members of that group. This thesis presents a signature scheme based on (RIVEST et al., 2001), where the subscriber can later revoke anonymity presenting secret values that prove that he would only be able to generate such a signature. This property will be referred to here as rescue of authorship. The main difference to the proposal of Rivest et al. (2001) is presented before we even begin signature generation. The values used as input to the trapdoor function are message authentication codes - MACs generated by the HMAC algorithm, an algorithm for message authentication based on hash function collision resistant. This simple modification will allow, in the future, the subscriber to reveal itself as the true author of the message by showing the secret values to generate those MACs.
A proposta apresentada nesta dissertação representa uma expansão do conceito original de assinatura em anel. Um esquema de assinatura em anel permite que um membro de um grupo divulgue uma mensagem anonimamente, de tal forma que cada um dos membros do grupo seja considerado o possível autor da mensagem. A ideia principal de uma assinatura em anel é garantir o anonimato do assinante e ainda garantir a autenticidade da informação, mostrando que a mensagem partiu de um dos membros do referido grupo. Esta dissertação apresenta um esquema de assinatura em anel baseado no esquema de Rivest et al. (2001), em que o assinante pode, mais tarde, revogar seu anonimato apresentando valores secretos que provam que somente ele seria capaz de gerar tal assinatura. Esta propriedade será chamada aqui de resgate de autoria. A principal diferença em relação ao trabalho de Rivest et al. (2001) é apresentada antes mesmo de começar a geração da assinatura. Os valores utilizados como entrada para a função trapdoor serão códigos de autenticação de mensagem - MACs gerados pelo algoritmo HMAC, um algoritmo de autenticação de mensagem baseado em função hash resistente à colisão. Essa modificação simples permitirá que, no futuro, o assinante revele-se como o verdadeiro autor da mensagem apresentando os valores secretos que geraram os MACs.
APA, Harvard, Vancouver, ISO, and other styles
30

Bourse, Florian. "Functional encryption for inner-product evaluations." Thesis, Paris Sciences et Lettres (ComUE), 2017. http://www.theses.fr/2017PSLEE067/document.

Full text
Abstract:
Le chiffrement fonctionnel est une technique émergente en cryptographie dans laquelle une autorité toute puissante est capable de distribuer des clés permettant d’effectuer des calculs sur des données chiffrées de manière contrôlée. La mode dans ce domaine est de construire des schémas qui sont aussi expressifs que possible, c’est-à-dire du chiffrement fonctionnel qui permet l’évaluation de n’importe quel circuit. Ces contributions délaissent souvent l’efficacité ainsi que la sécurité. Elles reposent sur des hypothèses fortes, très peu étudiées, et aucune construction n’est proche d’être pratique. Le but de cette thèse est d’attaquer ce défi sous un autre angle : nous essayons de construire des schémas de chiffrement fonctionnel les plus expressifs que nous le pouvons en se basant sur des hypothèses standards, tout en conservant la simplicité et l’efficacité des constructions. C’est pourquoi nous introduisons la notion de chiffrement fonctionnel pour l’évaluation de produits scalaires, où les messages sont des vecteurs ~x, et l’autorité peut transmettre des clés correspondants à des vecteurs ~y qui permettent l’évaluation du produit scalaire h~x, ~yi. Cette fonctionnalité possède immédiatement des applications directes, et peut aussi être utilisé dans d’autres constructions plus théoriques, leproduit scalaire étant une opération couramment utilisée. Enfin, nous présentons deux structures génériques pour construire des schémas de chiffrement fonctionnels pour le produit scalaire, ainsi que des instanciations concrètes dont la sécurité repose sur des hypothèses standards. Nous comparons aussi les avantages et inconvénients de chacune d’entre elles
Functional encryption is an emerging framework in which a master authority can distribute keys that allow some computation over encrypted data in a controlled manner. The trend on this topic is to try to build schemes that are as expressive possible, i.e., functional encryption that supports any circuit evaluation. These results are at the cost of efficiency and security. They rely on recent, not very well studied assumptions, and no construction is close to being practical. The goal of this thesis is to attack this challenge from a different angle: we try to build the most expressive functional encryption scheme we can get from standard assumption, while keeping the constructions simple and efficient. To this end, we introduce the notion of functional encryption for inner-product evaluations, where plaintexts are vectors ~x, and the trusted authority delivers keys for vectors ~y that allow the evaluation of the inner-product h~x, ~yi. This functionality already offers some direct applications, and it can also be used for theoretical constructions, as inner-product is a widely used operation. Finally, we present two generic frameworks to construct inner-product functional encryption schemes, as well as some concrete instantiations whose security relies on standard assumptions. We also compare their pros and cons
APA, Harvard, Vancouver, ISO, and other styles
31

Daddala, Bhavana. "Design and Implementation of a Customized Encryption Algorithm for Authentication and Secure Communication between Devices." University of Toledo / OhioLINK, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=toledo1501629228909517.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Calder, P. "Influence functions in multivariate analysis." Thesis, University of Kent, 1986. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.375052.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Woolley, Douglas Albert. "Generic Continuous Functions and other Strange Functions in Classical Real Analysis." Digital Archive @ GSU, 2008. http://digitalarchive.gsu.edu/math_theses/44.

Full text
Abstract:
In this paper we examine continuous functions which on the surface seem to defy well-known mathematical principles. Before describing these functions, we introduce the Baire Category theorem and the Cantor set, which are critical in describing some of the functions and counterexamples. We then describe generic continuous functions, which are nowhere differentiable and monotone on no interval, and we include an example of such a function. We then construct a more conceptually challenging function, one which is everywhere differentiable but monotone on no interval. We also examine the Cantor function, a nonconstant continuous function with a zero derivative almost everywhere. The final section deals with products of derivatives.
APA, Harvard, Vancouver, ISO, and other styles
34

Marletta, G. "Curvilinear maximal functions." Thesis, University of Sussex, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.283003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Roy, Matthew B. "An analysis of the applicability of federal law regarding hash-based searches of digital media." Thesis, Monterey, California: Naval Postgraduate School, 2014. http://hdl.handle.net/10945/42714.

Full text
Abstract:
Approved for public release; distribution is unlimited
The Fourth Amendment of the United States (U.S.) Constitution limits the ability of the government to search U.S. persons without cause or justification. The application of the Fourth Amendment to digital forensics search techniques is still evolving. This thesis summarizes current federal law and recent judicial rulings that can apply Fourth Amendment doctrine to current digital forensics techniques. It uses three hypothetical scenarios to show how current law could be applied to new techniques now under development: the use of sector hashes to find traces of digital contraband; the use of random sampling to rapidly triage large digital media; and the use of similarity functions to find documents that are similar but not identical to target documents.
APA, Harvard, Vancouver, ISO, and other styles
36

Zuo, Yanling. "Monotone regression functions." Thesis, University of British Columbia, 1990. http://hdl.handle.net/2429/29457.

Full text
Abstract:
In some applications, we require a monotone estimate of a regression function. In others, we want to test whether the regression function is monotone. For solving the first problem, Ramsay's, Kelly and Rice's, as well as point-wise monotone regression functions in a spline space are discussed and their properties developed. Three monotone estimates are defined: least-square regression splines, smoothing splines and binomial regression splines. The three estimates depend upon a "smoothing parameter": the number and location of knots in regression splines and the usual [formula omitted] in smoothing splines. Two standard techniques for choosing the smoothing parameter, GCV and AIC, are modified for monotone estimation, for the normal errors case. For answering the second question, a test statistic is proposed and its null distribution conjectured. Simulations are carried out to check the conjecture. These techniques are applied to two data sets.
Science, Faculty of
Statistics, Department of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
37

Namanya, Anitta P., Irfan U. Awan, J. P. Disso, and M. Younas. "Similarity hash based scoring of portable executable files for efficient malware detection in IoT." Elsevier, 2019. http://hdl.handle.net/10454/17168.

Full text
Abstract:
Yes
The current rise in malicious attacks shows that existing security systems are bypassed by malicious files. Similarity hashing has been adopted for sample triaging in malware analysis and detection. File similarity is used to cluster malware into families such that their common signature can be designed. This paper explores four hash types currently used in malware analysis for portable executable (PE) files. Although each hashing technique produces interesting results, when applied independently, they have high false detection rates. This paper investigates into a central issue of how different hashing techniques can be combined to provide a quantitative malware score and to achieve better detection rates. We design and develop a novel approach for malware scoring based on the hashes results. The proposed approach is evaluated through a number of experiments. Evaluation clearly demonstrates a significant improvement (> 90%) in true detection rates of malware.
APA, Harvard, Vancouver, ISO, and other styles
38

Stabingiene, Lijana. "Image analysis using Bayes discriminant functions." Doctoral thesis, Lithuanian Academic Libraries Network (LABT), 2012. http://vddb.laba.lt/obj/LT-eLABa-0001:E.02~2012~D_20120917_092640-83758.

Full text
Abstract:
Image analysis is very important because of its usage in many different areas of science and industry. Pattern recognition (classification) is a tool used in image analysis. Statistical pattern recognition, based on Bayes discriminant functions is the object of this work. The main problem is to classify stationary Gaussian random field observation into one off two classes, considering, that it is dependant on training sample ant taking in to account the relationship with training sample. The new supervised classification method, based on Bayes discriminant functions, is proposed and it gives better results comparing with other commonly used Bayes discriminant functions. Method is programmed with R program and investigated experimentally, reconstructing images corrupted by spatially correlated noise. Such situation occurs naturally, for example, during the forest fire smoke covers the remotely sensed image, gathered from the satellite. Also such situation is often during cloudy days. During such situation the incorporation of the spatial dependences into the classification problem is useful. Analytical error rates of Bayes discriminant functions are presented (derived), which are the criterion of these functions. Also, the dependences on statistical parameters are investigated for these error rates.
Vaizdų analizė šiomis dienomis yra labai svarbi dėl plataus pritaikymo daugelyje mokslo ir pramonės sričių. Vienas iš vaizdų analizės įrankių – objekto atpažinimas (klasifikavimas) (angl. pattern recognition). Statistinis objekto atpažinimas, paremtas Bajeso diskriminantinėmis funkcijomis – šio darbo objektas. Sprendžiama problema – optimalus klasifikavimas stacionaraus Gauso atsitiktinio lauko (GRF) stebinio, į vieną iš dviejų klasių, laikant, kad jis yra priklausomas nuo mokymo imties ir atsižvelgiant į jo ryšius su mokymo imtimi. Pateikta klasifikavimo procedūra, kuri Gauso atsitiktinio lauko stebinius klasifikuoja optimaliai. Yra pasiūlytas naujas klasifikavimo su mokymu metodas, kuris duoda geresnius rezultatus, lyginant su įprastai naudojamomis Bajeso diskriminantinėmis funkcijomis. Metodas realizuotas R sistemos aplinkoje ir tikrinamas eksperimentų būdu, atstatant vaizdus, sugadintus erdvėje koreliuoto triukšmo. Tokia situacija pasitaiko natūraliai, pavyzdžiui, degant miškui dūmai uždengia nuotolinio stebėjimo vaizdą, gautą iš palydovo. Taip pat tokia situacija gana dažna esant debesuotumui. Esant tokiai situacijai erdvinės priklausomybės įvedimas į klasifikacijos problemą pasiteisina. Pateiktos (išvestos) analitinės klaidų tikimybių išraiškos Bajeso diskriminantinėms funkcijoms, kurios yra kaip šių funkcijų veikimo kriterijus. Ištirta klaidų tikimybių priklausomybė nuo statistinių parametrų reikšmių.
APA, Harvard, Vancouver, ISO, and other styles
39

Lartey, Ebenezer. "Change-point analysis using score functions." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape10/PQDD_0009/NQ40269.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Abobakr, Mona R. H. "Quantum circuit analysis using analytic functions." Thesis, University of Bradford, 2019. http://hdl.handle.net/10454/18330.

Full text
Abstract:
In this thesis, classical computation is first introduced. Finite quantum systems are considered with D-dimensional Hilbert space, and position x and momentum p taking values in Z(D) (the integers modulo D). An analytic rep resentation of finite quantum systems that use Theta function is presented and considered. The first novel part of this thesis is contribution to study reversible classical CNOT gates and their binary inputs and outputs with reversible cir cuits. Furthermore, a reversible classical Toffoli gates are considered, as well as implementation of a Boolean expression with classical CNOT and Toffoli gates. Reversible circuits with classical CNOT and Toffoli gates are also considered. The second novel part of this thesis the study of quantum computation in terms of CNOT and Toffoli gates. Analytic representations and their zeros are considered, while zeros of the inputs and outputs for quantum CNOT and Toffoli gates are studied. Also, approximate computation of their zeros on the output are calculated. Finally, some quantum circuits are discussed. i
APA, Harvard, Vancouver, ISO, and other styles
41

Schmied, Jan. "GPU akcelerované prolamování šifer." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2014. http://www.nusl.cz/ntk/nusl-236071.

Full text
Abstract:
This work describes one - way hash functions and cryptographic algorithms . It also describes their implementation regarding DOC, PDF and ZIP files contents encryption . Subsequently , the implementation analyzis is provided . Following next, the brute - force attack procedure levereging GPU is proposed and evaluated.
APA, Harvard, Vancouver, ISO, and other styles
42

Zagar, Susanna Maria. "Convex functions." CSUSB ScholarWorks, 1996. https://scholarworks.lib.csusb.edu/etd-project/986.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Fountain, David Wilkes. "Implicit systems : orthogonal functions analysis and geometry." Diss., Georgia Institute of Technology, 1991. http://hdl.handle.net/1853/15750.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Chen, Jein-Shan. "Merit functions and nonsmooth functions for the second-order cone complementarity problem /." Thesis, Connect to this title online; UW restricted, 2004. http://hdl.handle.net/1773/5782.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Graneland, Elsa. "Orthogonal polynomials and special functions." Thesis, Uppsala universitet, Analys och sannolikhetsteori, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-418820.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Lind, Martin. "Functions of bounded variation." Thesis, Karlstad University, Division for Engineering Sciences, Physics and Mathematics, 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-209.

Full text
Abstract:

The paper begins with a short survey of monotone functions. The functions of bounded variation are introduced and some basic properties of these functions are given. Finally the jump function of a function of bounded variation is defined.

APA, Harvard, Vancouver, ISO, and other styles
47

Sadykov, Timour. "Hypergeometric functions in several complex variables." Doctoral thesis, Stockholm : Univ, 2002. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-198.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Nguyen, Thien Duy. "Modeling of Safety Functions in Quantitative Risk Analysis." Thesis, Norges Teknisk-Naturvitenskaplige Universitet, 2012. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-21097.

Full text
Abstract:
Quantitative risk analysis in the offshore industry is mandated by the Norwegian legislation. A literature survey is carried out, related to the current legislation from the Norwegian Petroleum Safety Authority (PSA) and supporting NORSOK standards. Process accidents on offshore installations, operating on the Norwegian continental shelf are emphasized. A risk picture is the synthesis of a risk assessment, describing the risk level. Requirements to the risk picture are discussed, and associated risk measures are presented. The risk measures represent the quantitative parts of a risk picture and the measures are evaluated against risk acceptance criteria. The evaluation can be performed with a mechanistic approach, or more flexibly by using the as low as reasonably practicable principle.Uncertainty is an important aspect that many quantitative risk analyses treat too briefly. Assumptions are always made in risk analyses, and uncertainty therefore becomes an important issue. To put it on the agenda, an introduction to the topic is given. The main purpose of a risk analysis is to support decision-making and the analysts should keep that in mind when performing the analysis. The field of quantitative risk analysis has received some criticisms, but some of it is unjust. To understand why, the scope of the quantitative risk analysis must be understood. Risk can be considered both from a strategic (long-term) and an operational (day-to-day) perspective. For quantitative risk analyses, a probabilistic view is used, dealing with probabilities and expected values. Strategic decision-making fits with this approach, but renders a day-to-day basis decision-making unsuitable. In addition, quantitative risk analysis copes with several types of hazards, with a long time span. The resources needed to handle all the hazards on an operational level of detail would be tremendous.Several methods can be used when performing a quantitative risk analysis. The approach used by Scandpower is explored in detail. The main method currently used is event tree analysis. This method has some challenges. A problem addressed is the treatment of dependencies, both within and between event trees. The answer is related to how RiskSpectrum, a fault and event tree software, calculates the end event frequencies. A second problem is the treatment of human reliability, and how it can be implemented in the event tree analyses.Large investments have been used on fire protection systems, to mitigate the consequences of process accidents. The thesis endeavors to study the importance of these safety systems. The emphasis is how the systems’ reliability is modeled and treated in a quantitative risk analysis. To investigate the effects of the safety systems on the risk measures, three quantitative risk analyses are explored in detail. This was executed by using sensitivity analyses. The sensitivity analyses are performed by altering the failure probabilities to the far ends. Astonishing results arisen. An attempt has been made to understand the mechanisms leading to the results. Possible explanations are discussed, and the three most important are outlined.An input to the quantitative risk analyses is reliability data of the safety systems, but there can be nonconformity between the data. Vendor data seems to be too optimistic related to the field performance. Possible explanations are discussed in the thesis.A best practice is presented, formed as an extended conclusion. Topics considered are:-Challenges when modeling the event trees-How to include vulnerability of the safety systems-Uncertainties with the effect of deluge-Human factors-Dependencies
APA, Harvard, Vancouver, ISO, and other styles
49

Cavina, Michelangelo. "Bellman functions and their method in harmonic analysis." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2019. http://amslaurea.unibo.it/19214/.

Full text
Abstract:
This work uses the method of the Bellman function to show a new proof of Hardy's inequality for Carleson measures. Bellman functions come from the theory of stochastic optimal control and there is a method to prove theorems about inequalities over dyadic trees (which have applications in harmonic analysis) that takes inspiration from concepts from the theory of the Bellman functions. The work will display the important concepts of the theory of Bellman functions in stochastic analysis, will show how to use the method of the Bellman function to prove the estimate over dyadic trees for Carleson measures for Hardy spaces (while also showing the connections between the stochastic theory and the harmonic analysis) and will give a new proof of Hardy's inequality for dyadic trees (which is related to the characterization of Carleson measures in Besov spaces) using the Bellman function method.
APA, Harvard, Vancouver, ISO, and other styles
50

Fuller, Joanne Elizabeth. "Analysis of Affine Equivalent Boolean Functions for Cryptography." Queensland University of Technology, 2003. http://eprints.qut.edu.au/15828/.

Full text
Abstract:
Boolean functions are an important area of study for cryptography. These functions, consisting merely of one's and zero's, are the heart of numerous cryptographic systems and their ability to provide secure communication. Boolean functions have application in a variety of such systems, including block ciphers, stream ciphers and hash functions. The continued study of Boolean functions for cryptography is therefore fundamental to the provision of secure communication in the future. This thesis presents an investigation into the analysis of Boolean functions and in particular, analysis of affine transformations with respect to both the design and application of Boolean functions for cryptography. Past research has often been limited by the difficulties arising from the magnitude of the search space. The research presented in this thesis will be shown to provide an important step towards overcoming such restrictions and hence forms the basis for a new analysis methodology. The new perspective allows a reduced view of the Boolean space in which all Boolean functions are grouped into connected equivalence classes so that only one function from each class need be established. This approach is a significant development in Boolean function research with many applications, including class distinguishing, class structures, self mapping analysis and finite field based s-box analysis. The thesis will begin with a brief overview of Boolean function theory; including an introduction to the main theme of the research, namely the affine transformation. This will be followed by the presentation of a fundamental new theorem describing the connectivity that exists between equivalence classes. The theorem of connectivity will form the foundation for the remainder of the research presented in this thesis. A discussion of efficient algorithms for the manipulation of Boolean functions will then be presented. The ability of Boolean function research to achieve new levels of analysis and understanding is centered on the availability of computer based programs that can perform various manipulations. The development and optimisation of efficient algorithms specifically for execution on a computer will be shown to have a considerable advantage compared to those constructed using a more traditional approach to algorithm optimisation. The theorem of connectivety will be shown to be fundamental in the provision many avenues of new analysis and application. These applications include the first non-exhaustive test for determining equivalent Boolean functions, a visual representation of the connected equivalence class structure to aid in the understanding of the Boolean space and a self mapping constant that enables enumeration of the functions in each equivalence class. A detailed survey of the classes with six inputs is also presented, providing valuable insight into their range and structure. This theme is then continued in the application Boolean function construction. Two important new methodologies are presented; the first to yield bent functions and the second to yield the best currently known balanced functions of eight inputs with respect to nonlinearity. The implementation of these constructions is extremely efficient. The first construction yields bent functions of a variety of algebraic order and inputs sizes. The second construction provides better results than previously proposed heuristic techniques. Each construction is then analysed with respect to its ability to produce functions from a variety of equivalence classes. Finally, in a further application of affine equivalence analysis, the impact to both s-box design and construction will be considered. The effect of linear redundancy in finite field based s-boxes will be examined and in particular it will be shown that the AES s-box possesses complete linear redundancy. The effect of such analysis will be discussed and an alternative construction to s-box design that ensures removal of all linear redundancy will be presented in addition to the best known example of such an s-box.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography