To see the other types of publications on this topic, follow the link: Vulnerability detection system.

Dissertations / Theses on the topic 'Vulnerability detection system'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 20 dissertations / theses for your research on the topic 'Vulnerability detection system.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Hou, Haiyu Dozier Gerry V. "GENERTIA a system for vulnerability analysis, design and redesign of immunity-based anomaly detection system /." Auburn, Ala., 2006. http://repo.lib.auburn.edu/2006%20Fall/Dissertations/HOU_HAIYU_22.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Munir, Rashid. "A Quantitative Security Assessment of Modern Cyber Attacks. A Framework for Quantifying Enterprise Security Risk Level Through System's Vulnerability Analysis by Detecting Known and Unknown Threats." Thesis, University of Bradford, 2014. http://hdl.handle.net/10454/14251.

Full text
Abstract:
Cisco 2014 Annual Security Report clearly outlines the evolution of the threat landscape and the increase of the number of attacks. The UK government in 2012 recognised the cyber threat as Tier-1 threat since about 50 government departments have been either subjected to an attack or a direct threat from an attack. The cyberspace has become the platform of choice for businesses, schools, universities, colleges, hospitals and other sectors for business activities. One of the major problems identified by the Department of Homeland Security is the lack of clear security metrics. The recent cyber security breach of the US retail giant TARGET is a typical example that demonstrates the weaknesses of qualitative security, also considered by some security experts as fuzzy security. High, medium or low as measures of security levels do not give a quantitative representation of the network security level of a company. In this thesis, a method is developed to quantify the security risk level of known and unknown attacks in an enterprise network in an effort to solve this problem. The identified vulnerabilities in a case study of a UK based company are classified according to their severity risk levels using common vulnerability scoring system (CVSS) and open web application security project (OWASP). Probability theory is applied against known attacks to create the security metrics and, detection and prevention method is suggested for company network against unknown attacks. Our security metrics are clear and repeatable that can be verified scientifically.
APA, Harvard, Vancouver, ISO, and other styles
3

Bughio, Kulsoom Saima. "IoMT security: A semantic framework for vulnerability detection in remote patient monitoring." Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 2024. https://ro.ecu.edu.au/theses/2841.

Full text
Abstract:
The increasing need to safeguard patient data in Internet of Medical Things (IoMT) devices highlights the critical importance of reducing vulnerabilities within these systems. The widespread adoption of IoMT has transformed healthcare by enabling continuous remote patient monitoring (RPM), which enhances patient outcomes and optimizes healthcare delivery. However, the integration of IoMT devices into healthcare systems presents significant security challenges, particularly in protecting sensitive patient data and ensuring the reliability of medical devices. The diversity of data formats used by various vendors in RPM complicates data aggregation and fusion, thereby hindering overall cybersecurity efforts. This thesis proposes a novel semantic framework for vulnerability detection in RPM settings within the IoMT system. The framework addresses interoperability, heterogeneity, and integration challenges through meaningful data aggregation. The core of this framework is a domain ontology that captures the semantics of concepts and properties related to the primary security aspects of IoT medical devices. This ontology is supported by a comprehensive ruleset and complex queries over aggregated knowledge. Additionally, the implementation integrates medical device data with the National Vulnerability Database (NVD) via an API, enabling real-time detection of vulnerabilities and improving the security of RPM systems. By capturing the semantics of medical devices and network components, the proposed semantic model facilitates partial automation in detecting network anomalies and vulnerabilities. A logic-based ruleset enhances the system’s robustness and efficiency, while its reasoning capabilities enable the identification of potential vulnerabilities and anomalies in IoMT systems, thereby improving security measures in remote monitoring settings. The semantic framework also supports knowledge graph visualization and efficient querying through SPARQL. The knowledge graph provides a structured representation of interconnected data and stores Cyber Threat Intelligence (CTI) to enhance data integration, visualization, and semantic enrichment. The query mechanism enables healthcare providers to extract valuable insights from IoMT data, notifying them about new system vulnerabilities or vulnerable medical devices. This demonstrates the impact of vulnerabilities on cybersecurity requirements (Confidentiality, Integrity, and Availability) and facilitates countermeasures based on severity. Consequently, the framework promotes timely decision-making, enhancing the overall efficiency and effectiveness of IoMT systems. The semantic framework is validated through various use cases and existing frameworks, demonstrating its effectiveness and robustness in vulnerability detection within the domain of IoMT security.
APA, Harvard, Vancouver, ISO, and other styles
4

Яцентюк, Святослав Сергійович, та Sviatoslav Yatsentiuk. "Розробка та дослідження автоматизованої системи виявлення вразливостей розподілених комп'ютерних систем". Master's thesis, Тернопіль, ТНТУ, 2021. http://elartu.tntu.edu.ua/handle/lib/36532.

Full text
Abstract:
Роботу виконано на кафедрі ком’пютерно-інтегрованих технологій Тернопільського національного технічного університету імені Івана Пулюя Міністерства освіти і науки України Захист відбудеться 21 грудня 2021 р. о 09 .00 годині на засіданні екзаменаційної комісії № 24 у Тернопільському національному технічному університеті імені Івана Пулюя за адресою: 46001, м. Тернопіль, вул.Руська, 56, навчальний корпус №1, ауд. 403<br>Кваліфікаційна робота складається з пояснювальної записки та графічної частини (ілюстративний матеріал – слайди). Об’єм графічної частини кваліфікаційної роботи становить 10 слайдів. Об’єм пояснювальної записки складає 64 друкованих сторінок формату А4 (210×297), об’єм додатків – 14 друкованих сторінок формату А4. Кваліфікаційна робота складається з шести розділів, в яких нараховується 10 рисунків та 3 таблиці з даними. В роботі використано 20 літературних джерел. У кваліфікаційній роботі вирішується задача розробки та впровадження розподіленої служби, призначеної для накопичення та аналізування використання ресурсів обчислювальних вузлів, їх активних процесів, виконання розподілених прикладних програм, з метою виявлення відхилень у їх роботі та інформування користувача про нештатні ситуації. Основні вимоги, які висуваються до цієї служби полягають у здійсненні кластерного моніторингу та аналізу використання ресурсів.Qualification work consists of an explanatory note and a graphic part (illustrative material - slides). The graphic part of the qualifying work is 10 slides. The volume of the explanatory note is 64 printed A4 pages (210 × 297), the volume of appendices is 10 printed A4 pages. The qualification work consists of six sections, in which there are 15 figures and 3 tables with data. 20 literary sources were used in the work. Qualification work solves the problem of developing and implementing a distributed service designed to accumulate and analyze the use of resources of computer nodes, their active processes, the implementation of distributed applications, to identify deviations in their work and inform the user about abnormal situations. The main requirements for this service are cluster monitoring and analysis of resource use.<br>ПЕРЕЛІК УМОВНИХ ПОЗНАЧЕНЬ, ОДИНИЦЬ, СИМВОЛІВ, СКОРОЧЕНЬ І ТЕРМІНІВ 6 ВСТУП 7 1 АНАЛІЗ АРХІТЕКТУРИ РОЗПОДІЛЕНИХ СИСТЕМ І АРХІТЕКТУРНИХ СТИЛІВ 9 1.1 Розподілена система 9 1.2 Архітектури розподілених систем 9 1.3 Архітектурні стилі 10 1.3.1 Багатошарова архітектура 10 1.3.2 Архітектура на основі об'єктів 10 1.3.3 Архітектура, орієнтована на дані 11 1.3.4 Архітектура на основі подій 11 1.4 Вимоги до розподілених систем 12 1.5 Проміжне середовище розподілених систем 14 2 АНАЛІЗ ЗАГРОЗ ІНФОРМАЦІЙНІЙ БЕЗПЕЦІ В РОЗПОДІЛЕНИХ СИСТЕМАХ 16 2.1 Технології розвитку вразливостей і загроз 16 2.2 Модель загроз у розподілених мережах 17 2.3 Вразливість до шахрайства 18 2.4 Проблеми дослідження вразливості 19 2.5 Несанкціонований доступ у розподілених мережах. Механізми його реалізації 21 3 ДОСЛІДЖЕННЯ МЕХАНІЗМІВ ЗМЕНШЕННЯ ВРАЗЛИВОСТІ ТА ЗАГРОЗ 27 3.2 Фізична безпека в розподіленій системі 28 3.3 Безпека мережі та політика аутентифікації 29 3.4 Розробка механізмів захисту 30 3.4.1. Фокус керування 31 3.4.2. Багаторівнева організація механізмів захисту 31 3.4.3. Розподіл механізмів захисту 32 3.5 Захищені канали 33 3.6 Контроль доступу 34 4 НАУКОВО-ДОСЛІДНА ЧАСТИНА 35 4.1 Дослідження існуючих способів побудови 35 4.2 Кластерний моніторинг 37 4.2.1 Загальна характеристика моніторингу кластерів 37 4.2.2 Способи автоматизування виявлення несправностей 40 4.2.3 Альтернативні структури моніторингу 40 5 СПЕЦІАЛЬНА ЧАСТИНА 42 5.1 Архітектура і дизайн розробленої системи 42 5.1.1 Принцип роботи колектора 42 5.1.2 Опис роботи механізму аналізу 44 5.2 Реалізація системи 46 5.2.1 Запобігання помилок при роботі з колектором 47 5.2.2 Передача даних для аналізу та зберігання 48 5.3 Експерементальна частина 48 5.3.1 Опис експериментальної установки 48 5.3.2 Використання пам’яті колектора 49 5.3.3 Хід аналізу 49 5.3.5 Експеременти з виявленням вразливостей 50 5.4 Отримані результати 51 5.4.1 Вимірювання колектора 51 5.4.2 Результати продуктивності аналізу 51 5.4.3 Аналіз даних 54 5.4.4 Результати виявлення відхилень 55 6 ОХОРОНА ПРАЦІ ТА БЕЗПЕКА В НАДЗВИЧАЙНИХ СИТУАЦІЯХ 56 6.1 Безпека виконання робіт 56 6.2 Значення автоматизації виробничих процесів в питаннях охорони праці 56 6.3 Долікарська допомога при пораненнях 57 ВИСНОВКИ 60 СПИСОК ВИКОРИСТАНИХ ДЖЕРЕЛ 61 ДОДАТКИ
APA, Harvard, Vancouver, ISO, and other styles
5

Gabelli, Filippo. "Security analysis of physical attacks to offshore O&G facilities." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2021.

Find full text
Abstract:
Chemical and petrochemical plants are susceptible to malicious acts due to the attractiveness brought by the substances handled, the possibility to thieve secret information, and their strategic importance. Thus, a number of qualitative and semi-quantitative methodologies have been developed for the assessment of vulnerabilities of such sites. Similarly, offshore facilities, that are central in the production of hydrocarbons, were subjected to physical and cyber security events as is outlined in relevant past accident analysis. However, there are just few methodologies tailored for the identification of vulnerabilities in that specific environment. The current study is aimed at closing such gap by building a proper technique for modeling attacks towards offshore sites. The method has been developed just for physical attacks and has been drafted with reference to the ASD model, for rendering the adversary pattern, and to the single path computer model (EASI), for the estimation of the PPS effectiveness. The methodology is intended as a tool for supporting standard SVA/SRA procedures, with particular reference to the API RP 780, therefore its purpose is to provide a systematic approach to the analysis. It has been applied to a case study, represented by an offshore production platform, to point out the results obtained and to outline the link with the API SRA methodology.
APA, Harvard, Vancouver, ISO, and other styles
6

Liu, Long. "Modelling and Vulnerability Assessment of Intelligent Electricity Networks as Cyber-Physical Systems." Thesis, The University of Sydney, 2017. http://hdl.handle.net/2123/17846.

Full text
Abstract:
The future grid is different from the current system, and requires more interactions between electrical network and data communication network from the end users to generators. The electrical network acts as a power supply for communication nodes, and in turn, the communication network transmits the control messages for the electrical components. Because of this interdependency, any problems exist in either of these networks may threaten the stability of the whole system. We focus on analysing the vulnerability of the interacted power network and communication network from perspectives of topologies and system operation. An interdiction model is proposed considering the security and operation of both power and communication networks to recognise the crucial set of power components. We solve the interdiction problem using a decomposition method with the consideration of the interdependency between power and communication components. We propose a practical smart grid model as two mutually dependent complex networks with improved interlinks allocation strategy. Moreover, we study the problem of intentional attacks targeting to interdependent networks generated with known degree distribution or distribution of interlinks. In both models, each node's degree is correlated with the number of its links that connect to the other network. Detecting the community structure of a power network can effectively improve the availability of the control actions, which can enhance power system's resilience. This work proposes a method to group the power components into overlapping communities based on the linear sensitivity of the electrical network and complex network theory. This work provides a guidance for the design of the distributed power control system and the power system operation planning.
APA, Harvard, Vancouver, ISO, and other styles
7

Kasse, Mamadou. "Système de Ρréventiοn cοntre les vulnérabilités et de Détectiοn des Anοmalies dans les Réseaux Ιnfοrmatiques". Electronic Thesis or Diss., Normandie, 2024. https://theses.hal.science/tel-04885354.

Full text
Abstract:
Les outils de prévention des vulnérabilités et de détection des anomalies sont essentiels pour la sécurité des réseaux informatiques. Cette thèse se concentre sur l'utilisation des données du MITRE ATT&amp;CK, des scores CVSS et de la norme ISO 27002:2022 pour automatiser et consolider l'analyse des vulnérabilités et la détection des anomalies. Les objectifs principaux sont : - Diagnostic de vulnérabilité : Identifier les sous-réseaux les plus vulnérables en combinant les données du MITRE ATT&amp;CK, des scores CVSS et de la norme ISO 27002:2022. Pour cela, une base de données appelée Data ISO-MA a été créée. Un algorithme évalue la vulnérabilité des chemins dans le réseau, identifiant ceux les plus à risque. - Détection d’anomalies : Analyser les flux de trafic pour détecter des comportements inhabituels dans les chemins vulnérables. Une approche inspirée du modèle Path-scan de Joshua Neil et al. (2013) a été utilisée. Chaque connexion réseau est modélisée avec un modèle de Markov à 3 états et la statistique du rapport de vraisemblance généralisé (GLRT), permettant de capturer et d'identifier les comportements anormaux.Ces deux outils visent à renforcer la sécurité des réseaux informatiques en fournissant une solution intégrée pour la prévention des vulnérabilités et la détection des anomalies<br>Tools for vulnerability prevention and anomaly detection are essential for the security of computer networks. This thesis focuses on using MITRE ATT&amp;CK data, CVSS scores, and the ISO 27002:2022 standard to automate and consolidate vulnerability analysis and anomaly detection.The main objectives are: -Vulnerability Diagnosis: Identify the most vulnerable sub-networks by combining MITRE ATT&amp;CK data, CVSS scores, and the ISO 27002:2022 standard. To achieve this, a database called Data ISO-MA was created. An algorithm evaluates the vulnerability of network paths, identifying those most at risk. - Anomaly Detection: Analyze traffic flows to detect unusual behaviors in vulnerable paths. An approach inspired by the Path-scan model introduced by Joshua Neil et al. (2013) was used. Each network connection is modeled with a 3-state Markov model and the Generalized Likelihood Ratio Test (GLRT), allowing for the capture and identification of abnormal behaviors.These two tools aim to enhance the security of computer networks by providing an integrated solution for vulnerability prevention and anomaly detection
APA, Harvard, Vancouver, ISO, and other styles
8

Akrout, Rim. "Analyse de vulnérabilités et évaluation de systèmes de détection d'intrusions pour les applications Web." Phd thesis, INSA de Toulouse, 2012. http://tel.archives-ouvertes.fr/tel-00782565.

Full text
Abstract:
Avec le développement croissant d'Internet, les applications Web sont devenues de plus en plus vulnérables et exposées à des attaques malveillantes pouvant porter atteinte à des propriétés essentielles telles que la confidentialité, l'intégrité ou la disponibilité des systèmes d'information. Pour faire face à ces malveillances, il est nécessaire de développer des mécanismes de protection et de test (pare feu, système de détection d'intrusion, scanner Web, etc.) qui soient efficaces. La question qui se pose est comment évaluer l'efficacité de tels mécanismes et quels moyens peut-on mettre en oeuvre pour analyser leur capacité à détecter correctement des attaques contre les applications web. Dans cette thèse nous proposons une nouvelle méthode, basée sur des techniques de clustering de pages Web, qui permet d'identifier les vulnérabilités à partir de l'analyse selon une approche boîte noire de l'application cible. Chaque vulnérabilité identifiée est réellement exploitée ce qui permet de s'assurer que la vulnérabilité identifiée ne correspond pas à un faux positif. L'approche proposée permet également de mettre en évidence différents scénarios d'attaque potentiels incluant l'exploitation de plusieurs vulnérabilités successives en tenant compte explicitement des dépendances entre les vulnérabilités. Nous nous sommes intéressés plus particulièrement aux vulnérabilités de type injection de code, par exemple les injections SQL. Cette méthode s'est concrétisée par la mise en oeuvre d'un nouveau scanner de vulnérabilités et a été validée expérimentalement sur plusieurs exemples d'applications vulnérables. Nous avons aussi développé une plateforme expérimentale intégrant le nouveau scanner de vulnérabilités, qui est destinée à évaluer l'efficacité de systèmes de détection d'intrusions pour des applicationsWeb dans un contexte qui soit représentatif des menaces auxquelles ces applications seront confrontées en opération. Cette plateforme intègre plusieurs outils qui ont été conçus pour automatiser le plus possible les campagnes d'évaluation. Cette plateforme a été utilisée en particulier pour évaluer deux techniques de détection d'intrusions développées par nos partenaires dans le cadre d'un projet de coopération financé par l'ANR, le projet DALI.
APA, Harvard, Vancouver, ISO, and other styles
9

Callaghan, Kerry Lee. "The use of remote sensing and GIS in the identification and vulnerability detection of coastal erosion as a hazard in False Bay, South Africa." Thesis, Stellenbosch : Stellenbosch University, 2014. http://hdl.handle.net/10019.1/86611.

Full text
Abstract:
Thesis (MSc)--Stellenbosch University, 2014.<br>ENGLISH ABSTRACT: Coastal erosion is a worldwide hazard of which the consequences can only be mitigated via thorough and efficient monitoring of erosion and vulnerability to erosion. This study aimed to establish the accuracy, efficacy and efficiency of various remote sensing techniques for the detection and monitoring of coastal erosion and vulnerability occurring in False Bay, South Africa. There is a need to monitor the erosion in this area as well as to determine the most effective techniques for monitoring the erosion in False Bay and other similar environments in the future. This study provides an assessment of the usefulness of different data sources and techniques for change detection in the coastal environment. The data sources used were Landsat TM/ETM+ imagery and aerial photographs. Image differencing, tasselled cap transformations, vegetation index differencing, Boolean change detection, and post-classification change detection were all performed on the Landsat imagery. The aerial photographs were assessed using the Digital Shoreline Analysis System (DSAS) add-on for ArcGIS which determines statistical differences in the shoreline position as digitised in vector format. The results showed that while the resolution of the Landsat imagery was not sufficient to analyse erosion along the beach itself, the larger area covered by the satellite images enabled vulnerability indicators to be seen. Notably, the post-classification change detection indicated consistent increases in built-up areas, while sand dune, beach, and sand (not beach) all decreased. NDVI differencing showed consistent decreases in NDVI indicating decreasing plant health and density. The results of image differencing with both band 4 and the brightness band led to conclusions that vegetation health was decreasing while reflective surfaces such as bare sand and roads were increasing. All of these indicate an increased vulnerability to coastal erosion. The Boolean change detection method was found not to be useful in this case. Aerial photographs were studied on four focus areas: Bayview Heights, Macassar Beach, Strand, and Pringle Bay. The results showed erosion at all four areas, with Strand experiencing only erosion (no accretion) at an average of 53 cm erosion per year. Erosion at Macassar Beach and Pringle Bay was also severe, with Bayview Heights being the least severe and showing a combination of erosion and accretion. The higher resolution available on the aerial photographs was vital to view changes on the beach itself. In future studies requiring assessment of changes in the position or condition of the beach itself, aerial photographs or high resolution satellite data should be used. Studies of vulnerability extending over the entire coastal zone may make use of Landsat TM images. Post-classification change detection provides powerful change direction information and can indicate the percentage of area change from one class to another. However, image differencing and vegetation index differencing are much faster to perform and can provide information about general trends in the changes occurring. Therefore post-classification change detection might be used in areas of high and rapid change while image differencing and vegetation index differencing can be useful to cover vast areas where little change is expected.<br>AFRIKAANSE OPSOMMING: Kus-erosie is ‘n wêreldwye gevaar waarvan die gevolge slegs deur deeglike en doeltreffende monitering van erosie en kwesbaarheid vir erosie verminder kan word. Hierdie studie poog om die akkuraatheid, doeltreffendheid en effektiwiteit van verskillende afstandswaarneming tegnieke vas te stel vir die opsporing en monitering van kus-erosie en kwesbaarheid in Valsbaai, Suid Afrika. Daar is ‘n behoefte aan die monitering van erosie in hierdie area, sowel as om die mees doeltreffende tegnieke van die monitering hiervan in Valsbaai en ander soortgelyke omgewings in die toekoms te bepaal. Hierdie studie bied ‘n evaluering van die nut van verskillende data-bronne en tegnieke vir die opsporing van verandering in ‘n kusomgewing. Die data-bronne wat gebruik is, is Landsat TM/ETM+ beelde asook lugfoto’s. Beeld differensievorming, “tasselled cap” transformasies, plantegroei indeks differensievorming, Boolse verandering en post-klassifikasie verandering is toegepas op die Landsat beelde. Die lugfotos is ge-evalueer deur die Digitale Kuslyn Analise Stelsel (Digital Shoreline Analysis System – DSAS). DSAS is ‘n bykomstige sagteware vir ArcGIS wat statistiese verskille in gedigitaliseerde kuslyn posisie bepaal. Die resultate toon dat terwyl die resolusie van die Landsat beelde nie voldoende was om strand-erosie self te analiseer, die groter area wat deur die satellietbeelde gedek word toegelaat het om kwesbaarheid aanwysers te ontleed. Spesifiek die post-klassifikasie verandering het aangedui dat konsekwente toenames in beboude areas voorkom, terwyl afnames in sandduine, strand en sand-areas voorgekom het. NDVI differensievorming het konsekwente afnames in NDVI getoon, wat dui op afnames in die gesondheid en digtheid van plantegroei. Die resultate van die beeld differensievorming met beide Landsat Band 4 en die helderheid-band het gelei tot die gevolgtrekking dat die gesondheid van plantegroei afgeneem het, terwyl reflektiewe oppervlaktes soos oop sand en paaie aan die toeneem is. Al hierdie resultate dui op die verhoogde kwesbaarheid vir kus erosie. Die Boolse verandering metode is bevind om nie van nut te wees in hierdie geval nie. Lugfoto’s van vier fokus-areas is bestudeer: Bayview Heights, Macassar Strand, Strand en Pringlebaai. Resultate van die DSAS analise het gevind dat oorwegend erosie by al vier areas plaasvind, met Strand die enigste area wat slegs erosie (geen aanwas) ervaar teen ‘n gemiddelde koers van 0.53 m per jaar. Erosie by Macassar Strand en Pringlebaai was ook ernstig, terwyl Bayview Heights die minste erosie ervaar het, met ‘n kombinasie van erosie en aanwas. Die hoër resolusie beskikbaar deur die lugfoto’s was noodsaaklik om veranderinge in strand areas waar te neem. In toekomstige studies wat die assessering van verandering in die posisie of toestand van strande noodsaak behoort lugfotos of hoë-resolusie satellietbeeld data gebruik te word. Studies oor die kwesbaarheid van ‘n hele kusstreek kan wel gebruik maak van Landsat data. Post-klassifikasie verandering bied kragtige informasie oor die rigting van verandering en kan die persentasie van verandering van een klas na ‘n ander aandui. Beeld en NDVI differensievorming is egter veel vinniger om uit te voer en kan informasie rakende die algemene tendense in verandering lewer. Post-klassifikasie verandering kan dus gebruik word in gebiede van vinnige en beduidende verandering plaasvind, terwyl beeld en NDVI differensievorming nuttig kan wees om groot areas te dek waar min verandering verwag word.
APA, Harvard, Vancouver, ISO, and other styles
10

Potnuru, Srinath. "Fuzzing Radio Resource Control messages in 5G and LTE systems : To test telecommunication systems with ASN.1 grammar rules based adaptive fuzzer." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-294140.

Full text
Abstract:
5G telecommunication systems must be ultra-reliable to meet the needs of the next evolution in communication. The systems deployed must be thoroughly tested and must conform to their standards. Software and network protocols are commonly tested with techniques like fuzzing, penetration testing, code review, conformance testing. With fuzzing, testers can send crafted inputs to monitor the System Under Test (SUT) for a response. 3GPP, the standardization body for the telecom system, produces new versions of specifications as part of continuously evolving features and enhancements. This leads to many versions of specifications for a network protocol like Radio Resource Control (RRC), and testers need to constantly update the testing tools and the testing environment. In this work, it is shown that by using the generic nature of RRC specifications, which are given in Abstract Syntax Notation One (ASN.1) description language, one can design a testing tool to adapt to all versions of 3GPP specifications. This thesis work introduces an ASN.1 based adaptive fuzzer that can be used for testing RRC and other network protocols based on ASN.1 description language. The fuzzer extracts knowledge about ongoing RRC messages using protocol description files of RRC, i.e., RRC ASN.1 schema from 3GPP, and uses the knowledge to fuzz RRC messages. The adaptive fuzzer identifies individual fields, sub-messages, and custom data types according to specifications when mutating the content of existing messages. Furthermore, the adaptive fuzzer has identified a previously unidentified vulnerability in Evolved Packet Core (EPC) of srsLTE and openLTE, two open-source LTE implementations, confirming the applicability to robustness testing of RRC and other network protocols.<br>5G-telekommunikationssystem måste vara extremt tillförlitliga för att möta behoven för den kommande utvecklingen inom kommunikation. Systemen som används måste testas noggrant och måste överensstämma med deras standarder. Programvara och nätverksprotokoll testas ofta med tekniker som fuzzing, penetrationstest, kodgranskning, testning av överensstämmelse. Med fuzzing kan testare skicka utformade input för att övervaka System Under Test (SUT) för ett svar. 3GPP, standardiseringsorganet för telekomsystemet, producerar ofta nya versioner av specifikationer för att möta kraven och bristerna från tidigare utgåvor. Detta leder till många versioner av specifikationer för ett nätverksprotokoll som Radio Resource Control (RRC) och testare behöver ständigt uppdatera testverktygen och testmiljön. I detta arbete visar vi att genom att använda den generiska karaktären av RRC-specifikationer, som ges i beskrivningsspråket Abstract Syntax Notation One (ASN.1), kan man designa ett testverktyg för att anpassa sig till alla versioner av 3GPP-specifikationer. Detta uppsatsarbete introducerar en ASN.1-baserad adaptiv fuzzer som kan användas för att testa RRC och andra nätverksprotokoll baserat på ASN.1- beskrivningsspråk. Fuzzer extraherar kunskap om pågående RRC meddelanden med användning av protokollbeskrivningsfiler för RRC, dvs RRC ASN.1 schema från 3GPP, och använder kunskapen för att fuzz RRC meddelanden. Den adaptiva fuzzer identifierar enskilda fält, delmeddelanden och anpassade datatyper enligt specifikationer när innehållet i befintliga meddelanden muteras. Dessutom har den adaptiva fuzzer identifierat en tidigare oidentifierad sårbarhet i Evolved Packet Core (EPC) för srsLTE och openLTE, två opensource LTE-implementeringar, vilket bekräftar tillämpligheten för robusthetsprovning av RRC och andra nätverksprotokoll.
APA, Harvard, Vancouver, ISO, and other styles
11

Izagirre, Mikel. "Deception strategies for web application security: application-layer approaches and a testing platform." Thesis, Luleå tekniska universitet, Institutionen för system- och rymdteknik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-64419.

Full text
Abstract:
The popularity of the internet has made the use of web applications ubiquitous and essential to the daily lives of people, businesses and governments. Web servers and web applications are commonly used to handle tasks and data that can be critical and highly valuable, making them a very attractive target for attackers and a vector for successful attacks that are aimed at the application layer. Existing misuse and anomaly-based detection and prevention techniques fail to cope with the volume and sophistication of new attacks that are continuously appearing, which suggests that there is a need to provide new additional layers of protection. This work aims to design a new layer of defense based on deception that is employed in the context of web application-layer traffic with the purpose of detecting and preventing attacks. The proposed design is composed of five deception strategies: Deceptive Comments, Deceptive Request Parameters, Deceptive Session Cookies, Deceptive Status Codes and Deceptive JavaScript. The strategies were implemented as a software artifact and their performance evaluated in a testing environment using a custom test script, the OWASP ZAP penetration testing tool and two vulnerable web applications. Deceptive Parameter strategy obtained the best security performance results, followed by Deceptive Comments and Deceptive Status Codes. Deceptive Cookies and Deceptive JavaScript got the poorest security performance results since OWASP ZAP was unable to detect and use deceptive elements generated by these strategies. Operational performance results showed that the deception artifact could successfully be implemented and integrated with existing web applications without changing their source code and adding a low operational overhead.
APA, Harvard, Vancouver, ISO, and other styles
12

Chang, Shun-Lee, and 張舜理. "A Security Testing System for Vulnerability Detection." Thesis, 1999. http://ndltd.ncl.edu.tw/handle/01883050089856034879.

Full text
Abstract:
碩士<br>國立交通大學<br>資訊工程系<br>87<br>In this thesis, a security-testing system is proposed that takes advantage of static verification and run-time analyzing to discover the potential vulnerabilities. Through in-depth investigation of advisories announced by CERT/CC, we realize that coding defects of software are the major part of Internet security incidents. The software should be examined by the security testing system to find out the vulnerabilities before using it. The proposed security-testing system has the ability to retrieve and test the resources that have relationship with the software to be tested. After the testing, those resources are ensured not to cause the penetration. The proposed testing system also provides the extensibility so that the testers can add new testing requirements to complete the testing. Moreover, with the ability of extensibility, undiscovered vulnerabilities can be detected by adding correlative testing rules. A prototype is implemented to prove the feasibility of our system.
APA, Harvard, Vancouver, ISO, and other styles
13

Yang, Zhenrong. "On building a dynamic security vulnerability detection system using program monitoring technique." Thesis, 2008. http://spectrum.library.concordia.ca/976019/1/MR40905.pdf.

Full text
Abstract:
This thesis presents a dynamic security vulnerability detection framework that sets up an infrastructure for automatic security testing of Free and Open Source Software (FOSS) projects. It makes three contributions to the design and implementation of a dynamic vulnerability detection system. Firstly, a mathematical model called Team Edit Automata is defined and implemented for security property specification. Secondly, an automatic code instrumentation tool is designed and implemented by extending the GNU Compiler Collection (GCC). The extension facilitates seamless integration of code instrumentation into FOSS projects' existing build system. Thirdly, a dynamic vulnerability detection system is prototyped to integrate the aforementioned two techniques. Experiments with the system are elaborated to automatically build, execute, and detect vulnerabilities of FOSS projects. Overall, this research demonstrates that monitoring program with Team Edit Automata can effectively detect security property violation.
APA, Harvard, Vancouver, ISO, and other styles
14

Chang, Tien-Chan, and 張添展. "The Vulnerability Assessment System of Security Detection – A case study of M corporation." Thesis, 2018. http://ndltd.ncl.edu.tw/handle/sy47de.

Full text
Abstract:
碩士<br>國立交通大學<br>管理學院資訊管理學程<br>106<br>The popularity of the Internet has accelerated the development for various applications of information technology. According to the recent research, the numbers of information security incidents are increased significantly. While we are benefited by the conveniences of the Internet, many unknown system weaknesses were discovered, moreover used to transmit and spread. For example: The WannaCry ransomware uses Microsoft SMB protocol to remotely execute program codes. The Bad Rabbit ransomware exploits flash vulnerability to remotely execute malicious codes. In recent years, the attacked types of information security incidents are based on APT (Advanced Continuous Attack). In general, the internal host computers of the enterprises will not be updated for various reasons (the system is stable、the program running on the host is out of date, etc.). So that if enterprises are attacked by APT, the impact is disastrous for business. According to the research, these unexplainable system risks and weaknesses are mainly caused by the faulty system design, the vulnerability caused by operational processes or human negligence. Apparently, the hidden system vulnerabilities creates great benefits for attackers. It generally causes great loss when the information security incident is discovered. Moreover, it is often difficult to trace the source and restore the original status afterwards. Therefore, it is an important lesson to actively discover the weaknesses of the system and take the preventions. This research makes a study of the M company of foundry industry. Before implementing the Vulnerability Assessment System, the administrators can only depend on the traditional WSUS and anti-virus software for warning but unaware of the weaknesses of hosts. With validations and tests, the vulnerability assessment system can support the administrators to quickly find out vulnerabilities and provide the suggestions for patches, it is obviously helpful for improving the information security in the enterprise.
APA, Harvard, Vancouver, ISO, and other styles
15

Soares, João Pedro dos Santos. "Implementation of a distributed intrusion detection and reaction system." Master's thesis, 2016. http://hdl.handle.net/10316/99196.

Full text
Abstract:
Relatório Final Estágio do Mestrado Engenharia Informática apresentado à Faculdade de Ciências e Tecnologia da Universidade de Coimbra.<br>Security was not always an important aspect in terms of networking and hosts. Nowadays, it is absolutely mandatory. Security measures must make an e ort to evolve at the same rate, or even at a higher rate, than threats, which is proving to be the most di cult of tasks. In this report we will detail the process of the implementation of a real distributed intrusion detection and reaction system, that will be responsible for securing a core set of networks already in production, comprising of thousands of servers, users and their respective con dential information.
APA, Harvard, Vancouver, ISO, and other styles
16

Pang, Chengzong. "Fast Detection and Mitigation of Cascading Outages in the Power System." Thesis, 2011. http://hdl.handle.net/1969.1/ETD-TAMU-2011-12-10514.

Full text
Abstract:
This dissertation studies the causes and mechanism of power system cascading outages and proposes the improved interactive scheme between system-wide and local levels of monitoring and control to quickly detect, classify and mitigate the cascading outages in power system. A novel method for evaluating the vulnerability of individual components as well as the whole power system, which is named as weighted vulnerability analysis, is developed. Betweenness centrality is used to measure the importance of each bus and transmission line in the modeled power system network, which is in turn used to determine the weights for the weighted vulnerability index. It features fast reaction time and achieves higher accuracy when dealing with the cascading outage detection, classification and mitigation over the traditional methods. The overload problem due to power flow redistribution after one line tripped is a critical factor contributing to the cascading outages. A parallel corridor searching method is proposed to quickly identify the most vulnerable components after tripping a transmission line. The power system topology model can be simplified into state graph after searching the domains for each generator, the commons for each bus, and links between the commons. The parallel corridor will be determined by searching the links and commons in system topology graph for the given state of power system operation. During stressed operating state, either stable or unstable power swing may have impacts on distance relay judgment and lead to relay misoperation, which will result in the power system lines being tripped and as a consequence power system operating state becoming even more stressful. At the local level, an enhanced fault detection tool during power system swing is developed to reduce the chance of relay misoperation. Comprehensive simulation studies have been implemented by using the IEEE 39-bus and 118-bus test systems. The results are promising because: The results from weighted vulnerability analysis could provide better system situational awareness and accurate information about the disturbance; The results form parallel corridor search method could identify the most vulnerable lines after power re-distribution, which will give operator time to take remedial actions; The results from new travelling wave and wavelet transform based fault detection could reduce the impact of relay misoperation.
APA, Harvard, Vancouver, ISO, and other styles
17

VESCIO, GIOVANNI. "Vulnerability and Reliability Assessment of Electrical Power System using Petri Nets." Doctoral thesis, 2012. http://hdl.handle.net/11573/916929.

Full text
Abstract:
The research aims to build a behavioral models of electrical power systems through the use of Petri nets, covering both traditional reliability analysis (quantitative measurement) and behavior analysis (qualitative measurement) for the detection of topology defects. The development models has allowed to study the operating sequence of the electrical system, where the evolution of the system to be monitored is compared with its ideal evolution. The modelling criterion is then validated by the implementation in two case studies. In the first case, the whole approach is applied to the identification of a complex electrical plant supplying a critical structure such as a hospital. Petri nets are capable of modeling and simulating all the functional dependencies between the system elements and also the correct fault repair behavior for the tested power system. This allows the system to be analyzed for properties such as tolerance of failures and isolation of the safety hazards to determine which functions are most critical and moreover need to be enhanced in order to mitigate the accidental risks. In the second case study, a new approach for LOTO procedures using Petri nets is introduced. The suggested methodology supports mechanical development of the operational procedures that can assists the operator especially for not automatic operations. It proposes some simple rules, a graphical representation of electrical status and an algebraic model to allow an autocheck of the interlocking procedure.
APA, Harvard, Vancouver, ISO, and other styles
18

Danese, Alessandro. "System-level functional and extra-functional characterization of SoCs through assertion mining." Doctoral thesis, 2018. http://hdl.handle.net/11562/979447.

Full text
Abstract:
Virtual prototyping is today an essential technology for modeling, verification, and re-design of full HW/SW platforms. This allows a fast prototyping of platforms with a higher and higher complexity, which precludes traditional verification approaches based on the static analysis of the source code. Consequently, several technologies based on the analysis of simulation traces have proposed to efficiently validate the entire system from both the functional and extra-functional point of view. From the functional point of view, different approaches based on invariant and assertion mining have been proposed in literature to validate the functionality of a system under verification (SUV). Dynamic mining of invariants is a class of approaches to extract logic formulas with the purpose of expressing stable conditions in the behavior of the SUV. The mined formulas represent likely invariants for the SUV, which certainly hold on the considered traces. A large set of representative execution traces must be analyzed to increase the probability that mined invariants are generally true. However, this is extremely time-consuming for current sequential approaches when long execution traces and large set of SUV's variables are considered. Dynamic mining of assertions is instead a class of approaches to extract temporal logic formulas with the purpose of expressing temporal relations among the variables of a SUV. However, in most cases, existing tools can only mine assertions compliant with a limited set of pre-defined templates. Furthermore, they tend to generate a huge amount of assertions, while they still lack an effective way to measure their coverage in terms of design behaviors. Moreover, the security vulnerability of a firmware running on a HW/SW platforms is becoming ever more critical in the functional verification of a SUV. Current approaches in literature focus only on raising an error as soon as an assertion monitoring the SUV fails. No approach was proposed to investigate the issue that this set of assertions could be incomplete and that different, unusual behaviors could remain not investigated. From the extra-functional point of view of a SUV, several approaches based on power state machines (PSMs) have been proposed for modeling and simulating the power consumption of an IP at system-level. However, while they focus on the use of PSMs as the underlying formalism for implementing dynamic power management techniques of a SoC, they generally do not deal with the basic problem of how to generate a PSM. In this context, the thesis aims at exploiting dynamic assertion mining to improve the current approaches for the characterization of functional and extra-functional properties of a SoC with the final goal of providing an efficient and effective system-level virtual prototyping environment. In detail, the presented methodologies focus on: efficient extraction of invariants from execution traces by exploiting GP-GPU architectures; extraction of human-readable temporal assertions by combining user-defined assertion templates, data mining and coverage analysis; generation of assertions pinpointing the unlike execution paths of a firmware to guide the analysis of the security vulnerabilities of a SoC; and last but not least, automatic generation of PSMs for the extra-functional characterization of the SoC.
APA, Harvard, Vancouver, ISO, and other styles
19

Farroukh, Amer. "Enhancing Performance of Vulnerability-based Intrusion Detection Systems." Thesis, 2010. http://hdl.handle.net/1807/25574.

Full text
Abstract:
The accuracy of current intrusion detection systems (IDSes) is hindered by the limited capability of regular expressions (REs) to express the exact vulnerability. Recent advances have proposed vulnerability-based IDSes that parse traffic and retrieve protocol semantics to describe the vulnerability. Such a description of attacks is analogous to subscriptions that specify events of interest in event processing systems. However, the matching engine of state-of-the-art IDSes lacks efficient matching algorithms that can process many signatures simultaneously. In this work, we place event processing in the core of the IDS and propose novel algorithms to efficiently parse and match vulnerability signatures. Also, we are among the first to detect complex attacks such as the Conficker worm which requires correlating multiple protocol data units (MPDUs) while maintaining a small memory footprint. Our approach incurs neglibile overhead when processing clean traffic, is resilient to attacks, and is faster than existing systems.
APA, Harvard, Vancouver, ISO, and other styles
20

Chen, Po-Ying, and 陳柏穎. "A Vulnerability Detecting System for Android Webview and Web Applications." Thesis, 2018. http://ndltd.ncl.edu.tw/handle/9s48c6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!