To see the other types of publications on this topic, follow the link: Computer software – Security measures.

Dissertations / Theses on the topic 'Computer software – Security measures'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Computer software – Security measures.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Allam, Sean. "A model to measure the maturuty of smartphone security at software consultancies." Thesis, University of Fort Hare, 2009. http://hdl.handle.net/10353/281.

Full text
Abstract:
Smartphones are proliferating into the workplace at an ever-increasing rate, similarly the threats that they pose is increasing. In an era of constant connectivity and availability, information is freed up of constraints of time and place. This research project delves into the risks introduced by smartphones, and through multiple cases studies, a maturity measurement model is formulated. The model is based on recommendations from two leading information security frameworks, the COBIT 4.1 framework and ISO27002 code of practice. Ultimately, a combination of smartphone specific risks are integrated with key control recommendations, in providing a set of key measurable security maturity components. The subjective opinions of case study respondents are considered a key component in achieving a solution. The solution addresses the concerns of not only policy makers, but also the employees subjected to the security policies. Nurturing security awareness into organisational culture through reinforcement and employee acceptance is highlighted in this research project. Software consultancies can use this model to mitigate risks, while harnessing the potential strategic advantages of mobile computing through smartphone devices. In addition, this research project identifies the critical components of a smartphone security solution. As a result, a model is provided for software consultancies due to the intense reliance on information within these types of organisations. The model can be effectively applied to any information intensive organisation.
APA, Harvard, Vancouver, ISO, and other styles
2

Monteiro, Valter. "How intrusion detection can improve software decoy applications." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2003. http://library.nps.navy.mil/uhtbin/hyperion-image/03Mar%5FMonteiro.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Colesky, Michael Robert. "SecMVC : a model for secure software design based on the model-view-controller pattern." Thesis, Nelson Mandela Metropolitan University, 2014. http://hdl.handle.net/10948/d1020614.

Full text
Abstract:
Current advances in the software development industry are growing more ubiquitous by the day. This has caused for security, not only in the broader sense, but specifically within the design and overall development of software itself, to become all the more important. An evidently prevalent problem in the domain of software development is that software security is not consistently addressed during design, which undermines core security concerns, and leads to the development of insecure software. This research seeks to address this issue via a model for secure software design, which is based on a software design pattern, namely, the Model-View-Controller (MVC) pattern. The use of a pattern to convey knowledge is not a new notion. However, the ability of software design patterns to convey secure software design is an idea worth investigating. Following identification of secure software design principles and concepts, as well as software design patterns, specifically those relating to the MVC pattern, a model was designed and developed. With the MVC pattern argued as being a suitable foundation for the model, the security conscious MVC (SecMVC) combines secure software design principles and concepts into the MVC pattern. Together herewith, the MVC pattern’s components in the MVC Compound pattern, namely: the Observer pattern, the Strategy pattern, and the Composite pattern, have provided further sub-models for less abstraction and greater detail. These sub-models were developed, as a result of the SecMVC model’s evaluation in the validation for this study, an expert review. Argued in the light of similar research methods, the expert review was chosen – along with a process that included the use of two expert participants to validate the SecMVC model. It was determined through the expert review that the SecMVC model is of sufficient utility, quality, and efficacy to constitute research value. The research methodology process followed was design science, in which the SecMVC model, which includes its related sub-models, serves as the artefact and research output of this study. This research study contributes evidence of the feasibility for integrating knowledge into software design patterns. This includes the SecMVC model itself. In addition, it argues for the use of an expert review, as an evaluative research method for such an artifact.
APA, Harvard, Vancouver, ISO, and other styles
4

Ma, Chunyan. "Mathematical security models for multi-agent distributed systems." CSUSB ScholarWorks, 2004. https://scholarworks.lib.csusb.edu/etd-project/2568.

Full text
Abstract:
This thesis presents the developed taxonomy of the security threats in agent-based distributed systems. Based on this taxonomy, a set of theories is developed to facilitate analyzng the security threats of the mobile-agent systems. We propose the idea of using the developed security risk graph to model the system's vulnerabilties.
APA, Harvard, Vancouver, ISO, and other styles
5

Ur-Rehman, Wasi. "Maintaining Web Applications Integrity Running on RADIUM." Thesis, University of North Texas, 2015. https://digital.library.unt.edu/ark:/67531/metadc804975/.

Full text
Abstract:
Computer security attacks take place due to the presence of vulnerabilities and bugs in software applications. Bugs and vulnerabilities are the result of weak software architecture and lack of standard software development practices. Despite the fact that software companies are investing millions of dollars in the research and development of software designs security risks are still at large. In some cases software applications are found to carry vulnerabilities for many years before being identified. A recent such example is the popular Heart Bleed Bug in the Open SSL/TSL. In today’s world, where new software application are continuously being developed for a varied community of users; it’s highly unlikely to have software applications running without flaws. Attackers on computer system securities exploit these vulnerabilities and bugs and cause threat to privacy without leaving any trace. The most critical vulnerabilities are those which are related to the integrity of the software applications. Because integrity is directly linked to the credibility of software application and data it contains. Here I am giving solution of maintaining web applications integrity running on RADIUM by using daikon. Daikon generates invariants, these invariants are used to maintain the integrity of the web application and also check the correct behavior of web application at run time on RADIUM architecture in case of any attack or malware. I used data invariants and program flow invariants in my solution to maintain the integrity of web-application against such attack or malware. I check the behavior of my proposed invariants at run-time using Lib-VMI/Volatility memory introspection tool. This is a novel approach and proof of concept toward maintaining web application integrity on RADIUM.
APA, Harvard, Vancouver, ISO, and other styles
6

Kaiser, Edward Leo. "Addressing Automated Adversaries of Network Applications." PDXScholar, 2010. https://pdxscholar.library.pdx.edu/open_access_etds/4.

Full text
Abstract:
The Internet supports a perpetually evolving patchwork of network services and applications. Popular applications include the World Wide Web, online commerce, online banking, email, instant messaging, multimedia streaming, and online video games. Practically all networked applications have a common objective: to directly or indirectly process requests generated by humans. Some users employ automation to establish an unfair advantage over non-automated users. The perceived and substantive damages that automated, adversarial users inflict on an application degrade its enjoyment and usability by legitimate users, and result in reputation and revenue loss for the application's service provider. This dissertation examines three challenges critical to addressing the undesirable automation of networked applications. The first challenge explores individual methods that detect various automated behaviors. Detection methods range from observing unusual network-level request traffic to sensing anomalous client operation at the application-level. Since many detection methods are not individually conclusive, the second challenge investigates how to combine detection methods to accurately identify automated adversaries. The third challenge considers how to leverage the available knowledge to disincentivize adversary automation by nullifying their advantage over legitimate users. The thesis of this dissertation is that: there exist methods to detect automated behaviors with which an application's service provider can identify and then systematically disincentivize automated adversaries. This dissertation evaluates this thesis using research performed on two network applications that have different access to the client software: Web-based services and multiplayer online games.
APA, Harvard, Vancouver, ISO, and other styles
7

Vibhute, Tejaswini Ajay. "EPA-RIMM-V: Efficient Rootkit Detection for Virtualized Environments." PDXScholar, 2018. https://pdxscholar.library.pdx.edu/open_access_etds/4485.

Full text
Abstract:
The use of virtualized environments continues to grow for efficient utilization of the available compute resources. Hypervisors virtualize the underlying hardware resources and allow multiple Operating Systems to run simultaneously on the same infrastructure. Since the hypervisor is installed at a higher privilege level than the Operating Systems in the software stack it is vulnerable to rootkits that can modify the environment to gain control, crash the system and even steal sensitive information. Thus, runtime integrity measurement of the hypervisor is essential. The currently proposed solutions achieve the goal by relying either partially or entirely on the features of the hypervisor itself, causing them to lack stealth and leaving themselves vulnerable to attack. We have developed a performance sensitive methodology for identifying rootkits in hypervisors from System Management Mode (SMM) while using the features of SMI Transfer Monitor (STM). STM is a recent technology from Intel and it is a virtual machine manager at the firmware level. Our solution extends a research prototype called EPA-RIMM, developed by Delgado and Karavanic at Portland State University. Our solution extends the state of the art in that it stealthily performs measurements of hypervisor memory and critical data structures using firmware features, keeps performance perturbation to acceptable levels and leverages the security features provided by the STM. We describe our approach and include experimental results using a prototype we have developed for Xen hypervisor on Minnowboard Turbot, an open hardware platform.
APA, Harvard, Vancouver, ISO, and other styles
8

Hughes, Grant Douglas. "A framework for software patch management in a multi-vendor environment." Thesis, Cape Peninsula University of Technology, 2016. http://hdl.handle.net/20.500.11838/2478.

Full text
Abstract:
Thesis (MTech (Information Technology))--Cape Peninsula University of Technology, 2016.<br>Software often requires patches to be installed post-implementation for a variety of reasons. Organisations and individuals, however, do not always promptly install these patches as and when they are released. This study investigated the reasons for the delay or hesitation, identified the challenges, and proposed a model that could assist organisations in overcoming the identified challenges. The research investigated the extent to which the integration of software patch management and enterprise data security is an important management responsibility, by reviewing relevant documents and interviewing key role players currently involved in the patch management process. The current challenges and complexities involved in patch management at an enterprise level could place organisations at risk by compromising their enterprise-data security. This research primarily sought to identify the challenges causing the management of software patches to be complex, and further attempted to establish how organisations currently implement patch management. The aim of the study was to explore the complexities of software patch management in order to enhance enterprise data security within organisations. A single case study was used, and data were obtained from primary sources and literature. The study considered both technological and human factors, and found that both factors play an equally important role with regard to the successful implementation of a patch management program within an organisation.
APA, Harvard, Vancouver, ISO, and other styles
9

Xuan, Chaoting. "Countering kernel malware in virtual execution environments." Diss., Atlanta, Ga. : Georgia Institute of Technology, 2009. http://hdl.handle.net/1853/31718.

Full text
Abstract:
Thesis (Ph.D)--Electrical and Computer Engineering, Georgia Institute of Technology, 2010.<br>Committee Chair: Copeland A. John; Committee Member: Alessandro Orso; Committee Member: Douglas M. Blough; Committee Member: George F. Riley; Committee Member: Raheem A. Beyah. Part of the SMARTech Electronic Thesis and Dissertation Collection.
APA, Harvard, Vancouver, ISO, and other styles
10

Ransbotham, Samuel B. III. "Acquisition and diffusion of technology innovation." Diss., Georgia Institute of Technology, 2008. http://hdl.handle.net/1853/28094.

Full text
Abstract:
In the first essay, I examine value created through external acquisition of nascent technology innovation. External acquisition of new technology is a growing trend in the innovation process, particularly in high technology industries, as firms complement internal efforts with aggressive acquisition programs. Yet, despite its importance, there is little empirical research on the timing of acquisition decisions in high technology environments. I examine the impact of target age on value created for the buyer. Applying an event study methodology to technology acquisitions in the telecommunications industry from 1995 to 2001, empirical evidence supports acquiring early in the face of uncertainty. The equity markets reward the acquisition of younger companies. In sharp contrast to the first essay, the second essay examines the diffusion of negative innovations. While destruction can be creative, certainly not all destruction is creative. Some is just destruction. I examine two fundamentally different paths to information security compromise an opportunistic path and a deliberate path. Through a grounded approach using interviews, observations, and secondary data, I advance a model of the information security compromise process. Using one year of alert data from intrusion detection devices, empirical analysis provides evidence that these paths follow two distinct, but interrelated diffusion patterns. Although distinct, I find empirical evidence that these paths both converge and escalate. Beyond the specific findings in the Internet security context, the study leads to a richer understanding of the diffusion of negative technological innovation. In the third essay, I build on the second essay by examining the effectiveness of reward-based mechanisms in restricting the diffusion of negative innovations. Concerns have been raised that reward-based private infomediaries introduce information leakage which decreases social welfare. Using two years of alert data, I find evidence of their effectiveness despite any leakage which may be occurring. While reward-based disclosures are just as likely to be exploited as non-reward-baed disclosures, exploits from reward-based disclosures are less likely to occur in the first week after disclosure. Further the overall volume of alerts is reduced. This research helps determine the effectiveness of reward mechanisms and provides guidance for security policy makers.
APA, Harvard, Vancouver, ISO, and other styles
11

Gouvêa, Conrado Porto Lopes 1984. "Implementação em software de criptografia baseada em emparelhamentos para redes de sensores usando o microcontrolador MSP430." [s.n.], 2010. http://repositorio.unicamp.br/jspui/handle/REPOSIP/275806.

Full text
Abstract:
Orientador: Julio César López Hernández<br>Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Computação<br>Made available in DSpace on 2018-08-16T09:36:03Z (GMT). No. of bitstreams: 1 Gouvea_ConradoPortoLopes_M.pdf: 1643588 bytes, checksum: 84895f14e5bab746796d6ca64e8287cf (MD5) Previous issue date: 2010<br>Resumo: Redes de sensores sem fio têm se tornado populares recentemente e possuem inúmeras aplicações. Contudo, elas apresentam o desafio de como proteger suas comunicações utilizando esquemas criptográficos, visto que são compostas por dispositivos de capacidade extremamente limitada. Neste trabalho é descrita uma implementação eficiente em software, para redes de sensores sem fio, de duas tecnologias de criptografia pública: a Criptografia Baseada em Emparelhamentos (CBE) e a Criptografia de Curvas Elípticas (CCE). Nossa implementação foca a família de microcontroladores MSP430 de 16 bits, utilizada em sensores como o Tmote Sky e TelosB. Em particular, para a CBE, foram implementados algoritmos para o cálculo de emparelhamentos nas curvas MNT e BN sobre corpos primos; para a CCE, foi implementado o esquema de assinatura ECDSA sobre corpos primos para os níveis de segurança de 80 e 128 bits. As principais contribuições deste trabalho são um estudo aprofundado dos algoritmos de emparelhamentos bilineares e novas otimizações na aritmética de corpos primos para a MSP430, que consequentemente melhoram o desempenho dos criptossistemas de CBE e CCE em tal plataforma<br>Abstract: Wireless sensor networks have become popular recently and provide many applications. However, the deployment of cryptography in sensor networks is a challenging task, given their limited computational power and resource-constrained nature. This work presents an efficient software implementation, for wireless sensor networks, of two public-key systems: Pairing-Based Cryptography (PBC) and Elliptic Curve Cryptography (ECC). Our implementation targets the MSP430 microcontroller, which is used in some sensors including the Tmote Sky and TelosB. For the PBC, we have implemented algorithms for pairing computation on MNT and BN curves over prime fields; for the ECC, the signature scheme ECDSA over prime fields for the 80-bit and 128-bit security levels. The main contributions of this work are an in-depth study of bilinear pairings algorithms and new optimizations for the prime field arithmetic in the MSP430, which improves the running times of the PBC and ECC cryptosystems on the platform<br>Mestrado<br>Teoria da Computação<br>Mestre em Ciência da Computação
APA, Harvard, Vancouver, ISO, and other styles
12

Irwin, Barry Vivian William. "A framework for the application of network telescope sensors in a global IP network." Thesis, Rhodes University, 2011. http://hdl.handle.net/10962/d1004835.

Full text
Abstract:
The use of Network Telescope systems has become increasingly popular amongst security researchers in recent years. This study provides a framework for the utilisation of this data. The research is based on a primary dataset of 40 million events spanning 50 months collected using a small (/24) passive network telescope located in African IP space. This research presents a number of differing ways in which the data can be analysed ranging from low level protocol based analysis to higher level analysis at the geopolitical and network topology level. Anomalous traffic and illustrative anecdotes are explored in detail and highlighted. A discussion relating to bogon traffic observed is also presented. Two novel visualisation tools are presented, which were developed to aid in the analysis of large network telescope datasets. The first is a three-dimensional visualisation tool which allows for live, near-realtime analysis, and the second is a two-dimensional fractal based plotting scheme which allows for plots of the entire IPv4 address space to be produced, and manipulated. Using the techniques and tools developed for the analysis of this dataset, a detailed analysis of traffic recorded as destined for port 445/tcp is presented. This includes the evaluation of traffic surrounding the outbreak of the Conficker worm in November 2008. A number of metrics relating to the description and quantification of network telescope configuration and the resultant traffic captures are described, the use of which it is hoped will facilitate greater and easier collaboration among researchers utilising this network security technology. The research concludes with suggestions relating to other applications of the data and intelligence that can be extracted from network telescopes, and their use as part of an organisation’s integrated network security systems
APA, Harvard, Vancouver, ISO, and other styles
13

Tang, Jin. "Mobile IPv4 Secure Access to Home Networks." Diss., Georgia Institute of Technology, 2006. http://hdl.handle.net/1853/11536.

Full text
Abstract:
With the fast development of wireless networks and devices, Mobile IP is expected to be used widely so that mobile users can access the Internet anywhere, anytime without interruption. However, some problems, such as firewall traversal and use of private IP addresses, restrict use of Mobile IP. The objective of this thesis is to design original schemes that can enable a mobile node at abroad to access its home network as well as the Internet securely and that can help Mobile IP to be used widely and commercially. Our solutions are secure, efficient, and scalable. They can be implemented and maintained easily. In this thesis, we mainly consider Mobile IPv4, instead of Mobile IPv6. Three research topics are discussed. In each topic, the challenges are investigated and the new solutions are presented. The first research topic solves the firewall traversal problems in Mobile IP. A mobile node cannot access its firewall-protected home network if it fails the authentication by the firewall. We propose that an IPsec tunnel be established between the firewall and the foreign agent for firewall traversal and that an IPsec transport security association be shared by the mobile node and a correspondent node for end-to-end security. The second topic researches further on firewall traversal problems and investigates the way of establishing security associations among network entities. A new security model and a new key distribution method are developed. With the help of the security model and keys, the firewall and the relevant network entities set up IPsec security associations to achieve firewall traversal. A mobile node from a private home network cannot communicate with other hosts with its private home address when it is visiting a public foreign network. A novel and useful solution is presented in the third research topic. We suggest that the mobile node use its Network Access Identifier (NAI) as its identification and obtain a public home address from its home agent. In addition, a new tunnel between the mobile node and its home agent is proposed.
APA, Harvard, Vancouver, ISO, and other styles
14

Ndakunda, Shange-Ishiwa Tangeni. "A mobile toolkit and customised location server for the creation of cross-referencing location-based services." Thesis, Rhodes University, 2013. http://hdl.handle.net/10962/d1013604.

Full text
Abstract:
Although there are several Software Development kits and Application Programming Interfaces for client-side location-based services development, they mostly involve the creation of self-referencing location-based services. Self-referencing location-based services include services such as geocoding, reverse geocoding, route management and navigation which focus on satisfying the location-based requirements of a single mobile device. There is a lack of open-source Software Development Kits for the development of client-side location-based services that are cross-referencing. Cross-referencing location-based services are designed for the sharing of location information amongst different entities on a given network. This project was undertaken to assemble, through incremental prototyping, a client-side Java Micro Edition location-based services Software Development Kit and a Mobicents location server to aid mobile network operators and developers alike in the quick creation of the transport and privacy protection of cross-referencing location-based applications on Session Initiation Protocol bearer networks. The privacy of the location information is protected using geolocation policies. Developers do not need to have an understanding of Session Initiation Protocol event signaling specifications or of the XML Configuration Access Protocol to use the tools that we put together. The developed tools are later consolidated using two sample applications, the friend-finder and child-tracker services. Developer guidelines are also provided, to aid in using the provided tools.
APA, Harvard, Vancouver, ISO, and other styles
15

Chen, Tang-Li. "Designing secure, JAVA based online registration systems to meet peak load performance targets." CSUSB ScholarWorks, 2004. https://scholarworks.lib.csusb.edu/etd-project/2767.

Full text
Abstract:
This project "Designing Secure, Java Based Online Registration Systems to Meet Peak Load Performance Targets" is a simulation of a Web-based exposition management system plus a performance testing procedure to examine this web application.
APA, Harvard, Vancouver, ISO, and other styles
16

Cowie, Bradley. "An exploratory study of techniques in passive network telescope data analysis." Thesis, Rhodes University, 2013. http://hdl.handle.net/10962/d1002038.

Full text
Abstract:
Careful examination of the composition and concentration of malicious traffic in transit on the channels of the Internet provides network administrators with a means of understanding and predicting damaging attacks directed towards their networks. This allows for action to be taken to mitigate the effect that these attacks have on the performance of their networks and the Internet as a whole by readying network defences and providing early warning to Internet users. One approach to malicious traffic monitoring that has garnered some success in recent times, as exhibited by the study of fast spreading Internet worms, involves analysing data obtained from network telescopes. While some research has considered using measures derived from network telescope datasets to study large scale network incidents such as Code-Red, SQLSlammer and Conficker, there is very little documented discussion on the merits and weaknesses of approaches to analyzing network telescope data. This thesis is an introductory study in network telescope analysis and aims to consider the variables associated with the data received by network telescopes and how these variables may be analysed. The core research of this thesis considers both novel and previously explored analysis techniques from the fields of security metrics, baseline analysis, statistical analysis and technical analysis as applied to analysing network telescope datasets. These techniques were evaluated as approaches to recognize unusual behaviour by observing the ability of these techniques to identify notable incidents in network telescope datasets
APA, Harvard, Vancouver, ISO, and other styles
17

Herbig, Christopher Fred. "Use of OpenSSH support for remote login to a multilevel secure system." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2004. http://library.nps.navy.mil/uhtbin/hyperion/04Dec%5FHerbig.pdf.

Full text
Abstract:
Thesis (M.S. in Computer Science)--Naval Postgraduate School, Dec. 2004.<br>Thesis Advisor(s): Cynthia E. Irvine, Thuy D. Nguyen. Includes bibliographical references (p. 201-202). Also available online.
APA, Harvard, Vancouver, ISO, and other styles
18

Morais, Anderson Nunes Paiva. "Injeção de ataques baseado em modelo para teste de protocolos de segurança." [s.n.], 2009. http://repositorio.unicamp.br/jspui/handle/REPOSIP/275915.

Full text
Abstract:
Orientadores: Eliane Martins, Ricardo de Oliveira Anido<br>Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Computação<br>Made available in DSpace on 2018-08-14T04:24:04Z (GMT). No. of bitstreams: 1 Morais_AndersonNunesPaiva.pdf: 1792317 bytes, checksum: e8304b24c7765a959814665bcaff15c8 (MD5) Previous issue date: 2009<br>Resumo: Neste trabalho apresentamos uma proposta de geração de ataques para testes de protocolos de segurança. O objetivo é detectar vulnerabilidades de um protocolo, que um atacante pode explorar para causar falhas de segurança. Nossa proposta usa um injetor de falhas para emular um atacante que possui total controle do sistema de comunicação. Como o sucesso dos testes depende principalmente dos ataques injetados, nós propomos uma abordagem baseada em modelos para a geração de ataques. O modelo representa ataques conhecidos e reportados do protocolo sob teste. A partir deste modelo, cenários de ataque são gerados. Os cenários estão em um formato que é independente do injetor de falhas usado. Usando refinamentos e transformações, pode-se converter a descrição do cenário de ataque em scripts específicos do injetor de falhas. A proposta pode ser completamente apoiada por ferramentas de software. Nós ilustramos o uso da proposta com um estudo de caso, um protocolo de segurança para dispositivos móveis<br>Abstract: We present an attack injection approach for security protocols testing. The goal is to uncover protocol vulnerabilities that an attacker can exploit to cause security failures. Our approach uses a fault injector to emulate an attacker that has control over the communication system. Since the success of the tests depends greatly on the attacks injected, we propose a model-based approach for attack generation. The model represents reported known attacks to the protocol under test. From this model, attack scenarios are generated. The scenarios are in a format that is independent of the fault injector used. Using refinements and transformations, the abstract scenario specification can be converted to the specific fault injector scripts. The approach can be completely supported by tools. We illustrate the use of the approach in a case study, a security protocol for mobile devices<br>Universidade Estadual de Campi<br>Tolerancia a Falhas<br>Mestre em Ciência da Computação
APA, Harvard, Vancouver, ISO, and other styles
19

Grégio, André Ricardo Abed. "Malware Behavior = Comportamento de programas maliciosos." [s.n.], 2012. http://repositorio.unicamp.br/jspui/handle/REPOSIP/261000.

Full text
Abstract:
Orientadores: Mario Jino, Paulo Licio de Geus<br>Tese (doutorado) - Universidade Estadual de Campinas, Faculdade de Engenharia Elétrica e de Computação<br>Made available in DSpace on 2018-08-21T16:40:48Z (GMT). No. of bitstreams: 1 Gregio_AndreRicardoAbed_D.pdf: 5158672 bytes, checksum: 12a24da95543bac78fd3f047f7415314 (MD5) Previous issue date: 2012<br>Resumo: Ataques envolvendo programas maliciosos (malware) s~ao a grande ameaça atual _a segurança de sistemas. Assim, a motivação desta tese _e estudar o comportamento de malware e como este pode ser utilizado para fins de defesa. O principal mecanismo utilizado para defesa contra malware _e o antivírus (AV). Embora seu propósito seja detectar (e remover) programas maliciosos de máquinas infectadas, os resultados desta detecção provêem, para usuários e analistas, informações insuficientes sobre o processo de infecção realizado pelo malware. Além disso, não há um padrão de esquema de nomenclatura para atribuir, de maneira consistente, nomes de identificação para exemplares de malware detectados, tornando difícil a sua classificação. De modo a prover um esquema de nomenclatura para malware e melhorar a qualidade dos resultados produzidos por sistemas de análise dinâmica de malware, propõe-se, nesta tese, uma taxonomia de malware com base nos comportamentos potencialmente perigosos observados durante vários anos de análise de exemplares encontrados em campo. A meta principal desta taxonomia _e ser clara, de simples manutenção e extensão, e englobar tipos gerais de malware (worms, bots, spyware). A taxonomia proposta introduz quatro classes e seus respectivos comportamentos de alto nível, os quais representam atividades potencialmente perigosas. Para avaliá-la, foram utilizados mais de 12 mil exemplares únicos de malware pertencentes a diferentes classes (atribuídas por antivírus). Outras contribuições provenientes desta tese incluem um breve histórico dos programas maliciosos e um levantamento das taxonomias que tratam de tipos específicos de malware; o desenvolvimento de um sistema de análise dinâmica para extrair pefis comportamentais de malware; a especializa- _c~ao da taxonomia para lidar com exemplares de malware que roubam informações (stealers), conhecidos como bankers, a implementação de ferramentas de visualização para interagir com traços de execução de malware e, finalmente, a introdução de uma técnica de agrupamento baseada nos valores escritos por malware na memória e nos registradores<br>Abstract: Attacks involving malicious software (malware) are the major current threats to systems security. The motivation behind this thesis is to study malware behavior with that purpose. The main mechanism used for defending against malware is the antivirus (AV) tool. Although the purpose of an AV is to detect (and remove) malicious programs from infected machines, this detection usually provides insufficient information for users and analysts regarding the malware infection process. Furthermore, there is no standard naming scheme for consistently labeling detected malware, making the malware classification process harder. To provide a meaningful naming scheme, as well as to improve the quality of results produced by dynamic analysis systems, we propose a malware taxonomy based on potentially dangerous behaviors observed during several years of analysis of malware found in the wild. The main goal of the taxonomy is, in addition to being simple to understand, extend and maintain, to embrace general types of malware (e.g., worms, bots, spyware). Our behavior-centric malware taxonomy introduces four classes and their respective high-level behaviors that represent potentially dangerous activities. We applied our taxonomy to more than 12 thousand unique malware samples from different classes (assigned by AV scanners) to show that it is useful to better understand malware infections and to aid in malware-related incident response procedures. Other contributions of our work are: a brief history of malware and a survey of taxonomies that address specific malware types; a dynamic analysis system to extract behavioral profiles from malware; specialization of our taxonomy to handle information stealers known as bankers; proposal of visualization tools to interact with malware execution traces and, finally, a clustering technique based on values that malware writes into memory or registers<br>Doutorado<br>Engenharia de Computação<br>Doutor em Engenharia Elétrica
APA, Harvard, Vancouver, ISO, and other styles
20

Stalmans, Etienne Raymond. "DNS traffic based classifiers for the automatic classification of botnet domains." Thesis, Rhodes University, 2014. http://hdl.handle.net/10962/d1007739.

Full text
Abstract:
Networks of maliciously compromised computers, known as botnets, consisting of thousands of hosts have emerged as a serious threat to Internet security in recent years. These compromised systems, under the control of an operator are used to steal data, distribute malware and spam, launch phishing attacks and in Distributed Denial-of-Service (DDoS) attacks. The operators of these botnets use Command and Control (C2) servers to communicate with the members of the botnet and send commands. The communications channels between the C2 nodes and endpoints have employed numerous detection avoidance mechanisms to prevent the shutdown of the C2 servers. Two prevalent detection avoidance techniques used by current botnets are algorithmically generated domain names and DNS Fast-Flux. The use of these mechanisms can however be observed and used to create distinct signatures that in turn can be used to detect DNS domains being used for C2 operation. This report details research conducted into the implementation of three classes of classification techniques that exploit these signatures in order to accurately detect botnet traffic. The techniques described make use of the traffic from DNS query responses created when members of a botnet try to contact the C2 servers. Traffic observation and categorisation is passive from the perspective of the communicating nodes. The first set of classifiers explored employ frequency analysis to detect the algorithmically generated domain names used by botnets. These were found to have a high degree of accuracy with a low false positive rate. The characteristics of Fast-Flux domains are used in the second set of classifiers. It is shown that using these characteristics Fast-Flux domains can be accurately identified and differentiated from legitimate domains (such as Content Distribution Networks exhibit similar behaviour). The final set of classifiers use spatial autocorrelation to detect Fast-Flux domains based on the geographic distribution of the botnet C2 servers to which the detected domains resolve. It is shown that botnet C2 servers can be detected solely based on their geographic location. This technique is shown to clearly distinguish between malicious and legitimate domains. The implemented classifiers are lightweight and use existing network traffic to detect botnets and thus do not require major architectural changes to the network. The performance impact of implementing classification of DNS traffic is examined and it is shown that the performance impact is at an acceptable level.
APA, Harvard, Vancouver, ISO, and other styles
21

Hsiao, Chih-Wen, David Turner, and Keith Ross. "A secure lightweight currency service provider." CSUSB ScholarWorks, 2004. https://scholarworks.lib.csusb.edu/etd-project/2594.

Full text
Abstract:
The main purpose of this project is to build a bank system that offers a friendly and simple interface to let users easily manage their lightweight currencies. The Lightweight Currency Protocol (LCP) was originally proposed to solve the problem of fairness in resource cooperatives. However, there are other possible applications of the protocol, including the control of spam and as a general purpose medium of exchange for low value transactions. This project investigates the implementation issues of the LCP, and also investigates LCP bank services to provide human interface to currency operations.
APA, Harvard, Vancouver, ISO, and other styles
22

Dua, Akshay. "Trust-but-Verify: Guaranteeing the Integrity of User-generated Content in Online Applications." PDXScholar, 2013. https://pdxscholar.library.pdx.edu/open_access_etds/1425.

Full text
Abstract:
Online applications that are open to participation lack reliable methods to establish the integrity of user-generated information. Users may unknowingly own compromised devices, or intentionally publish forged information. In these scenarios, applications need some way to determine the "correctness" of autonomously generated information. Towards that end, this thesis presents a "trust-but-verify" approach that enables open online applications to independently verify the information generated by each participant. In addition to enabling independent verification, our framework allows an application to verify less information from more trustworthy users and verify more information from less trustworthy ones. Thus, an application can trade-off performance for more integrity, or vice versa. We apply the trust-but-verify approach to three different classes of online applications and show how it can enable 1) high-integrity, privacy-preserving, crowd-sourced sensing 2) non-intrusive cheat detection in online games, and 3) effective spam prevention in online messaging applications.
APA, Harvard, Vancouver, ISO, and other styles
23

Eksteen, Lambertus Lochner. "An investigation into source code escrow as a controlling measure for operational risk contained in business critical software." Thesis, Stellenbosch : Stellenbosch University, 2012. http://hdl.handle.net/10019.1/95629.

Full text
Abstract:
Thesis (MBA)--Stellenbosch University, 2012.<br>This research report outlines corporate governance and information technology risk management frameworks and the use of software escrow within a holistic enterprise risk management strategy to maintain business continuity. Available risk mitigation tools and frameworks were analysed including the use of software escrow as an information technology risk mitigation tool and continuity instrument. The primary researched problem relates to how organisations can ensure business continuity through managing the risks surrounding business-critical software applications. Software escrow was identified in the literature review as a risk management tool used to mitigate operational risks residing in the licencing of mission-critical software applications. The primary research question is: “How can source code escrow contribute towards business continuity by limiting risks contained in licensed business critical software applications?” This study found that an escrow agreement ensures an end-user access to licenced mission-critical intellectual property in the event of the owner’s insolvency, acquisition or breach of maintenance agreements and thereby ensures continuity. The following secondary research questions were also answered: “What types of operational risks will be minimised using software escrow?” and “What constitutes an effective source code agreement in South Africa?” The research identified that the main driver for escrow was operational risk of a mission-critical system failure due to maintenance and upgrades not taking place. The reasons identified included insolvency of the software supplier, acquisition of the supplier, loss of key resources (developers) and breach of maintenance or development agreements. The research also identified some limitations to the application of escrow and the reasons for some agreements not being executed. Key escrow contract quality criteria were identified which ensure an effective agreement under South African law. The following essential quality criteria were found to improve the efficiency of execution of the escrow contract: - Frequency and quality of deposits; - Deposit verification to ensure usability of material post release; and - Well-defined release trigger events to avoid legal disputes regarding what constitutes a release. Case studies highlighted the main risks that drive the creation of escrow agreements and identified limitations to the execution of some escrow agreements. The software end-user operational risks mitigated by the use of escrow included: - Continued use of the software despite vendor bankruptcy; - Reducing the dependency on the supplier for maintenance and support of the software Safeguarding critical business processes; and - Return on investment (software implementation, hardware and training of staff). It was concluded that, despite the legal and practical complexities concerned with escrow, it remains the best instrument to ensure continuity when relying on licensed intellectual property used for business-critical functions and processes. Software escrow is therefore a vital component of a well-formulated license agreement to ensure access to mission-critical technology (including all related intellectual property) under pre-defined conditions of release to the end-user (licensee). In the event of a release, the escrow agent gives the end-user access to the deposited source code and related materials for the purposes of business continuity only and in no way affects the ownership rights of the supplier/owner.
APA, Harvard, Vancouver, ISO, and other styles
24

Palma, Salas Marcelo Invert 1982. "Security testing methodology for robustness analysis of Web services by fault injection = Metodologia de testes de segurança para análise de robustez de Web services por injeção de falhas." [s.n.], 2012. http://repositorio.unicamp.br/jspui/handle/REPOSIP/275650.

Full text
Abstract:
Orientador: Eliane Martins<br>Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Computação<br>Made available in DSpace on 2018-08-22T12:17:55Z (GMT). No. of bitstreams: 1 PalmaSalas_MarceloInvert_M.pdf: 1721846 bytes, checksum: a5c6a8727658455a92eade9c6c09e948 (MD5) Previous issue date: 2012<br>Resumo: Devido a sua natureza distribuída e aberta, os Web Services geram novos desafios de segurança da informação. Esta tecnologia Web, desenvolvida pela W3C e OASIS, é susceptível a ataques de injeção e negação de serviços. Desta forma, o atacante pode coletar e manipular informação para procurar vulnerabilidades nos serviços. Nesse estudo analisamos o uso do injetor de falhas (IF) WSInject, para emular ataques com testes de segurança nos Web Services. A motivação para o uso de um injetor de falhas, ao invés do uso de vulnerabilities scanners, que são comumente usados na prática para testar a segurança, foi permitir melhor cobertura dos ataques. Em um estudo preliminar, usando um vulnerability scanner não comercial, foi possível determinar: (i) os serviços, bem como seus parâmetros e suas operações que seriam mais interessantes de utilizar durante a injeção de falhas, por terem sido os que apresentaram maior número de vulnerabilidades; (ii) um conjunto de regras para analisar os resultados dos testes de segurança. Esses resultados preliminares serviram de guia para os testes usando o injetor de falhas. As falhas foram injetadas em Web Services reais, sendo que alguns implementaram mecanismos de segurança de acordo com o padrão Web Services Security (WS-Security), como credenciais de segurança (Security Tokens)<br>Abstract: Due to its distributed and open nature, the Web Services give rise to new information security challenges. This technology, standardized by W3C and OASIS, is susceptible to both injection and denial of services (DoS) attacks. In this way, the attacker can collect and manipulate information in search of Web Services vulnerabilities. In this study we analyses the use of the WSInject fault injector, in order to emulate attacks with security tests on Web Services. The proposed approach makes use of WSInject Fault Injector to emulate attacks with Security Testing on Web Services. The motivation for using a fault injector, instead of vulnerabilities scanners, which are commonly used in practice for security testing, was to enable better coverage of attacks. In a preliminary study, using a non-commercial vulnerability scanner, it was possible to determine: (i) the Web Services to be tested as well as its parameters and operations more interesting to use during fault injection, by presenting the highest number of vulnerabilities; and (ii) a set of rules to analyze the results of security testing. These preliminary results served as a guide for the tests using the fault injector. The faults have been injected into real Web Services, and some of them have security mechanisms implemented, in compliance with the Web Services Security (WS-Security) with Security Tokens<br>Mestrado<br>Ciência da Computação<br>Mestre em Ciência da Computação
APA, Harvard, Vancouver, ISO, and other styles
25

Johns, Kenneth W. Jr. "Toward managing & automating CyberCIEGE scenario definition file creation." Thesis, Monterey, California. Naval Postgraduate School, 2004. http://hdl.handle.net/10945/1669.

Full text
Abstract:
Approved for public release, distribution is unlimited<br>The CyberCIEGE project seeks to create an alternative to traditional Information Assurance (IA) training and education approaches by developing an interactive, entertaining commercial-grade PC-based computer game/virtual laboratory. CyberCIEGE will provide a robust, flexible and extensible gaming environment where each instance of the game is based on a fully customizable scenario. These scenarios are written in the CyberCIEGE Scenario Definition Language. Unfortunately, the trade-off for flexibility, extensibility and fully customizable scenarios is syntax complexity in the scenario definition language. This thesis will solve this real world problem by showing that the complexity of scenario definition language syntax can be managed through a software tool. This thesis will develop such a tool and further demonstrate that progress can be made toward automating scenario generation.<br>Civilian, Federal Cyber Service Corps, Naval Postgraduate School
APA, Harvard, Vancouver, ISO, and other styles
26

Zhang, Tao. "RADAR: compiler and architecture supported intrusion prevention, detection, analysis and recovery." Diss., Available online, Georgia Institute of Technology, 2006, 2006. http://etd.gatech.edu/theses/available/etd-08042006-122745/.

Full text
Abstract:
Thesis (Ph. D.)--Computing, Georgia Institute of Technology, 2007.<br>Ahamad, Mustaque, Committee Member ; Pande, Santosh, Committee Chair ; Lee, Wenke, Committee Member ; Schwan, Karsten, Committee Member ; Yang, Jun, Committee Member.
APA, Harvard, Vancouver, ISO, and other styles
27

Jia, Hao. "A web application for Medasolution Healthcare Company customer service system." CSUSB ScholarWorks, 2005. https://scholarworks.lib.csusb.edu/etd-project/2612.

Full text
Abstract:
Medasolution is a virtual company designed by the author to handle Medicare insurance business. The web application (which uses ASP.net and SQL Server 2000) facilitates communication between Medasolution and all its clients: members, employers, brokers, and medicare providers through separate web pages based on their category levels. The program incorporates security so that it follows government privacy rules regarding client information.
APA, Harvard, Vancouver, ISO, and other styles
28

Love, Randall James. "Predictive software design measures." Thesis, This resource online, 1994. http://scholar.lib.vt.edu/theses/available/etd-06112009-063248/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Dhir, Amit. "Online project management system." CSUSB ScholarWorks, 2005. https://scholarworks.lib.csusb.edu/etd-project/2919.

Full text
Abstract:
The purpose of this project is to design and create a system that can be used by a wide variety of groups who do projects. The system created has been specifically tailored for a medium-level company that has employees in different locations and levels, and also has customers for whom they do projects.
APA, Harvard, Vancouver, ISO, and other styles
30

Lindmark, Fanny, and Hanna Kvist. "Security in software : How software companies work with security during a software development process." Thesis, Linköpings universitet, Institutionen för datavetenskap, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-130964.

Full text
Abstract:
This study is conducted, due to recent interest in privacy and software security, on how a number of software development companies work during a development process to develop secure software to the possible extent. The study is based on four interviews with four software companies located in Linköping, Sweden. The interviews followed a semi-structured format to ensure the possibility to compare the given answers from the companies to each other. This structure was chosen to give each company the liberty to express what they valued and thought were important during a software development process. The aim was to analyze how and if these companies work with security when developing software, and to see what differences and similarities that could be established. We found differences between the companies' perspective of security and on their methods of working. Some valued secure software more than others and performed more measures to ensure it. We also found some similarities on their view on importance of secure software and ways to work with it. The differences and similarities were related to the size of the companies, their resources, the types of products they develop, and the types of clients they have.
APA, Harvard, Vancouver, ISO, and other styles
31

Corcalciuc, Horia V. "Taxonomies for software security." Thesis, University of Birmingham, 2014. http://etheses.bham.ac.uk//id/eprint/4844/.

Full text
Abstract:
A reoccurring problem with software security is that programmers are encouraged to reason about correctness either at code-level or at the design level, while attacks often tend to take places on intermediary layers of abstraction. It may happen that the code itself may seem correct and secure as long as its functionality has been demonstrated - for example, by showing that some invariant has been maintained. However, from a high-level perspective, one can observe that parallel executing processes can be seen as one single large program consisting of smaller components that work together in order to accomplish a task and that, for the duration of that interaction, several smaller invariants have to be maintained. It is frequently the case that an attacker manages to subvert the behavior of a program in case the invariants for intermediary steps can be invalidated. Such invariants become difficult to track, especially when the programmer does not explicitly have security in mind. This thesis explores the mechanisms of concurrent interaction between concurrent processes and tries to bring some order to synchronization by studying attack patterns, not only at code level, but also from the perspective of abstract programming concepts.
APA, Harvard, Vancouver, ISO, and other styles
32

Mkpong-Ruffin, Idongesit Okon Umphress David A. Hamilton John A. "Quantitative risk assessment model for software security in the design phase of software development." Auburn, Ala., 2009. http://hdl.handle.net/10415/1584.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Tyukala, Mkhululi. "Governing information security using organisational information security profiles." Thesis, Nelson Mandela Metropolitan University, 2007. http://hdl.handle.net/10948/626.

Full text
Abstract:
The corporate scandals of the last few years have changed the face of information security and its governance. Information security has been elevated to the board of director level due to legislation and corporate governance regulations resulting from the scandals. Now boards of directors have corporate responsibility to ensure that the information assets of an organisation are secure. They are forced to embrace information security and make it part of business strategies. The new support from the board of directors gives information security weight and the voice from the top as well as the financial muscle that other business activities experience. However, as an area that is made up of specialist activities, information security may not easily be comprehended at board level like other business related activities. Yet the board of directors needs to provide oversight of information security. That is, put an information security programme in place to ensure that information is adequately protected. This raises a number of challenges. One of the challenges is how can information security be understood and well informed decisions about it be made at the board level? This dissertation provides a mechanism to present information at board level on how information security is implemented according to the vision of the board of directors. This mechanism is built upon well accepted and documented concepts of information security. The mechanism (termed An Organisational Information Security Profile or OISP) will assist organisations with the initialisation, monitoring, measuring, reporting and reviewing of information security programmes. Ultimately, the OISP will make it possible to know if the information security endeavours of the organisation are effective or not. If the information security programme is found to be ineffective, The OISP will facilitate the pointing out of areas that are ineffective and what caused the ineffectiveness. This dissertation also presents how the effectiveness or ineffctiveness of information security can be presented at board level using well known visualisation methods. Finally the contribution, limits and areas that need more investigation are provided.
APA, Harvard, Vancouver, ISO, and other styles
34

Kavuluru, Ramakanth. "ANALYSIS OF SECURITY MEASURES FOR SEQUENCES." UKnowledge, 2009. http://uknowledge.uky.edu/gradschool_diss/735.

Full text
Abstract:
Stream ciphers are private key cryptosystems used for security in communication and data transmission systems. Because they are used to encrypt streams of data, it is necessary for stream ciphers to use primitives that are easy to implement and fast to operate. LFSRs and the recently invented FCSRs are two such primitives, which give rise to certain security measures for the cryptographic strength of sequences, which we refer to as complexity measures henceforth following the convention. The linear (resp. N-adic) complexity of a sequence is the length of the shortest LFSR (resp. FCSR) that can generate the sequence. Due to the availability of shift register synthesis algorithms, sequences used for cryptographic purposes should have high values for these complexity measures. It is also essential that the complexity of these sequences does not decrease when a few symbols are changed. The k-error complexity of a sequence is the smallest value of the complexity of a sequence obtained by altering k or fewer symbols in the given sequence. For a sequence to be considered cryptographically ‘strong’ it should have both high complexity and high error complexity values. An important problem regarding sequence complexity measures is to determine good bounds on a specific complexity measure for a given sequence. In this thesis we derive new nontrivial lower bounds on the k-operation complexity of periodic sequences in both the linear and N-adic cases. Here the operations considered are combinations of insertions, deletions, and substitutions. We show that our bounds are tight and also derive several auxiliary results based on them. A second problem on sequence complexity measures useful in the design and analysis of stream ciphers is to determine the number of sequences with a given fixed (error) complexity value. In this thesis we address this problem for the k-error linear complexity of 2n-periodic binary sequences. More specifically: 1. We characterize 2n-periodic binary sequences with fixed 2- or 3-error linear complexity and obtain the counting function for the number of such sequences with fixed k-error linear complexity for k = 2 or 3. 2. We obtain partial results on the number of 2n-periodic binary sequences with fixed k-error linear complexity when k is the minimum number of changes required to lower the linear complexity.
APA, Harvard, Vancouver, ISO, and other styles
35

Andersson, Martin. "Software Security Testing : A Flexible Architecture for Security Testing." Thesis, Växjö University, School of Mathematics and Systems Engineering, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:vxu:diva-2388.

Full text
Abstract:
<p>Abstract: This thesis begins with briefly describing a few vulnerability classes that exist in today’s software. We then continue by describing how these vulnerabilities could be discovered through dynamic testing. Both general testing techniques and existent tools are mentioned.</p><p>The second half of this thesis present and evaluates a new flexible architecture. This new architecture has the ability to combine different approaches and create a more flexible environment from where the testing can be conducted. This new flexible architecture aims towards reducing maintenance and/or adaptation time for existing tools or frameworks. The architecture consists of a given set of plug-ins that can be easily replaced to adapt test as needed. We evaluate this architecture by implementing test plug-ins. We also use this architecture and a set of test plug-ins to generate a fuzzer targeted to test a known vulnerable server.</p>
APA, Harvard, Vancouver, ISO, and other styles
36

Baratz, Joshua W. (Joshua William) 1981. "Regions Security Policy (RSP) : applying regions to network security." Thesis, Massachusetts Institute of Technology, 2004. http://hdl.handle.net/1721.1/17933.

Full text
Abstract:
Thesis (M. Eng. and S.B.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2004.<br>Includes bibliographical references (p. 51-54).<br>The Regions network architecture is a new look at network organization that groups nodes into regions based on common purposes. This shift from strict network topology groupings of nodes requires a change in security systems. This thesis designs and implements the Regions Security Policy (RSP). RSP allows a unified security policy to be set across a region, fully controlling data as it enters into, exits from, and transits within a region. In doing so, it brings together several existing security solutions so as to provide security comparable to existing systems that is more likely to function correctly.<br>by Joshua W. Baratz.<br>M.Eng.and S.B.
APA, Harvard, Vancouver, ISO, and other styles
37

Fana, Akhona. "An evaluation of security issues in cloud-based file sharing technologies." Thesis, University of Fort Hare, 2015. http://hdl.handle.net/10353/1841.

Full text
Abstract:
Cloud computing is one of the most promising technologies for backup and data storage that provides flexible access to data. Cloud computing plays a vital role in remote backup. It is so unfortunate that this computing technique has flaws that thrilled and edgy end users in implementing it effectively. These flaws include factors like lack of integrity, confidentiality and privacy to information. A secure cloud is impossible unless the computer-generated environment is appropriately secured. In any form of technology it is always advisable that security challenges must be prior identified and fixed before the implementation of that particular technology. Primarily, this study will focus on finding security issues in cloud computing with the objective of finding concerns like credential theft and session management in the ―Cloud‖. Main arguments like HTTP banner disclosure, Bash ―ShellShock‖ Injection and password issues were discovered during the stages of study implementation. These challenges may provide information that will permit hackers in manipulating and exploiting cloud environment. Identifying credential theft and session management in cloud-based file sharing technologies a mixed method approach was implemented throughout the course of the study due to the nature of study and unity of analysis. Penetration tests were performed as security testing technique. Prevention and guideline of security threats leads to a friendly and authentic world of technology.
APA, Harvard, Vancouver, ISO, and other styles
38

Mayisela, Simphiwe Hector. "Data-centric security : towards a utopian model for protecting corporate data on mobile devices." Thesis, Rhodes University, 2014. http://hdl.handle.net/10962/d1011094.

Full text
Abstract:
Data-centric security is significant in understanding, assessing and mitigating the various risks and impacts of sharing information outside corporate boundaries. Information generally leaves corporate boundaries through mobile devices. Mobile devices continue to evolve as multi-functional tools for everyday life, surpassing their initial intended use. This added capability and increasingly extensive use of mobile devices does not come without a degree of risk - hence the need to guard and protect information as it exists beyond the corporate boundaries and throughout its lifecycle. Literature on existing models crafted to protect data, rather than infrastructure in which the data resides, is reviewed. Technologies that organisations have implemented to adopt the data-centric model are studied. A utopian model that takes into account the shortcomings of existing technologies and deficiencies of common theories is proposed. Two sets of qualitative studies are reported; the first is a preliminary online survey to assess the ubiquity of mobile devices and extent of technology adoption towards implementation of data-centric model; and the second comprises of a focus survey and expert interviews pertaining on technologies that organisations have implemented to adopt the data-centric model. The latter study revealed insufficient data at the time of writing for the results to be statistically significant; however; indicative trends supported the assertions documented in the literature review. The question that this research answers is whether or not current technology implementations designed to mitigate risks from mobile devices, actually address business requirements. This research question, answered through these two sets qualitative studies, discovered inconsistencies between the technology implementations and business requirements. The thesis concludes by proposing a realistic model, based on the outcome of the qualitative study, which bridges the gap between the technology implementations and business requirements. Future work which could perhaps be conducted in light of the findings and the comments from this research is also considered.
APA, Harvard, Vancouver, ISO, and other styles
39

Chu, Man-ying, and 朱文英. "Information security deviant behavior: its typology, measures, and causes." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2012. http://hub.hku.hk/bib/B48079613.

Full text
Abstract:
Although information security is important to all organizations, little behavioral research has been carried out in this area. Particularly lacking is research on negative forms of behavior involved in information security. The aim of this thesis is to fill this research gap by conducting three related studies on information security deviant behavior (ISDB), which refers to the voluntary behavior of employees within organizations that differs markedly from the information security norms of the organizations and that is normally considered by other employees to be wrong. Prior research work on this topic is insufficient, and the information security deviance concept remains unclear. This thesis explores the topic by considering three fundamental research questions: 1) What is ISDB? 2) How can ISDB be measured? 3) Why do employees commit ISDB? Study I addresses the first question—“What is ISDB?”—by identifying and organizing ISDB using a typology. A four-step method, comprising content analysis, multidimensional scaling, expert judgmental analysis, and empirical testing, is proposed for the development of typologies, which can fulfill the criteria for being a theory. The findings of this study suggest that ISDB can be organized into four ideal types that are interrelated along two dimensions—severity and frequency. Four constructs are identified from this typology. They are resource misuse (“high frequency, high severity” deviance), security carelessness (“high frequency, low severity” deviance), access control deviance (“low frequency, low severity” deviance), and system protection deviance (“low frequency, high severity” deviance). Study I not only develops an organized and theoretical framework for systematic research on ISDB and constitutes a critical starting point for the development of measures of the behavior, but also makes an important theoretical contribution by demonstrating the development of a typology, which is a unique form of theory building for an underdeveloped topic. Study II focuses on the second research question—“How can ISDB be measured?”—by developing valid and reliable scales to measure ISDB. My target is to develop scales to measure commonly found types of ISDB using an empirical method. Accordingly, the two “low frequency” types of deviance, access control and system protection deviance, are omitted from consideration. A rigorous measurement development process which includes three surveys and a number of tests is adopted. A four-item scale of resource misuse and a three-item scale of security carelessness are developed. The development of these two scales makes an important contribution to future ISDB research by providing a means to measure two types of information security deviance, thus facilitating the empirical study of ISDB. Study III is aimed at answering the third research question—“Why do employees commit ISDB?”—through construction of a causal model. Rather than consider “intention” as existing behavioral research on information security commonly does, Study III investigates actual behavior and employs resource misuse (“high frequency, high severity” deviance) as the dependent variable. Data from a Web-based survey are analyzed using the partial least squares approach. Considering the dual-process approach in the theory of planned behavior, the findings suggest that resource misuse may be both an intentional type of behavior and an unreasoned action. Perceived behavioral control influences employees’ resource misuse actions via their desires or intentions, whereas attitude toward resource misuse affects these actions via employees’ desires alone. Subjective norm is found not to affect employees’ resource misuse via either desires or intentions. In terms of the theoretical contributions, Study III takes steps to consider information security deviance by incorporating the dual-process approach and the theory of planned behavior. In terms of managerial significance, the results of Study III can help managers to better understand why employees commit resource misuse. In conclusion, this thesis provides a number of significant insights into ISDB and useful guidelines for further research on the topic. In addition, the findings of the three studies can help managers to develop better company strategies and policies to reduce internal security threats.<br>published_or_final_version<br>Business<br>Doctoral<br>Doctor of Philosophy
APA, Harvard, Vancouver, ISO, and other styles
40

LU, WEN-PAI. "SECURITY OF COMMUNICATION IN COMPUTER NETWORKS (KEY MANAGEMENT, VERIFICATION)." Diss., The University of Arizona, 1986. http://hdl.handle.net/10150/183922.

Full text
Abstract:
This dissertation concerns investigations on two of the most important problems in establishing communication security in computer networks: (1) developing a model which precisely describes the mechanism that enforces the security policy and requirements for a secure network, and (2) designing a key management scheme for establishing a secure session for end-to-end encryption between a pair of communicants. The security mechanism attempts to ensure secure flow of information between entities assigned to different security classes in different computer systems attached to a computer communication network. The mechanism also controls the accesses to the network devices by the subjects (users and processes executed on behalf of the users). The communication security problem is formulated by using a mathematical model which precisely describes the security requirements for the network. The model integrates the notions of access control and information flow control to provide a Trusted Network Base (TNB) for the network. The demonstration of security of the network when the security mechanism is designed following the present model is given by using mathematical induction techniques. The problem of designing key management schemes for establishing end-to-end encrypted sessions between source-destination pairs when the source and the destination are on different networks interconnected via Gateways and intermediate networks is examined. In such an internet environment, the key management problem attains a high degree of complexity due to the differences in the key distribution mechanisms used in the constituent networks and the infeasibility of effecting extensive hardware and software changes to the existing networks. A hierarchical approach for key management is presented which utilizes the existing network specific protocols at the lower levels and protocols between Authentication Servers and/or Control Centers of different networks at the higher levels. Details of this approach are discussed for specific illustrative scenarios to demonstrate the implementational simplicity. A formal verification of the security of the resulting system is also conducted by an axiomatic procedure utilizing certain combinatory logic principles. This approach is general and can be used for verifying the security of any existing key management scheme.
APA, Harvard, Vancouver, ISO, and other styles
41

Bailey, Carmen F. "Analysis of security solutions in large enterprises." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2003. http://library.nps.navy.mil/uhtbin/hyperion-image/03Jun%5FBailey.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Veeder, Nadine M. "An exploratory study of software development measures across COBOL programs." Thesis, Kansas State University, 1986. http://hdl.handle.net/2097/9977.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Naude, Kevin Alexander. "Assessing program code through static structural similarity." Thesis, Nelson Mandela Metropolitan University, 2007. http://hdl.handle.net/10948/578.

Full text
Abstract:
Learning to write software requires much practice and frequent assessment. Consequently, the use of computers to assist in the assessment of computer programs has been important in supporting large classes at universities. The main approaches to the problem are dynamic analysis (testing student programs for expected output) and static analysis (direct analysis of the program code). The former is very sensitive to all kinds of errors in student programs, while the latter has traditionally only been used to assess quality, and not correctness. This research focusses on the application of static analysis, particularly structural similarity, to marking student programs. Existing traditional measures of similarity are limiting in that they are usually only effective on tree structures. In this regard they do not easily support dependencies in program code. Contemporary measures of structural similarity, such as similarity flooding, usually rely on an internal normalisation of scores. The effect is that the scores only have relative meaning, and cannot be interpreted in isolation, ie. they are not meaningful for assessment. The SimRank measure is shown to have the same problem, but not because of normalisation. The problem with the SimRank measure arises from the fact that its scores depend on all possible mappings between the children of vertices being compared. The main contribution of this research is a novel graph similarity measure, the Weighted Assignment Similarity measure. It is related to SimRank, but derives propagation scores from only the locally optimal mapping between child vertices. The resulting similarity scores may be regarded as the percentage of mutual coverage between graphs. The measure is proven to converge for all directed acyclic graphs, and an efficient implementation is outlined for this case. Attributes on graph vertices and edges are often used to capture domain specific information which is not structural in nature. It has been suggested that these should influence the similarity propagation, but no clear method for doing this has been reported. The second important contribution of this research is a general method for incorporating these local attribute similarities into the larger similarity propagation method. An example of attributes in program graphs are identifier names. The choice of identifiers in programs is arbitrary as they are purely symbolic. A problem facing any comparison between programs is that they are unlikely to use the same set of identifiers. This problem indicates that a mapping between the identifier sets is required. The third contribution of this research is a method for applying the structural similarity measure in a two step process to find an optimal identifier mapping. This approach is both novel and valuable as it cleverly reuses the similarity measure as an existing resource. In general, programming assignments allow a large variety of solutions. Assessing student programs through structural similarity is only feasible if the diversity in the solution space can be addressed. This study narrows program diversity through a set of semantic preserving program transformations that convert programs into a normal form. The application of the Weighted Assignment Similarity measure to marking student programs is investigated, and strong correlations are found with the human marker. It is shown that the most accurate assessment requires that programs not only be compared with a set of good solutions, but rather a mixed set of programs of varying levels of correctness. This research represents the first documented successful application of structural similarity to the marking of student programs.
APA, Harvard, Vancouver, ISO, and other styles
44

Frid, Jonas. "Security Critical Systems in Software." Thesis, Linköpings universitet, Informationskodning, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-61588.

Full text
Abstract:
Sectra Communications is today developing cryptographic products for high assurance environments with rigorous requirements on separation between encrypted and un-encrypted data. This separation has traditionally been achieved through the use of physically distinct hardware components, leading to larger products which require more power and cost more to produce compared to systems where lower assurance is required. An alternative to hardware separation has emerged thanks to a new class of operating systems based on the "separation kernel" concept, which offers verifiable separation between software components running on the same processor comparable to that of physical separation. The purpose of this thesis was to investigate the feasibility in developing a product based on a separation kernel and which possibilities and problems with security evaluation would arise. In the thesis, a literature study was performed covering publications on the separation kernel from a historical and technical perspective, and the development and current status on the subject of software evaluation. Additionally, a software crypto demonstrator was partly implemented in the separation kernel based Green Hills Integrity operating system. The thesis shows that the separation kernel concept has matured significantly and it is indeed feasible to begin using this class of operating systems within a near future. Aside from the obvious advantages with smaller amounts of hardware, it would give greater flexibility in development and potential for more fine-grained division of functions. On the other hand, it puts new demands on developers and there is also a need for additional research about some evaluation aspects, failure resistance and performance.<br>Sectra Communications utvecklar idag kryptoprodukter med högt ställda krav på separation mellan krypterad och okrypterad data. Traditionellt har denna separation gjorts i hårdvara med fysiskt åtskilda komponenter, vilket lett till större produkter, högre energiförbrukning och högre tillverkningskostnader än motsvarande system för lägre säkerhetsnivåer. Ett alternativ till hårdvaruseparation har framkommit tack vare en ny typ av operativsystem baserat på ett koncept kallat "separationskärna", som erbjuder verifierbar separation mellan mjukvarukomponenter på en processor likvärdig med fysisk separation. Syftet med examensarbetet var att undersöka möjligheten att basera en produkt på ett sådant system samt vilka ytterligare möjligheter och problem med säkerhetsevaluering av produkten som uppstår. I examensarbetet utfördes en litteraturstudie av publikationer om separationskärnan ur ett historiskt och tekniskt perspektiv, samt den historiska utvecklingen inom säkerhetsevaluering av mjukvara och dess nuvarande status. Dessutom implementerades delar av ett mjukvarukrypto som en demonstrationsenhet baserad på Integrity från Green Hills Software, vilket är ett realtidsoperativsystem byggt kring en separationskärna. Arbetet visade att separationskärnan som koncept har nått en hög mognadsgrad och att det är rimligt att börja använda denna typ av operativsystem till produkter med mycket högt ställda säkerhetskrav inom en snar framtid. Det skulle förutom uppenbara vinster med minskad mängd hårdvara även ge större flexibilitet vid utvecklingen och möjlighet till exaktare uppdelning av funktioner. Samtidigt ställer det andra krav på utvecklarna och det behövs ytterligare utredning om vissa aspekter av hur evalueringsförfarandet påverkas, systemens feltolerans samt prestanda.
APA, Harvard, Vancouver, ISO, and other styles
45

Backman, Lars. "Why is security still an issue? : A study comparing developers’ software security awareness to existing vulnerabilities in software applications." Thesis, Linköpings universitet, Programvara och system, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-153438.

Full text
Abstract:
The need for secure web applications grows ever stronger the more sensitive, personal data makes its’ way onto the Internet. During the last decade, hackers have stolen enormous amounts of data from high profile companies and social institutions. In this paper, we answer the question of why security breaches still occur; Why do programmers write vulnerable code? To answer this question, we conducted a case study on a smaller software development company. By performing penetration tests, surveys and interviews we successfully identified several weaknesses in their product and their way of working, that could lead to security breaches in their application. We also conducted a security awareness assessment and found multiple contributing factors to why these weaknesses occur. Insufficient knowledge, misplaced trust, and inadequate testing policies are some of the reasons why these vulnerabilities appeared in the studied application.
APA, Harvard, Vancouver, ISO, and other styles
46

Reid, Rayne. "Guidelines for cybersecurity education campaigns." Thesis, Nelson Mandela University, 2017. http://hdl.handle.net/10948/14091.

Full text
Abstract:
In our technology- and information-infused world, cyberspace is an integral part of modern-day society. As the number of active cyberspace users increases, so too does the chances of a cyber threat finding a vulnerable target increase. All cyber users who are exposed to cyber risks need to be educated about cyber security. Human beings play a key role in the implementation and governing of an entire cybersecurity and cybersafety solution. The effectiveness of any cybersecurity and cybersafety solutions in a societal or individual context is dependent on the human beings involved in the process. If these human beings are either unaware or not knowledgeable about their roles in the security solution they become the weak link in these cybersecurity solutions. It is essential that all users be educated to combat any threats. Children are a particularly vulnerable subgroup within society. They are digital natives and make use of ICT, and online services with increasing frequency, but this does not mean they are knowledgeable about or behaving securely in their cyber activities. Children will be exposed to cyberspace throughout their lifetimes. Therefore, cybersecurity and cybersafety should be taught to children as a life-skill. There is a lack of well-known, comprehensive cybersecurity and cybersafety educational campaigns which target school children. Most existing information security and cybersecurity education campaigns limit their scope. Literature reports mainly on education campaigns focused on primary businesses, government agencies and tertiary education institutions. Additionally, most guidance for the design and implementation of security and safety campaigns: are for an organisational context, only target organisational users, and mostly provide high-level design recommendations. This thesis addressed the lack of guidance for designing and implementing cybersecurity and cybersafety educational campaigns suited to school learners as a target audience. The thesis aimed to offer guidance for designing and implementing education campaigns that educate school learners about cybersecurity and cybersafety. This was done through the implementation of an action research process over a five-year period. The action research process involved cybersecurity and cybersafety educational interventions at multiple schools. A total of 18 actionable guidelines were derived from this research to guide the design and implementation of cybersecurity and cybersafety education campaigns which aim to educate school children.
APA, Harvard, Vancouver, ISO, and other styles
47

Tekbacak, Fatih Tuğlular Tuğkan. "Developing a security mechanism for software agents/." [s.l.]: [s.n.], 2006. http://library.iyte.edu.tr/tezler/master/bilgisayaryazilimi/T000526.pdf.

Full text
Abstract:
Thesis (Master)--İzmir Institute of Technology, İzmir, 2006.<br>Keywords: Agents, security protocols, software, software development, software security. Includes bibliographical references (leaves. 73-76).
APA, Harvard, Vancouver, ISO, and other styles
48

Lububu, Steven. "Perception of employees concerning information security policy compliance : case studies of a European and South African university." Thesis, Cape Peninsula University of Technology, 2018. http://hdl.handle.net/20.500.11838/2802.

Full text
Abstract:
Thesis (MTech (Information Technology))--Cape Peninsula University of Technology, 2018.<br>This study recognises that, regardless of information security policies, information about institutions continues to be leaked due to the lack of employee compliance. The problem is that information leakages have serious consequences for institutions, especially those that rely on information for its sustainability, functionality and competitiveness. As such, institutions ensure that information about their processes, activities and services are secured, which they do through enforcement and compliance of policies. The aim of this study is to explore the extent of non-compliance with information security policy in an institution. The study followed an interpretive, qualitative case study approach to understand the meaningful characteristics of the actual situations of security breaches in institutions. Qualitative data was collected from two universities, using semi-structured interviews, with 17 participants. Two departments were selected: Human Resources and the Administrative office. These two departments were selected based on the following criteria: they both play key roles within an institution, they maintain and improve the university’s policies, and both departments manage and keep confidential university information (Human Resources transects and keeps employees’ information, whilst the Administrative office manages students’ records). This study used structuration theory as a lens to view and interpret the data. The qualitative content analysis was used to analyse documentation, such as brochures and information obtained from the websites of the case study’s universities. The documentation was then further used to support the data from the interviews. The findings revealed some factors that influence non-compliance with regards to information security policy, such as a lack of leadership skills, favouritism, fraud, corruption, insufficiency of infrastructure, lack of security education and miscommunication. In the context of this study, these factors have severe consequences on an institution, such as the loss of the institution’s credibility or the institution’s closure. Recommendations for further study are also made available.
APA, Harvard, Vancouver, ISO, and other styles
49

Lee, Chun-to Michael, and 李俊圖. "Novel techniques for implementing tamper-resistant software." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2004. http://hub.hku.hk/bib/B42577494.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Lee, Chun-to Michael. "Novel techniques for implementing tamper-resistant software." Click to view the E-thesis via HKUTO, 2004. http://sunzi.lib.hku.hk/hkuto/record/B42577494.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!