To see the other types of publications on this topic, follow the link: Data transmission systems Security measures.

Dissertations / Theses on the topic 'Data transmission systems Security measures'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Data transmission systems Security measures.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Tandon, Prateek. "High-performance advanced encryption standard (AES) security co-processor design." Thesis, Available online, Georgia Institute of Technology, 2004:, 2003. http://etd.gatech.edu/theses/available/etd-04082004-180433/unrestricted/tandon%5fprateek%5f200312%5fms.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Beckman, Joseph M. "Legal requirements of secure systems." Thesis, Kansas State University, 1985. http://hdl.handle.net/2097/9822.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Fernandez, Irma Becerra. "Encryption-based security for public networks : technique and application." FIU Digital Commons, 1994. http://digitalcommons.fiu.edu/etd/3296.

Full text
Abstract:
This dissertation describes the development of a new system whereby the Public Switch Telephone Network (PSTN), which is not secure, can perform like a private network. Integrated Services Digital Network (ISDN) forms a technical platform for other communication technologies, such as frame relay and Switched Megabit Data Service (SMDS). This is an original and innovative hardware and software design which can be embedded into the ISDN Customer Premises Equipment (CPE) to privatize the public ISDN without the need to upgrade the existing switching equipment. This research incorporates original design and development of the following hardware and software modules to provide real-time encryption and decryption of images and data in the ISDN medium: 1. ISDN Communications Module with customized Caller-ID access. 2. Token Access Control module for secure log-in. 3. A Hybrid Cryptographic Module, public key for key management and authentication, and private key for privacy. This Cryptographic module, the Security Extension Module to the Terminal Adapter (SEMTA), was implemented in software, and then optimized in hardware. This work proves that medical images and legal documents can be transmitted through the PSTN without any security breach, guaranteeing the privacy, confidentiality, and authenticity of the data.
APA, Harvard, Vancouver, ISO, and other styles
4

Mayisela, Simphiwe Hector. "Data-centric security : towards a utopian model for protecting corporate data on mobile devices." Thesis, Rhodes University, 2014. http://hdl.handle.net/10962/d1011094.

Full text
Abstract:
Data-centric security is significant in understanding, assessing and mitigating the various risks and impacts of sharing information outside corporate boundaries. Information generally leaves corporate boundaries through mobile devices. Mobile devices continue to evolve as multi-functional tools for everyday life, surpassing their initial intended use. This added capability and increasingly extensive use of mobile devices does not come without a degree of risk - hence the need to guard and protect information as it exists beyond the corporate boundaries and throughout its lifecycle. Literature on existing models crafted to protect data, rather than infrastructure in which the data resides, is reviewed. Technologies that organisations have implemented to adopt the data-centric model are studied. A utopian model that takes into account the shortcomings of existing technologies and deficiencies of common theories is proposed. Two sets of qualitative studies are reported; the first is a preliminary online survey to assess the ubiquity of mobile devices and extent of technology adoption towards implementation of data-centric model; and the second comprises of a focus survey and expert interviews pertaining on technologies that organisations have implemented to adopt the data-centric model. The latter study revealed insufficient data at the time of writing for the results to be statistically significant; however; indicative trends supported the assertions documented in the literature review. The question that this research answers is whether or not current technology implementations designed to mitigate risks from mobile devices, actually address business requirements. This research question, answered through these two sets qualitative studies, discovered inconsistencies between the technology implementations and business requirements. The thesis concludes by proposing a realistic model, based on the outcome of the qualitative study, which bridges the gap between the technology implementations and business requirements. Future work which could perhaps be conducted in light of the findings and the comments from this research is also considered.
APA, Harvard, Vancouver, ISO, and other styles
5

Alkhaldi, Rawan. "Spatial data transmission security authentication of spatial data using a new temporal taxonomy /." abstract and full text PDF (free order & download UNR users only), 2005. http://0-gateway.proquest.com.innopac.library.unr.edu/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:1433280.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Lee, Yung-Chuan. "A multi-stage framework on data transmission security for asymmetric systems /." Available to subscribers only, 2005. http://proquest.umi.com/pqdweb?did=1221743771&sid=21&Fmt=2&clientId=1509&RQT=309&VName=PQD.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Coertze, Jacques Jacobus. "A framework for information security governance in SMMEs." Thesis, Nelson Mandela Metropolitan University, 2012. http://hdl.handle.net/10948/d1014083.

Full text
Abstract:
It has been found that many small, medium and micro-sized enterprises (SMMEs) do not comply with sound information security governance principles, specifically the principles involved in drafting information security policies and monitoring compliance, mainly as a result of restricted resources and expertise. Research suggests that this problem occurs worldwide and that the impact it has on SMMEs is great. The problem is further compounded by the fact that, in our modern-day information technology environment, many larger organisations are providing SMMEs with access to their networks. This results not only in SMMEs being exposed to security risks, but the larger organisations as well. In previous research an information security management framework and toolbox was developed to assist SMMEs in drafting information security policies. Although this research was of some help to SMMEs, further research has shown that an even greater problem exists with the governance of information security as a result of the advancements that have been identified in information security literature. The aim of this dissertation is therefore to establish an information security governance framework that requires minimal effort and little expertise to alleviate governance problems. It is believed that such a framework would be useful for SMMEs and would result in the improved implementation of information security governance.
APA, Harvard, Vancouver, ISO, and other styles
8

Zhang, Cong, and 張聰. "Design of Anonymity scheme for communication systems." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2002. http://hub.hku.hk/bib/B31228100.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Ma, Chunyan. "Mathematical security models for multi-agent distributed systems." CSUSB ScholarWorks, 2004. https://scholarworks.lib.csusb.edu/etd-project/2568.

Full text
Abstract:
This thesis presents the developed taxonomy of the security threats in agent-based distributed systems. Based on this taxonomy, a set of theories is developed to facilitate analyzng the security threats of the mobile-agent systems. We propose the idea of using the developed security risk graph to model the system's vulnerabilties.
APA, Harvard, Vancouver, ISO, and other styles
10

Fragkos, Grigorios. "Near real-time threat assessment using intrusion detection system's data." Thesis, University of South Wales, 2011. https://pure.southwales.ac.uk/en/studentthesis/near-realtime-threat-assessment-using-intrusion-detection-systems-data(96a9528f-f319-4125-aaf0-71593bb61b56).html.

Full text
Abstract:
The concept of Intrusion Detection (ID) and the development of such systems have been a major concern for scientists since the late sixties. In recent computer networks, the use of different types of Intrusion Detection Systems (IDS) is considered essential and in most cases mandatory. Major improvements have been achieved over the years and a large number of different approaches have been developed and applied in the way these systems perform Intrusion Detection. The purpose of the research is to introduce a novel approach that will enable us to take advantage of the vast amounts of information generated by the large number of different IDSs, in order to identify suspicious traffic, malicious intentions and network attacks in an automated manner. In order to achieve this, the research focuses upon a system capable of identifying malicious activity in near real-time, that is capable of identifying attacks while they are progressing. The thesis addresses the near real-time threat assessment by researching into current state of the art solutions. Based on the literature review, current Intrusion Detection technologies lean towards event correlation systems using different types of detections techniques. Instead of using linear event signatures or rule sets, the thesis suggests a structured description of network attacks based on the abstracted form of the attacker’s activity. For that reason, the design focuses upon the description of network attacks using the development of footprints. Despite the level of knowledge, capabilities and resources of the attacker, the system compares occurring network events against predefined footprints in order to identify potential malicious activity. Furthermore, based on the implementation of the footprints, the research also focuses upon the design of the Threat Assessment Engine (TAE) which is capable of performing detection in near real-time by the use of the above described footprints. The outcome of the research proves that it is possible to have an automated process performing threat assessment despite the number of different ongoing attacks taking place simultaneously. The threat assessment process, taking into consideration the system’s architecture, is capable of acting as the human analyst would do when investigating such network activity. This automation speeds up the time-consuming process of manually analysing and comparing data logs deriving from heterogeneous sources, as it performs the task in near real-time. Effectively, by performing the this task in near real-time, the proposed system is capable of detecting complicated malicious activity which in other cases, as currently performed, it would be difficult, maybe impossible or results would be generated too late.
APA, Harvard, Vancouver, ISO, and other styles
11

Simpson, Leonie Ruth. "Divide and conquer attacks on shift register based stream ciphers." Thesis, Queensland University of Technology, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
12

Thomson, Kerry-Lynn. "MISSTEV : model for information security shared tacit espoused values." Thesis, Nelson Mandela Metropolitan University, 2007. http://hdl.handle.net/10948/717.

Full text
Abstract:
One of the most critical assets in most organisations is information. It is often described as the lifeblood of an organisation. For this reason, it is vital that this asset is protected through sound information security practices. However, the incorrect and indifferent behaviour of employees often leads to information assets becoming vulnerable. Incorrect employee behaviour could have an extremely negative impact on the protection of information. An information security solution should be a fundamental component in most organisations. It is, however, possible for an organisation to have the most comprehensive physical and technical information security controls in place, but the operational controls, and associated employee behaviour, have not received much consideration. Therefore, the issue of employee behaviour must be addressed in an organisation to assist in ensuring the protection of information assets. The corporate culture of an organisation is largely responsible for the actions and behaviour of employees. Therefore, to address operational information security controls, the corporate culture of an organisation should be considered. To ensure the integration of information security into the corporate culture of an organisation, the protection of information should become part of the way the employees conduct their everyday tasks – from senior management, right throughout the entire organisation. Therefore, information security should become an integral component of the corporate culture of the organisation. To address the integration of information security into the corporate culture of an organisation, a model was developed which depicted the learning stages and modes of knowledge creation necessary to transform the corporate culture into one that is information security aware.
APA, Harvard, Vancouver, ISO, and other styles
13

Wang, Ke, and 黃岢. "Designing authenication scheme for wireless sensor networks." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2009. http://hub.hku.hk/bib/B42841732.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Thomson, Steven Michael. "A standards-based security model for health information systems." Thesis, Nelson Mandela Metropolitan University, 2008. http://hdl.handle.net/10948/718.

Full text
Abstract:
In the healthcare environment, various types of patient information are stored in electronic format. This prevents the re-entering of information that was captured previously. In the past this information was stored on paper and kept in large filing cabinets. However, with the technology advancements that have occurred over the years, the idea of storing patient information in electronic systems arose. This led to a number of electronic health information systems being created, which in turn led to an increase in possible security risks. Any organization that stores information of a sensitive nature must apply information security principles in order to ensure that the stored information is kept secure. At a basic level, this entails ensuring the confidentiality, integrity and availability of the information, which is not an easy feat in today’s distributed and networked environments. This paved the way for organized standardization activities in the areas of information security and information security management. Throughout history, there have been practices that were created to help “standardize” industries of all areas, to the extent that there are professional organizations whose main objective it is to create such standards to help connect industries all over the world. This applies equally to the healthcare environment, where standardization took off in the late eighties. Healthcare organizations must follow standardized security measures to ensure that patient information stored in health information systems is kept secure. However, the proliferation in standards makes it difficult to understand, adopt and deploy these standards in a coherent manner. This research, therefore, proposes a standards-based security model for health information systems to ensure that such standards are applied in a manner that contributes to securing the healthcare environment as a whole, rather than in a piecemeal fashion.
APA, Harvard, Vancouver, ISO, and other styles
15

Kwok, Lam For. "A methodology of developing a data security model." Thesis, Queensland University of Technology, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
16

Perelson, Stephen. "SoDA : a model for the administration of separation of duty requirements in workflow systems." Thesis, Port Elizabeth Technikon, 2001. http://hdl.handle.net/10948/68.

Full text
Abstract:
The increasing reliance on information technology to support business processes has emphasised the need for information security mechanisms. This, however, has resulted in an ever-increasing workload in terms of security administration. Security administration encompasses the activity of ensuring the correct enforcement of access control within an organisation. Access rights and their allocation are dictated by the security policies within an organisation. As such, security administration can be seen as a policybased approach. Policy-based approaches promise to lighten the workload of security administrators. Separation of duties is one of the principles cited as a criterion when setting up these policy-based mechanisms. Different types of separation of duty policies exist. They can be categorised into policies that can be enforced at administration time, viz. static separation of duty requirements and policies that can be enforced only at execution time, viz. dynamic separation of duty requirements. This dissertation deals with the specification of both static separation of duty requirements and dynamic separation of duty requirements in role-based workflow environments. It proposes a model for the specification of separation of duty requirements, the expressions of which are based on set theory. The model focuses, furthermore, on the enforcement of static separation of duty. The enforcement of static separation of duty requirements is modelled in terms of invariant conditions. The invariant conditions specify restrictions upon the elements allowed in the sets representing access control requirements. The sets are themselves expressed as database tables within a relational database management system. Algorithms that stipulate how to verify the additions or deletions of elements within these sets can then be performed within the database management system. A prototype was developed in order to demonstrate the concepts of this model. This prototype helps demonstrate how the proposed model could function and flaunts its effectiveness.
APA, Harvard, Vancouver, ISO, and other styles
17

Miklau, Gerome. "Confidentiality and integrity in distributed data exchange /." Thesis, Connect to this title online; UW restricted, 2005. http://hdl.handle.net/1773/7011.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Chan, Yuk-wah Eliza, and 陳玉華. "A review of catastrophe planning for management information systems inHong Kong." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1989. http://hub.hku.hk/bib/B3126427X.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Ward, Michael P. "An architectural framework for describing Supervisory Control and Data Acquisition (SCADA) systems." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2004. http://library.nps.navy.mil/uhtbin/hyperion/04Sep%5FWard.pdf.

Full text
Abstract:
Thesis (M.S. in Computer Science)--Naval Postgraduate School, Sept. 2004.
Thesis Advisor(s): Cynthia E. Irvine, Deborah S. Shifflett. Includes bibliographical references (p. 73-75). Also available online.
APA, Harvard, Vancouver, ISO, and other styles
20

Gonzalez-Nieto, Juan Manuel. "Key recovery systems." Thesis, Queensland University of Technology, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
21

Li, Xiao-Yu. "Evolving a secure grid-enabled, distributed data warehouse : a standards-based perspective." Thesis, Nelson Mandela Metropolitan University, 2007. http://hdl.handle.net/10948/544.

Full text
Abstract:
As digital data-collection has increased in scale and number, it becomes an important type of resource serving a wide community of researchers. Cross-institutional data-sharing and collaboration introduce a suitable approach to facilitate those research institutions that are suffering the lack of data and related IT infrastructures. Grid computing has become a widely adopted approach to enable cross-institutional resource-sharing and collaboration. It integrates a distributed and heterogeneous collection of locally managed users and resources. This project proposes a distributed data warehouse system, which uses Grid technology to enable data-access and integration, and collaborative operations across multi-distributed institutions in the context of HV/AIDS research. This study is based on wider research into OGSA-based Grid services architecture, comprising a data-analysis system which utilizes a data warehouse, data marts, and near-line operational database that are hosted by distributed institutions. Within this framework, specific patterns for collaboration, interoperability, resource virtualization and security are included. The heterogeneous and dynamic nature of the Grid environment introduces a number of security challenges. This study also concerns a set of particular security aspects, including PKI-based authentication, single sign-on, dynamic delegation, and attribute-based authorization. These mechanisms, as supported by the Globus Toolkit’s Grid Security Infrastructure, are used to enable interoperability and establish trust relationship between various security mechanisms and policies within different institutions; manage credentials; and ensure secure interactions.
APA, Harvard, Vancouver, ISO, and other styles
22

Li, Ling Feng. "An image encryption system based on two-dimensional quantum random walks." Thesis, University of Macau, 2018. http://umaclib3.umac.mo/record=b3950660.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Gupta, Gaurav. "Robust digital watermarking of multimedia objects." Phd thesis, Australia : Macquarie University, 2008. http://hdl.handle.net/1959.14/28597.

Full text
Abstract:
Thesis (PhD)--Macquarie University, Division of Information and Communication Sciences, Department of Computing, 2008.
Bibliography: p. 144-153.
Introduction -- Background -- Overview of watermarking -- Natural language watermarking -- Software watermarking -- Semi-blind and reversible database watermarking -- Blind and reversible database watermarking -- Conclusion and future research -- Bibliography.
Digital watermarking has generated significant research and commercial interest in the past decade. The primary factors contributing to this surge are widespread use of the Internet with improved bandwidth and speed, regional copyright loopholes in terms of legislation; and seamless distribution of multimedia content due to peer-to-peer file-sharing applications. -- Digital watermarking addresses the issue of establishing ownership over mul-timedia content through embedding a watermark inside the object. Ideally, this watermark should be detectable and/or extractable, survive attacks such as digital reproduction and content-specific manipulations such as re-sizing in the case of images, and be invisible to the end-user so that the quality of the content is not degraded significantly. During detection or extraction, the only requirements should be the secret key and the watermarked multimedia object, and not the original un-marked object or the watermark inserted. Watermarking scheme that facilitate this requirement are categorized as blind. In recent times, reversibility of watermark has also become an important criterion. This is due to the fact that reversible watermarking schemes can provided security against secondary watermarking attacks by using backtracking algorithms to identify the rightful owner. A watermarking scheme is said to be reversible if the original unmarked object can be regenerated from the watermarked copy and the secret key.
This research covers three multimedia content types: natural language documents, software, and databases; and discusses the current watermarking scenario, challenges, and our contribution to the field. We have designed and implemented a natural language watermarking scheme that uses the redundancies in natural languages. As a result, it is robust against general attacks against text watermarks. It offers additional strength to the scheme by localizing the attack to the modified section and using error correction codes to detect the watermark. Our first contribution in software watermarking is identification and exploitation of weaknesses in branch-based software watermarking scheme proposed in [71] and the software watermarking algorithm we present is an improvised version of the existing watermarking schemes from [71]. Our scheme survives automated debugging attacks against which the current schemes are vulnerable, and is also secure against other software-specific attacks. We have proposed two database watermarking schemes that are both reversible and therefore resilient against secondary watermarking attacks. The first of these database watermarking schemes is semi-blind and requires the bits modified during the insertion algorithm to detect the watermark. The second scheme is an upgraded version that is blind and therefore does not require anything except a secret key and the watermarked relation. The watermark has a 89% probability of survival even when almost half of the data is manipulated. The watermarked data in this case is extremely useful from the users' perspective, since query results are preserved (i.e., the watermarked data gives the same results for a query as the nmarked data). -- The watermarking models we have proposed provide greater security against sophisticated attacks in different domains while providing sufficient watermark-carrying capacity at the same time. The false-positives are extremely low in all the models, thereby making accidental detection of watermark in a random object almost negligible. Reversibility has been facilitated in the later watermarking algorithms and is a solution to the secondary watermarking attacks. We shall address reversibility as a key issue in our future research, along with robustness, low false-positives and high capacity.
Mode of access: World Wide Web.
xxiv, 156 p. ill. (some col.)
APA, Harvard, Vancouver, ISO, and other styles
24

Gastaud, Gallagher Nicolas Hugh René. "Multi-Gigahertz Encrypted Communication Using Electro-Optical Chaos Cryptography." Diss., Georgia Institute of Technology, 2007. http://hdl.handle.net/1853/19701.

Full text
Abstract:
Chaotic dynamics are at the center of multiple studies to perfect encrypted communication systems. Indeed, the particular time evolution nature of chaotic signals constitutes the fundamentals of their application to secure telecommunications. The pseudo random signal constitutes the carrier wave for the communication. The information coded on the carrier wave can be extracted with knowledge of the system dynamic evolution law. This evolution law consists of a second-order delay differential equation in which intervene the various parameters of the physical system setup. The set of precise parameter values forms the key, in a cryptographic sense, of the encrypted transmission. This thesis work presents the implementation of an experimental encryption system using chaos. The optical intensity of the emitter fluctuates chaotically and serves as carrier wave. A message of small amplitude, hidden inside the fluctuations of the carrier wave, is extracted from the transmitted signal by a properly tuned receiver. The influence of the message modulation format on the communication quality both in the back to back case and after propagation is investigated numerically.
APA, Harvard, Vancouver, ISO, and other styles
25

Owen, Morné. "An enterprise information security model for a micro finance company: a case study." Thesis, Nelson Mandela Metropolitan University, 2009. http://hdl.handle.net/10948/1151.

Full text
Abstract:
The world has entered the information age. How the information is used within an organization will determine success or failure of the organisation. This study aims to provide a model, that once implemented, will provide the required protection for the information assets. The model is based on ISO 27002, an international security standard. The primary objective is to build a model that will provide a holistic security system specifically for a South African Micro Finance Company (MFC). The secondary objectives focuses on successful implementation of such a model, the uniqueness of the MFC that should be taken into account, and the maintenance of the model once implemented to ensure ongoing relevance. A questionnaire conducted at the MFC provided insight into the perceived understanding of information security. The questionnaire results were used to ensure the model solution addressed current information security shortcomings within the MFC. This study found that the information security controls in ISO 27002 should be applicable to any industry. The uniqueness for the MFC is not in the security controls, but rather in the regulations and laws applicable to it.
APA, Harvard, Vancouver, ISO, and other styles
26

Wang, Chengwei. "Monitoring and analysis system for performance troubleshooting in data centers." Diss., Georgia Institute of Technology, 2013. http://hdl.handle.net/1853/50411.

Full text
Abstract:
It was not long ago. On Christmas Eve 2012, a war of troubleshooting began in Amazon data centers. It started at 12:24 PM, with an mistaken deletion of the state data of Amazon Elastic Load Balancing Service (ELB for short), which was not realized at that time. The mistake first led to a local issue that a small number of ELB service APIs were affected. In about six minutes, it evolved into a critical one that EC2 customers were significantly affected. One example was that Netflix, which was using hundreds of Amazon ELB services, was experiencing an extensive streaming service outage when many customers could not watch TV shows or movies on Christmas Eve. It took Amazon engineers 5 hours 42 minutes to find the root cause, the mistaken deletion, and another 15 hours and 32 minutes to fully recover the ELB service. The war ended at 8:15 AM the next day and brought the performance troubleshooting in data centers to world’s attention. As shown in this Amazon ELB case.Troubleshooting runtime performance issues is crucial in time-sensitive multi-tier cloud services because of their stringent end-to-end timing requirements, but it is also notoriously difficult and time consuming. To address the troubleshooting challenge, this dissertation proposes VScope, a flexible monitoring and analysis system for online troubleshooting in data centers. VScope provides primitive operations which data center operators can use to troubleshoot various performance issues. Each operation is essentially a series of monitoring and analysis functions executed on an overlay network. We design a novel software architecture for VScope so that the overlay networks can be generated, executed and terminated automatically, on-demand. From the troubleshooting side, we design novel anomaly detection algorithms and implement them in VScope. By running anomaly detection algorithms in VScope, data center operators are notified when performance anomalies happen. We also design a graph-based guidance approach, called VFocus, which tracks the interactions among hardware and software components in data centers. VFocus provides primitive operations by which operators can analyze the interactions to find out which components are relevant to the performance issue. VScope’s capabilities and performance are evaluated on a testbed with over 1000 virtual machines (VMs). Experimental results show that the VScope runtime negligibly perturbs system and application performance, and requires mere seconds to deploy monitoring and analytics functions on over 1000 nodes. This demonstrates VScope’s ability to support fast operation and online queries against a comprehensive set of application to system/platform level metrics, and a variety of representative analytics functions. When supporting algorithms with high computation complexity, VScope serves as a ‘thin layer’ that occupies no more than 5% of their total latency. Further, by using VFocus, VScope can locate problematic VMs that cannot be found via solely application-level monitoring, and in one of the use cases explored in the dissertation, it operates with levels of perturbation of over 400% less than what is seen for brute-force and most sampling-based approaches. We also validate VFocus with real-world data center traces. The experimental results show that VFocus has troubleshooting accuracy of 83% on average.
APA, Harvard, Vancouver, ISO, and other styles
27

Gallo, Filho Roberto Alves 1978. "Um framework para desenvolvimento e implementação de sistemas seguros baseados em hardware." [s.n.], 2004. http://repositorio.unicamp.br/jspui/handle/REPOSIP/275683.

Full text
Abstract:
Orientador : Ricardo Dahab.
Tese (doutorado) - Universidade Estadual de Campinas, Instituto de Computação
Made available in DSpace on 2018-08-21T17:02:27Z (GMT). No. of bitstreams: 1 GalloFilho_RobertoAlves_D.pdf: 5999506 bytes, checksum: 6ef66e76246dddb7de30593abff60bc5 (MD5) Previous issue date: 2012
Resumo A concepção de sistemas seguros demanda tratamento holístico, global. A razão é que a mera composição de componentes individualmente seguros não garante a segurança do conjunto resultante2. Enquanto isso, a complexidade dos sistemas de informação cresce vigorosamente, dentre outros, no que se diz respeito: i) ao número de componentes constituintes; ii) ao número de interações com outros sistemas; e iii) 'a diversidade de natureza dos componentes. Este crescimento constante da complexidade demanda um domínio de conhecimento ao mesmo tempo multidisciplinar e profundo, cada vez mais difícil de ser coordenado em uma única visão global, seja por um indivíduo, seja por uma equipe de desenvolvimento. Nesta tese propomos um framework para a concepção, desenvolvimento e deployment de sistemas baseados em hardware que é fundamentado em uma visão única e global de segurança. Tal visão cobre um espectro abrangente de requisitos, desde a integridade física dos dispositivos até a verificação, pelo usuário final, de que seu sistema está logicamente íntegro. Para alcançar este objetivo, apresentamos nesta tese o seguinte conjunto de componentes para o nosso framework: i) um conjunto de considerações para a construção de modelos de ataques que capturem a natureza particular dos adversários de sistemas seguros reais, principalmente daqueles baseados em hardware; ii) um arcabouço teórico com conceitos e definições importantes e úteis na construção de sistemas seguros baseados em hardware; iii) um conjunto de padrões (patterns) de componentes e arquiteturas de sistemas seguros baseados em hardware; iv) um modelo teórico, lógico-probabilístico, para avaliação do nível de segurança das arquiteturas e implementações; e v) a aplicação dos elementos do framework na implementação de sistemas de produção, com estudos de casos muito significativos3. Os resultados relacionados a estes componentes estão apresentados nesta tese na forma de coletânea de artigos. 2 Técnicas "greedy" não fornecem necessariamente os resultados ótimos. Mais, a presença de componentes seguros não é nem fundamental. 3 Em termos de impacto social, econômico ou estratégico
Abstract: The conception of secure systems requires a global, holistic, approach. The reason is that the mere composition of individually secure components does not necessarily imply in the security of the resulting system4. Meanwhile, the complexity of information systems has grown vigorously in several dimensions as: i) the number of components, ii) the number of interactions with other components, iii) the diversity in the nature of the components. This continuous growth of complexity requires from designers a deep and broad multidisciplinary knowledge, which is becoming increasingly difficult to be coordinated and attained either by individuals or even teams. In this thesis we propose a framework for the conception, development, and deployment of secure hardware-based systems that is rooted on a unified and global security vision. Such a vision encompasses a broad spectrum of requirements, from device physical integrity to the device logical integrity verification by humans. In order to attain this objective we present in this thesis the following set of components of our framework: i) a set of considerations for the development of threat models that captures the particular nature of adversaries of real secure systems based on hardware; ii) a set of theoretical concepts and definitions useful in the design of secure hardware-based systems; iii) a set of design patterns of components and architectures for secure systems; iv) a logical-probabilistic theoretical model for security evaluation of system architectures and implementations; and v) the application of the elements of our framework in production systems with highly relevant study cases. Our results related to these components are presented in this thesis as a series of papers which have been published or submitted for publication. 4Greedy techniques do not inevitably yield optimal results. More than that, the usage of secure components is not even required
Doutorado
Ciência da Computação
Doutor em Ciência da Computação
APA, Harvard, Vancouver, ISO, and other styles
28

Makokoe, Isaac. "Selected antecedents towards the acceptance of m-payment services and the relationship with attitude and future intentions." Thesis, Vaal University of Technology, 2017. http://hdl.handle.net/10352/454.

Full text
Abstract:
M. Tech. (Marketing, Faculty of Management Sciences), Vaal University of Technology
Keywords: Mobile payments, usefulness, ease of use, security, attitude, future intentions. An increased reliance on mobile phones by consumers for making retail purchases has been witnessed over the years. Given the pervasive use of m-payments and the incessant diffusion of innovations in South Africa, it is important for marketers to have knowledge of the right set of factors that enhance consumers’ intent towards favouring m-payments in future encounters. This study draws from the undertones of Davis’s (1989) Technology acceptance Model (TAM). Whereas the theory alludes to the influences of both usefulness and ease of use on consumer attitudes and behaviour, this study further amplifies the salience of cosumer perceptions of security as a salient drive towards m-payment acceptance. This is because m-payments involve moneybased transactions and therefore it is important for consumers to have assurance that they operate along a secure platform. The TAM was nominated as the underlying theory in this research owing to its effectiveness when applied during the initial phases of an innovation, to avoid costly mistakes of implementing innovation attributes that do not offer the requiredset of elements for persuading consumers. The purpose of this study was to test an integrative research model of the antecedents of mpayment acceptance using a South African sample of consumers. A quantitative study comprising a non-probability snowball sample of 474 consumers aged between 18 and 50 years was conducted in 2016, in and around the five major towns of Southern Gauteng province in South Africa. The structured questionnaire requested respondents to indicate their perceptions regarding the usefulness, ease of use and security of m-payment platforms they have utilised. In addition, the questionnaire relates to consumers’ attitude evaluations of m-payments in general, as well as their intentions to both use and recommending m-payments to others in the future. Initially, descriptive statistics were performed on the data set, including correlation analysis and multicolinearity testing. Subsequently, structural equation modelling was applied by first, assessing the measurement model using fit indices, confirmatory factor analysis and statistical accuracy tests of reliability and validity. Specification of the measurement model led to the conclusion that the future intentions model was a five-factor structure comprising usefulness, ease of use, security, attitude and future intentions. Thereafter, the results of the structural model (Structural model A) supported the existence of a direct influence between usefulness and security with attitude, while the latter was found to have a direct influence on future intentions. Nevertheless, the relationships between ease of use and attitude was not significant and therefore, alternative hypothesis Ha3 could not be supported in this study leading to the need to specify a vi subsequent competing model. Under Structural model B, perceived usefulness is used as both a dependent and an independent variable since it is predicted by perceived ease of use and in turn predicts attitude towards using and behavioural intention to use simultaneously. The results of Structural model B led to the decision to accept the competing model as the ultimate model for this research since the model presents complete evidence of path weights that are greater than 0.20, interpreted as evidence for significant path outcomes. Insights gained from this study could assist both marketing academics and practitioners to understand the perceptions of consumers towards m-payments. In this regard, if a determination is made that conducting m-payment transactions in secure and effort-free environments could enhance the effectiveness of consumers in their jobs and lives in general, then marketers could be in a better position to deliver a worthwhile innovation solution for South African consumers.
APA, Harvard, Vancouver, ISO, and other styles
29

Kativu, Tatenda Kevin. "A framework for the secure consumerisation of mobile, handheld devices in the healthcare institutional context." Thesis, Nelson Mandela Metropolitan University, 2017. http://hdl.handle.net/10948/18630.

Full text
Abstract:
The advances in communication technologies have resulted in a significant shift in the workplace culture. Mobile computing devices are increasingly becoming an integral part of workplace culture. Mobility has several advantages to the organisation, one such example is the “always online” workforce resulting in increased productivity hours. As a result, organisations are increasingly providing mobile computing devices to the workforce to enable remote productivity at the organisations cost. A challenge associated with mobility is that these devices are likely to connect to a variety of networks, some which may insecure, and because of their smaller form factor and perceived value, are vulnerable to loss and theft amongst other information security challenges. Increased mobility has far reaching benefits for remote and rural communities, particularly in the healthcare domain where health workers are able to provide services to previously inaccessible populations. The adverse economic and infrastructure environment means institution provided devices make up the bulk of the mobile computing devices, and taking away the ownership, the usage patterns and the susceptibility of information to adversity are similar. It is for this reason that this study focuses on information security on institution provided devices in a rural healthcare setting. This study falls into the design science paradigm and is guided by the principles of design science proposed by Hevner et al. The research process incorporates literature reviews focusing on health information systems security and identifying theoretical constructs that support the low-resource based secure deployment of health information technologies. Thereafter, the artifact is developed and evaluated through an implementation case study and expert reviews. The outcomes from the feedback are integrated into the framework.
APA, Harvard, Vancouver, ISO, and other styles
30

Guerreiro, André Saito 1986. "Capacidade de sigilo e indisponibilidade de sigilo em sistemas MIMOME." [s.n.], 2014. http://repositorio.unicamp.br/jspui/handle/REPOSIP/259132.

Full text
Abstract:
Orientador: Gustavo Fraidenraich
Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Elétrica e de Computação
Made available in DSpace on 2018-08-25T15:23:35Z (GMT). No. of bitstreams: 1 Guerreiro_AndreSaito_M.pdf: 2368603 bytes, checksum: 297e17dce61316c0a4184fc3db28066c (MD5) Previous issue date: 2014
Resumo: Neste trabalho, considera-se a transmissão de mensagem confidencial em um canal sem fio em que transmissor, receptor e escuta possuem múltiplas antenas. O trabalho divide-se em duas partes. Na primeira parte analisamos a capacidade de sigilo ergódica e a probabilidade de indisponibilidade de sigilo para os cenários em que o canal é ergódico e não ergódico respectivamente, ambos na presença de desvanecimento estacionário com distribuição Rayleigh e considerando conhecimento do estado do canal (CSI) no receptor e na escuta. No cenário ergódico, deriva-se uma nova expressão fechada para a capacidade ergódica de sistemas em que há conhecimento do estado do canal no transmissor (CSIT) do canal principal e do canal de escuta, no qual permite-se que matriz covariância varie no tempo. Também deriva-se um limite inferior para capacidade de sigilo com CSIT, no qual a matriz covariância é fixa no período de transmissão. A primeira expressão é restrita ao limite da alta relação sinal ruído (SNR), n_t antenas no transmissor, n_r antenas no receptor (n_r > n_t) e n_e=n_t antenas na escuta (arranjo n_t x n_r x n_t). A segunda expressão é restrita ao arranjo de antenas n_t x n_t x n_t e potência do ruído do canal principal e do canal de escuta iguais. No cenário não ergódico, deriva-se uma nova expressão fechada para a probabilidade de indisponibilidade de sigilo no limite da alta SNR, em um arranjo de antenas 2 nr x 2 com n_r > 2. Também calcula-se um limite superior para a probabilidade de indisponibilidade de sigilo para outros arranjos de antena. Na segunda parte, considera-se uma escuta ativa que é capaz de atacar de forma inteligente o processo de estimação de canal. Focando em sistemas de transmissão baseados na decomposição generalizada em valores singulares (GSVD), diferentes técnicas de ataque são propostas e simulações computacionais são utilizadas para avaliar a eficiência de cada uma delas
Abstract: In this thesis, we consider the transmission of confidential information over a multiple-input multiple-output multiple-eavesdropper (MIMOME) wireless channel. The content is largely divided in two. In the first part we analyse the ergodic secrecy capacity and the secrecy outage probability in the ergodic and non-ergodic scenario respectively, both with stationary Rayleigh distributed fading channels and channel state information (CSI) at the receiver and eavesdropper. For the ergodic scenario we derive a new closed-form expression for the ergodic secrecy capacity with channel state information at the transmitter (CSIT) of the main and the eavesdropper channels, allowing the covariance matrix to be time-varying. A lower bound for the ergodic capacity with CSIT, in which the covariance matrix is fixed for the entire transmission period is also derived. The first expression is restricted to the high-SNR limit, with n_t transmit antennas, n_r receive antennas (n_r >= n_t) and n_e=n_t eavesdropper antennas (n_t x n_r x n_t setup). The second expression is restricted to the n_t x n_t x n_t antenna setup and equal noise power at both channels. For the non-ergodic scenario, we derive a new closed-form expression for the secrecy outage probability in the high-SNR limit, in a 2x n_r x 2 setup with n_r \ge 2. We also calculate an upper-bound for the secrecy outage probability in other antenna setups. In the second part we consider an eavesdropper which is able to attack the channel sounding process through intelligent jamming. We focus on transmission systems based on generalized singular value decomposition (GSVD). We propose and analyze, through computer simulations, the efficiency of several attack techniques that intend to disrupt the secret communication between legitimate users
Mestrado
Telecomunicações e Telemática
Mestre em Engenharia Elétrica
APA, Harvard, Vancouver, ISO, and other styles
31

Sokolova, Karina. "Bridging the gap between Privacy by Design and mobile systems by patterns." Thesis, Troyes, 2016. http://www.theses.fr/2016TROY0008/document.

Full text
Abstract:
De nos jours, les smartphones et les tablettes génèrent, reçoivent, mémorisent et transfèrent vers des serveurs une grande quantité de données en proposant des services aux utilisateurs via des applications mobiles facilement téléchargeables et installables. Le grand nombre de capteurs intégrés dans un smartphone lui permet de collecter de façon continue des informations très précise sur l'utilisateur et son environnement. Cette importante quantité de données privées et professionnelles devient difficile à superviser.L'approche «Privacy by Design», qui inclut sept principes, propose d'intégrer la notion du respect des données privées dès la phase de la conception d’un traitement informatique. En Europe, la directive européenne sur la protection des données privées (Directive 95/46/EC) intègre des notions du «Privacy by Design». La nouvelle loi européenne unifiée (General Data Protection Régulation) renforce la protection et le respect des données privées en prenant en compte les nouvelles technologies et confère au concept de «Privacy by Design» le rang d’une obligation légale dans le monde des services et des applications mobiles.L’objectif de cette thèse est de proposer des solutions pour améliorer la transparence des utilisations des données personnelles mobiles, la visibilité sur les systèmes informatiques, le consentement et la sécurité pour finalement rendre les applications et les systèmes mobiles plus conforme au «Privacy by (re)Design»
Nowadays, smartphones and smart tablets generate, receive, store and transfer substantial quantities of data, providing services for all possible user needs with easily installable programs, also known as mobile applications. A number of sensors integrated into smartphones allow the devices to collect very precise information about the owner and his environment at any time. The important flow of personal and business data becomes hard to manage.The “Privacy by Design” approach with 7 privacy principles states privacy can be integrated into any system from the software design stage. In Europe, the Data Protection Directive (Directive 95/46/EC) includes “Privacy by Design” principles. The new General Data Protection Regulation enforces privacy protection in the European Union, taking into account modern technologies such as mobile systems and making “Privacy by Design” not only a benefit for users, but also a legal obligation for system designers and developers.The goal of this thesis is to propose pattern-oriented solutions to cope with mobile privacy problems, such as lack of transparency, lack of consent, poor security and disregard for purpose limitation, thus giving mobile systems more Privacy by (re) Design
APA, Harvard, Vancouver, ISO, and other styles
32

Chinpanich, Vorapong. "Helpdesk Support Alert System." CSUSB ScholarWorks, 2004. https://scholarworks.lib.csusb.edu/etd-project/2674.

Full text
Abstract:
The goal of this project was to implement the Helpdesk Support Alert System in the Data Center Services (DCS) of California State University, San Bernardino's (CSUSB's) Information Resource and Technology Division (IRT). DCS is responsible for ensuring uninterrupted operation of all CSUSB administrative computing systems. These responsibilities include user support, system maintenance, and system security. The DCS helpdesk cannot be staffed 24 hours a day; this application is designed to alert DCS technicians of emergencies when they are away from the helpdesk. The Helpdesk Support Alert System sends out an automated emergency alert in the form of a short text message to technicians' mobile phones. Technicians respond back to their main office by using the Wireless Application Protocol (WAP) capability of their mobile phones.
APA, Harvard, Vancouver, ISO, and other styles
33

Kaido, Rodrigo Tsuneyoshi. "Codificação de rede como alternativa para aumentar a segurança na camada física em smart grids." Universidade Tecnológica Federal do Paraná, 2014. http://repositorio.utfpr.edu.br/jspui/handle/1/817.

Full text
Abstract:
Smart grids representam o futuro das redes elétricas. Estes tipos de redes devem ser robustas a flutuações de carga e devem possuir monitoramento e gerenciamento inteligente e em tempo real. Para que essas demandas sejam possíveis, é preciso uma comunicação de dados de alta velocidade, flexível e de baixo custo. Dentro dessas características, muitos autores propõem a utilização de sistemas de comunicação sem fio, os quais possuem um custo de implantação mais baixo que redes ópticas ou cabeadas, além de possuir flexibilidade para rápidas mudanças de topologia, e não apresentarem barreiras em relação aos padrões e equipamentos, o oposto por exemplo ao caso do sistema PLC (Power Line Communications). Devido à natureza difusora do canal sem fio, segurança nesse tipo de rede é um dos pontos mais críticos, já que um ataque de qualquer natureza pode provocar perturbações e blackouts na rede elétrica, ou gerar problemas de privacidade, na situação em que atacantes passivos (eavesdroppers) interceptam mensagens da rede com o intuito de obter algum tipo de benefício. Esta segunda situação, de ataques passivos, será abordada neste trabalho. Além das tradicionais técnicas de criptografia geralmente utilizadas para aumentar a segurança de redes de comunicação, outra área que vem recentemente despertando interesse da comunidade científica é a área de segurança na camada física, a qual é baseada em conceitos da teoria da informação de Shannon. Neste trabalho, utiliza-se as técnicas de codificação de rede para aumentar a segurança na camada física da parte de múltiplo acesso de uma rede de comunicação sem fio, em que dois transmissores possuem informações independentes para um destino em comum, na presença de um eavesdropper. Utilizando-se a probabilidade de outage com restrições de sigilo como métrica, mostra-se através de resultados analíticos e numéricos que o sigilo pode ser aumentado através da codificação de rede, quando comparada com a transmissão direta e com as técnicas de cooperação tradicionais.
Smart grids represent future of electrical power systems . These kind of networks must be robust to load fluctuations as well as have smart monitoring and intelligent management in real-time fashion . Based on the aforementioned needs, many authors propose the use of wireless communication systems in order to meet these demands, due to their efficient tradeoff between low-cost and high-seed when compared to wired connections such as optical fibens or metallic cables, and, in addition, they are flexible to topology changes and do not have constrainsts in terms of standards and devices, the opposite for example to the case of PLC(Power Line Comminications) .Due to the broadcast nature of the wireless medium , security is onde of the critical issues in smart grids since the occurrence of attacks can lead to load fluctuations and blackouts in the electrical system, or generate secrecy problems, in the situation where passive eaversdroppers intercept messages in the network aiming to obtain some kind of benefit . This second case of passive attacks will be addressed in the work. In addition to classical cryptography strategies commonly used to increase the security in communications systems another area which has been studied by the scientific community is the physical-layer security, which is based on the Shannon’s information theory. In this work, we use the network coding technique as tool to increase the physical-layer in a mulple access wireless network, where two users have independent information to transmit to a common destination, in the presence of an eavesdropper. By using the secrecy outage probability as the metric, we show through theoretic and numerical results that the network security can be increased through the use of network coding when compared to the transmission and traditional cooperative techniques.
APA, Harvard, Vancouver, ISO, and other styles
34

Battikh, Dalia. "Sécurité de l’information par stéganographie basée sur les séquences chaotiques." Thesis, Rennes, INSA, 2015. http://www.theses.fr/2015ISAR0013/document.

Full text
Abstract:
La stéganographie est l’art de la dissimulation de l’information secrète dans un médium donné (cover) de sorte que le médium résultant (stégo) soit quasiment identique au médium cover. De nos jours, avec la mondialisation des échanges (Internet, messagerie et commerce électronique), s’appuyant sur des médiums divers (son, image, vidéo), la stéganographie moderne a pris de l’ampleur. Dans ce manuscrit, nous avons étudié les méthodes de stéganographie LSB adaptatives, dans les domaines spatial et fréquentiel (DCT, et DWT), permettant de cacher le maximum d’information utile dans une image cover, de sorte que l’existence du message secret dans l’image stégo soit imperceptible et pratiquement indétectable. La sécurité du contenu du message, dans le cas de sa détection par un adversaire, n’est pas vraiment assurée par les méthodes proposées dans la littérature. Afin de résoudre cette question, nous avons adapté et implémenté deux méthodes (connues) de stéganographie LSB adaptatives, en ajoutant un système chaotique robuste permettant une insertion quasi-chaotique des bits du message secret. Le système chaotique proposé consiste en un générateur de séquences chaotiques robustes fournissant les clés dynamiques d’une carte Cat 2-D chaotique modifiée. La stéganalyse universelle (classification) des méthodes de stéganographie développées est étudiée. A ce sujet, nous avons utilisé l’analyse discriminante linéaire de Fisher comme classifieur des vecteurs caractéristiques de Farid, Shi et Wang. Ce choix est basé sur la large variété de vecteurs caractéristiques testés qui fournissent une information sur les propriétés de l’image avant et après l’insertion du message. Une analyse des performances des trois méthodes de stéganalyse développées, appliquées sur des images stégo produites par les deux méthodes de stéganographie LSB adaptatives proposées, est réalisée. L’évaluation des résultats de la classification est réalisée par les paramètres: sensibilité, spécificité, précision et coefficient Kappa
Steganography is the art of the dissimulation of a secret message in a cover medium such that the resultant medium (stego) is almost identical to the cover medium. Nowadays, with the globalization of the exchanges (Internet, messaging and e-commerce), using diverse mediums (sound, embellish with images, video), modern steganography is widely expanded. In this manuscript, we studied adaptive LSB methods of stéganography in spatial domain and frequency domain (DCT, and DWT), allowing of hiding the maximum of useful information in a cover image, such that the existence of the secret message in the stégo image is imperceptible and practically undetectable. Security of the message contents, in the case of its detection by an opponent, is not really insured by the methods proposed in the literature. To solve this question, we adapted and implemented two (known) methods of adaptive stéganographie LSB, by adding a strong chaotic system allowing a quasi-chaotic insertion of the bits of the secret message. The proposed chaotic system consists of a generator of strong chaotic sequences, supplying the dynamic keys of a modified chaotic 2D Cat map. Universal steganalysis (classification) of the developed methods of stéganography, is studied. On this question, we used the linear discriminating analysis of Fisher as classifier of the characteristic vectors of Farid, Shi and Wang. This choice is based on the wide variety of tested characteristic vectors that give an information about the properties of the image before and after message insertion. An analysis of the performances of three developed methods of steganalysis, applied to the produced stego images by the proposed adaptive methods of stéganography, is realized. Performance evaluation of the classification is realized by using the parameters: sensibility, specificity, precision and coefficient Kappa
APA, Harvard, Vancouver, ISO, and other styles
35

BARBACENA, Marcell Manfrin. "Impacto da redução de taxa de transmissão de fluxos de vídeos na eficácia de algoritmo para detecção de pessoas." Universidade Federal de Campina Grande, 2014. http://dspace.sti.ufcg.edu.br:8080/jspui/handle/riufcg/413.

Full text
Abstract:
Submitted by Johnny Rodrigues (johnnyrodrigues@ufcg.edu.br) on 2018-04-18T15:01:39Z No. of bitstreams: 1 MARCELL MANFRIN BARBACENA - DISSERTAÇÃO PPGCC 2014..pdf: 1468565 bytes, checksum: b94d20ffdace21ece654986ffd8fbb63 (MD5)
Made available in DSpace on 2018-04-18T15:01:39Z (GMT). No. of bitstreams: 1 MARCELL MANFRIN BARBACENA - DISSERTAÇÃO PPGCC 2014..pdf: 1468565 bytes, checksum: b94d20ffdace21ece654986ffd8fbb63 (MD5) Previous issue date: 2014
Impulsionadas pela crescente demanda por sistemas de segurança para proteção do indivíduo e da propriedade nos dias atuais, várias pesquisas têm sido desenvolvidas com foco na implantação de sistemas de vigilância por vídeo com ampla cobertura. Um dos problemas de pesquisa em aberto nas áreas de visão computacional e redes de computadores envolvem a escalabilidade desses sistemas, principalmente devido ao aumento do número de câmeras transmitindo vídeos em tempo real para monitoramento e processamento. Neste contexto, o objetivo geral deste trabalho é avaliar o impacto que a redução da taxa de transmissão dos fluxos de vídeos impõe na eficácia dos algoritmos de detecção de pessoas utilizados em sistemas inteligentes de videovigilância. Foram realizados experimentos utilizando vídeos em alta resolução no contexto de vigilância com tomadas externas e com um algoritmo de detecção de pessoas baseado em histogramas de gradientes orientados, nos quais se coletou, como medida de eficácia do algoritmo, a métrica de área sob a curva de precisão e revocação para, em sequência, serem aplicados os testes estatísticos de Friedman e de comparações múltiplas com um controle na aferição das hipóteses levantadas. Os resultados obtidos indicaram que é possível uma redução da taxa de transmissão em mais de 70% sem que haja redução da eficácia do algoritmo de detecção de pessoas.
Motivated by the growing demand for security systems to protect persons and properties in the nowadays, several researches have been developed focusing on the deployment of widearea video coverage surveillance systems. One open research problem in the areas of computer vision and computer networks involves the scalability of these systems, mainly due to the increasing number of cameras transmitting real-time video for monitoring and processing. In this context, the aim of this study was to evaluate the impact that transmission data-rate reduction of video streams imposes on the effectiveness of people detection algorithms used in intelligent video surveillance systems. With a proposed experimental design, experiments were performed using high-resolution wide-area external coverage video surveillance and using an algorithm for people detection based on histograms of oriented gradients. As a measure of effectiveness of the people detection algorithm, the metric of area under the precision-recall curve was collected and statistical tests of Friedman and multiple comparisons with a control were applied to evaluate the hypotheses. The results indicated that it is possible to reduce transmission rate by more than 70% without decrease in the effectiveness of the people detection algorithm.
APA, Harvard, Vancouver, ISO, and other styles
36

Wong, Walter. "Proposta de implementação de uma arquitetura para a Internet de nova geração." [s.n.], 2007. http://repositorio.unicamp.br/jspui/handle/REPOSIP/259643.

Full text
Abstract:
Orientadores: Mauricio Ferreira Magalhães, Fabio Luciano Verdi
Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Eletrica e Computação
Made available in DSpace on 2018-08-09T14:41:11Z (GMT). No. of bitstreams: 1 Wong_Walter_M.pdf: 1265045 bytes, checksum: 15a2435e5676b973ffe726e4757323e4 (MD5) Previous issue date: 2007
Resumo: A concepção original da arquitetura da Internet foi baseada em uma rede fixa e confiável. Hoje em dia, a Internet se tornou dinâmica e vulnerável aos ataques de segurança. Também não era prevista a necessidade de integração de tecnologias heterogêneas nem de ambientes sem fio. A arquitetura atual apresenta uma série de barreiras técnicas para prover estes serviços, sendo uma das maiores a sobrecarga semântica do Internet Protocol (IP). O endereço IP atua como localizador na camada de rede e como identificador na camada de transporte, impossibilitando novas funcionalidades como a mobilidade e abrindo brechas de segurança. Este trabalho apresenta uma proposta de implementação de uma arquitetura para Internet de nova geração para o provisionamento de novos serviços de forma natural e integrada para a Internet atual. A proposta de arquitetura de implementação oferece suporte à mobilidade, ao multihoming, à segurança, à integração de redes heterogêneas e às aplicações legadas através da introdução de uma nova camada de identificação na arquitetura atual. Esta nova camada tem por objetivo separar a identidade da localização e se tornar uma opção de comunicação para as redes heterogêneas. Mecanismos adicionais foram propostos para prover o suporte às funcionalidades da arquitetura, tais como a resolução de nomes em identificadores, o roteamento baseado no identificador, a gerência de localização e um plano de controle para a troca de mensagens de sinalização fim-a-fim entre os componentes da arquitetura. Para a validação da arquitetura proposta, um protótipo foi implementado e vários testes de desempenho foram realizados para avaliação do overhead da implementação, do modelo de segurança, da robustez e do suporte à mobilidade e às aplicações legadas
Abstract: The original concept of the Internet architecture was based on static and reliable networks. Nowadays, the Internet became more dynamic and vulnerable to security attacks. The integration of heterogeneous technologies and wireless environment were not predicted. The current architecture presents some technical barriers to provide these services. One of these problems is the semantic overload of the Internet Protocol (IP). The IP address acts as locator in the network layer and identifier in the transport layer, preventing new features such as mobility and allowing security flaws. This work presents an implementation proposal of a next generation Internet architecture to provide new services naturally integrated to the Internet. The implementation proposal supports mobility, multihoming, security, heterogeneous networks integration and legacy applications by the introduction of a new identification layer in the current architecture. This new layer will separate the identity from the location and become an option for communication between heterogeneous networks. Additional mechanisms were proposed to support the new functionalities of the architecture, e.g., resolution of names to identifiers, identifier-based routing, location management and a control plane to exchange end-toend signalling control messages between the components of the architecture. In order to evaluate the proposed architecture, a prototype was implemented and some tests were performed considering implementation overhead, security model, robustness and support for mobility and legacy applications
Mestrado
Engenharia de Computação
Mestre em Engenharia Elétrica
APA, Harvard, Vancouver, ISO, and other styles
37

Chego, Lloyd. "Whether using encryption in SCADA systems, the services performance requirements are still met in OT IT environment over an MPLS core network?" Thesis, 2016. http://hdl.handle.net/10539/21050.

Full text
Abstract:
A Research Project Abstract submitted in fulfillment of the requirements for Master of Science in Engineering [Electrical]: Telecommunications at the University Of The Witwatersrand, Johannesburg 07 June 2016
Utilities use Supervisory Control and Data Acquisition systems as their industrial control system. The architecture of these systems in the past was based on them being isolated from other networks. Now with recent ever changing requirements of capabilities from these systems there is a need to converge with information technology systems and with the need to have these industrial networks communicating on packet switched networks there are cyber security concerns that come up. This research project looks at the whether using encryption in an IP/MPLS core network for SCADA in an OT IT environment has an effect on the performance requirements. This was done through an experimental simulation with the results recorded. The research project also looks at the key literature study considerations. The key research question for the research project of this MSc 50/50 mini-thesis is “whether using encryption in SCADA systems, the services performance requirements are still met in OT/ IT environment over an MPLS core network”? The research project seeks to determine if SCADA performance requirements are met over an encrypted MPLS/IP core network in an OT/IT environment. The key focus area of the research project is only encryption in the whole cyber security value chain versus SCADA services performances. This means that the research project only focused on the encryption portion of the whole cyber security value chain and the scope did not focus on other aspects of the value chain. This suffices for an MSc 50/50 mini-thesis research project as a focus on the whole value chain would require a full MSc thesis. Thus the primary objective for the research project is to research and demonstrate that encryption is essential for secure SCADA communication over a MPLS/IP core network. As aforementioned encryption forms an essential part of the Cyber Security value chain which has to achieve the following objectives. Confidentiality: ensuring that the information source is really from that source. Integrity: ensuring that the information has not been altered in any way. Availability: ensuring that system is not comprised but that it is available. These objectives of encryption should be met with SCADA service performance requirements not violated which is the objective of the research project.
M T 2016
APA, Harvard, Vancouver, ISO, and other styles
38

"Data security and reliability in cloud backup systems with deduplication." 2012. http://library.cuhk.edu.hk/record=b5549075.

Full text
Abstract:
雲存儲是一個新興的服務模式,讓個人和企業的數據備份外包予較低成本的遠程雲服務提供商。本論文提出的方法,以確保數據的安全性和雲備份系統的可靠性。
在本論文的第一部分,我們提出 FadeVersion,安全的雲備份作為今天的雲存儲服務上的安全層服務的系統。 FadeVersion實現標準的版本控制備份設計,從而消除跨不同版本備份的冗餘數據存儲。此外,FadeVersion在此設計上加入了加密技術以保護備份。具體來說,它實現細粒度安全删除,那就是,雲客戶可以穩妥地在雲上删除特定的備份版本或文件,使有關文件永久無法被解讀,而其它共用被删除數據的備份版本或文件將不受影響。我們實現了試驗性原型的 FadeVersion並在亞馬遜S3之上進行實證評價。我們證明了,相對於不支援度安全删除技術傳統的雲備份服務 FadeVersion只增加小量額外開鎖。
在本論文的第二部分,提出 CFTDedup一個分佈式代理系統,利用通過重複數據删除增加雲存儲的效率,而同時確保代理之間的崩潰容錯。代理之間會進行同步以保持重複數據删除元數據的一致性。另外,它也分批更新元數據減輕同步帶來的開銷。我們實現了初步的原型CFTDedup並通過試驗台試驗,以存儲虛擬機映像評估其重複數據删除的運行性能。我們還討論了幾個開放問題,例如如何提供可靠、高性能的重複數據删除的存儲。我們的CFTDedup原型提供了一個平台來探討這些問題。
Cloud storage is an emerging service model that enables individuals and enterprises to outsource the storage of data backups to remote cloud providers at a low cost. This thesis presents methods to ensure the data security and reliability of cloud backup systems.
In the first part of this thesis, we present FadeVersion, a secure cloud backup system that serves as a security layer on top of todays cloud storage services. FadeVersion follows the standard version-controlled backup design, which eliminates the storage of redundant data across different versions of backups. On top of this, FadeVersion applies cryptographic protection to data backups. Specifically, it enables ne-grained assured deletion, that is, cloud clients can assuredly delete particular backup versions or files on the cloud and make them permanently in accessible to anyone, while other versions that share the common data of the deleted versions or les will remain unaffected. We implement a proof-of-concept prototype of FadeVersion and conduct empirical evaluation atop Amazon S3. We show that FadeVersion only adds minimal performance overhead over a traditional cloud backup service that does not support assured deletion.
In the second part of this thesis, we present CFTDedup, a distributed proxy system designed for providing storage efficiency via deduplication in cloud storage, while ensuring crash fault tolerance among proxies. It synchronizes deduplication metadata among proxies to provide strong consistency. It also batches metadata updates to mitigate synchronization overhead. We implement a preliminary prototype of CFTDedup and evaluate via test bed experiments its runtime performance in deduplication storage for virtual machine images. We also discuss several open issues on how to provide reliable, high-performance deduplication storage. Our CFTDedup prototype provides a platform to explore such issues.
Detailed summary in vernacular field only.
Detailed summary in vernacular field only.
Detailed summary in vernacular field only.
Rahumed, Arthur.
Thesis (M.Phil.)--Chinese University of Hong Kong, 2012.
Includes bibliographical references (leaves 47-51).
Abstracts also in Chinese.
Chapter 1 --- Introduction --- p.1
Chapter 1.1 --- Cloud Based Backup and Assured Deletion --- p.1
Chapter 1.2 --- Crash Fault Tolerance for Backup Systems with Deduplication --- p.4
Chapter 1.3 --- Outline of Thesis --- p.6
Chapter 2 --- Background and Related Work --- p.7
Chapter 2.1 --- Deduplication --- p.7
Chapter 2.2 --- Assured Deletion --- p.7
Chapter 2.3 --- Policy Based Assured Deletion --- p.8
Chapter 2.4 --- Convergent Encryption --- p.9
Chapter 2.5 --- Cloud Based Backup Systems --- p.10
Chapter 2.6 --- Fault Tolerant Deduplication Systems --- p.10
Chapter 3 --- Design of FadeVersion --- p.12
Chapter 3.1 --- Threat Model and Assumptions for Fade Version --- p.12
Chapter 3.2 --- Motivation --- p.13
Chapter 3.3 --- Main Idea --- p.14
Chapter 3.4 --- Version Control --- p.14
Chapter 3.5 --- Assured Deletion --- p.16
Chapter 3.6 --- Assured Deletion for Multiple Policies --- p.18
Chapter 3.7 --- Key Management --- p.19
Chapter 4 --- Implementation of FadeVersion --- p.20
Chapter 4.1 --- System Entities --- p.20
Chapter 4.2 --- Metadata Format in FadeVersion --- p.22
Chapter 5 --- Evaluation of FadeVersion --- p.24
Chapter 5.1 --- Setup --- p.24
Chapter 5.2 --- Backup/Restore Time --- p.26
Chapter 5.3 --- Storage Space --- p.28
Chapter 5.4 --- Monetary Cost --- p.29
Chapter 5.5 --- Conclusions --- p.30
Chapter 6 --- CFTDedup Design --- p.31
Chapter 6.1 --- Failure Model --- p.31
Chapter 6.2 --- System Overview --- p.32
Chapter 6.3 --- Distributed Deduplication --- p.33
Chapter 6.4 --- Crash Fault Tolerance --- p.35
Chapter 6.5 --- Implementation --- p.36
Chapter 7 --- Evaluation of CFTDedup --- p.37
Chapter 7.1 --- Setup --- p.37
Chapter 7.2 --- Experiment 1 (Archival) --- p.38
Chapter 7.3 --- Experiment 2 (Restore) --- p.39
Chapter 7.4 --- Experiment 3 (Recovery) --- p.40
Chapter 7.5 --- Summary --- p.41
Chapter 8 --- Future work and Conclusions of CFTDedup --- p.43
Chapter 8.1 --- Future Work --- p.43
Chapter 8.2 --- Conclusions --- p.44
Chapter 9 --- Conclusion --- p.45
Bibliography --- p.47
APA, Harvard, Vancouver, ISO, and other styles
39

Zhao, Weiliang, University of Western Sydney, College of Health and Science, and School of Computing and Mathematics. "Trust in distributed information systems." 2008. http://handle.uws.edu.au:8081/1959.7/35454.

Full text
Abstract:
Trust management is an important issue in the analysis and design of secure information systems. This is especially the case where centrally managed security is not possible. Trust issues arise not only in business functions, but also in technologies used to support these functions. There are a vast number of services and applications that must accommodate appropriate notions of trust. Trust and trust management have become a hot research area. The motivation of this dissertation is to build up a comprehensive trust management approach that covers the analysis/modelling of trust relationships and the development of trust management systems in a consistent manner. A formal model of trust relationship is proposed with a strict mathematical structure that can not only reflect many of the commonly used notions of trust, but also provide a solid basis for a unified taxonomy framework of trust where a range of useful properties of trust relationships can be expressed and compared. A classification of trust relationships is presented. A set of definitions, propositions, and operations are proposed for the properties about scope and diversity of trust relationships, direction and symmetry of trust relationships, and relations of trust relationships. A general methodology for analysis and modelling of trust relationships in distributed information system is presented. The general methodology includes a range of major concerns in the whole lifecycle of trust relationships, and provides practical guidelines for analysis and modelling of trust relationships in the real world. A unified framework for trust management is proposed. Trust request, trust evaluation, and trust consuming are handled in a comprehensive and consistent manner. A variety of trust mechanisms including reputation, credentials, local data, and environment parameters are covered under the same framework. A trust management architecture is devised for facilitating the development of trust management systems. A trust management system for federated medical services is developed as an implementation example of the proposed trust management architecture. An online booking system is developed to show how a trust management system is employed by applications. A trust management architecture for web services is devised. It can be viewed as an extension of WS-Trust with the ability to integrate the message building blocks supported by web services protocol stack and other trust mechanisms. It provides high level architecture and guidelines for the development and deployment of a trust management layer in web services. Trust management extension of CardSpace identity system is introduced. Major concerns are listed for the analysis and modelling of trust relationships, and development of trust management systems for digital identities.
Doctor of Philosophy (PhD)
APA, Harvard, Vancouver, ISO, and other styles
40

"Asymmetric reversible parametric sequences approach to design a multi-key secure multimedia proxy: theory, design and implementation." 2003. http://library.cuhk.edu.hk/record=b5891457.

Full text
Abstract:
Yeung Siu Fung.
Thesis (M.Phil.)--Chinese University of Hong Kong, 2003.
Includes bibliographical references (leaves 52-53).
Abstracts in English and Chinese.
Abstract --- p.ii
Acknowledgement --- p.v
Chapter 1 --- Introduction --- p.1
Chapter 2 --- Multi-Key Encryption Theory --- p.7
Chapter 2.1 --- Reversible Parametric Sequence --- p.7
Chapter 2.2 --- Implementation of ARPSf --- p.11
Chapter 3 --- Multimedia Proxy: Architectures and Protocols --- p.16
Chapter 3.1 --- Operations to Request and Cache Data from the Server --- p.16
Chapter 3.2 --- Operations to Request Cached Data from the Multimedia Proxy --- p.18
Chapter 3.3 --- Encryption Configuration Parameters (ECP) --- p.19
Chapter 4 --- Extension to multi-level proxy --- p.24
Chapter 5 --- Secure Multimedia Library (SML) --- p.27
Chapter 5.1 --- Proxy Pre-fetches and Caches Data --- p.27
Chapter 5.2 --- Client Requests Cached Data From the Proxy --- p.29
Chapter 6 --- Implementation Results --- p.31
Chapter 7 --- Related Work --- p.40
Chapter 8 --- Conclusion --- p.42
Chapter A --- Function Prototypes of Secure Multimedia Library (SML) --- p.44
Chapter A.1 --- CONNECTION AND AUTHENTICATION --- p.44
Chapter A.1.1 --- Create SML Session --- p.44
Chapter A.1.2 --- Public Key Manipulation --- p.44
Chapter A.1.3 --- Authentication --- p.45
Chapter A.1.4 --- Connect and Accept --- p.46
Chapter A.1.5 --- Close Connection --- p.47
Chapter A.2 --- SECURE DATA TRANSMISSION --- p.47
Chapter A.2.1 --- Asymmetric Reversible Parametric Sequence and En- cryption Configuration Parameters --- p.47
Chapter A.2.2 --- Bulk Data Encryption and Decryption --- p.48
Chapter A.2.3 --- Entire Data Encryption and Decryption --- p.49
Chapter A.3 --- Secure Proxy Architecture --- p.49
Chapter A.3.1 --- Proxy-Server Connection --- p.49
Chapter A.3.2 --- ARPS and ECP --- p.49
Chapter A.3.3 --- Initial Sever Encryption --- p.50
Chapter A.3.4 --- Proxy Re-Encryption --- p.51
Chapter A.3.5 --- Client Decryption --- p.51
Bibliography --- p.52
APA, Harvard, Vancouver, ISO, and other styles
41

Escobar, Santoro Mauro. "Security and Statistics on Power Grids." Thesis, 2019. https://doi.org/10.7916/d8-987s-6q56.

Full text
Abstract:
Improving the functioning and the safety of the electrical grids is a topic of great concern, given its magnitude and importance in today's world. In this thesis, we focus in these two subjects. In the first part, we study undetectable cyber-physical attacks on power grids, which are attacks that involve physical disruptions, including tripping lines and load modifications, and sensor output alterations. We propose a sophisticated attack model described under the full Alternating Current (AC) power flow equations and show its feasibility on large grids from a test cases library. As counter-measures, we propose different defensive strategies that the network's controller can apply under a suspected cyber attack. These are random, simple and fast procedures that change the voltages across the network and aim to unmask the current status of the system, assuming that the attacker cannot react against their randomness. Secondly, with access to data collected through Phasor Measurement Units (PMUs) by a power utility in the United States, we perform statistical analyses on the frequency and voltage time series that have been recorded at a rate of 30 Hz. We focus on intervals of time where the sampled data shows to be in steady-state conditions and, with the use of appropriate signal processing filters, we are able to extract hidden anomalies such as spatio-temporal correlations between sensors and harmonic distortions.
APA, Harvard, Vancouver, ISO, and other styles
42

Pather, Maree. "Towards a model for ensuring optimal interoperability between the security systems of trading partners in a business-to-business e-commerce context." Diss., 2002. http://hdl.handle.net/10500/885.

Full text
Abstract:
A vast range of controls/countermeasures exists for implementing security on information systems connected to the Internet. For the practitioner attempting to implement an integrated solution between trading partners operating across the Internet, this has serious implications in respect of interoperability between the security systems of the trading partners. The problem is exacerbated by the range of specification options within each control. This research is an attempt to find a set of relevant controls and specifications towards a framework for ensuring optimal interoperability between trading partners in this context. Since a policy-based, layered approach is advocated, which allows each trading partner to address localized risks independently, no exhaustive risk analysis is attempted. The focus is on infrastructure that is simultaneously optimally secure and provides optimal interoperability. It should also be scalable, allowing for additional security controls to be added whenever deemed necessary.
Computing
M. Sc. (Information Systems)
APA, Harvard, Vancouver, ISO, and other styles
43

Shah, Kavit. "Secure data aggregation protocol for sensor networks." Thesis, 2015. http://hdl.handle.net/1805/6697.

Full text
Abstract:
Indiana University-Purdue University Indianapolis (IUPUI)
We propose a secure in-network data aggregation protocol with internal verification, to gain increase in the lifespan of the network by preserving bandwidth. For doing secure internal distributed operations, we show an algorithm for securely computing the sum of sensor readings in the network. Our algorithm can be generalized to any random tree topology and can be applied to any combination of mathematical functions. In addition, we represent an efficient way of doing statistical analysis for the protocol. Furthermore, we propose a novel, distributed and interactive algorithm to trace down the adversary and remove it from the network. Finally, we do bandwidth analysis of the protocol and give the proof for the efficiency of the protocol.
APA, Harvard, Vancouver, ISO, and other styles
44

Kortekaas, Birgit Friederike. "Internet-based electronic payment systems." Diss., 2002. http://hdl.handle.net/10500/858.

Full text
Abstract:
As today, the traditional payment systems of cash, cheques and credit cards are being supplemented by electronic cheques, electronic credit card-based systems, and token-based systems, online security is of utmost importance and one of the biggest criteria used for evaluating electronic payment systems. Electronic payment systems must guarantee the essential security requirements: confidentiality, privacy, integrity, availability. authentication, non-repudiation as well as anonymity and trust. This paper compares the various payment systems (both traditional and electronic) available today mainly according to their security aspects. Secure processing can be accomplished including access controls and detection techniques, such as, encrypted communication channels, user and/or message authentication, symmetric and asymmetric encryption, digital certificates and firewalls. These effective security measures, which are outlined in detail in this paper, will protect the information and payment systems against security risks that currently threaten the Internet
Computing
M.Sc. (Information Systems)
APA, Harvard, Vancouver, ISO, and other styles
45

Blauw, Frans Frederik. "Beatrix: a model for multi-modal and fine-grained authentication for online banking." Thesis, 2015. http://hdl.handle.net/10210/13809.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Kgopa, Alfred Thaga. "Information security issues facing internet café users." 2013. http://encore.tut.ac.za/iii/cpro/DigitalItemViewPage.external?sp=1001578.

Full text
Abstract:
M. Tech. Business Information Systems
Although owners of Internet cafés extend the freedom to have Internet access to the community, they fail to tighten their computer security to safeguard the private information of their customers. This dissertation provides a conceptual framework for improving information security in the Internet Café, to help and ensure data privacy, data integrity, risk management and information security (IS) behaviour. The study investigated the information security issues that are faced by users of Internet cafés and explored the effects of these issues. The framework shows how users can improve their physical security to reach higher standards of information privacy over the Internet.
APA, Harvard, Vancouver, ISO, and other styles
47

Dreef, Dennis Sebastian. "Secure routing and data aggregation for infrastructureless wireless networks without persistent cryptographic operations." Thesis, 2006. http://hdl.handle.net/1828/2050.

Full text
Abstract:
Nodes in infrastructureless wireless networks usually have only limited energy and pose new security challenges. since traditional cryptographic operations have a high energy cost. In this thesis, new security solutions are presented to avoid using costly cryptographic operations. Two secure problems are investigated: the first is secure routing: the other is secure data aggregation. For the first problem, a randomized algorithm is proposed to defend against malicious attackers wishing to disrupt routing in wireless ad-hoc networks. For the second problem, a solution is proposed to leverage the broadcast nature of wireless medium and use a special aggregation topology, namely a clique tree, for data integrity in wireless sensor networks. With analysis and performance evaluation, both solutions are demonstrably lightweight with acceptable security features.
APA, Harvard, Vancouver, ISO, and other styles
48

Lall, Manoj. "Selection of mobile agent systems based on mobility, communication and security aspects." Diss., 2005. http://hdl.handle.net/10500/2397.

Full text
Abstract:
The availability of numerous mobile agent systems with its own strengths and weaknesses poses a problem when deciding on a particular mobile agent system. In this dissertation, factors based on mobility, communication and security of the mobile agent systems are presented and used as a means to address this problem. To facilitate in the process of selection, a grouping scheme of the agent system was proposed. Based on this grouping scheme, mobile agent systems with common properties are grouped together and analyzed against the above-mentioned factors. In addition, an application was developed using the Aglet Software Development Toolkit to demonstrate certain features of agent mobility, communication and security.
Theoretical Computing
M. Sc. (Computer Science)
APA, Harvard, Vancouver, ISO, and other styles
49

Moyo, Moses. "Information security risk management in small-scale organisations: a case study of secondary schools’ computerised information systems." Diss., 2014. http://hdl.handle.net/10500/14611.

Full text
Abstract:
Threats to computerised information systems are always on the rise and compel organisations to invest a lot of money and time amongst other technical controls in an attempt to protect their critical information from inherent security risks. The computerisation of information systems in secondary schools has effectively exposed these organisations to a host of complex information security challenges that they have to deal with in addition to their core business of teaching and learning. Secondary schools handle large volumes of sensitive information pertaining to educators, learners, creditors and financial records that they are obliged to secure. Computerised information systems are vulnerable to both internal and external threats but ease of access sometimes manifest in security breaches, thereby undermining information security. Unfortunately, school managers and users of computerised information systems are ignorant of the risks to their information systems assets and the consequences of the compromises that might occur thereof. One way of educating school managers and users about the risks to their computerised information systems is through a risk management programme in which they actively participate. However, secondary schools do not have the full capacity to perform information security risk management exercises due to the unavailability of risk management experts and scarce financial resources to fund such programmes. This qualitative case study was conducted in two secondary schools that use computerised information systems to support everyday administrative operations. The main objective of this research study was to assist secondary schools that used computerised information systems to develop a set of guidelines they would use to effectively manage information security risks in their computerised information systems. This study educated school managers and computerised information systems users on how to conduct simple risk management exercises. The Operationally Critical Threats, Assets and Vulnerability Evaluation for small-scale organisations risk management method was used to evaluate the computerised information systems in the two schools and attain the goals of the research study. Data for this study were generated through participatory observation, physical inspections and interview techniques. Data were presented, analysed and interpreted qualitatively. This study found that learners‟ continuous assessment marks, financial information, educators‟ personal information, custom application software, server-computers and telecommunication equipment used for networking were the critical assets. The main threats to these critical assets were authorised and unauthorised systems users, malware, system crashes, access paths and incompatibilities in software. The risks posed by these threats were normally led to the unavailability of critical information systems assets, compromise of data integrity and confidentiality. This also led to the loss of productivity and finance, and damage to school reputation. The only form of protection mechanism enforced by secondary schools was physical security. To mitigate the pending risks, the study educated school managers and users in selecting, devising and implementing simple protection and mitigation strategies commensurate with their information systems, financial capabilities and their level of skills. This study also recommended that secondary schools remove all critical computers from open-flow school networks, encrypt all critical information, password-protect all computers holding critical information and train all users of information systems of personal security. The study will be instrumental in educating school managers and computerised information systems users in information security awareness and risk management in general.
Science Engineering and Technology
M.Sc. (Information Systems)
APA, Harvard, Vancouver, ISO, and other styles
50

Sui, Yan. "Design and evaluation of a secure, privacy-preserving and cancelable biometric authentication : Bio-Capsule." Thesis, 2014. http://hdl.handle.net/1805/4985.

Full text
Abstract:
Indiana University-Purdue University Indianapolis (IUPUI)
A large portion of system breaches are caused by authentication failure either during the system login process or even in the post-authentication session, which is further related to the limitations associated with existing authentication approaches. Current authentication methods, whether proxy based or biometrics based, are hardly user-centric; and they either put burdens on users or endanger users' (biometric) security and privacy. In this research, we propose a biometrics based user-centric authentication approach. The main idea is to introduce a reference subject (RS) (for each system), securely fuse the user's biometrics with the RS, generate a BioCapsule (BC) (from the fused biometrics), and employ BCs for authentication. Such an approach is user-friendly, identity-bearing yet privacy-preserving, resilient, and revocable once a BC is compromised. It also supports "one-click sign on" across multiple systems by fusing the user's biometrics with a distinct RS on each system. Moreover, active and non-intrusive authentication can be automatically performed during the user's post-authentication on-line session. In this research, we also formally prove that the proposed secure fusion based BC approach is secure against various attacks and compare the new approach with existing biometrics based approaches. Extensive experiments show that the performance (i.e., authentication accuracy) of the new BC approach is comparable to existing typical biometric authentication approaches, and the new BC approach also possesses other desirable features such as diversity and revocability.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography