To see the other types of publications on this topic, follow the link: Data processing security.

Dissertations / Theses on the topic 'Data processing security'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Data processing security.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

鄧偉明 and Wai-ming Tang. "Semantics of authentication in workflow security." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2001. http://hub.hku.hk/bib/B30110828.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Kalibjian, Jeff. "Data Security Architecture Considerations for Telemetry Post Processing Environments." International Foundation for Telemetering, 2017. http://hdl.handle.net/10150/626950.

Full text
Abstract:
Telemetry data has great value, as setting up a framework to collect and gather it involve significant costs. Further, the data itself has product diagnostic significance and may also have strategic national security importance if the product is defense or intelligence related. This potentially makes telemetry data a target for acquisition by hostile third parties. To mitigate this threat, data security principles should be employed by the organization to protect telemetry data. Data security is in an important element of a layered security strategy for the enterprise. The value proposition centers on the argument that if organization perimeter/internal defenses (e.g. firewall, IDS, etc.) fail enabling hostile entities to be able to access data found on internal company networks; they will be unable to read the data because it will be encrypted. After reviewing important encryption background including accepted practices, standards, and architectural considerations regarding disk, file, database and application data protection encryption strategies; specific data security options applicable to telemetry post processing environments will be discussed providing tangible approaches to better protect organization telemetry data.
APA, Harvard, Vancouver, ISO, and other styles
3

Benson, Glenn Stuart. "A formal protection model of security in distributed systems." Diss., Georgia Institute of Technology, 1989. http://hdl.handle.net/1853/12238.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Kalibjian, J. R. "Telemetry Post-Processing in the Clouds: A Data Security Challenge." International Foundation for Telemetering, 2011. http://hdl.handle.net/10150/595799.

Full text
Abstract:
ITC/USA 2011 Conference Proceedings / The Forty-Seventh Annual International Telemetering Conference and Technical Exhibition / October 24-27, 2011 / Bally's Las Vegas, Las Vegas, Nevada
As organizations move toward cloud [1] computing environments, data security challenges will begin to take precedence over network security issues. This will potentially impact telemetry post processing in a myriad of ways. After reviewing how data security tools like Enterprise Rights Management (ERM), Enterprise Key Management (EKM), Data Loss Prevention (DLP), Database Activity Monitoring (DAM), and tokenization are impacting cloud security, their effect on telemetry post-processing will also be examined. An architecture will be described detailing how these data security tools can be utilized to make telemetry post-processing environments in the cloud more robust.
APA, Harvard, Vancouver, ISO, and other styles
5

He, Yijun, and 何毅俊. "Protecting security in cloud and distributed environments." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2012. http://hub.hku.hk/bib/B49617631.

Full text
Abstract:
Encryption helps to ensure that information within a session is not compromised. Authentication and access control measures ensure legitimate and appropriate access to information, and prevent inappropriate access to such resources. While encryption, authentication and access control each has its own responsibility in securing a communication session, a combination of these three mechanisms can provide much better protection for information. This thesis addresses encryption, authentication and access control related problems in cloud and distributed environments, since these problems are very common in modern organization environment. The first one is a User-friendly Location-free Encryption System for Mobile Users (UFLE). It is an encryption and authentication system which provides maximum security to sensitive data in distributed environment: corporate, home and outdoors scenarios, but requires minimum user effort (i.e. no biometric entry, or possession of cryptographic tokens) to access the data. It makes users securely and easily access data any time and any place, as well as avoids data breach due to stolen/lost laptops and USB flash. The multi-factor authentication protocol provided in this scheme is also applicable to cloud storage. The second one is a Simple Privacy-Preserving Identity-Management for Cloud Environment (SPICE). It is the first digital identity management system that can satisfy “unlinkability”and “delegatable authentication” in addition to other desirable properties in cloud environment. Unlinkability ensures that none of the cloud service providers (CSPs), even if they collude, can link the transactions of the same user. On the other hand, delegatable authentication is unique to the cloud platform, in which several CSPs may join together to provide a packaged service, with one of them being the source provider which interacts with the clients and performs authentication, while the others are receiving CSPs which will be transparent to the clients. The authentication should be delegatable such that the receiving CSP can authenticate a user without a direct communication with either the user or the registrar, and without fully trusting the source CSP. The third one addresses re-encryption based access control issue in cloud and distributed storage. We propose the first non-transferable proxy re-encryption scheme [16] which successfully achieves the non-transferable property. Proxy re-encryption allows a third-party (the proxy) to re-encrypt a ciphertext which has been encrypted for one party without seeing the underlying plaintext so that it can be decrypted by another. A proxy re-encryption scheme is said to be non-transferable if the proxy and a set of colluding delegatees cannot re-delegate decryption rights to other parties. The scheme can be utilized for a content owner to delegate content decryption rights to users in the untrusted cloud storage. The advantages of using such scheme are: decryption keys are managed by the content owner, and plaintext is always hidden from cloud provider.
published_or_final_version
Computer Science
Doctoral
Doctor of Philosophy
APA, Harvard, Vancouver, ISO, and other styles
6

Hu, Ji, Dirk Cordel, and Christoph Meinel. "A virtual machine architecture for creating IT-security laboratories." Universität Potsdam, 2006. http://opus.kobv.de/ubp/volltexte/2009/3307/.

Full text
Abstract:
E-learning is a flexible and personalized alternative to traditional education. Nonetheless, existing e-learning systems for IT security education have difficulties in delivering hands-on experience because of the lack of proximity. Laboratory environments and practical exercises are indispensable instruction tools to IT security education, but security education in con-ventional computer laboratories poses the problem of immobility as well as high creation and maintenance costs. Hence, there is a need to effectively transform security laboratories and practical exercises into e-learning forms. This report introduces the Tele-Lab IT-Security architecture that allows students not only to learn IT security principles, but also to gain hands-on security experience by exercises in an online laboratory environment. In this architecture, virtual machines are used to provide safe user work environments instead of real computers. Thus, traditional laboratory environments can be cloned onto the Internet by software, which increases accessibilities to laboratory resources and greatly reduces investment and maintenance costs. Under the Tele-Lab IT-Security framework, a set of technical solutions is also proposed to provide eective functionalities, reliability, security, and performance. The virtual machines with appropriate resource allocation, software installation, and system congurations are used to build lightweight security laboratories on a hosting computer. Reliability and availability of laboratory platforms are covered by the virtual machine management framework. This management framework provides necessary monitoring and administration services to detect and recover critical failures of virtual machines at run time. Considering the risk that virtual machines can be misused for compromising production networks, we present security management solutions to prevent misuse of laboratory resources by security isolation at the system and network levels. This work is an attempt to bridge the gap between e-learning/tele-teaching and practical IT security education. It is not to substitute conventional teaching in laboratories but to add practical features to e-learning. This report demonstrates the possibility to implement hands-on security laboratories on the Internet reliably, securely, and economically.
APA, Harvard, Vancouver, ISO, and other styles
7

Kwok, Tai-on Tyrone. "High performance embedded reconfigurable computing data security and media processing applications /." Click to view the E-thesis via HKUTO, 2005. http://sunzi.lib.hku.hk/hkuto/record/B3204043X.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Kwok, Tai-on Tyrone, and 郭泰安. "High performance embedded reconfigurable computing: data security and media processing applications." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2005. http://hub.hku.hk/bib/B3204043X.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

梁松柏 and Chung-pak Leung. "Concurrent auditing on computerized accounting systems." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1998. http://hub.hku.hk/bib/B31269011.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Xu, Cheng. "Authenticated query processing in the cloud." HKBU Institutional Repository, 2019. https://repository.hkbu.edu.hk/etd_oa/620.

Full text
Abstract:
With recent advances in data-as-a-service (DaaS) and cloud computing, outsourcing data to the cloud has become a common practice. In a typical scenario, the data owner (DO) outsources the data and delegates the query processing service to a service provider (SP). However, as the SP is often an untrusted third party, the integrity of the query results cannot be guaranteed and is thus imperative to be authenticated. To tackle this issue, a typical approach is letting the SP provide a cryptographic proof, which can be used to verify the soundness and completeness of the query results by the clients. Despite extensive research on authenticated query processing for outsourced databases, existing techniques have only considered limited query types. They fail to address a variety of needs demanded by enterprise customers such as supporting aggregate queries over set-valued data, enforcing fine-grained access control, and using distributed computing paradigms. In this dissertation, we take the first step to comprehensively investigate the authenticated query processing in the cloud that fulfills the aforementioned requirements. Security analysis and performance evaluation show that the proposed solutions and techniques are robust and efficient under a wide range of system settings.
APA, Harvard, Vancouver, ISO, and other styles
11

Goss, Ryan Gavin. "Enabling e-learning 2.0 in information security education: a semantic web approach." Thesis, Nelson Mandela Metropolitan University, 2009. http://hdl.handle.net/10948/909.

Full text
Abstract:
The motivation for this study argued that current information security ed- ucation systems are inadequate for educating all users of computer systems world wide in acting securely during their operations with information sys- tems. There is, therefore, a pervasive need for information security knowledge in all aspects of modern life. E-Learning 2.0 could possi- bly contribute to solving this problem, however, little or no knowledge currently exists regarding the suitability and practicality of using such systems to infer information security knowledge to learners.
APA, Harvard, Vancouver, ISO, and other styles
12

Edman, Johan, and Wilhelm Ågren. "Legal and Security Issues of Data Processing when Implementing IoT Solutions in Apartments." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-277917.

Full text
Abstract:
The concept of the Internet of Things (IoT) and connected devices is a growing trend. New ways to integrate them with Smart Home Technology emerge each day. The use of sensors in IoT solutions enables large scale data collection that can be used in various ways. The European Union recently enforced a General Data Protection Regulation (GDPR) that sets guidelines for the collection and processing of personal information. The communication protocol M-Bus is a European standard (EN 13757-x) mainly used for remote reading of electrical, gas and water meters. M-Bus is being integrated with sensors because the protocol offers long battery times. There are however some known flaws with the protocol that might make it unsuitable for a large scale data collection system. A conceptualized data collection scenario with a system utilizing M- Bus is presented. The authors aim to investigate some of the security flaws with the M-Bus protocol, while also investigating the GDPR demands of the system. The thesis supplements a System Requirement Specification (SyRS) which can be used as a template for organizations implementing a similar system. An analysis of the system based on the SyRS is conducted to identify any shortcomings. Modifications to the system are proposed in order to comply with the defined SyRS. The authors concluded that M-Bus is a sufficiently reliable protocol to be used in the system, and has no inherent conflicts with GDPR. The system has a few flaws in terms of GDPR compliance, which require both administrative and technical work to comply with. The suggested modifications of the system are mainly focused on how the data is stored in various parts of it.
Konceptet med Internet of Things (IoT) och uppkopplade enheter är en väx- ande trend, och nya sätt att integrera dem med det smarta hemmet framträder varje dag. Den Europeiska Unionen har nyligen verkställt en ny dataskydds- förordning, General Data Protection Regulation (GDPR), som sätter krav på insamling och behandling av personlig data. Användandet av IoT lösningar skapar möjligheten för storskalig datainsamling som kan användas på flera sätt. Kommunikationsprotokollet M-Bus är en europeisk standard (EN 13757-x) som huvudsakligen är framtagen för att avlägset läsa av el-, gas- och vattenmätare. På grund av ett litet avtryck och enkel implementation av sitt protokoll så är M-bus ofta ett val till uppkoplade sensorer för att möjliggöra lång drifttid. Det finns däremot ett antal säkerhetsbrister med protokollet som kan göra det olämpligt för ett datainsamlingssystem. Ett konceptualiserat datainsamlingscenario med ett system som utnyttjar M-Bus presenteras. Författarnas mål är att undersöka några av säkerhetsbristerna med M-Bus protokollet, samtidigt som det undersöker vilka krav GDPR ställer på ett sådant system. Uppsatsen sammanställer en kravspecifikation som kan användas som grund och riktlinje för organisationer som ska implementera liknande system. En analys av det konceptualiserade systemet baserat på kravspecifikationen genomförs för att identifiera potentiella brister. Modifikationer till system föreslås för att uppnå kraven definierade i kravspecifikationen. Författarna drog slutsatsen att M-Bus är ett tillräckligt tillförlitligt protokoll som kan användas för system likt detta. Det analyserade systemet har några brister gällande GDPR, som kräver både tekniska och administrativa åtgärder. De föreslagna modifikationerna av systemet är fokuserade primärt på hur den personliga informationen lagras i de olika delarna av systemet.
APA, Harvard, Vancouver, ISO, and other styles
13

Hellsing, Mattias, and Odervall Albin. "Efficient Multi-Core Implementation of the IPsec Encapsulating Security Payload Protocol for a Single Security Association." Thesis, Linköpings universitet, Programvara och system, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-151984.

Full text
Abstract:
As the mobile Internet traffic increases, the workload of the base stations processing this traffic increases with it. To cope with this, the telecommunication providers responsible for the systems deployed in these base stations have looked to parallelism. This, together with the fact that these providers have a vested interest in protecting their users' data from potential attackers, means that there is a need for efficient parallel packet processing software which handles encryption as well as authentication. A well known protocol for encryption and authentication of IP packets is the Encapsulating Security Payload (ESP) protocol of the IPsec protocol suite. IPsec establishes simplex connections, called Security Associations (SA), between entities that wish to communicate. This thesis investigates a special case of this problem where the work of encrypting and authenticating the packets within a single SA is parallelized. This problem was investigated by developing and comparing two multi-threaded implementations based on the Eventdev, an event driven programming library, and ring buffer libraries of Data Plane Development Kit (DPDK). One additional Eventdev-based implementation was also investigated which schedules linked lists of packets, instead of single packets, in an attempt to reduce the overhead of scheduling packets to the worker cores. These implementations were then evaluated in terms of throughput, latency, speedup, and last level cache miss rates. The results showed that the ring buffer-based implementation performed the best in all metrics while the single packet-scheduling Eventdev-based implementation was outperformed by the one using linked lists of packets. It was shown that the packet generation, which was done by the receiving core, was the main limiting factor for all implementations. In addition, the memory resources such as the memory bus, memory controller and prefetching hardware were shown to likely be an area of contention and a possible bottleneck as the packet generation rate increases. The conclusion drawn from this was that a parallelized packet retrieval solution such as Receive Side Scaling (RSS) together with minimizing memory resource contention is necessary to further improve performance.
APA, Harvard, Vancouver, ISO, and other styles
14

Gerber, Mariana. "The development of a technique to establish the security requirements of an organization." Thesis, Port Elizabeth Technikon, 2001. http://hdl.handle.net/10948/89.

Full text
Abstract:
To perform their business activities effectively, organizations rely heavily on the use of information (ISO/IEC TR 13335-2, 1996, p 1). Owens (1998) reiterates this by claiming that all organizations depend on information for their everyday operation and without it business will fail to operate (Owens, 1998, p 1-2). For an organization it means that if the right information is not available at the right time, it can make the difference between profit and loss or success and failure (Royds, 2000, p 2). Information is an asset and just like other important business assets within the organization, it has extreme value to an organization (BS 7799-1, 1999, p 1; Humphreys, Moses & Plate, 1998, p 8). For this reason it has become very important that business information is sufficiently protected. There are many different ways in which information can exist. Information can be printed or written on paper, stored electronically, transmitted electronically or by post, even spoken in conversation or any other way in which knowledge and ideas can be conveyed (URN 99/703, 1999, p. 2; Humphreys, Moses & Plate, 1998, p 8; URN 96/702, 1996, p 3).It is, therefore, critical to protect information, and to ensure that the security of IT (Information Technology) systems within organizations is properly managed. This requirement to protect information is even more important today, since many organizations are internally and externally connected by networks of IT systems (ISO/IEC TR 13335-2, 1996, p 1). Information security is therefore required to assist in the process of controlling and securing of information from accidental or malicious changes, deletions or unauthorized disclosure (Royds, 2000, p 2; URN 96/702, 1996, p 3). By preventing and minimizing the impact of security incidents, information security can ensure business continuity and reduce business damage (Owens, 1998, p 7). Information security in an organization can be regarded as a management opportunity and should become an integral part of the whole management activity of the organization. Obtaining commitment from management is therefore extremely important for effective information security. One way in which management can show their commitment to ensuring information security, is to adopt and enforce a security policy. A security policy ensures that people understand exactly what important role they play in securing information assets.
APA, Harvard, Vancouver, ISO, and other styles
15

Jones, James H. "Detecting hidden computer processes by deliberate resource exhaustion." Fairfax, VA : George Mason University, 2008. http://hdl.handle.net/1920/3385.

Full text
Abstract:
Thesis (Ph.D.)--George Mason University, 2008.
Vita: p. 259. Thesis director: Kathryn B. Laskey. Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy in Computational Sciences and Informatics. Title from PDF t.p. (viewed Mar. 17, 2009). Includes bibliographical references (p. 255-258). Also issued in print.
APA, Harvard, Vancouver, ISO, and other styles
16

Okere, Irene Onyekachi. "A control framework for the assessment of information security culture." Thesis, Nelson Mandela Metropolitan University, 2013. http://hdl.handle.net/10948/d1019861.

Full text
Abstract:
The modern organisation relies heavily on information to function effectively. With such reliance on information, it is vital that information be protected from both internal (employees) and external threats. The protection of information or information security to a large extent depends on the behaviour of humans (employees) in the organisation. The behaviour of employees is one of the top information security issues facing organisations as the human factor is regarded as the weakest link in the security chain. To address this human factor many researchers have suggested the fostering of a culture of information security so that information security becomes second nature to employees. Information security culture as defined for this research study exists in four levels namely artefacts, espoused values, shared tacit assumptions and information security knowledge. An important step in the fostering of an information security culture is the assessment of the current state of such a culture. Gaps in current approaches for assessing information security culture were identified and this research study proposes the use of a control framework to address the identified gaps. This research study focuses on the assessment of information security culture and addresses 5 research objectives namely 1) to describe information security culture in the field of information security, 2) to determine ways to foster information security culture in an organisation, 3) to demonstrate the gap in current approaches used to assess information security culture, 4) to determine the components that could be used for the assessment of information security culture for each of the culture’s underlying levels and 5) to describe a process for the assessment of information security culture for all four levels. This research study follows a qualitative approach utilising a design science strategy and multi-method qualitative data collection techniques including literature review, qualitative content analysis, argumentation, and modelling techniques. The research methods provide a means for the interpretation of the data and the development of the proposed control framework.
APA, Harvard, Vancouver, ISO, and other styles
17

Viljoen, Melanie. "A framework towards effective control in information security governance." Thesis, Nelson Mandela Metropolitan University, 2009. http://hdl.handle.net/10948/887.

Full text
Abstract:
The importance of information in business today has made the need to properly secure this asset evident. Information security has become a responsibility for all managers of an organization. To better support more efficient management of information security, timely information security management information should be made available to all managers. Smaller organizations face special challenges with regard to information security management and reporting due to limited resources (Ross, 2008). This dissertation discusses a Framework for Information Security Management Information (FISMI) that aims to improve the visibility and contribute to better management of information security throughout an organization by enabling the provision of summarized, comprehensive information security management information to all managers in an affordable manner.
APA, Harvard, Vancouver, ISO, and other styles
18

Judge, Paul Q. "Security and protection architectures for large-scale content distribution." Diss., Georgia Institute of Technology, 2002. http://hdl.handle.net/1853/9217.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Kalibjian, Jeffrey R. "The Impact Of Wireless Security Protocols on Post Processed Telemetry Data Transfer." International Foundation for Telemetering, 2002. http://hdl.handle.net/10150/606320.

Full text
Abstract:
International Telemetering Conference Proceedings / October 21, 2002 / Town & Country Hotel and Conference Center, San Diego, California
Commercial wireless protocol use (e.g. Wireless Access Protocol, Bluetooth, etc.) is becoming widespread as the demand to access computing devices in remote locations grows. Although not widely prevalent today, wireless access of post processed telemetry data will become a common activity. Essential to the use of such a capability is the security of the wireless links involved in the data transfer. Each wireless protocol has an associated security paradigm. Some protocols have stronger security schemes than others and this should influence protocol selection for particular telemetry data transfer applications.
APA, Harvard, Vancouver, ISO, and other styles
20

Kalibjian, Jeffrey R. "The Impact of the Common Data Security Architecture (CDSA) on Telemetry Post Processing Architectures." International Foundation for Telemetering, 1999. http://hdl.handle.net/10150/608706.

Full text
Abstract:
International Telemetering Conference Proceedings / October 25-28, 1999 / Riviera Hotel and Convention Center, Las Vegas, Nevada
It is an increasing requirement that commercial satellite telemetry data product be protected from unauthorized access during transmission to ground stations. While the technology (cryptography) to secure telemetry data product is well known, the software infrastructure to support such security is costly, and very customized. Further, many software packages have difficulty interoperating. The Common Data Security Architecture [1] [2] [3] (originally proposed by the Intel Corporation, and now adopted by the Open Group), is a set of common cryptographic [4] and public key infrastructure (PKI) application programming interfaces (APIs) which will facilitate better cryptographic interoperability as well as making cryptographic resources more readily available in telemetry post processing environments.
APA, Harvard, Vancouver, ISO, and other styles
21

Tansley, Natalie Vanessa. "A methodology for measuring and monitoring IT risk." Thesis, Nelson Mandela Metropolitan University, 2007. http://hdl.handle.net/10948/772.

Full text
Abstract:
The primary objective of the research is to develop a methodology for monitoring and measuring IT risks, strictly focusing on internal controls. The research delivers a methodology whereby an organization can measure its system of internal controls, providing assurance that the risks are at an acceptable level. To achieve the primary objective a number of secondary objectives were addressed: What are the drivers forcing organizations to better corporate governance in managing risk? What is IT risk management, specifically focusing on operational risk. What is internal control and specifically focusing on COSO’s internal control process. Investigation of measurement methods, such as, Balance Scorecards, Critical Success Factors, Maturity Models, Key Performance Indicators and Key Goal Indicators. Investigation of various frameworks such as CobiT, COSO and ISO 17799, ITIL and BS 7799 as to how they manage IT risk relating to internal control.
APA, Harvard, Vancouver, ISO, and other styles
22

Metzger, Christiane, and Johann Haag. "„Ich könnte nie wieder zu einem ‚normalen‘ Stundenplan zurück!“ – Zur Reorganisation der Lehre in einem Bachelor-Studiengang IT Security." Universität Potsdam, 2013. http://opus.kobv.de/ubp/volltexte/2013/6488/.

Full text
Abstract:
Im Bachelor-Studiengang (B. Sc.) IT Security an der Fachhochschule St. Pölten wurde im Wintersemester 2011/12 versuchsweise die Lehrorganisation im ersten Fachsemester verändert: Die Module bzw. Teilmodule wurden nicht mehr alle parallel zueinander unterrichtet, sondern jedes Modul wurde exklusiv über einige Wochen abgehalten. Im Beitrag werden die Auswirkungen und bisherigen Erfahrungen mit dieser Reorganisation der Lehre geschildert: So haben sich die Noten im Mittel um etwa eine Note verbessert, die Zahl derjenigen Studierenden, die durch Prüfungen durchfallen, ist drastisch gesunken. Die Zufriedenheit der Studierenden und Lehrenden ist so groß, dass diese Form der Lehrorganisation im gesamten Bachelor- und auch im Masterstudiengang übernommen wird.
APA, Harvard, Vancouver, ISO, and other styles
23

Yan, Chenyu. "Architectural support for improving security and performance of memory sub-systems." Diss., Atlanta, Ga. : Georgia Institute of Technology, 2008. http://hdl.handle.net/1853/26663.

Full text
Abstract:
Thesis (Ph.D)--Computing, Georgia Institute of Technology, 2009.
Committee Chair: Milos Prvulovic; Committee Member: Gabriel Loh; Committee Member: Hyesoon Kim; Committee Member: Umakishore Ramachandran; Committee Member: Yan Solihin. Part of the SMARTech Electronic Thesis and Dissertation Collection.
APA, Harvard, Vancouver, ISO, and other styles
24

Chan, Yik-Kwan Eric, and 陳奕鈞. "Investigation of a router-based approach to defense against Distributed Denial-of-Service (DDoS) attack." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2004. http://hub.hku.hk/bib/B30173309.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Widener, Patrick M. (Patrick McCall). "Dynamic Differential Data Protection for High-Performance and Pervasive Applications." Diss., Georgia Institute of Technology, 2005. http://hdl.handle.net/1853/7239.

Full text
Abstract:
Modern distributed applications are long-lived, are expected to provide flexible and adaptive data services, and must meet the functionality and scalability challenges posed by dynamically changing user communities in heterogeneous execution environments. The practical implications of these requirements are that reconfiguration and upgrades are increasingly necessary, but opportunities to perform such tasks offline are greatly reduced. Developers are responding to this situation by dynamically extending or adjusting application functionality and by tuning application performance, a typical method being the incorporation of client- or context-specific code into applications' execution loops. Our work addresses a basic roadblock in deploying such solutions: the protection of key application components and sensitive data in distributed applications. Our approach, termed Dynamic Differential Data Protection (D3P), provides fine-grain methods for providing component-based protection in distributed applications. Context-sensitive, application-specific security methods are deployed at runtime to enforce restrictions in data access and manipulation. D3P is suitable for low- or zero-downtime environments, since deployments are performed while applications run. D3P is appropriate for high performance environments and for highly scalable applications like publish/subscribe, because it creates native codes via dynamic binary code generation. Finally, due to its integration into middleware, D3P can run across a wide variety of operating system and machine platforms. This dissertation introduces D3P, using sample applications from the high performance and pervasive computing domains to illustrate the problems addressed by our D3P solution. It also describes how D3P can be integrated into modern middleware. We present experimental evaluations which demonstrate the fine-grain nature of D3P, that is, its ability to capture individual end users' or components' needs for data protection, and also describe the performance implications of using D3P in data-intensive applications.
APA, Harvard, Vancouver, ISO, and other styles
26

Cui, Yingjie, and 崔英杰. "A study on privacy-preserving clustering." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2009. http://hub.hku.hk/bib/B4357225X.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Mxoli, Ncedisa Avuya Mercia. "Guidelines for secure cloud-based personal health records." Thesis, Nelson Mandela Metropolitan University, 2017. http://hdl.handle.net/10948/14134.

Full text
Abstract:
Traditionally, health records have been stored in paper folders at the physician’s consulting rooms – or at the patient’s home. Some people stored the health records of their family members, so as to keep a running history of all the medical procedures they went through, and what medications they were given by different physicians at different stages of their lives. Technology has introduced better and safer ways of storing these records, namely, through the use of Personal Health Records (PHRs). With time, different types of PHRs have emerged, i.e. local, remote server-based, and hybrid PHRs. Web-based PHRs fall under the remote server-based PHRs; and recently, a new market in storing PHRs has emerged. Cloud computing has become a trend in storing PHRs in a more accessible and efficient manner. Despite its many benefits, cloud computing has many privacy and security concerns. As a result, the adoption rate of cloud services is not yet very high. A qualitative and exploratory research design approach was followed in this study, in order to reach the objective of proposing guidelines that could assist PHR providers in selecting a secure Cloud Service Provider (CSP) to store their customers’ health data. The research methods that were used include a literature review, systematic literature review, qualitative content analysis, reasoning, argumentation and elite interviews. A systematic literature review and qualitative content analysis were conducted to examine those risks in the cloud environment that could have a negative impact on the secure storing of PHRs. PHRs must satisfy certain dimensions, in order for them to be meaningful for use. While these were highlighted in the research, it also emerged that certain risks affect the PHR dimensions directly, thus threatening the meaningfulness and usability of cloud-based PHRs. The literature review revealed that specific control measures can be adopted to mitigate the identified risks. These control measures form part of the material used in this study to identify the guidelines for secure cloud-based PHRs. The guidelines were formulated through the use of reasoning and argumentation. After the guidelines were formulated, elite interviews were conducted, in order to validate and finalize the main research output: i.e. guidelines. The results of this study may alert PHR providers to the risks that exist in the cloud environment; so that they can make informed decisions when choosing a CSP for storing their customers’ health data.
APA, Harvard, Vancouver, ISO, and other styles
28

Maseti, Ophola S. "A model for role-based security education, training and awareness in the South African healthcare environment." Thesis, Nelson Mandela Metropolitan University, 2008. http://hdl.handle.net/10948/724.

Full text
Abstract:
It is generally accepted that a business operates more efficiently when it is able to consolidate information from a variety of sources. This principle applies as much in the healthcare environment. Although limited in the South African context, the use of electronic systems to access information is advancing rapidly. Many aspects have to be considered in regards to such a high availability of information, for example, training people how to access and protect information, motivating them to use the systems and information extensively and effectively, ensuring adequate levels of security, confronting ethical issues and maintaining the availability of information at crucial times. This is especially true in the healthcare sector, where access to critical data is often vital. This data must be accessed by different kinds of people with different levels of access. However, accessibility often leads to vulnerabilities. The healthcare sector deals with very sensitive data. People’s medical records need to be kept confidential; hence, security is very important. Information of a very sensitive nature is exposed to human intervention on various levels (e.g. nurses, administrative staff, general practitioners and specialists). In this scenario, it is important for each person to be aware of the requirements in terms of security and privacy, especially from a legal perspective. Because of the large dependence on the human factor in maintaining information security, organisations must employ mechanisms that address this at the staff level. One such mechanism is information security education, training and awareness programmes. As the learner is the recipient of information in such a programme, it is increasingly important that it targets the audience that it is intended for. This will maximize the benefits achieved from such a programme. This can be achieved through following a role-based approach in the design and development of the SETA programme. This research therefore proposes a model for a role-based SETA programme, with the area of application being in the South African healthcare environment.
APA, Harvard, Vancouver, ISO, and other styles
29

Leung, Wing Pan. "Visual cryptography for color images : formal security analysis and new construction /." access full-text access abstract and table of contents, 2009. http://libweb.cityu.edu.hk/cgi-bin/ezdb/thesis.pl?mphil-cs-b23759100f.pdf.

Full text
Abstract:
Thesis (M.Phil.)--City University of Hong Kong, 2009.
"Submitted to Department of Computer Science in partial fulfillment of the requirements for the degree of Master of Philosophy." Includes bibliographical references (leaves 103-108)
APA, Harvard, Vancouver, ISO, and other styles
30

Miles, Shaun Graeme. "An investigation of issues of privacy, anonymity and multi-factor authentication in an open environment." Thesis, Rhodes University, 2012. http://hdl.handle.net/10962/d1006653.

Full text
Abstract:
This thesis performs an investigation into issues concerning the broad area ofIdentity and Access Management, with a focus on open environments. Through literature research the issues of privacy, anonymity and access control are identified. The issue of privacy is an inherent problem due to the nature of the digital network environment. Information can be duplicated and modified regardless of the wishes and intentions ofthe owner of that information unless proper measures are taken to secure the environment. Once information is published or divulged on the network, there is very little way of controlling the subsequent usage of that information. To address this issue a model for privacy is presented that follows the user centric paradigm of meta-identity. The lack of anonymity, where security measures can be thwarted through the observation of the environment, is a concern for users and systems. By an attacker observing the communication channel and monitoring the interactions between users and systems over a long enough period of time, it is possible to infer knowledge about the users and systems. This knowledge is used to build an identity profile of potential victims to be used in subsequent attacks. To address the problem, mechanisms for providing an acceptable level of anonymity while maintaining adequate accountability (from a legal standpoint) are explored. In terms of access control, the inherent weakness of single factor authentication mechanisms is discussed. The typical mechanism is the user-name and password pair, which provides a single point of failure. By increasing the factors used in authentication, the amount of work required to compromise the system increases non-linearly. Within an open network, several aspects hinder wide scale adoption and use of multi-factor authentication schemes, such as token management and the impact on usability. The framework is developed from a Utopian point of view, with the aim of being applicable to many situations as opposed to a single specific domain. The framework incorporates multi-factor authentication over multiple paths using mobile phones and GSM networks, and explores the usefulness of such an approach. The models are in tum analysed, providing a discussion into the assumptions made and the problems faced by each model.
Adobe Acrobat Pro 9.5.1
Adobe Acrobat 9.51 Paper Capture Plug-in
APA, Harvard, Vancouver, ISO, and other styles
31

Kalibjian, Jeff, and Steven Wierenga. "Assuring Post Processed Telemetry Data Integrity With a Secure Data Auditing Appliance." International Foundation for Telemetering, 2005. http://hdl.handle.net/10150/604910.

Full text
Abstract:
ITC/USA 2005 Conference Proceedings / The Forty-First Annual International Telemetering Conference and Technical Exhibition / October 24-27, 2005 / Riviera Hotel & Convention Center, Las Vegas, Nevada
Recent federal legislation (e.g. Sarbanes Oxley, Graham Leach Bliley) has introduced requirements for compliance including records retention and records integrity. Many industry sectors (e.g. Energy, under the North American Energy Reliability Council) are also introducing their own voluntary compliance mandates to avert possible additional federal regulation. A trusted computer appliance device dedicated to data auditing may soon be required in all corporate IT infrastructures to accommodate various compliance directives. Such an auditing device also may have application in telemetry post processing environments, as it maybe used to guarantee the integrity of post-processed telemetry data.
APA, Harvard, Vancouver, ISO, and other styles
32

Li, Xiao-Yu. "Evolving a secure grid-enabled, distributed data warehouse : a standards-based perspective." Thesis, Nelson Mandela Metropolitan University, 2007. http://hdl.handle.net/10948/544.

Full text
Abstract:
As digital data-collection has increased in scale and number, it becomes an important type of resource serving a wide community of researchers. Cross-institutional data-sharing and collaboration introduce a suitable approach to facilitate those research institutions that are suffering the lack of data and related IT infrastructures. Grid computing has become a widely adopted approach to enable cross-institutional resource-sharing and collaboration. It integrates a distributed and heterogeneous collection of locally managed users and resources. This project proposes a distributed data warehouse system, which uses Grid technology to enable data-access and integration, and collaborative operations across multi-distributed institutions in the context of HV/AIDS research. This study is based on wider research into OGSA-based Grid services architecture, comprising a data-analysis system which utilizes a data warehouse, data marts, and near-line operational database that are hosted by distributed institutions. Within this framework, specific patterns for collaboration, interoperability, resource virtualization and security are included. The heterogeneous and dynamic nature of the Grid environment introduces a number of security challenges. This study also concerns a set of particular security aspects, including PKI-based authentication, single sign-on, dynamic delegation, and attribute-based authorization. These mechanisms, as supported by the Globus Toolkit’s Grid Security Infrastructure, are used to enable interoperability and establish trust relationship between various security mechanisms and policies within different institutions; manage credentials; and ensure secure interactions.
APA, Harvard, Vancouver, ISO, and other styles
33

Bellam, Kiranmai Qin Xiao. "Improving reliability, energy-efficiency and security of storage systems and real-time systems." Auburn, Ala, 2009. http://hdl.handle.net/10415/1722.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Ma, Chunyan. "Mathematical security models for multi-agent distributed systems." CSUSB ScholarWorks, 2004. https://scholarworks.lib.csusb.edu/etd-project/2568.

Full text
Abstract:
This thesis presents the developed taxonomy of the security threats in agent-based distributed systems. Based on this taxonomy, a set of theories is developed to facilitate analyzng the security threats of the mobile-agent systems. We propose the idea of using the developed security risk graph to model the system's vulnerabilties.
APA, Harvard, Vancouver, ISO, and other styles
35

Hu, Ji. "A virtual machine architecture for IT-security laboratories." Phd thesis, [S.l.] : [s.n.], 2006. http://deposit.ddb.de/cgi-bin/dokserv?idn=980935652.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Coertze, Jacques Jacobus. "A framework for information security governance in SMMEs." Thesis, Nelson Mandela Metropolitan University, 2012. http://hdl.handle.net/10948/d1014083.

Full text
Abstract:
It has been found that many small, medium and micro-sized enterprises (SMMEs) do not comply with sound information security governance principles, specifically the principles involved in drafting information security policies and monitoring compliance, mainly as a result of restricted resources and expertise. Research suggests that this problem occurs worldwide and that the impact it has on SMMEs is great. The problem is further compounded by the fact that, in our modern-day information technology environment, many larger organisations are providing SMMEs with access to their networks. This results not only in SMMEs being exposed to security risks, but the larger organisations as well. In previous research an information security management framework and toolbox was developed to assist SMMEs in drafting information security policies. Although this research was of some help to SMMEs, further research has shown that an even greater problem exists with the governance of information security as a result of the advancements that have been identified in information security literature. The aim of this dissertation is therefore to establish an information security governance framework that requires minimal effort and little expertise to alleviate governance problems. It is believed that such a framework would be useful for SMMEs and would result in the improved implementation of information security governance.
APA, Harvard, Vancouver, ISO, and other styles
37

Kalibjian, Jeff. "AN UPDATE ON NETWORK-BASED SECURITY TECHNOLOGIES APPLICABLE TO TELEMETRY POST-PROCESSING AND ANALYSIS ACTIVITIES." International Foundation for Telemetering, 2007. http://hdl.handle.net/10150/604578.

Full text
Abstract:
ITC/USA 2007 Conference Proceedings / The Forty-Third Annual International Telemetering Conference and Technical Exhibition / October 22-25, 2007 / Riviera Hotel & Convention Center, Las Vegas, Nevada
Networked based technologies (i.e. TCP/IP) have come to play an important role in the evolution of telemetry post processing services. A paramount issue when using networking to access/move telemetry data is security. In past years papers have focused on individual security technologies and how they could be used to secure telemetry data. This paper will review currently available network based security technologies, update readers on enhancements, and discuss their appropriate uses in the various phases of telemetry post-processing and analysis activities.
APA, Harvard, Vancouver, ISO, and other styles
38

Burdis, Keith Robert. "Distributed authentication for resource control." Thesis, Rhodes University, 2000. http://hdl.handle.net/10962/d1006512.

Full text
Abstract:
This thesis examines distributed authentication in the process of controlling computing resources. We investigate user sign-on and two of the main authentication technologies that can be used to control a resource through authentication and providing additional security services. The problems with the existing sign-on scenario are that users have too much credential information to manage and are prompted for this information too often. Single Sign-On (SSO) is a viable solution to this problem if physical procedures are introduced to minimise the risks associated with its use. The Generic Security Services API (GSS-API) provides security services in a manner in- dependent of the environment in which these security services are used, encapsulating security functionality and insulating users from changes in security technology. The un- derlying security functionality is provided by GSS-API mechanisms. We developed the Secure Remote Password GSS-API Mechanism (SRPGM) to provide a mechanism that has low infrastructure requirements, is password-based and does not require the use of long-term asymmetric keys. We provide implementations of the Java GSS-API bindings and the LIPKEY and SRPGM GSS-API mechanisms. The Secure Authentication and Security Layer (SASL) provides security to connection- based Internet protocols. After finding deficiencies in existing SASL mechanisms we de- veloped the Secure Remote Password SASL mechanism (SRP-SASL) that provides strong password-based authentication and countermeasures against known attacks, while still be- ing simple and easy to implement. We provide implementations of the Java SASL binding and several SASL mechanisms, including SRP-SASL.
APA, Harvard, Vancouver, ISO, and other styles
39

Iwaya, Leonardo H. "Secure and Privacy-aware Data Collection and Processing in Mobile Health Systems." Licentiate thesis, Karlstads universitet, Institutionen för matematik och datavetenskap (from 2013), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-46982.

Full text
Abstract:
Healthcare systems have assimilated information and communication technologies in order to improve the quality of healthcare and patient's experience at reduced costs. The increasing digitalization of people's health information raises however new threats regarding information security and privacy. Accidental or deliberate data breaches of health data may lead to societal pressures, embarrassment and discrimination. Information security and privacy are paramount to achieve high quality healthcare services, and further, to not harm individuals when providing care. With that in mind, we give special attention to the category of Mobile Health (mHealth) systems. That is, the use of mobile devices (e.g., mobile phones, sensors, PDAs) to support medical and public health. Such systems, have been particularly successful in developing countries, taking advantage of the flourishing mobile market and the need to expand the coverage of primary healthcare programs. Many mHealth initiatives, however, fail to address security and privacy issues. This, coupled with the lack of specific legislation for privacy and data protection in these countries, increases the risk of harm to individuals. The overall objective of this thesis is to enhance knowledge regarding the design of security and privacy technologies for mHealth systems. In particular, we deal with mHealth Data Collection Systems (MDCSs), which consists of mobile devices for collecting and reporting health-related data, replacing paper-based approaches for health surveys and surveillance. This thesis consists of publications contributing to mHealth security and privacy in various ways: with a comprehensive literature review about mHealth in Brazil; with the design of a security framework for MDCSs (SecourHealth); with the design of a MDCS (GeoHealth); with the design of Privacy Impact Assessment template for MDCSs; and with the study of ontology-based obfuscation and anonymisation functions for health data.
Information security and privacy are paramount to achieve high quality healthcare services, and further, to not harm individuals when providing care. With that in mind, we give special attention to the category of Mobile Health (mHealth) systems. That is, the use of mobile devices (e.g., mobile phones, sensors, PDAs) to support medical and public health. Such systems, have been particularly successful in developing countries, taking advantage of the flourishing mobile market and the need to expand the coverage of primary healthcare programs. Many mHealth initiatives, however, fail to address security and privacy issues. This, coupled with the lack of specific legislation for privacy and data protection in these countries, increases the risk of harm to individuals. The overall objective of this thesis is to enhance knowledge regarding the design of security and privacy technologies for mHealth systems. In particular, we deal with mHealth Data Collection Systems (MDCSs), which consists of mobile devices for collecting and reporting health-related data, replacing paper-based approaches for health surveys and surveillance.
APA, Harvard, Vancouver, ISO, and other styles
40

Balupari, Ravindra. "Real-time network-based anomaly intrusion detection." Ohio : Ohio University, 2002. http://www.ohiolink.edu/etd/view.cgi?ohiou1174579398.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Brasee, Kaleb. "Secure distributed single sign-on with two-factor authentication /." Connect to Online Resource-OhioLINK, 2007. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=toledo1195656733.

Full text
Abstract:
Thesis (M.S.)--University of Toledo, 2007.
Typescript. "Submitted as partial fulfillments of the requirements for the Master of Engineering with a concentration in Computer Science." "A thesis entitled"--at head of title. Bibliography: leaves 69-72.
APA, Harvard, Vancouver, ISO, and other styles
42

Lee, Kum-Yu Enid. "Privacy and security of an intelligent office form." Thesis, Kansas State University, 1986. http://hdl.handle.net/2097/9930.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Addimando, Alessio. "Progettazione e prototipazione di un sistema di Data Stream Processing basato su Apache Storm." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2016. http://amslaurea.unibo.it/10977/.

Full text
Abstract:
Con l’avvento di Internet, il numero di utenti con un effettivo accesso alla rete e la possibilità di condividere informazioni con tutto il mondo è, negli anni, in continua crescita. Con l’introduzione dei social media, in aggiunta, gli utenti sono portati a trasferire sul web una grande quantità di informazioni personali mettendoli a disposizione delle varie aziende. Inoltre, il mondo dell’Internet Of Things, grazie al quale i sensori e le macchine risultano essere agenti sulla rete, permette di avere, per ogni utente, un numero maggiore di dispositivi, direttamente collegati tra loro e alla rete globale. Proporzionalmente a questi fattori anche la mole di dati che vengono generati e immagazzinati sta aumentando in maniera vertiginosa dando luogo alla nascita di un nuovo concetto: i Big Data. Nasce, di conseguenza, la necessità di far ricorso a nuovi strumenti che possano sfruttare la potenza di calcolo oggi offerta dalle architetture più complesse che comprendono, sotto un unico sistema, un insieme di host utili per l’analisi. A tal merito, una quantità di dati così vasta, routine se si parla di Big Data, aggiunta ad una velocità di trasmissione e trasferimento altrettanto alta, rende la memorizzazione dei dati malagevole, tanto meno se le tecniche di storage risultano essere i tradizionali DBMS. Una soluzione relazionale classica, infatti, permetterebbe di processare dati solo su richiesta, producendo ritardi, significative latenze e inevitabile perdita di frazioni di dataset. Occorre, perciò, far ricorso a nuove tecnologie e strumenti consoni a esigenze diverse dalla classica analisi batch. In particolare, è stato preso in considerazione, come argomento di questa tesi, il Data Stream Processing progettando e prototipando un sistema bastato su Apache Storm scegliendo, come campo di applicazione, la cyber security.
APA, Harvard, Vancouver, ISO, and other styles
44

Domingues, Steve. "Navigating between information security management documents : a modeling methodology." Thesis, Nelson Mandela Metropolitan University, 2010. http://hdl.handle.net/10948/1212.

Full text
Abstract:
Organizations no longer draft their own standards. Instead, organizations take advantage of the available international standards. One standard may not cover all the organization's needs, requiring organizations to implement more than one standard. The same aspect in an organization may be covered by two or more standards, creating an overlap. An awareness of such overlaps led to various institutions creating mapping documents illustrating how a control from one standard relates to a control from a different standard. The mapping documents are consulted by the end user, to identify how a control in one standard may relate to other standards. This allows the end user to navigate between the standards documents. These mapping documents are valuable to a person who wishes to grasp how different standards deal with a specific control. However, the navigation between standards is a cumbersome task. In order to navigate between the standards the end user is required to consult three or more documents, depending on the number of standards that are mapped to the control being investigated. The need for a tool that will provide fast and efficient navigation between standards was identified. The data tier of the tool is the focus of this dissertation. As a result, this research proposes a modeling methodology that will allow for the modeling of the standards and the information about the mapping between standards, thereby contributing to the creation of tools to aid in the navigation between standards. A comparison between the major data modeling paradigms identifies multi-dimensional modeling as the most appropriate technique to model standards. Adapting an existing modeling methodology to cater for the modeling standards, yield a five step standard modeling methodology. Once modeled, the standards can be physically implemented as a database. The database schema that results from the standard modeling methodology adheres to a specific pattern and can thus be expressed according to well-defined meta-model. This allows for the generation of SQL statements by a tool with limited knowledge of the standards in a way that allows the quick navigation between standards. To determine the usefulness of the standards modeling methodology the research presents iv a prototype that utilizes the well-defined meta-model to navigate between standards. It is shown that, as far as navigation is concerned, no code changes are necessary when adding a new standard or new mappings between standards. This research contributes to the creation of a tool that can easily navigate between standards by providing the ability to model the data tier in such a way that it is extensible, yet remains independent of the application and presentation tiers.
APA, Harvard, Vancouver, ISO, and other styles
45

Lunemann, Carolin. "Quantum cryptography : security analysis of multiuser quantum communication with embedded authentication." Master's thesis, Universität Potsdam, 2006. http://opus.kobv.de/ubp/volltexte/2007/1275/.

Full text
Abstract:
Three quantum cryptographic protocols of multiuser quantum networks with embedded authentication, allowing quantum key distribution or quantum direct communication, are discussed in this work. The security of the protocols against different types of attacks is analysed with a focus on various impersonation attacks and the man-in-the-middle attack. On the basis of the security analyses several improvements are suggested and implemented in order to adjust the investigated vulnerabilities. Furthermore, the impact of the eavesdropping test procedure on impersonation attacks is outlined. The framework of a general eavesdropping test is proposed to provide additional protection against security risks in impersonation attacks.
In der Diplomarbeit werden drei verschiedene quantenkryptographische Protokolle mit dem Schwerpunkt auf authentifizierten Quantennetzwerken analysiert. Die Sicherheit der Protokolle gegenüber verschiedenen Angriffen wird untersucht, wobei der Fokus auf kompletten Personifikationsattacken („impersonation attacks“) liegt. Auf Basis der Sicherheitsanalyse und den Netzwerkanforderungen werden entsprechende Verbesserungen vorgeschlagen. Um die Gefahr von Personifikationen realistisch abschätzen zu können, wird außerdem der Einfluss des Testablaufs analysiert. Um zusätzlichen Schutz gegen Personifikationsattacken zu gewährleisten, werden die Rahmenbedingungen für eine allgemeine Testspezifikation festgelegt.
APA, Harvard, Vancouver, ISO, and other styles
46

Perelson, Stephen. "SoDA : a model for the administration of separation of duty requirements in workflow systems." Thesis, Port Elizabeth Technikon, 2001. http://hdl.handle.net/10948/68.

Full text
Abstract:
The increasing reliance on information technology to support business processes has emphasised the need for information security mechanisms. This, however, has resulted in an ever-increasing workload in terms of security administration. Security administration encompasses the activity of ensuring the correct enforcement of access control within an organisation. Access rights and their allocation are dictated by the security policies within an organisation. As such, security administration can be seen as a policybased approach. Policy-based approaches promise to lighten the workload of security administrators. Separation of duties is one of the principles cited as a criterion when setting up these policy-based mechanisms. Different types of separation of duty policies exist. They can be categorised into policies that can be enforced at administration time, viz. static separation of duty requirements and policies that can be enforced only at execution time, viz. dynamic separation of duty requirements. This dissertation deals with the specification of both static separation of duty requirements and dynamic separation of duty requirements in role-based workflow environments. It proposes a model for the specification of separation of duty requirements, the expressions of which are based on set theory. The model focuses, furthermore, on the enforcement of static separation of duty. The enforcement of static separation of duty requirements is modelled in terms of invariant conditions. The invariant conditions specify restrictions upon the elements allowed in the sets representing access control requirements. The sets are themselves expressed as database tables within a relational database management system. Algorithms that stipulate how to verify the additions or deletions of elements within these sets can then be performed within the database management system. A prototype was developed in order to demonstrate the concepts of this model. This prototype helps demonstrate how the proposed model could function and flaunts its effectiveness.
APA, Harvard, Vancouver, ISO, and other styles
47

Wells, William Ward. "Information security program development." CSUSB ScholarWorks, 2004. https://scholarworks.lib.csusb.edu/etd-project/2585.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Menzel, Michael. "Model-driven security in service-oriented architectures : leveraging security patterns to transform high-level security requirements to technical policies." Phd thesis, Universität Potsdam, 2011. http://opus.kobv.de/ubp/volltexte/2012/5905/.

Full text
Abstract:
Service-oriented Architectures (SOA) facilitate the provision and orchestration of business services to enable a faster adoption to changing business demands. Web Services provide a technical foundation to implement this paradigm on the basis of XML-messaging. However, the enhanced flexibility of message-based systems comes along with new threats and risks. To face these issues, a variety of security mechanisms and approaches is supported by the Web Service specifications. The usage of these security mechanisms and protocols is configured by stating security requirements in security policies. However, security policy languages for SOA are complex and difficult to create due to the expressiveness of these languages. To facilitate and simplify the creation of security policies, this thesis presents a model-driven approach that enables the generation of complex security policies on the basis of simple security intentions. SOA architects can specify these intentions in system design models and are not required to deal with complex technical security concepts. The approach introduced in this thesis enables the enhancement of any system design modelling languages – for example FMC or BPMN – with security modelling elements. The syntax, semantics, and notion of these elements is defined by our security modelling language SecureSOA. The metamodel of this language provides extension points to enable the integration into system design modelling languages. In particular, this thesis demonstrates the enhancement of FMC block diagrams with SecureSOA. To enable the model-driven generation of security policies, a domain-independent policy model is introduced in this thesis. This model provides an abstraction layer for security policies. Mappings are used to perform the transformation from our model to security policy languages. However, expert knowledge is required to generate instances of this model on the basis of simple security intentions. Appropriate security mechanisms, protocols and options must be chosen and combined to fulfil these security intentions. In this thesis, a formalised system of security patterns is used to represent this knowledge and to enable an automated transformation process. Moreover, a domain-specific language is introduced to state security patterns in an accessible way. On the basis of this language, a system of security configuration patterns is provided to transform security intentions related to data protection and identity management. The formal semantics of the security pattern language enable the verification of the transformation process introduced in this thesis and prove the correctness of the pattern application. Finally, our SOA Security LAB is presented that demonstrates the application of our model-driven approach to facilitate a dynamic creation, configuration, and execution of secure Web Service-based composed applications.
Im Bereich der Enterprisearchitekturen hat das Paradigma der Service-orientierten Architektur (SOA) in den vergangenen Jahren eine große Bedeutung erlangt. Dieser Ansatz ermöglicht die Strukturierung und Umsetzung verteilter, IT-basierter Geschäftsfunktionen, um einen effizienten und flexiblen Einsatz von IT-Ressourcen zu ermöglichen. Während in der Vergangenheit fachliche Anforderungen in monolithischen Applikationen umgesetzt wurden, setzt dieser Architekturansatz auf wiederverwendbare Dienste, die spezifische Geschäftsfunktionen implementieren. Diese Dienste können dann dynamisch zur Umsetzung von Geschäftsprozessen herangezogen werden und ermöglichen eine schnelle Reaktion auf verändernde geschäftliche Rahmenbedingungen durch Anpassung der Prozesse. Die einzelnen Dienste existieren unabhängig voneinander und sind lose über einen Nachrichtenaustausch gekoppelt. Diese Unabhängigkeit unterscheidet den SOA-Ansatz von der bisherigen Entwicklung klassischer verteilter Anwendungen. Die Verwendung unabhängiger Dienste geht aber auch mit einem größeren Gefährdungspotential einher, da eine Vielzahl von Schnittstellen bereitgestellt wird, die mittels komplexer Protokolle angesprochen werden können. Somit ist die korrekte Umsetzung von Sicherheitsmechanismen in allen Diensten und SOA-Infrastrukturkomponeten essentiell. Kommunikationspartner müssen an jedem Kommunikationsendpunkt authentifiziert und autorisiert werden und ausgetauschte Nachrichten müssen immer geschützt werden. Solche Sicherheitsanforderungen werden in technischen Sicherheitskonfigurationen (Policydokumenten) mittels einer Policysprache kodiert und werden an die Dienste verteilt, die diese Anforderungen durchsetzen. Da Policysprachen für SOA aber durch die Vielzahl und Vielfalt an Sicherheitsmechanismen, -protokollen und -standards eine hohe Komplexität aufweisen, sind Sicherheitskonfigurationen höchst fehleranfällig und mit viel Fachwissen zu erstellen. Um die Generierung von Sicherheitskonfigurationen in komplexen Systemen zu vereinfachen, wird in dieser Arbeit ein modellgetriebener Ansatz vorgestellt, der eine visuelle Modellierung von Sicherheitsanforderungen in Architekturmodellen ermöglicht und eine automatisierte Generierung von Sicherheitskonfigurationen auf Basis dieser Anforderungen unterstützt. Die Modellierungsebene ermöglicht eine einfache und abstrakte Darstellung von Sicherheitsanforderungen, die sich auch für Systemarchitekten erschließen, welche keine Sicherheits-experten sind. Beispielsweise können modellierte Daten einfach mit einem Schloss annotiert werden, um den Schutz dieser Daten zu fordern. Die Syntax, die Semantik und die Darstellung dieser Anforderungen werden durch die in dieser Arbeit vorgestellte Sicherheitsmodellierungssprache SecureSOA spezifiziert. Der vorgestellte modellgetriebene Ansatz transformiert die modellierten Anforderungen auf ein domänen-unabhängiges Policymodell, das eine Abstraktionsschicht zu konkreten Policysprachen bildet. Diese Abstrak-tionsschicht vereinfacht die Generierung von Sicherheitspolicies in verschiedenen Policysprachen. Allerdings kann diese Transformation nur erfolgen, wenn im System Expertenwissen hinterlegt ist, das die Auswahl von konkreten Sicherheitsmechanismen und -optionen bestimmt. Im Rahmen dieser Arbeit werden Entwurfsmuster für SOA-Sicherheit zur Transformation herangezogen, die dieses Wissen repräsentieren. Dazu wird ein Katalog von Entwurfsmustern eingeführt, der die Abbildung von abstrakten Sicherheitsanforderungen auf konkrete Konfigurationen ermöglicht. Diese Muster sind mittels einer Entwurfsmustersprache definiert, die in dieser Arbeit eingeführt wird. Die formale Semantik dieser Sprache ermöglicht die formale Verifikation des Transformationsprozesses, um die Korrektheit der Entwurfsmusteranwendung nachzuweisen. Die Definition dieses Entwurfsmusterkatalogs und der darauf basierende Transformationsprozess ermöglichen die Abbildung von abstrakten Sicherheitsanforderungen auf konkrete technische Sicherheitskonfigurationen und stellen den Beitrag dieser Arbeit dar. Abschließend wird in dieser Arbeit das SOA-Security-Lab vorgestellt, das die Umsetzung dieses Ansatzes demonstriert.
APA, Harvard, Vancouver, ISO, and other styles
49

May, Brian 1975. "Scalable access control." Monash University, School of Computer Science and Software, 2001. http://arrow.monash.edu.au/hdl/1959.1/8043.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Neuhaus, Christian, Andreas Polze, and Mohammad M. R. Chowdhuryy. "Survey on healthcare IT systems : standards, regulations and security." Universität Potsdam, 2011. http://opus.kobv.de/ubp/volltexte/2011/5146/.

Full text
Abstract:
IT systems for healthcare are a complex and exciting field. One the one hand, there is a vast number of improvements and work alleviations that computers can bring to everyday healthcare. Some ways of treatment, diagnoses and organisational tasks were even made possible by computer usage in the first place. On the other hand, there are many factors that encumber computer usage and make development of IT systems for healthcare a challenging, sometimes even frustrating task. These factors are not solely technology-related, but just as well social or economical conditions. This report describes some of the idiosyncrasies of IT systems in the healthcare domain, with a special focus on legal regulations, standards and security.
IT Systeme für Medizin und Gesundheitswesen sind ein komplexes und spannendes Feld. Auf der einen Seite stehen eine Vielzahl an Verbesserungen und Arbeitserleichterungen, die Computer zum medizinischen Alltag beitragen können. Einige Behandlungen, Diagnoseverfahren und organisatorische Aufgaben wurden durch Computer überhaupt erst möglich. Auf der anderen Seite gibt es eine Vielzahl an Fakturen, die Computerbenutzung im Gesundheitswesen erschweren und ihre Entwicklung zu einer herausfordernden, sogar frustrierenden Aufgabe machen können. Diese Faktoren sind nicht ausschließlich technischer Natur, sondern auch auf soziale und ökonomische Gegebenheiten zurückzuführen. Dieser Report beschreibt einige Besondenderheiten von IT Systemen im Gesundheitswesen, mit speziellem Fokus auf gesetzliche Rahmenbedingungen, Standards und Sicherheit.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography