To see the other types of publications on this topic, follow the link: Electronic data control.

Dissertations / Theses on the topic 'Electronic data control'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Electronic data control.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

May, Brian 1975. "Scalable access control." Monash University, School of Computer Science and Software, 2001. http://arrow.monash.edu.au/hdl/1959.1/8043.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Kulatunga, Chamil. "Enforcing receiver-driven multicast congestion control using ECN-Nonce." Thesis, Available from the University of Aberdeen Library and Historic Collections Digital Resources, 2009. http://digitool.abdn.ac.uk:80/webclient/DeliveryManager?application=DIGITOOL-3&owner=resourcediscovery&custom_att_2=simple_viewer&pid=33532.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Cline, George E. "A control framework for distributed (parallel) processing environments." Thesis, This resource online, 1994. http://scholar.lib.vt.edu/theses/available/etd-12042009-020227/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Snowdon, Jane Louise. "Workflow control for surges from a batch work station." Diss., Georgia Institute of Technology, 1994. http://hdl.handle.net/1853/25100.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Zeng, Ningzhou. "REFORM: REFACTORIZED ELECTRONIC WEB FORMS - LARGE SCALESURVEY DATA CAPTURE AND WORKFLOW CONTROL FRAMEWORK." Case Western Reserve University School of Graduate Studies / OhioLINK, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=case1496839127238529.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Lööf, Sam. "Evaluation of Protocols for Transfer of Automotive Data from an Electronic Control Unit." Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-176080.

Full text
Abstract:
Nowadays almost all motorized vehicles use electronic control units (ECUs) to control parts of a vehicle’s function. A good way to understand a vehicle’s behaviour is to analyse logging data containing ECU internal variables. Data must then be transferred from the ECU to a computer in order to study such data. Today, Keyword Protocol (KWP) requests are used to read data from the ECUs at Scania. The method is not suitable if many signals should be logged with a higher transfer rate than the one used today. In this thesis, communication protocols, that allow an ECU to communicate with a computer, are studied. The purpose of this master’s thesis is to examine how the transfer rate of variables from Scania’s ECUs to a computer can become faster compared to the method used today in order to get a more frequent logging of the variables. The method that was chosen was implemented, evaluated and also compared to the method used today. The busload, total CPU load and CPU load for the frequency used during the experiments, 100 Hz, was also examined and evaluated. The experiments performed show that the method chosen, data acquisition (DAQ) with CAN Calibration Protocol (CCP), increased the transfer rate of the internal ECU variables significantly compared to the method using KWP requests. The results also show that the number of signals have a major impact on the busload for DAQ. The busload is the parameter that limits the number of signals that can be logged. The total CPU load and the CPU load for 100 Hz are not affected significantly compared to when no transmissions are performed. Even though the busload can become high if many variables are used in DAQ, DAQ with CCP is preferable over KWP requests. This is due to the great increase in transfer rate of the ECU internal variables and thus a great increase in the logging frequency.
Nuförtiden används styrenheter (ECUer) för att styra delar av ett fordons funktion i så gott som alla motoriserade fordon. Ett bra sätt att förstå ett fordons beteende är att analysera loggningsdata som innehåller interna styrenhetsvariabler. Data måste då överföras från styrenheten till en dator för att data ska kunna studeras. Idag används Keyword Protocol-förfrågningar (KWP-förfrågningar) för att läsa data från Scanias styrenheter. Metoden är inte lämplig om man vill logga många variabler med en högre överföringshastighet än den som används idag. I detta examensarbete studeras kommunikationsprotokoll som tillåter en styrenhet att kommunicera med en dator. Examensarbetets syfte är undersöka hur överföringshastigheten av variablerna, från Scanias styrenheter till en dator, kan ökas jämfört med den metod som används idag för att få en mer frekvent loggning av variablerna. Metoden som valdes implementerades, utvärderades och jämfördes med metoden som används idag. Busslasten, totala CPU-lasten och CPU-lasten för den frekvens som användes under experimenten 100 Hz har också undersökts och evaluerats. De utförda experimenten visar att den valda metoden, data acquisition (DAQ) med CAN Calibration Protocol (CCP), ökade överföringshastigheten av de interna styrenhetsvariablerna betydligt jämfört med metoden KWP-förfrågningar. Experimenten visar också att antalet signaler har stor inverkan på busslasten för DAQ. Busslasten är den parameter som begränsar antalet signaler som kan loggas. Den totala CPU-lasten och CPU-lasten för 100 Hz påverkas inte betydligt jämfört med när inga överföringar görs. DAQ med CCP är att föredra framför KWP-förfrågningar även om busslasten blir hög för DAQ då den stora ökningen i överföringshastighet av de interna styrenhetsvariablerna medför en mer frekvent loggning av variablerna.
APA, Harvard, Vancouver, ISO, and other styles
7

Sheth, Amit Pravin. "Adaptive concurrency control for distributed database systems /." The Ohio State University, 1985. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487262513408523.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Nielsen, Niels Bech. "Using electronic voting systems data outside lectures to support learning." Connect to e-thesis. Move to record for print version, 2007. http://theses.gla.ac.uk/46/.

Full text
Abstract:
Thesis (MSc. (R)) - University of Glasgow, 2007.
MSc. (R) thesis submitted to the Department of Computing Science, Faculty of Information and Mathematical Sciences, University of Glasgow, 2007. Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
9

Cheung, Shun Yan. "Optimizing the performance of quorum consensus replica control protocols." Diss., Georgia Institute of Technology, 1990. http://hdl.handle.net/1853/8150.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Burdis, Keith Robert. "Distributed authentication for resource control." Thesis, Rhodes University, 2000. http://hdl.handle.net/10962/d1006512.

Full text
Abstract:
This thesis examines distributed authentication in the process of controlling computing resources. We investigate user sign-on and two of the main authentication technologies that can be used to control a resource through authentication and providing additional security services. The problems with the existing sign-on scenario are that users have too much credential information to manage and are prompted for this information too often. Single Sign-On (SSO) is a viable solution to this problem if physical procedures are introduced to minimise the risks associated with its use. The Generic Security Services API (GSS-API) provides security services in a manner in- dependent of the environment in which these security services are used, encapsulating security functionality and insulating users from changes in security technology. The un- derlying security functionality is provided by GSS-API mechanisms. We developed the Secure Remote Password GSS-API Mechanism (SRPGM) to provide a mechanism that has low infrastructure requirements, is password-based and does not require the use of long-term asymmetric keys. We provide implementations of the Java GSS-API bindings and the LIPKEY and SRPGM GSS-API mechanisms. The Secure Authentication and Security Layer (SASL) provides security to connection- based Internet protocols. After finding deficiencies in existing SASL mechanisms we de- veloped the Secure Remote Password SASL mechanism (SRP-SASL) that provides strong password-based authentication and countermeasures against known attacks, while still be- ing simple and easy to implement. We provide implementations of the Java SASL binding and several SASL mechanisms, including SRP-SASL.
APA, Harvard, Vancouver, ISO, and other styles
11

Haasbroek, Adriaan Lodewicus. "Advanced control with semi-empirical and data based modelling for falling film evaporators." Thesis, Stellenbosch : Stellenbosch University, 2013. http://hdl.handle.net/10019.1/80196.

Full text
Abstract:
Thesis (MSc)--Stellenbosch University, 2013.
ENGLISH ABSTRACT: This work focussed on a local multiple chamber falling film evaporator (FFE). The FFE is currently under operator control and experiencing large amounts of lost production time due to excessive fouling. Furthermore, the product milk dry mass fraction (WP) is constantly off specification, negatively influencing product quality, while the first effect temperature (TE1) runs higher than the recommended 70°C (this is a main cause of fouling). A two month period of historical data were received with the aim to develop a controller that could outperform the operators by keeping both control variables, WP and TE1, at desired set points while also increasing throughput and maintaining product quality. Access to the local plant was not possible and as such available process data were cleaned and used to identify two data based models, transfer function and autoregressive with exogenous inputs (ARX) models, as well as a semi-empirical-model. The ARX model proved inadequate to predict TE1 trends, with an average TE1 correlation to historical data of 0.36, compared to 0.59 and 0.74 for the transfer function and semi-empirical-models respectively. Product dry mass correlations were similar between the models with the average correlations of 0.47, 0.53 and 0.51 for the semi-empirical, transfer function and ARX models respectively. Although the semi-empirical showed the lowest WP correlation, it was offset by the TE1 prediction advantage. Therefore, the semi-empirical model was selected for controller development and comparisons. The success of the semi-empirical model was in accordance with previous research [1] [2] [3], yet other studies have concluded that ARX modelling was more suited to FFE modelling [4]. Three controllers were developed, namely: a proportional and integral (PI) controller as base case, a linear quadratic regulator (LQR) as an optimal state space alternative and finally, to make full use of process knowledge, a predictive fuzzy logic controller (PFC). The PI controller was able to offer zero offset set point tracking, but could not adequately reject a feed dry mass (WF) disturbance (as proposed and reported by Winchester [5]). The LQR was combined with a Kalman estimator and used pre-delay states. In order to offer increased disturbance rejection, the feedback gains of the disturbance states were tuned individually. The altered LQR and PFC solutions proved to adequately reject all modelled disturbances and outperform a cascade controller designed by Bakker [6]. The maximum deviation in WP was a fractional increase of 0.007 for LQR and 0.005 for FPC, compared to 0.012 for PI and 0.0075 for the cascade controller [6] (WF disturbance fractional increase of 0.01). All the designed controllers managed to reduce the standard deviation of operator controlled WP and TE1 by at least 700% and 450%, respectively. The same level of reduction was seen for maximum control variable deviations (370%), the integral of the absolute error (300%) and the mean squared error (900%). All these performance metrics point to the controllers performing better than the operator based control. In order to prevent manipulated variable saturation and optimise the feed flow rate (F1), a fuzzy feed optimiser (FFO) was developed. The FFO focussed on maximising the available evaporative capacity of the FFE by optimising the motive steam pressure (PS), which supplied heat to the effects. By using the FFO for each controller the average feed flow rate was increased by 4.8% (±500kg/h) compared to the operator control. In addition to flow rate gain, the controllers kept TE1 below 70°C and WP on specification. As such, the overall product quality also increased as well as decreasing the down time due to less fouling.
AFRIKAANSE OPSOMMING: Hierdie projek het op ‘n vallende film verdamper (VFV) gefokus. Die VFV word tans beheer deur operateurs en ondervind groot hoeveelhede verlore produksie tyd a.g.v oormatige aangroeisels. Die vorming van aangroeisels is grootliks te danke aan die eerste effek temperatuur (TE1) wat gereeld 70°C oorskrei. Die produk droë massa fraksie (WP) is ook telkens nie op die gewenste vlak nie, wat produk kwaliteit negatief beinvloed. Data, wat oor ‘n twee maand periode strek, was verkry met die doelstelling om ‘n beheerder te ontwerp wat beter sou vaar as die operateurs, deur beide WP en TE2 om ‘n nou stelpunt te beheer. Ter selfde tyd moet die ontwerpte beheerder die produksie tempo en produk kwaliteit verhoog. Geen toegang tot die plaaslikke VFV was moontlik nie, dus was die data skoongemaak en gebruik om twee data gebasseerde modelle te identifiseer, nl. oordragsfunksie en outoregressiwe met eksogene insette (ORX) modelle, asook ‘n semi-empiriese model. Die ORX model kon nie TE1 goed voorspel nie, met ‘n korrelasie faktor (tot die historiese data) van 0.36, vergeleke met die 0.59 en 0.74 van die oordragsfunksie en semi-empiriese modelle onderskeidelik. WP korrelasie faktore was meer konstant tussen die modelle, met waardes van 0.47, 0.53 en 0.51 vir die semi-empiriese, oordragsfunskie en ORX modelle onderskeidelik. Alhoewel die semi-empiriese model die laagste WP korrelasie vertoon het, was die tekortkoming vergoed deur die beter TE1 voorspelling. Gevolglik was die semi-empiriese model gebruik vir beheerder ontwerp en vergelyking. Die sukses van die semiempiriese model stem ooreen met vorige studies [1] [2] [3], tog het ander studies al bevind dat die ORX model beter gepas is vir die VFV proses [4]. Drie beheerders was ontwikkel, nl. ‘n proporsionele en integreerder (PI) beheerder as basis geval, ‘n liniêre kwadratiese reguleerder (LKR) as optimale toestands beheer alternatief en laastens ‘n voorspellende wasige logika beheerder (VWB) om volle gebruik van proseskennis te maak. Die PI beheerder kon foutlose volging van die stelpunte lewer, maar kon nie ‘n inset voer droë massa fraksie (WF) versteuring (soos voorgestel en weergegee deur Winchester [5]) na wense verwerp nie. Die LKR was saamgevoeg met ‘n Kalman afskatter en het gebruik gemaak van onvertraagde toestande. Die versteuringstoestande was individueel verstel om beter versteurings verweping te weeg te bring. Die aangepaste LKR en VWB kon beide die WF versteuring verwerp en het beter gevaar as ‘n kaskade beheer oplossing wat deur Bakker [6] ontwerp was. Die WP afwyking is beperk tot ‘n fraksie droë masse verandering van 0.007 vir LKR en 0.005 vir VWB, vergeleke met die afwykings van 0.012 vir die PI beheerder asook die 0.0075 van die kaskade beheerder [6]. Die ontwerpte beheerder kon ook die standaard afwyking van beide WP en TE1 met ten minste 700% en 450% onderskeidelik verminder. Soortgelyke verbeterings was gesien vir die maksimum beheer veranderlikke afwyking (370%), die integraal van die absolute fout (300%) en die gemiddelde fout (900%). Dus het die ontwerpte beheerders wesenlik verbeter op die operateur beheer. Ten einde om gemanipuleerde veranderlikke versadiging te voorkom, asook die voer vloei (V1) te optimiseer, was ‘n wasige logika optimiseerder (WVO) ontwerp. Die WVO het die beskikbare verdampingskapasiteit ten volle benut deur te sorg dat die stoom druk (PS), wat energie verskaf vir verdamping, ge-optimiseerd bly. ‘n Gemiddelde V1 stygging van 4.8% (±500kg/uur), vergeleke met operateur beheer, is waargeneem. Al die beheerders kon steeds die WP en TE1 stelpunte volg en dus TE1 onder 70°C hou (wat verminderde vormasie van aangroeisels tot gevolg gehad het). Daarom het die produk kwailiteit verhoog en die verlore produksie tyd verminder.
APA, Harvard, Vancouver, ISO, and other styles
12

Chiu, Lin. "A methodology for designing concurrency control schemes in distributed databases /." The Ohio State University, 1987. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487584612163117.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Viljoen, Melanie. "A framework towards effective control in information security governance." Thesis, Nelson Mandela Metropolitan University, 2009. http://hdl.handle.net/10948/887.

Full text
Abstract:
The importance of information in business today has made the need to properly secure this asset evident. Information security has become a responsibility for all managers of an organization. To better support more efficient management of information security, timely information security management information should be made available to all managers. Smaller organizations face special challenges with regard to information security management and reporting due to limited resources (Ross, 2008). This dissertation discusses a Framework for Information Security Management Information (FISMI) that aims to improve the visibility and contribute to better management of information security throughout an organization by enabling the provision of summarized, comprehensive information security management information to all managers in an affordable manner.
APA, Harvard, Vancouver, ISO, and other styles
14

Huang, Shiping. "Exploratory visualization of data with variable quality." Link to electronic thesis, 2005. http://www.wpi.edu/Pubs/ETD/Available/etd-01115-225546/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Wang, Mianyu Kam Moshe Kandasamy Nagarajan. "A decentralized control and optimization framework for autonomic performance management of web-server systems /." Philadelphia, Pa. : Drexel University, 2007. http://hdl.handle.net/1860/2643.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Goss, Ryan Gavin. "Enabling e-learning 2.0 in information security education: a semantic web approach." Thesis, Nelson Mandela Metropolitan University, 2009. http://hdl.handle.net/10948/909.

Full text
Abstract:
The motivation for this study argued that current information security ed- ucation systems are inadequate for educating all users of computer systems world wide in acting securely during their operations with information sys- tems. There is, therefore, a pervasive need for information security knowledge in all aspects of modern life. E-Learning 2.0 could possi- bly contribute to solving this problem, however, little or no knowledge currently exists regarding the suitability and practicality of using such systems to infer information security knowledge to learners.
APA, Harvard, Vancouver, ISO, and other styles
17

Singhal, Anoop. "The design and analysis of concurrency control algorithms in different network environments /." The Ohio State University, 1985. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487262513408844.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Drummond, John. "Specifying quality of service for distributed systems based upon behavior models." Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2002. http://library.nps.navy.mil/uhtbin/hyperion-image/02Jun%5FDrummond%5FPhD.pdf.

Full text
Abstract:
Thesis (Ph. D.)--Naval Postgraduate School, 2002.
Thesis advisor(s): Valdes Berzins, Luqi, William Kemple, Mikhail Auguston, Nabendu Chaki. Includes bibliographical references (p. 231-240). Also available online.
APA, Harvard, Vancouver, ISO, and other styles
19

Benson, Glenn Stuart. "A formal protection model of security in distributed systems." Diss., Georgia Institute of Technology, 1989. http://hdl.handle.net/1853/12238.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Estlund, Mark J. "A survey and analysis of access control architectures for XML data." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2006. http://library.nps.navy.mil/uhtbin/hyperion/06Mar%5FEstlund.pdf.

Full text
Abstract:
Thesis (M.S. in Computer Science)--Naval Postgraduate School, March 2006.
Thesis Advisor(s): Cynthia E. Irvine, Timothy E. Levin. "March 2006." Includes bibliographical references (p. 43-45). Also available online.
APA, Harvard, Vancouver, ISO, and other styles
21

Lakshmanan, Nithya M. "Estimation and control of nonlinear batch processes using multiple linear models." Thesis, Georgia Institute of Technology, 1997. http://hdl.handle.net/1853/11835.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Doubleday, Kevin. "Generation of Individualized Treatment Decision Tree Algorithm with Application to Randomized Control Trials and Electronic Medical Record Data." Thesis, The University of Arizona, 2016. http://hdl.handle.net/10150/613559.

Full text
Abstract:
With new treatments and novel technology available, personalized medicine has become a key topic in the new era of healthcare. Traditional statistical methods for personalized medicine and subgroup identification primarily focus on single treatment or two arm randomized control trials (RCTs). With restricted inclusion and exclusion criteria, data from RCTs may not reflect real world treatment effectiveness. However, electronic medical records (EMR) offers an alternative venue. In this paper, we propose a general framework to identify individualized treatment rule (ITR), which connects the subgroup identification methods and ITR. It is applicable to both RCT and EMR data. Given the large scale of EMR datasets, we develop a recursive partitioning algorithm to solve the problem (ITR-Tree). A variable importance measure is also developed for personalized medicine using random forest. We demonstrate our method through simulations, and apply ITR-Tree to datasets from diabetes studies using both RCT and EMR data. Software package is available at https://github.com/jinjinzhou/ITR.Tree.
APA, Harvard, Vancouver, ISO, and other styles
23

Ogidan, Olugbenga Kayode. "Design of nonlinear networked control for wastewater distributed systems." Thesis, Cape Peninsula University of Technology, 2014. http://hdl.handle.net/20.500.11838/1201.

Full text
Abstract:
Thesis submitted in fulfilment of the requirements for the degree Doctor of Technology: Electrical Engineering in the Faculty of Engineering at the Cape Peninsula University of Technology 2014
This thesis focuses on the design, development and real-time simulation of a robust nonlinear networked control for the dissolved oxygen concentration as part of the wastewater distributed systems. This concept differs from previous methods of wastewater control in the sense that the controller and the wastewater treatment plants are separated by a wide geographical distance and exchange data through a communication medium. The communication network introduced between the controller and the DO process creates imperfections during its operation, as time delays which are an object of investigation in the thesis. Due to the communication network imperfections, new control strategies that take cognisance of the network imperfections in the process of the controller design are needed to provide adequate robustness for the DO process control system. This thesis first investigates the effects of constant and random network induced time delays and the effects of controller parameters on the DO process behaviour with a view to using the obtained information to design an appropriate controller for the networked closed loop system. On the basis of the above information, a Smith predictor delay compensation controller is developed in the thesis to eliminate the deadtime, provide robustness and improve the performance of the DO process. Two approaches are adopted in the design of the Smith predictor compensation scheme. The first is the transfer function approach that allows a linearized model of the DO process to be described in the frequency domain. The second one is the nonlinear linearising approach in the time domain. Simulation results reveal that the developed Smith predictor controllers out-performed the nonlinear linearising controller designed for the DO process without time delays by compensating for the network imperfections and maintaining the DO concentration within a desired acceptable level. The transfer function approach of designing the Smith predictor is found to perform better under small time delays but the performance deteriorates under large time delays and disturbances. It is also found to respond faster than the nonlinear approach. The nonlinear feedback linearisig approach is slower in response time but out-performs the transfer function approach in providing robustness and performance for the DO process under large time delays and disturbances. The developed Smith predictor compensation schemes were later simulated in a real-time platform using LabVIEW. The Smith predictor controllers developed in this thesis can be applied to other process control plants apart from the wastewater plants, where distributed control is required. It can also be applied in the nuclear reactor plants where remote control is required in hazardous conditions. The developed LabVIEW real-time simulation environment would be a valuable tool for researchers and students in the field of control system engineering. Lastly, this thesis would form the basis for further research in the field of distributed wastewater control.
APA, Harvard, Vancouver, ISO, and other styles
24

Comerford, Michael. "Statistical disclosure control : an interdisciplinary approach to the problem of balancing privacy risk and data utility." Thesis, University of Glasgow, 2014. http://theses.gla.ac.uk/7044/.

Full text
Abstract:
The recent increase in the availability of data sources for research has put significant strain on existing data management work-flows, especially in the field of statistical disclosure control. New statistical methods for disclosure control are frequently set out in the literature, however, few of these methods become functional implementations for data owners to utilise. Current workflows often provide inconsistent results dependent on ad hoc approaches, and bottlenecks can form around statistical disclosure control checks which prevent research from progressing. These problems contribute to a lack of trust between researchers and data owners and contribute to the under utilisation of data sources. This research is an interdisciplinary exploration of the existing methods. It hypothesises that algorithms which invoke a range of statistical disclosure control methods (recoding, suppression, noise addition and synthetic data generation) in a semi-automatic way will enable data owners to release data with a higher level of data utility without any increase in disclosure risk when compared to existing methods. These semi-automatic techniques will be applied in the context of secure data-linkage in the e-Health sphere through projects such as DAMES and SHIP. This thesis sets out a theoretical framework for statistical disclosure control and draws on qualitative data from data owners, researchers, and analysts. With these contextual frames in place, the existing literature and methods were reviewed, and a tool set for implementing k-anonymity and a range of disclosure control methods was created. This tool-set is demonstrated in a standard workflow and it is shown how it could be integrated into existing e-Science projects and governmental settings. Comparing this approach with existing workflows within the Scottish Government and NHS Scotland, it allows data owners to process queries from data users in a semi-automatic way and thus provides for an enhanced user experience. This utility is drawn from the consistency and replicability of the approach combined with the increase in the speed of query processing.
APA, Harvard, Vancouver, ISO, and other styles
25

O'Brien, Marita A. "Effects of Shape, Letter Arrangements, and Practice on Text Entry on a Virtual Keyboard." Thesis, Georgia Institute of Technology, 2006. http://hdl.handle.net/1853/11499.

Full text
Abstract:
This research study examined the design of a virtual keyboard that can be used for text entry with a rotary controller, particularly when users may differ in age and experience with a particular system. I specifically examined the shape and letter arrangement on the virtual keyboard to help determine the best features to use in a design. Two keyboard shapes, an Oval and a Plus, were selected to represent different aspects of the shape. Two keyboard arrangements, Alphabetic and a Standard QWERTY-based ordering, were selected to represent a well-known and less familiar arrangement. In the experiment, older and younger adults entered words over two consecutive days. Most of the time, they used either the Oval or the Plus, but they also used the alternate shape at specific points during their practice session to allow assessment of their ability to transfer what they had learned. At the end of the second day, they also used a variation of the practiced arrangement to examine how well they had learned the letter arrangement. Text entry performance on both shapes improved as a function of practice, demonstrating that participants could learn even unfamiliar devices and virtual keyboards to complete a word entry task. No overall shape effects were found for any level of performance, but shape did affect how participants learned and performed the word entry task. In particular, unique visual features on a shape may facilitate memorization of letter/visual cue mappings. These shape features are particularly important for older adults, as younger adults seem to develop a mental model that helps them memorize letter locations on either shape. With practice, older adults could achieve optimal performance levels with an Alphabetic keyboard on the Plus shape that has the more visually unique corners. In general, alphabetic ordering is best not only because it helped visual search, but also because it facilitated better movement planning. Overall, designers should consider creating unique visual features on a virtual keyboard that will blend with the compatibility and allowed movements for the selected device to create an effective virtual keyboard.
APA, Harvard, Vancouver, ISO, and other styles
26

Christelis, Christian. "Volumetric data throughput optimisation by dynamic FEC bearing frame length adaptation." Thesis, Stellenbosch : University of Stellenbosch, 2010. http://hdl.handle.net/10019.1/4337.

Full text
Abstract:
Thesis (MScEng (Electrical and Electronic Engineering))--University of Stellenbosch, 2010.
ENGLISH ABSTRACT: The telecommunications link between a LEO satellite and a rural ground station with a non-tracking antenna, has a strongly varying link quality and a short communications window. The satellite acts as a store-and-forward node between ground stations. The TC-SDLP and an FTP protocol form a shallow protocol stack, which excludes unneeded protocol functionality and the resulting overhead. Coding gain, introduced by BCH FEC in the TCSDLP, allows for link quality improvement. The core of this thesis is an improvement of the TC-SDLP to maximise effective payload data throughput, or goodput. This improvement was achieved by creating an optimal segment length selection metric based on the BER. Since the BER is not determinable from within the TC-SDLP, the metric was twice determined; once based on the FER and finally based on time delays. The work includes an extensive background study, which consists of space standardisation, orbital physics, error detection and correction, space datalink protocols, data throughput and culminating in the protocol stack design. The project specific link budget calculation is presented. The optimal segment length policy was mathematically determined. Asimulation model of the TC-SDLP was used as a proof of concept for the effective throughput and give a performance benchmark. Finally a TC-SDLP implementation offers a real world performance demonstration.
AFRIKAANSE OPSOMMING: Die telekommunikasie skakel tussen ’n lae aardomwenteling (LEO) sateliet en ’n plattelandse grondstasie met ’n nie-volg antenna, het ’n skakelkwaliteit wat in ’n groot mate varieer en ’n kort kommunikasievenster. Die sateliet tree op as ’n stoor- en- aanstuur node tussen grondstasies. Die TC-SDLP en ’n leêr oordrag protokol (FTP) vorm ’n vlak protokol stapel, wat onnodige protokol funksionaliteit en die gevolglike opkoste uitsluit. Kode aanwins, wat deur die BCH FEC in die TC-SDLP, aangebring word, verbeter die skakelkwaliteit. Die kern van hierdie tesis is ’n verbetering van die TC-SDLP om sodoende die ware deurvoer van nuttige vragdata te maksimimeer. Hierdie verbetering is bereik deur die skep van ’n optimale segmentlengte-seleksie metode gebaseer of die bit fout tempo (BER). Aangesien die BER nie bepaal kan word vanuit die TC-SDLP nie, is die maatstaf twee keer bepaal; die eerste keer is die bepaling gebaseer op die raamwerk fout tempo (FER) en die finale bepaling op tyd vertragings. Die tesis sluit ’n omvattende agtergrondstudie in, wat bestaan uit ruimte standardisering, wentelbaan fisika, die opspoor en regstel van foute, ruimte inligtingskakel protokol en deurstuur van data wat uitloop op die protokol ontwerp. Daar word aangedui hoe die berekening van die begroting vir die skakel van toepassing op die spesifieke projek, gedoen is. ’n Wiskundige analise van die optimale segmentlengte s ook gedoen. ’n Simulasie model van die TC-SDLP is gebruik as ’n bewys van die konsep vir die ware deurset en gee ’n prestasie maatstaf. Laastens bied die TCSDLP implementering ’n ware wereld prestasie demonstrasie.
APA, Harvard, Vancouver, ISO, and other styles
27

Miles, Shaun Graeme. "An investigation of issues of privacy, anonymity and multi-factor authentication in an open environment." Thesis, Rhodes University, 2012. http://hdl.handle.net/10962/d1006653.

Full text
Abstract:
This thesis performs an investigation into issues concerning the broad area ofIdentity and Access Management, with a focus on open environments. Through literature research the issues of privacy, anonymity and access control are identified. The issue of privacy is an inherent problem due to the nature of the digital network environment. Information can be duplicated and modified regardless of the wishes and intentions ofthe owner of that information unless proper measures are taken to secure the environment. Once information is published or divulged on the network, there is very little way of controlling the subsequent usage of that information. To address this issue a model for privacy is presented that follows the user centric paradigm of meta-identity. The lack of anonymity, where security measures can be thwarted through the observation of the environment, is a concern for users and systems. By an attacker observing the communication channel and monitoring the interactions between users and systems over a long enough period of time, it is possible to infer knowledge about the users and systems. This knowledge is used to build an identity profile of potential victims to be used in subsequent attacks. To address the problem, mechanisms for providing an acceptable level of anonymity while maintaining adequate accountability (from a legal standpoint) are explored. In terms of access control, the inherent weakness of single factor authentication mechanisms is discussed. The typical mechanism is the user-name and password pair, which provides a single point of failure. By increasing the factors used in authentication, the amount of work required to compromise the system increases non-linearly. Within an open network, several aspects hinder wide scale adoption and use of multi-factor authentication schemes, such as token management and the impact on usability. The framework is developed from a Utopian point of view, with the aim of being applicable to many situations as opposed to a single specific domain. The framework incorporates multi-factor authentication over multiple paths using mobile phones and GSM networks, and explores the usefulness of such an approach. The models are in tum analysed, providing a discussion into the assumptions made and the problems faced by each model.
Adobe Acrobat Pro 9.5.1
Adobe Acrobat 9.51 Paper Capture Plug-in
APA, Harvard, Vancouver, ISO, and other styles
28

Perelson, Stephen. "SoDA : a model for the administration of separation of duty requirements in workflow systems." Thesis, Port Elizabeth Technikon, 2001. http://hdl.handle.net/10948/68.

Full text
Abstract:
The increasing reliance on information technology to support business processes has emphasised the need for information security mechanisms. This, however, has resulted in an ever-increasing workload in terms of security administration. Security administration encompasses the activity of ensuring the correct enforcement of access control within an organisation. Access rights and their allocation are dictated by the security policies within an organisation. As such, security administration can be seen as a policybased approach. Policy-based approaches promise to lighten the workload of security administrators. Separation of duties is one of the principles cited as a criterion when setting up these policy-based mechanisms. Different types of separation of duty policies exist. They can be categorised into policies that can be enforced at administration time, viz. static separation of duty requirements and policies that can be enforced only at execution time, viz. dynamic separation of duty requirements. This dissertation deals with the specification of both static separation of duty requirements and dynamic separation of duty requirements in role-based workflow environments. It proposes a model for the specification of separation of duty requirements, the expressions of which are based on set theory. The model focuses, furthermore, on the enforcement of static separation of duty. The enforcement of static separation of duty requirements is modelled in terms of invariant conditions. The invariant conditions specify restrictions upon the elements allowed in the sets representing access control requirements. The sets are themselves expressed as database tables within a relational database management system. Algorithms that stipulate how to verify the additions or deletions of elements within these sets can then be performed within the database management system. A prototype was developed in order to demonstrate the concepts of this model. This prototype helps demonstrate how the proposed model could function and flaunts its effectiveness.
APA, Harvard, Vancouver, ISO, and other styles
29

Weston, Mindy. "The Right to Be Forgotten: Analyzing Conflicts Between Free Expression and Privacy Rights." BYU ScholarsArchive, 2017. https://scholarsarchive.byu.edu/etd/6453.

Full text
Abstract:
As modern technology continues to affect civilization, the issue of electronic rights grows in a global conversation. The right to be forgotten is a data protection regulation specific to the European Union but its consequences are creating an international stir in the fields of mass communication and law. Freedom of expression and privacy rights are both founding values of the United States which are protected by constitutional amendments written before the internet also changed those fields. In a study that analyzes the legal process of when these two fundamental values collide, this research offers insight into both personal and judicial views of informational priority. This thesis conducts a legal analysis of cases that cite the infamous precedents of Melvin v. Reid and Sidis v. F-R Pub. Corp., to examine the factors on which U.S. courts of law determinewhether freedom or privacy rules.
APA, Harvard, Vancouver, ISO, and other styles
30

Miller, Alan Henry David. "Best effort measurement based congestion control." Thesis, Connect to e-thesis, 2001. http://theses.gla.ac.uk/1015/.

Full text
Abstract:
Thesis (Ph.D.) -- University of Glasgow, 2001.
Includes bibliographical references (p.i-xv). Print version also available. Mode of access : World Wide Web. System requirments : Adobe Acrobat reader reuired to view PDF document.
APA, Harvard, Vancouver, ISO, and other styles
31

He, Yijun, and 何毅俊. "Protecting security in cloud and distributed environments." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2012. http://hub.hku.hk/bib/B49617631.

Full text
Abstract:
Encryption helps to ensure that information within a session is not compromised. Authentication and access control measures ensure legitimate and appropriate access to information, and prevent inappropriate access to such resources. While encryption, authentication and access control each has its own responsibility in securing a communication session, a combination of these three mechanisms can provide much better protection for information. This thesis addresses encryption, authentication and access control related problems in cloud and distributed environments, since these problems are very common in modern organization environment. The first one is a User-friendly Location-free Encryption System for Mobile Users (UFLE). It is an encryption and authentication system which provides maximum security to sensitive data in distributed environment: corporate, home and outdoors scenarios, but requires minimum user effort (i.e. no biometric entry, or possession of cryptographic tokens) to access the data. It makes users securely and easily access data any time and any place, as well as avoids data breach due to stolen/lost laptops and USB flash. The multi-factor authentication protocol provided in this scheme is also applicable to cloud storage. The second one is a Simple Privacy-Preserving Identity-Management for Cloud Environment (SPICE). It is the first digital identity management system that can satisfy “unlinkability”and “delegatable authentication” in addition to other desirable properties in cloud environment. Unlinkability ensures that none of the cloud service providers (CSPs), even if they collude, can link the transactions of the same user. On the other hand, delegatable authentication is unique to the cloud platform, in which several CSPs may join together to provide a packaged service, with one of them being the source provider which interacts with the clients and performs authentication, while the others are receiving CSPs which will be transparent to the clients. The authentication should be delegatable such that the receiving CSP can authenticate a user without a direct communication with either the user or the registrar, and without fully trusting the source CSP. The third one addresses re-encryption based access control issue in cloud and distributed storage. We propose the first non-transferable proxy re-encryption scheme [16] which successfully achieves the non-transferable property. Proxy re-encryption allows a third-party (the proxy) to re-encrypt a ciphertext which has been encrypted for one party without seeing the underlying plaintext so that it can be decrypted by another. A proxy re-encryption scheme is said to be non-transferable if the proxy and a set of colluding delegatees cannot re-delegate decryption rights to other parties. The scheme can be utilized for a content owner to delegate content decryption rights to users in the untrusted cloud storage. The advantages of using such scheme are: decryption keys are managed by the content owner, and plaintext is always hidden from cloud provider.
published_or_final_version
Computer Science
Doctoral
Doctor of Philosophy
APA, Harvard, Vancouver, ISO, and other styles
32

Van, Schalkwyk Dirko. "Distributed real-time processing using GNU/Linux/libré software and COTS hardware." Thesis, Stellenbosch : Stellenbosch University, 2004. http://hdl.handle.net/10019.1/49933.

Full text
Abstract:
Thesis (MScIng)--Stellenbosch University, 2004.
ENGLISH ABSTRACT: This dissertation's research aims at studying the viability of using both low cost consumer Commodity Off The Self (COTS) PCs and libn~software in implementing a distributed real-time system modeled on a real-world engineering problem. Debugging and developing a modular satellite system is both time consuming and complex, to this end the SUNSATteam has envisioned the Interactive Test System that would be a dual mode simulator/monitoring system. It is this system that requires a real-time back-end and is used as a real world problem model to implement. The implementation was accomplished by researching the available parallel processing software and real-time extensions to GNU/Linux and choosing the appropriate solutions based on the needs of the model. A monitoring system was also implemented, for system verification, using freely available system monitoring utilities. The model was successfully implemented and verified with a global synchronization of < 10ms. It was shown that GNU/Linux and libn~ software is both mature enough and appropriate in solving a real world distributed real-time problem.
AFRIKAANSE OPSOMMING: Die tesis se navorsing is daarop gemik om die toepaslikheid van beide lae koste verbruikers Komoduteits Van Die Rak (KVDR)persoonlike rekenaars en vemiet sagteware in die implementasie van verspreide intydse stelsels te ondersoek aan die hand van die oplossing van 'n gemodelleerde ingenieurs probleem. Die ontfouting en ontwikkeling van 'n modulere satelliet is beide tyd rowend en kompleks, om hierdie te vergemaklik het die SUNSAT span die Interaktiewe Toets Stelsel gekonseptualiseer, wat in wese'n dubbel modus simulator/moniteerings stelsel sou wees. Dit is hierdie stelsel wat 'n verspreide intydse onderstel benodig en dien as die regte wereld probleem model om op te los. Die implementasie is bereik deur die beskikbare verspreide verwerkings sagteware en intydse uitbreidings vir GNU/Linux te ondersoek en die toepaslike opsies te kies gegrond op die behoeftes van die model. 'n Moniteerings stelsel is ook geimplimenteer, met behulp van libn~sagteware, vir stelsel verifikasie. Die model was suksesvol geimplimenteer en geverifieer met 'n globale sinkronisasie van < 10ms. Daar is getoon dat GNU/Linux en libn~sagteware beide volwaardig en geskik is vir die oplossing van regte wereld verspreide intydse probleme.
APA, Harvard, Vancouver, ISO, and other styles
33

Maass, E. (Eanette). "Integrated attitude determination system using a combination of magnetometer and horizon sensor data." Thesis, Stellenbosch : Stellenbosch University, 2003. http://hdl.handle.net/10019.1/53469.

Full text
Abstract:
Thesis (MScEng)--Stellenbosch University, 2003.
ENGLISH ABSTRACT: A different approach of employing attitude sensors with incomplete measurements in an attitude determination system is investigated. The amount of available attitude sensors on small satellites are limited, and the failure of sensors can be fatal when accurate attitude determination is necessary. The problem with sensors with incomplete measurements is that they must be used in combination with other sensors to obtain three dimensional attitude information. The aim is to enhance the possible number of sensor combinations that can be employed, in an attempt to improve the ability of the attitude determination system to tolerate sensor failures. An alternative sensor structure consisting of a magnetometer and two horizon sensors is presented. A method to obtain vector observations of the attitude from a combination between magnetometer and horizon sensor measurements is derived and tested. A full state Extended Kalman Filter is used to determine the satellite's attitude, attitude rate and disturbance torque from these vector observations. A second Extended Kalman Filter structure, using only magnetometer measurements, is implemented. The magnetometer Extended Kalman Filter and the horizon/magnetometer Extended Kalman Filter are integrated to obtain a single Extended Kalman Filter structure to determine the satellite's full attitude state. Integration is done by switching between the different pairs of vector information. A systematic analysis of the integrated filter's dynamic behaviour during the switching stages is done by means of a series of case studies.
AFRIKAANSE OPSOMMING: Die gebruik van oriëntasiesensore met onvolledige metingsdata in oriëntasiebepalingsstelsels is ondersoek. Slegs 'n beperkte aantal oriëntasiesensore is beskikbaar op mikro satelliete. 'n Foutiewe sensor kan dus noodlottig wees wanneer akkurate oriëntasiebepaling nodig is. Die probleem met sensore met onvolledige metingsdata is dat dit in sensor kombinasies gebruik moet word om drie dimensionele oriëntasieinligting te verkry. Die doel is dus om die moontlike aantal sensor kombinasies sodanig te vermeerder dat die oriëntasiebepalingsstelsel beter bestand sal wees teen moontlike sensor falings. 'n Alternatiewe sensor struktuur, bestaande uit 'n magnetometer en twee horison sensore, is ondersoek. 'n Metode vir die verkryging van 3-as oriëntasie inligting vanaf 'n kombinasie van magnetometer en horison sensor metingsdata is afgelei en getoets. 'n Vol toestand uitgebreide Kalmanfilter is gebruik om the satelliet se oriëntasie, oriëntasie snelheid en versteurings draairnoment vanaf die vektor observasies af te lei. 'n Tweede uitgebreide Kalmanfilter struktuur, wat slegs magnetometer metingsdata gebruik, is geïmplementeer. Die magnetometer filter en die horison/magnetometer filter is geïntegreer sodat een uitgebreide Kalmanfilter struktuur volle oriëntasie inligting kan aflei vanaf verskillende pare vektors met oriëntasie inligting. Integrasie is gedoen deur te skakel tussen die verskillende vektorpare. 'n Sistematiese analise van die geïntegreerde filter se dinamiese gedrag gedurende die oorskakelingstye is gedoen deur middel van 'n reeks gevallestudies.
APA, Harvard, Vancouver, ISO, and other styles
34

Mbangeni, Litha. "Development of methods for parallel computation of the solution of the problem for optimal control." Thesis, Cape Peninsula University of Technology, 2010. http://hdl.handle.net/20.500.11838/1110.

Full text
Abstract:
Thesis (MTech(Electrical Engineering))--Cape Peninsula University of Technology, 2010
Optimal control of fermentation processes is necessary for better behaviour of the process in order to achieve maximum production of product and biomass. The problem for optimal control is a very complex nonlinear, dynamic problem requiring long time for calculation Application of decomposition-coordinating methods for the solution of this type of problems simplifies the solution if it is implemented in a parallel way in a cluster of computers. Parallel computing can reduce tremendously the time of calculation through process of distribution and parallelization of the computation algorithm. These processes can be achieved in different ways using the characteristics of the problem for optimal control. Problem for optimal control of a fed-batch, batch and continuous fermentation processes for production of biomass and product are formulated. The problems are based on a criterion for maximum production of biomass at the end of the fermentation process for the fed-batch process, maximum production of metabolite at the end of the fermentation for the batch fermentation process and minimum time for achieving steady state fermentor behavior for the continuous process and on unstructured mass balance biological models incorporating in the kinetic coefficients, the physiochemical variables considered as control inputs. An augmented functional of Lagrange is applied and its decomposition in time domain is used with a new coordinating vector. Parallel computing in a Matlab cluster is used to solve the above optimal control problems. The calculations and tasks allocation to the cluster workers are based on a shared memory architecture. Real-time control implementation of calculation algorithms using a cluster of computers allows quick and simpler solutions to the optimal control problems.
APA, Harvard, Vancouver, ISO, and other styles
35

Wu, Qinyi. "Partial persistent sequences and their applications to collaborative text document editing and processing." Diss., Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/44916.

Full text
Abstract:
In a variety of text document editing and processing applications, it is necessary to keep track of the revision history of text documents by recording changes and the metadata of those changes (e.g., user names and modification timestamps). The recent Web 2.0 document editing and processing applications, such as real-time collaborative note taking and wikis, require fine-grained shared access to collaborative text documents as well as efficient retrieval of metadata associated with different parts of collaborative text documents. Current revision control techniques only support coarse-grained shared access and are inefficient to retrieve metadata of changes at the sub-document granularity. In this dissertation, we design and implement partial persistent sequences (PPSs) to support real-time collaborations and manage metadata of changes at fine granularities for collaborative text document editing and processing applications. As a persistent data structure, PPSs have two important features. First, items in the data structure are never removed. We maintain necessary timestamp information to keep track of both inserted and deleted items and use the timestamp information to reconstruct the state of a document at any point in time. Second, PPSs create unique, persistent, and ordered identifiers for items of a document at fine granularities (e.g., a word or a sentence). As a result, we are able to support consistent and fine-grained shared access to collaborative text documents by detecting and resolving editing conflicts based on the revision history as well as to efficiently index and retrieve metadata associated with different parts of collaborative text documents. We demonstrate the capabilities of PPSs through two important problems in collaborative text document editing and processing applications: data consistency control and fine-grained document provenance management. The first problem studies how to detect and resolve editing conflicts in collaborative text document editing systems. We approach this problem in two steps. In the first step, we use PPSs to capture data dependencies between different editing operations and define a consistency model more suitable for real-time collaborative editing systems. In the second step, we extend our work to the entire spectrum of collaborations and adapt transactional techniques to build a flexible framework for the development of various collaborative editing systems. The generality of this framework is demonstrated by its capabilities to specify three different types of collaborations as exemplified in the systems of RCS, MediaWiki, and Google Docs respectively. We precisely specify the programming interfaces of this framework and describe a prototype implementation over Oracle Berkeley DB High Availability, a replicated database management engine. The second problem of fine-grained document provenance management studies how to efficiently index and retrieve fine-grained metadata for different parts of collaborative text documents. We use PPSs to design both disk-economic and computation-efficient techniques to index provenance data for millions of Wikipedia articles. Our approach is disk economic because we only save a few full versions of a document and only keep delta changes between those full versions. Our approach is also computation-efficient because we avoid the necessity of parsing the revision history of collaborative documents to retrieve fine-grained metadata. Compared to MediaWiki, the revision control system for Wikipedia, our system uses less than 10% of disk space and achieves at least an order of magnitude speed-up to retrieve fine-grained metadata for documents with thousands of revisions.
APA, Harvard, Vancouver, ISO, and other styles
36

Wilson, Kevin G. 1952. "The social significance of home networking : public surveillance and social management." Thesis, McGill University, 1985. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=72035.

Full text
Abstract:
This thesis analyzes the social significance of the integration of the home into computer networks. The social significance of home networking is grasped when these systems are understood in their relationship to emerging forms of electronic social control. The thesis establishes this connection through an analysis of structural trends in the videotex industry which demonstrates the value to the corporate sector of cybernetic information generated by interactive systems. The North American tradition of privacy policy is reviewed and demonstrated as inadequate for the protection of personal privacy in home networking. It is further shown that privacy policy does not represent an adequate theorization of social control in computer networking, since it does not account for practices of aggregate social control, which have been termed in the thesis "social management," so vital to the cybernetic economy of late capitalism. Finally, the thesis argues that current conceptual frameworks and policy mechanisms cannot assure the socially beneficial development of home networking, given the tendency towards the integration of such systems into structures of social control.
APA, Harvard, Vancouver, ISO, and other styles
37

Erdogdu, Utku. "Efficient Partially Observable Markov Decision Process Based Formulation Of Gene Regulatory Network Control Problem." Phd thesis, METU, 2012. http://etd.lib.metu.edu.tr/upload/12614317/index.pdf.

Full text
Abstract:
The need to analyze and closely study the gene related mechanisms motivated the research on the modeling and control of gene regulatory networks (GRN). Dierent approaches exist to model GRNs
they are mostly simulated as mathematical models that represent relationships between genes. Though it turns into a more challenging problem, we argue that partial observability would be a more natural and realistic method for handling the control of GRNs. Partial observability is a fundamental aspect of the problem
it is mostly ignored and substituted by the assumption that states of GRN are known precisely, prescribed as full observability. On the other hand, current works addressing partially observability focus on formulating algorithms for the nite horizon GRN control problem. So, in this work we explore the feasibility of realizing the problem in a partially observable setting, mainly with Partially Observable Markov Decision Processes (POMDP). We proposed a POMDP formulation for the innite horizon version of the problem. Knowing the fact that POMDP problems suer from the curse of dimensionality, we also proposed a POMDP solution method that automatically decomposes the problem by isolating dierent unrelated parts of the problem, and then solves the reduced subproblems. We also proposed a method to enrich gene expression data sets given as input to POMDP control task, because in available data sets there are thousands of genes but only tens or rarely hundreds of samples. The method is based on the idea of generating more than one model using the available data sets, and then sampling data from each of the models and nally ltering the generated samples with the help of metrics that measure compatibility, diversity and coverage of the newly generated samples.
APA, Harvard, Vancouver, ISO, and other styles
38

Gerber, Mariana. "The development of a technique to establish the security requirements of an organization." Thesis, Port Elizabeth Technikon, 2001. http://hdl.handle.net/10948/89.

Full text
Abstract:
To perform their business activities effectively, organizations rely heavily on the use of information (ISO/IEC TR 13335-2, 1996, p 1). Owens (1998) reiterates this by claiming that all organizations depend on information for their everyday operation and without it business will fail to operate (Owens, 1998, p 1-2). For an organization it means that if the right information is not available at the right time, it can make the difference between profit and loss or success and failure (Royds, 2000, p 2). Information is an asset and just like other important business assets within the organization, it has extreme value to an organization (BS 7799-1, 1999, p 1; Humphreys, Moses & Plate, 1998, p 8). For this reason it has become very important that business information is sufficiently protected. There are many different ways in which information can exist. Information can be printed or written on paper, stored electronically, transmitted electronically or by post, even spoken in conversation or any other way in which knowledge and ideas can be conveyed (URN 99/703, 1999, p. 2; Humphreys, Moses & Plate, 1998, p 8; URN 96/702, 1996, p 3).It is, therefore, critical to protect information, and to ensure that the security of IT (Information Technology) systems within organizations is properly managed. This requirement to protect information is even more important today, since many organizations are internally and externally connected by networks of IT systems (ISO/IEC TR 13335-2, 1996, p 1). Information security is therefore required to assist in the process of controlling and securing of information from accidental or malicious changes, deletions or unauthorized disclosure (Royds, 2000, p 2; URN 96/702, 1996, p 3). By preventing and minimizing the impact of security incidents, information security can ensure business continuity and reduce business damage (Owens, 1998, p 7). Information security in an organization can be regarded as a management opportunity and should become an integral part of the whole management activity of the organization. Obtaining commitment from management is therefore extremely important for effective information security. One way in which management can show their commitment to ensuring information security, is to adopt and enforce a security policy. A security policy ensures that people understand exactly what important role they play in securing information assets.
APA, Harvard, Vancouver, ISO, and other styles
39

Retief, Francois. "Investigation of a high-speed serial bus between satellite subsystems." Thesis, Stellenbosch : Stellenbosch University, 2003. http://hdl.handle.net/10019.1/53474.

Full text
Abstract:
Thesis (MScEng)--University of Stellenbosch, 2003.
ENGLISH ABSTRACT: The aim of this thesis is to investigate the implementation of a high-speed serial bus based on the IEEE Std 1394-1995 specification for use in a microsatellite. Earth observation microsatellites carry imagers (or cameras) that take photographs of the earth. Each photograph generates a large volume of digital data that has to be transferred to either a storage device, a RF transmission unit or a video processing device. Traditionally, the connection between such systems were dedicated serial bus systems that were custom designed for just that purpose. This thesis will investigate the the implementation of a generic alternative to such a custom serial bus. The IEEE 1394 serial bus will allow many devices and subsystems to be connected to the serial bus and will allow these different subsystems to exchange data between each other. As an example implementation, a real-time video link between two points using the IEEE 1394 serial bus will be developed.
AFRIKAANSE OPSOMMING: Die doel van hierdie tesis is om ondersoek in te stel na die bou van 'n hoëspoed seriebus vir gebruik in 'n mikrosatelliet gebaseer is op die IEEE Std 1394-1995 spesifikasie. Aardobservasie-mikrosatelliete bevat kameras wat fotos van die aarde neem. Elke foto genereer groot volumes digitale data wat na óf 'n massastoor, óf 'n RF-sender, óf 'n video-verwerkingseenheid gestuur word. Tradisioneel is elkeen van hierdie verbindings met 'n toegewyde seriebus gedoen wat spesiaal vir daardie doel gemaak is. Hierdie tesis het dit ten doelom ondersoek in te stel na 'n generiese alternatief vir hierdie toegewyde seriële busse. Die IEEE 1394 seriebus sal toelaat dat verskeie eenhede en substelsels aan mekaar gekoppel kan word en dat hulle data tussen mekaar kan uitruil. Ter demonstrasie sal 'n intydse videoskakel ontwerp word wat die IEEE 1394 seriebus gebruik om data tussen twee punte oor te dra.
APA, Harvard, Vancouver, ISO, and other styles
40

Niranjan, Mysore Radhika. "Towards IQ-Appliances: Quality-awareness in Information Virtualization." Thesis, Available online, Georgia Institute of Technology, 2007, 2007. http://etd.gatech.edu/theses/available/etd-04262007-121537/.

Full text
Abstract:
Thesis (M. S.)--Electrical and Computer Engineering, Georgia Institute of Technology, 2008.
Ferri, Bonnie Heck, Committee Member ; Gavrilovska, Ada, Committee Member ; Yalamanchili, Sudhakar, Committee Member ; Schwan, Karsten, Committee Chair.
APA, Harvard, Vancouver, ISO, and other styles
41

Chance, Christopher P. "Designing and implementing a network authentication service for providing a secure communication channel." Thesis, Kansas State University, 1986. http://hdl.handle.net/2097/9903.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Procházka, Miroslav. "Bezdrátový sběr dat z řídících jednotek automobilů." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2013. http://www.nusl.cz/ntk/nusl-219987.

Full text
Abstract:
This thesis deals with implementation of a wireless interface for a module providing communication between computer and electronic control unit of an automobile. The 802.11 standard, wireless network structures and MAC sublayer were described. The thesis suggests selection of the most suitable WLAN module, testing of module's features used and the WLAN module implementation. Modifications for the next device version were developed and programs for device operation were implemented.
APA, Harvard, Vancouver, ISO, and other styles
43

Junior, Ulysses Borelli Thomaz. "Desenvolvimento de um sistema de aquisição, análise e controle para experimentos em ressonância paramagnética eletrônica." Universidade de São Paulo, 1988. http://www.teses.usp.br/teses/disponiveis/54/54132/tde-19052014-112927/.

Full text
Abstract:
Este trabalho descreve o projeto e o desenvolvimento de um sistema para controle, aquisição, processamento e análise de experimentos de Ressonância Paramagnética Eletrônica, baseado em microprocessador. Trata-se da construção de um equipamento dedicado e otimizado, para a realização dessas tarefas. Além de recursos convencionais, como o Cálculo da. Média Ponto-a-Ponto _de um sinal pela aquisição de múltiplos espectros, o instrumento desenvolvido possui várias características em relação á manipulação e visualização de espectros, tais como: correção da linha de base, soma e subtração de espectros, cálculo da função primeira integral e o valor da segunda integral, e aquisição automática para experimentos de Cinética. Com a finalidade de torná-lo o mais versátil possível, foram desenvolvidos \"hardware\" e \"software\" permitindo, tanto o controle remoto por microcomputador, quanto a transferência. dos dados. Além da descrição da documentação completa do projeto, também são apresentadas, comparações com outros sistemas desenvolvidos, destacando aspectos de desempenho e flexibilidade. Finalmente, são fornecidos resultados experimentais que comprovam o funcionamento dos principais recursos implementados
This work describes the design and implementation of a dedicated microprocessor cases system for control, date acquisition and signal processing in electronic paramagnetic resonance experiments. Many specific functions dealing with processing and evaluation of spectra, such as baseline correction, addition, subtraction and integration of spectra are provided. For experiments in kinetic, automatic control and acquisition is available. Remote operation and data transfer from and to a microcomputer is also included to permit higher flexibility. A complete description of project is presented and operation procedures and evaluations are described.
APA, Harvard, Vancouver, ISO, and other styles
44

Meuter, Cédric. "Development and validation of distributed reactive control systems." Doctoral thesis, Universite Libre de Bruxelles, 2008. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/210552.

Full text
Abstract:
A reactive control system is a computer system reacting to certain stimuli emitted by its environment in order to maintain it in a desired state. Distributed reactive control systems are generally composed of several processes, running in parallel on one or more computers, communicating with one another to perform the required control task. By their very nature, distributed reactive control systems are hard to design. Their distributed nature and/or the communication scheme used can introduce subtle unforeseen behaviours. When dealing with critical applications, such as plane control systems, or traffic light control systems, those unintended behaviours can have disastrous consequences. It is therefore essential, for the designer, to ensure that this does not happen. For that purpose, rigorous and systematic techniques can (and should) be applied as early as possible in the development process. In that spirit, this work aims at providing the designer with the necessary tools in order to facilitate the development and validation of such distributed reactive control systems. In particular, we show how using a dedicated language called dSL (Distributed Supervision language) can be used to ease the development process. We also study how validations techniques such as model-checking and testing can be applied in this context.
Doctorat en Sciences
info:eu-repo/semantics/nonPublished
APA, Harvard, Vancouver, ISO, and other styles
45

Neelamegam, Jothi P. "Zero-sided communication : challenges in implementing time-based channels using the MPI/RT specification." Master's thesis, Mississippi State : Mississippi State University, 2002. http://library.msstate.edu/etd/show.asp?etd=etd-03252002-153109.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Riexinger, Luke E. "Residual Crashes and Injured Occupants with Lane Departure Prevention Systems." Diss., Virginia Tech, 2021. http://hdl.handle.net/10919/103060.

Full text
Abstract:
Every year, approximately 34,000 individuals are fatally injured in crashes on US roads [1]. These fatalities occur across many types of crash scenarios, each with its own causation factors. One way to prioritize research on a preventive technology is to compare the number of occupant fatalities relative to the total number of occupants involved in a crash scenario. Four crash modes are overrepresented among fatalities: single vehicle road departure crashes, control loss crashes, cross-centerline head-on crashes, and pedestrian/cyclist crashes [2]. Interestingly, three of these crash scenarios require the subject vehicle to depart from the initial lane of travel. Lane departure warning (LDW) systems track the vehicle lane position and can alert the driver through audible and haptic feedback before the vehicle crosses the lane line. Lane departure prevention (LDP) systems can perform an automatic steering maneuver to prevent the departure. Another method of prioritizing research is to determine factors common among the fatal crashes. In 2017, 30.4% of passenger vehicle crash fatalities involved a vehicle rollover [1]. Half of all fatal single vehicle road departure crashes resulted in a rollover yet only 12% of fatal multi-vehicle crashes involved a rollover [1]. These often occur after the driver has lost control of the vehicle and departed the road. Electronic stability control (ESC) can provide different braking to each wheel and allow the vehicle to maintain heading. While ESC is a promising technology, some rollover crashes still occur. Passive safety systems such as seat belts, side curtain airbags, and stronger roofs work to protect occupants during rollover crashes. Seat belts prevent occupants from moving inside the occupant compartment during the rollover and both seat belts and side curtain airbags can prevent occupants from being ejected from the vehicle. Stronger roofs ensure that the roof is not displaced during the rollover and the integrity of the occupant compartment is maintained to prevent occupant ejection. The focus of this dissertation is to evaluate the effectiveness of vehicle-based countermeasures, such as lane departure warning and electronic stability control, for preventing or mitigating single vehicle road departure crashes, cross-centerline head-on crashes, and single vehicle rollover crashes. This was accomplished by understanding how drivers respond to both road departure and cross-centerline events in real-world crashes. These driver models were used to simulate real crash scenarios with LDW/LDP systems to quantify their potential crash reduction. The residual crashes, which are not avoided with LDW/LDP systems or ESC, were analyzed to estimate the occupant injury outcome. For rollover crashes, a novel injury model was constructed that includes modern passive safety countermeasures such as seat belts, side curtain airbags, and stronger roofs. The results for road departure, head-on, and control loss rollover crashes were used to predict the number of crashes and injured occupants in the future. This work is important for identifying the residual crashes that require further research to reduce the number of injured crash occupants.
Doctor of Philosophy
Every year in the US, approximately 34,000 individuals are fatally injured in many different types of crashes. However, some crash types are more dangerous than other crash types. Drift-out-of-lane (DrOOL) road departure crashes, control loss road departure crashes, head-on crashes, and pedestrian crashes are more likely to result in an occupant fatality than other crash modes. In three of these more dangerous crash types, the vehicle departs from the lane before the crash occurs. Lane departure warning (LDW) systems can detect when the vehicle is about to cross the lane line and notify the driver with beeping or vibrating the steering wheel. A different system, called lane departure prevention (LDP), can provide automatic steering to prevent the vehicle from leaving the lane or return lane. In control loss crashes, the vehicle's motion is in a different direction than the vehicle's heading. During control loss, it is easier for the vehicle to roll over which is very dangerous. Electronic stability control (ESC) can prevent control loss by applying selective braking to each tire to keep the vehicle's motion in the same direction as the vehicle's heading. If a rollover still occurs, vehicles are equipped with passive safety systems and designs such as seat belts, side curtain airbags, and stronger roofs to protect the people inside. Seat belts can prevent occupants from striking the vehicle interior during the rollover and both seat belts and side curtain airbags can prevent occupants from being ejected from the vehicle. Stronger roofs ensure that the roof is not displaced during the rollover to prevent occupants from being ejected from the vehicle. The focus of this dissertation is to estimate how many crashes LDW, LDP, and ESC systems could prevent. This was accomplished by understanding how drivers respond after leaving their lane in real crashes. Then, these real crash scenarios were simulated with an LDW or LDP system to estimate how many crashes were prevented. The occupants of residual crashes, which were not prevented by the simulated systems, were analyzed to estimate the number of occupants with at least one moderate injury. Understanding which crashes and injuries that were not prevented with this technology can be used to decide where future research should occur to prevent more fatalities in road departure, head-on and control loss crashes.
APA, Harvard, Vancouver, ISO, and other styles
47

Holeš, Cyril. "Diagnostika pohonných jednotek." Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2016. http://www.nusl.cz/ntk/nusl-254334.

Full text
Abstract:
The thesis is focused on determining the methodology of engine diagnostics for vehicles with common rail direct fuel injection. The first section describes the communication between the diagnostic tools and control units, and the common rail system for diesel engines. The next section specifies the procedure for the acquisition, processing and analysis of data using selective diagnostic tools along with software in a numerical computing environment. Subsequently, from the results, the equivalent electronic control of the fuel injection quantity is established for the purpose of research, vehicle servicing. The results may also allow for additional observations to be made.
APA, Harvard, Vancouver, ISO, and other styles
48

Kalyon, Gabriel. "Supervisory control of infinite state systems under partial observation." Doctoral thesis, Universite Libre de Bruxelles, 2010. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/210032.

Full text
Abstract:
A discrete event system is a system whose state space is given by a discrete set and whose state transition mechanism is event-driven i.e. its state evolution depends only on the occurrence of discrete events over the time. These systems are used in many fields of application (telecommunication networks, aeronautics, aerospace,). The validity of these systems is then an important issue and to ensure it we can use supervisory control methods. These methods consist in imposing a given specification on a system by means of a controller which runs in parallel with the original system and which restricts its behavior. In this thesis, we develop supervisory control methods where the system can have an infinite state space and the controller has a partial observation of the system (this implies that the controller must define its control policy from an imperfect knowledge of the system). Unfortunately, this problem is generally undecidable. To overcome this negative result, we use abstract interpretation techniques which ensure the termination of our algorithms by overapproximating, however, some computations. The aim of this thesis is to provide the most complete contribution it is possible to bring to this topic. Hence, we consider more and more realistic problems. More precisely, we start our work by considering a centralized framework (i.e. the system is controlled by a single controller) and by synthesizing memoryless controllers (i.e. controllers that define their control policy from the current observation received from the system). Next, to obtain better solutions, we consider the synthesis of controllers that record a part or the whole of the execution of the system and use this information to define the control policy. Unfortunately, these methods cannot be used to control an interesting class of systems: the distributed systems. We have then defined methods that allow to control distributed systems with synchronous communications (decentralized and modular methods) and with asynchronous communications (distributed method). Moreover, we have implemented some of our algorithms to experimentally evaluate the quality of the synthesized controllers. /

Un système à événements discrets est un système dont l'espace d'états est un ensemble discret et dont l'évolution de l'état courant dépend de l'occurrence d'événements discrets à travers le temps. Ces systèmes sont présents dans de nombreux domaines critiques tels les réseaux de communications, l'aéronautique, l'aérospatiale. La validité de ces systèmes est dès lors une question importante et une manière de l'assurer est d'utiliser des méthodes de contrôle supervisé. Ces méthodes associent au système un dispositif, appelé contrôleur, qui s'exécute en parrallèle et qui restreint le comportement du système de manière à empêcher qu'un comportement erroné ne se produise. Dans cette thèse, on s'intéresse au développement de méthodes de contrôle supervisé où le système peut avoir un espace d'états infini et où les contrôleurs ne sont pas toujours capables d'observer parfaitement le système; ce qui implique qu'ils doivent définir leur politique de contrôle à partir d'une connaissance imparfaite du système. Malheureusement, ce problème est généralement indécidable. Pour surmonter cette difficulté, nous utilisons alors des techniques d'interprétation abstraite qui assurent la terminaison de nos algorithmes au prix de certaines sur-approximations dans les calculs. Le but de notre thèse est de fournir la contribution la plus complète possible dans ce domaine et nous considèrons pour cela des problèmes de plus en plus réalistes. Plus précisement, nous avons commencé notre travail en définissant une méthode centralisée où le système est contrôlé par un seul contrôleur qui définit sa politique de contrôle à partir de la dernière information reçue du système. Ensuite, pour obtenir de meilleures solutions, nous avons défini des contrôleurs qui retiennent une partie ou la totalité de l'exécution du système et qui définissent leur politique de contrôle à partir de cette information. Malheureusement, ces méthodes ne peuvent pas être utilisées pour contrôler une classe intéressante de systèmes: les sytèmes distribués. Nous avons alors défini des méthodes permettant de contrôler des systèmes distribués dont les communications sont synchrones (méthodes décentralisées et modulaires) et asynchrones (méthodes distribuées). De plus, nous avons implémenté certains de nos algorithmes pour évaluer expérimentalement la qualité des contrôleurs qu'ils synthétisent.
Doctorat en Sciences
info:eu-repo/semantics/nonPublished

APA, Harvard, Vancouver, ISO, and other styles
49

Jafari, Farhang. "The concerns of the shipping industry regarding the application of electronic bills of lading in practice amid technological change." Thesis, University of Stirling, 2015. http://hdl.handle.net/1893/24071.

Full text
Abstract:
In the sea trade, the traditional paper-based bill of lading has played an important role across the globe for centuries, but with the advent of advanced commercial modes of transportation and communication, the central position of this document is under threat. The importance of the bill of lading still prevails as does the need of the functions that this document served in the past, although in a changed format. In the recent past, the world has witnessed a lot of debate about replacing this traditional paper-based document with an electronic equivalent that exhibits all of its functions and characteristics, both commercial and legal. More specifically, unlike many rival travel documents, such as the Sea Waybill, a bill of lading has two prominent features, that is to say, its negotiability and its acceptability as a document of title in certain legal jurisdictions that are required to be retained in an electronic bill of lading so as to also retain the prominence of this document in the future landscape. This thesis is, however, more concerned about the legal aspects of adopting the electronic bill of lading as a traditional paper-based legal document as well as an effective legal document in the present age. However, the scope of this debate remains primarily focused on the USA and UK jurisdictions. In the course of this thesis, it is observed that, in the past, the bill of lading has been subject to a variety of international regimes, such as The Hague Rules and The Hague-Visby Rules, and presently efforts are being made to arrive at a universal agreement under the umbrella of The Rotterdam Rules, but such an agreement is yet to arrive among the comity of nations. On the other hand, efforts made by the business community to introduce an electronic bill of lading are much louder and more evident. The private efforts, such as the SeaDocs System, CMI Rules, and the BOLERO Project, etc., were, however, received by the fellow business community with both applause as well as suspicion. At the same time, there are a number of concerns voiced by the international business community on the legislative adoptability in national and international jurisdictions and the courts’ approach in adjudicating cases involving electronic transactions and these are making the task of adoption of electronic bill of lading in the sea-based transactions a difficult task. Therefore, in the absence of any formal legal backing from national and international legislations, these attempts could not achieve the desired results. In this thesis, the present situation of the acceptability of electronic transactions in general, and of the electronic bill of lading specifically, has also been discussed with reference to certain national jurisdictions, such as Australia, India, South Korea and China, in order to present comparative perspectives on the preparedness of these nations. On the regional level, the efforts made by the European Union have also been discussed to promote electronic transactions within its jurisdiction. All the discussion, however, leads to the situation where the level of acceptability of electronic bill of lading in the near future is found to be dependent upon the official efforts from the national governments and putting these efforts towards arriving at an agreement on Rotterdam Rules as early as possible. The other area of importance revealed in this thesis is the need for change in juristic approach by the courts while interpreting and adjudicating upon cases involving electronic transactions. On the whole, this thesis has provided a cohesive and systematic review, synthesis and analysis of the history of the bill of lading, its importance as a document of title, and attempts to incorporate its important functions within the fast-paced electronic shipping commerce of today. In such a way it has provided a valuable contribution to the literature by providing a comprehensive resource for jurists, policy-makers and the business community alike, as they work towards adapting the bill of lading so that it might be successfully applied in electronic form.
APA, Harvard, Vancouver, ISO, and other styles
50

Nairouz, Bassem R. "Conceptual design methodology of distributed intelligence large scale systems." Diss., Georgia Institute of Technology, 2013. http://hdl.handle.net/1853/49077.

Full text
Abstract:
Distributed intelligence systems are starting to gain dominance in the field of large-scale complex systems. These systems are characterized by nonlinear behavior patterns that are only predicted through simulation-based engineering. In addition, the autonomy, intelligence, and reconfiguration capabilities required by certain systems introduce obstacles adding another layer of complexity. However, there exists no standard process for the design of such systems. This research presents a design methodology focusing on distributed control architectures while concurrently considering the systems design process. The methodology has two major components. First, it introduces a hybrid design process, based on the infusion of the control architecture and conceptual system design processes. The second component is the development of control architectures metamodel, placing a distinction between control configuration and control methods. This enables a standard representation of a wide spectrum of control architectures frameworks.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography