To see the other types of publications on this topic, follow the link: Electronic data processing – Auditing.

Dissertations / Theses on the topic 'Electronic data processing – Auditing'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Electronic data processing – Auditing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Wheeler, Sonya. "A structured technique for applying risk based internal auditing in information technology environments : (with specific reference to llA RBIA, King Report and CobiT) /." Link to the online version, 2005. http://hdl.handle.net/10019/1310.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Anyanwu, Ogechi Uloma. "The role of enterprise resource planning systems in continuous auditing of a selected organization in the Western Cape, South Africa." Thesis, Cape Peninsula University of Technology, 2018. http://hdl.handle.net/20.500.11838/2669.

Full text
Abstract:
Thesis (MTech (Business Information Systems))--Cape Peninsula University of Technology, 2018.
The thesis aimed at exploring the role Enterprise Resource Planning (ERP) Systems play in an organization’s continuous auditing practices. Continuous auditing encourages innovation and improves the practice of traditional auditing through the use of automation and computerisation. Auditing specialists and researchers have begun to adopt a technology driven process as an approach to back up real time assurance. The rationale of the study is drawn from previous research where the findings argue that organizations employ the use of ERP systems because it enables seamless access to information and automation, which makes monitoring of controls easier. The study used Structuration Theory (ST) as the underpinning theory and drew on the concept of duality of technology (i.e., Enactment of Technology-in-Practice) as a lens to comprehend and deduced the social phenomenon of continuous auditing using ERP system. This research study investigated this social phenomenon and how it had influenced performance auditing of an organization. The study applied interpretivism as a research paradigm and as such adopted a qualitative approach where semi-structured interviews were used to tease out the research objectives and questions. The outcome of the research validated a conceptual framework which has led to a proposed general framework for practicing continuous auditing using ERP system. All interviews data collected and accurately captured with informed consent were subject to the approval of the selected organization. This was not to violate the organization’s privacy and confidentiality policies. It did not reveal any information that could potentially adversely affect the reputation of the organization or reveal private information to its competitors.
APA, Harvard, Vancouver, ISO, and other styles
3

Mashima, Daisuke. "Safeguarding health data with enhanced accountability and patient awareness." Diss., Georgia Institute of Technology, 2012. http://hdl.handle.net/1853/45775.

Full text
Abstract:
Several factors are driving the transition from paper-based health records to electronic health record systems. In the United States, the adoption rate of electronic health record systems significantly increased after "Meaningful Use" incentive program was started in 2009. While increased use of electronic health record systems could improve the efficiency and quality of healthcare services, it can also lead to a number of security and privacy issues, such as identity theft and healthcare fraud. Such incidents could have negative impact on trustworthiness of electronic health record technology itself and thereby could limit its benefits. In this dissertation, we tackle three challenges that we believe are important to improve the security and privacy in electronic health record systems. Our approach is based on an analysis of real-world incidents, namely theft and misuse of patient identity, unauthorized usage and update of electronic health records, and threats from insiders in healthcare organizations. Our contributions include design and development of a user-centric monitoring agent system that works on behalf of a patient (i.e., an end user) and securely monitors usage of the patient's identity credentials as well as access to her electronic health records. Such a monitoring agent can enhance patient's awareness and control and improve accountability for health records even in a distributed, multi-domain environment, which is typical in an e-healthcare setting. This will reduce the risk and loss caused by misuse of stolen data. In addition to the solution from a patient's perspective, we also propose a secure system architecture that can be used in healthcare organizations to enable robust auditing and management over client devices. This helps us further enhance patients' confidence in secure use of their health data.
APA, Harvard, Vancouver, ISO, and other styles
4

梁松柏 and Chung-pak Leung. "Concurrent auditing on computerized accounting systems." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1998. http://hub.hku.hk/bib/B31269011.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Blundell, Adrian Wesley. "Continuous auditing technologies and models." Thesis, Nelson Mandela Metropolitan University, 2007. http://hdl.handle.net/10948/476.

Full text
Abstract:
Continuous auditing is not a totally new concept, but it has not been widely implemented, and has existed mostly as a point of debate amongst the auditing fraternity. This may soon change, as continuous auditing has become a topic of great interest, especially in the last decade. This may be due to a combination of reasons. In the last decade, much of the confidence in auditors’ reports was lost due to corporate governance scandals. This also brought about a greater desire for faster, more reliable reporting on which to base decisions. This desire has been transposed into regulations such as the Sarbanes-Oxley act in the United States, which encourages real-time auditing activities, which would benefit from continuous auditing. A second, possible contributing factor to the heightened interest in continuous auditing is that much of the requisite technology has matured to a point where it can be successfully used to implement continuous auditing. It is the technologies which form the focus of this research. It is therefore, the primary objective of this research to investigate and identify the essential technologies, and identify and define their roles within a continuous auditing solution. To explore this area, three models of continuous auditing are compared according to the roles of the technologies within them. The roots of some auditing technologies which can be adapted to the paradigm of continuous auditing are explored, as well as new technologies, such as XML-based reporting languages. In order to fully explore these technologies, the concepts of data integrity and data quality are first defined and discussed, and some security measures which contribute to integrity are identified. An obstacle to implementing a continuous model is that even with the newly available technologies, the multitudes of systems which are used in organisations, produce data in a plethora of data formats. In performing an audit the continuous auditing system needs to first gather this data and then needs to be able to compare “apples with apples”. Therefore, the technologies which can be used to acquire and standardise the data are identified.
APA, Harvard, Vancouver, ISO, and other styles
6

Ching, Siu-ming Vincent, and 程少明. "Computer auditing in Hong Kong." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1986. http://hub.hku.hk/bib/B31263550.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Aldeco, Perez Rocio. "Secure provenance-based auditing of personal data use." Thesis, University of Southampton, 2012. https://eprints.soton.ac.uk/340065/.

Full text
Abstract:
In recent years, an increasing number of personalised services that require users to disclose personal information have appeared on the Web (e.g. social networks, governmental sites, on-line selling sites). By disclosing their personal information, users are given access to a wide range of new functionality and benefits. However, there exists a risk that their personal information is misused. To strike a balance between the advantages of personal information disclosure and protection of information, governments have created legal frameworks, such as the Data Protection Act, Health Insurance Portability & Accountability Act (HIPAA) or Safe Harbor, which place restrictions on how organisations can process personal information. By auditing the way in which organisations used personal data, it is possible to determine whether they process personal information in accordance with the appropriate frameworks. The traditional way of auditing collects evidence in a manual way. This evidence is later analysed to assess the degree of compliance to a predefined legal framework. These manual assessments are long, since large amounts of data need to be analysed, and they are unreliable, since there is no guarantee that all data is correctly analysed. As several cases of data leaks and exposures of private data have proven, traditional audits are also prone to intentional and unintentional errors derived from human intervention. Therefore, this thesis proposes a provenance-based approach to auditing the use of personal information by securely gathering and analysing electronic evidence related to the processing of personal information. This approach makes three contributions to the state of art. The first contribution is the Provenance-based Auditing Architecture that defies a set of communication protocols to make existing systems provenance-aware. These protocols specify which provenance information should be gathered to verify the compliance with the Data Protection Act. Moreover, we derive a set of Auditing Requirements by analysing a Data Protection Act case study and demonstrate that provenance can be used as electronic evidence of past processing. The second contribution is the Compliance Framework, which is a provenance-based auditing framework for automatically auditing the compliance with the Data Protection Act's principles. This framework consist of a provenance graph representation (Processing View), a novel graph-based rule representation expressing processing rules (Usage Rules Definition) and a novel set of algorithms that automatically verify whether information was processed according to the Auditing Requirements by comparing the Processing View against the Usage Rules Definition. The third contribution is the Secure Provenance-based Auditing Architecture that ensures any malicious alteration on provenance during the entire provenance life cycle of recording, storage, querying and analysis can be detected. This architecture, which relies on cryptographic techniques, guarantees the correctness of the audit results
APA, Harvard, Vancouver, ISO, and other styles
8

Marshall, Thomas E. (Thomas Edward) 1954. "Task Domain Knowledge as a Moderator of Information System Usage." Thesis, University of North Texas, 1993. https://digital.library.unt.edu/ark:/67531/metadc278541/.

Full text
Abstract:
Information system (IS) support of human problem solving during the complex task of auditing within a computer environment was investigated. 74 computer audit specialist professionals from nine firms participated in the field experiment. Task accomplishment behavior was recorded via a computerized activity-logging technique. Theoretical constructs of interest included: 1) IS problem-solving support, 2) task domain knowledge, and 3) decision-making behavior. It was theorized that task domain knowledge influences the type of IS most functionally appropriate for usage by that individual. IS task presentation served as the treatment variable. Task domain knowledge was investigated as a moderating factor of task accomplishment Task accomplishment, the dependent variable, was defined as search control strategy and quality of task performance. A subject's task domain knowledge was assessed over seven theoretical domains. Subjects were assigned to higher or lower task domain knowledge groups based on performance on professional competency examination questions. Research hypothesis one investigated the effects of task domain knowledge on task accomplishment behavior. Several task domain knowledge bases were found to influence both search control strategy and task performance. Task presentation ordering effects, hypothesis two, were not found to significantly influence search control strategy or task performance. The third hypothesis investigated interaction effects of a subject's task domain knowledge and task presentation ordering treatments on task accomplishment behavior. An interaction effect was found to influence the subject's search control strategy. The computer-specific knowledge base and task presentation ordering treatments were found to interact as joint moderators of search control strategy. Task performance was not found to be significantly influenced by interaction effects. Users' task accomplishment was modeled based upon problem-solving behavior. A subject's level of task domain knowledge was found to serve as a moderating factor of IS usage. Human information-processing strategies, IS usage, and task domain knowledge were integrated into a comprehensive IS user task model. This integrated model provides a robust characterization scheme for IS problem-solving support in a complex task environment.
APA, Harvard, Vancouver, ISO, and other styles
9

Ostroumov, Ivan Victorovich. "Magnetic field data processing with personal electronic device." Thesis, Polit. Challenges of science today: International Scientific and Practical Conference of Young Researchers and Students, April 6–8, 2016 : theses. – К., 2016. – 83p, 2016. http://er.nau.edu.ua/handle/NAU/26649.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Romig, Phillip R. "Parallel task processing of very large datasets." [Lincoln, Neb. : University of Nebraska-Lincoln], 1999. http://international.unl.edu/Private/1999/romigab.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Rehfuss, Paul Stephen. "Parallelism in contextual processing /." Full text open access at:, 1999. http://content.ohsu.edu/u?/etd,272.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Kalibjian, Jeff, and Steven Wierenga. "Assuring Post Processed Telemetry Data Integrity With a Secure Data Auditing Appliance." International Foundation for Telemetering, 2005. http://hdl.handle.net/10150/604910.

Full text
Abstract:
ITC/USA 2005 Conference Proceedings / The Forty-First Annual International Telemetering Conference and Technical Exhibition / October 24-27, 2005 / Riviera Hotel & Convention Center, Las Vegas, Nevada
Recent federal legislation (e.g. Sarbanes Oxley, Graham Leach Bliley) has introduced requirements for compliance including records retention and records integrity. Many industry sectors (e.g. Energy, under the North American Energy Reliability Council) are also introducing their own voluntary compliance mandates to avert possible additional federal regulation. A trusted computer appliance device dedicated to data auditing may soon be required in all corporate IT infrastructures to accommodate various compliance directives. Such an auditing device also may have application in telemetry post processing environments, as it maybe used to guarantee the integrity of post-processed telemetry data.
APA, Harvard, Vancouver, ISO, and other styles
13

Robinson, Patrick Glen. "Distributed Linda : design, development, and characterization of the data subsystem /." Thesis, This resource online, 1994. http://scholar.lib.vt.edu/theses/available/etd-07102009-040417/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Parker, Greg. "Robust processing of diffusion weighted image data." Thesis, Cardiff University, 2014. http://orca.cf.ac.uk/61622/.

Full text
Abstract:
The work presented in this thesis comprises a proposed robust diffusion weighted magnetic resonance imaging (DW-MRI) pipeline, each chapter detailing a step designed to ultimately transform raw DW-MRI data into segmented bundles of coherent fibre ready for more complex analysis or manipulation. In addition to this pipeline we will also demonstrate, where appropriate, ways in which each step could be optimized for the maxillofacial region, setting the groundwork for a wider maxillofacial modelling project intended to aid surgical planning. Our contribution begins with RESDORE, an algorithm designed to automatically identify corrupt DW-MRI signal elements. While slower than the closest alternative, RESDORE is also far more robust to localised changes in SNR and pervasive image corruptions. The second step in the pipeline concerns the retrieval of accurate fibre orientation distribution functions (fODFs) from the DW-MRI signal. Chapter 4 comprises a simulation study exploring the application of spherical deconvolution methods to `generic' fibre; finding that the commonly used constrained spherical harmonic deconvolution (CSHD) is extremely sensitive to calibration but, if handled correctly, might be able to resolve muscle fODFs in vivo. Building upon this information, Chapter 5 conducts further simulations and in vivo image experimentation demonstrating that this is indeed the case, allowing us to demonstrate, for the first time, anatomically plausible reconstructions of several maxillofacial muscles. To complete the proposed pipeline, Chapter 6 then introduces a method for segmenting whole volume streamline tractographies into anatomically valid bundles. In addition to providing an accurate segmentation, this shape-based method does not require computationally expensive inter-streamline comparisons employed by other approaches, allowing the algorithm to scale linearly with respect to the number of streamlines within the dataset. This is not often true for comparison based methods which in the best case scale in higher linear time but more often by O(N2) complexity.
APA, Harvard, Vancouver, ISO, and other styles
15

Ma, Chi-kui, and 馬智駒. "The profession of EDP audit in Hong Kong." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1989. http://hub.hku.hk/bib/B31264426.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Tucker, Peter A. "Punctuated data streams /." Full text open access at:, 2005. http://content.ohsu.edu/u?/etd,255.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Gottemukkala, Vibby. "Scalability issues in distributed and parallel databases." Diss., Georgia Institute of Technology, 1996. http://hdl.handle.net/1853/8176.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Lee, J. J. "The object-oriented database and processing of electronic warfare data." Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1996. http://handle.dtic.mil/100.2/ADA303112.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Bodorik, Peter Carleton University Dissertation Engineering Electrical. "Query processing strategies in a distributed data base." Ottawa, 1985.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
20

Chen, George C. M. "Strategic analysis of a data processing company /." Burnaby B.C. : Simon Fraser University, 2005. http://ir.lib.sfu.ca/handle/1892/3624.

Full text
Abstract:
Research Project (M.B.A.) - Simon Fraser University, 2005.
Research Project (Faculty of Business Administration) / Simon Fraser University. Senior supervisor : Dr. Ed Bukszar. EMBA Program. Also issued in digital format and available on the World Wide Web.
APA, Harvard, Vancouver, ISO, and other styles
21

Lewis, Tony. "Electronic data interchange in the construction industry." Thesis, Loughborough University, 1998. https://dspace.lboro.ac.uk/2134/11183.

Full text
Abstract:
The aim of this research is to improve the efficiency of the construction process through the application of electronic data interchange (EDI). This thesis describes the development and application of EDI messages. The messages described are targeted to provide a means for transferring construction specific information during the construction process. The definition of electronic data interchange and its technical issues are first described. The nature of EDI, replacing paper based communication with electronic messages, impacts on the way in which business is conducted, and also has far reaching legal implications due to the reliance of many legal systems on paper documents and signatures. The business and legal implications are therefore discussed in detail. The application of EDI in the construction industry is investigated by means of a literature review. This work is furthered by a longitudinal study of the construction industry's application of EDI, which consisted of two surveys at a five year interval. A model of the information flows within the traditional construction process is developed to assist in the identification of information flows suitable for EDI. A methodology for message development was produced. The methodology was then applied to develop a description data model that could be utilised in the existing bill of quantity and trading cycle messages. The bill of quantity message set was at a stage ready for trial. To determine the issues related to implementation specifically in the construction industry a trial implementation of this message set was undertaken. The official implementation undertaken by EDICON is described. Software was also developed to undertake the trial. This software was tested and proved the message set developed was suitable for the transfer of bill of quantity related information during a construction project. The factors causing the failure of the implementation of the bill of quantities message set are discussed. A number of these factors are considered valid for all construction project information flows. Finally, the use of shared project models to re-engineer construction information tasks is recommended as a means of achieving significant benefit from electronic data exchange in the construction process.
APA, Harvard, Vancouver, ISO, and other styles
22

Bostanudin, Nurul Jihan Farhah. "Computational methods for processing ground penetrating radar data." Thesis, University of Portsmouth, 2013. https://researchportal.port.ac.uk/portal/en/theses/computational-methods-for-processing-ground-penetrating-radar-data(d519f94f-04eb-42af-a504-a4c4275d51ae).html.

Full text
Abstract:
The aim of this work was to investigate signal processing and analysis techniques for Ground Penetrating Radar (GPR) and its use in civil engineering and construction industry. GPR is the general term applied to techniques which employ radio waves, typically in the Mega Hertz and Giga Hertz range, to map structures and features buried in the ground or in manmade structures. GPR measurements can suffer from large amount of noise. This is primarily caused by interference from other radio-wave-emitting devices (e.g., cell phones, radios, etc.) that are present in the surrounding area of the GPR system during data collection. In addition to noise, presence of clutter – reflections from other non-target objects buried underground in the vicinity of the target can make GPR measurement difficult to understand and interpret, even for the skilled human, GPR analysts. This thesis is concerned with the improvements and processes that can be applied to GPR data in order to enhance target detection and characterisation process particularly with multivariate signal processing techniques. Those primarily include Principal Component Analysis (PCA) and Independent Component Analysis (ICA). Both techniques have been investigated, implemented and compared regarding their abilities to separate the target originating signals from the noise and clutter type signals present in the data. Combination of PCA and ICA (SVDPICA) and two-dimensional PCA (2DPCA) are the specific approaches adopted and further developed in this work. Ability of those methods to reduce the amount of clutter and unwanted signals present in GPR data have been investigated and reported in this thesis, suggesting that their use in automated analysis of GPR images is a possibility. Further analysis carried out in this work concentrated on analysing the performance of developed multivariate signal processing techniques and at the same time investigating the possibility of identifying and characterising the features of interest in pre-processed GPR images. The driving idea behind this part of work was to extract the resonant modes present in the individual traces of each GPR image and to use properties of those poles to characterise target. Three related but different methods have been implemented and applied in this work – Extended Prony, Linear Prediction Singular Value Decomposition and Matrix Pencil methods. In addition to these approaches, PCA technique has been used to reduce dimensionality of extracted traces and to compare signals measured in various experimental setups. Performance analysis shows that Matrix Pencil offers the best results.
APA, Harvard, Vancouver, ISO, and other styles
23

Thomas, Muffy. "The imperative implementation of algebraic data types." Thesis, University of St Andrews, 1988. http://hdl.handle.net/10023/13471.

Full text
Abstract:
The synthesis of imperative programs for hierarchical, algebraically specified abstract data types is investigated. Two aspects of the synthesis are considered: the choice of data structures for efficient implementation, and the synthesis of linked implementations for the class of ADTs which insert and access data without explicit key. The methodology is based on an analysis of the algebraic semantics of the ADT. Operators are partitioned according to the behaviour of their corresponding operations in the initial algebra. A family of relations, the storage relations of an ADT, Is defined. They depend only on the operator partition and reflect an observational view of the ADT. The storage relations are extended to storage graphs: directed graphs with a subset of nodes designated for efficient access. The data structures in our imperative language are chosen according to properties of the storage relations and storage graphs. Linked implementations are synthesised in a stepwise manner by implementing the given ADT first by its storage graphs, and then by linked data structures in the imperative language. Some circumstances under which the resulting programs have constant time complexity are discussed.
APA, Harvard, Vancouver, ISO, and other styles
24

Best, Peter J. "Machine-independent audit trail analysis." Thesis, Queensland University of Technology, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
25

Cline, George E. "A control framework for distributed (parallel) processing environments." Thesis, This resource online, 1994. http://scholar.lib.vt.edu/theses/available/etd-12042009-020227/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Clayton, Peter Graham. "Interrupt-generating active data objects." Thesis, Rhodes University, 1990. http://hdl.handle.net/10962/d1006700.

Full text
Abstract:
An investigation is presented into an interrupt-generating object model which is designed to reduce the effort of programming distributed memory multicomputer networks. The object model is aimed at the natural modelling of problem domains in which a number of concurrent entities interrupt one another as they lay claim to shared resources. The proposed computational model provides for the safe encapsulation of shared data, and incorporates inherent arbitration for simultaneous access to the data. It supplies a predicate triggering mechanism for use in conditional synchronization and as an alternative mechanism to polling. Linguistic support for the proposal requires a novel form of control structure which is able to interface sensibly with interrupt-generating active data objects. The thesis presents the proposal as an elemental language structure, with axiomatic guarantees which enforce safety properties and aid in program proving. The established theory of CSP is used to reason about the object model and its interface. An overview is presented of a programming language called HUL, whose semantics reflect the proposed computational model. Using the syntax of HUL, the application of the interrupt-generating active data object is illustrated. A range of standard concurrent problems is presented to demonstrate the properties of the interrupt-generating computational model. Furthermore, the thesis discusses implementation considerations which enable the model to be mapped precisely onto multicomputer networks, and which sustain the abstract programming level provided by the interrupt-generating active data object in the wider programming structures of HUL.
APA, Harvard, Vancouver, ISO, and other styles
27

Jin, Xiaoming. "A practical realization of parallel disks for a distributed parallel computing system." [Gainesville, Fla.] : University of Florida, 2000. http://etd.fcla.edu/etd/uf/2000/ane5954/master.PDF.

Full text
Abstract:
Thesis (M.S.)--University of Florida, 2000.
Title from first page of PDF file. Document formatted into pages; contains ix, 41 p.; also contains graphics. Vita. Includes bibliographical references (p. 39-40).
APA, Harvard, Vancouver, ISO, and other styles
28

Nader, Babak. "Parallel solution of sparse linear systems." Full text open access at:, 1987. http://content.ohsu.edu/u?/etd,138.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Khare, Arjun. "ACT++ 3.0 : implementation of the actor model using POSIX threads /." Master's thesis, This resource online, 1994. http://scholar.lib.vt.edu/theses/available/etd-10242009-020041/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Wan, Charn Wing. "The political economy of digital copyright in Hong Kong /." access full-text access abstract and table of contents, 2009. http://libweb.cityu.edu.hk/cgi-bin/ezdb/thesis.pl?jsd-slw-b23750893f.pdf.

Full text
Abstract:
Thesis (JSD)--City University of Hong Kong, 2009.
"Submitted to School of Law in partial fulfillment of the requirements for the degree of Doctor of Juridical Science." Includes bibliographical references (leaves 435-452)
APA, Harvard, Vancouver, ISO, and other styles
31

Gardener, Michael Edwin. "A multichannel, general-purpose data logger." Thesis, Cape Technikon, 1986. http://hdl.handle.net/20.500.11838/2179.

Full text
Abstract:
Thesis (Diploma (Electrical Engineering))--Cape Technikon, 1986.
This thesis describes the implementation of a general-purpose, microprocessor-based Data Logger. The Hardware allows analog data acquisition from one to thirty two channels with 12 bit resolution and at a data throughput of up to 2KHz. The data is logged directly to a Buffer memory and from there, at the end of each 109, it is dumped to an integral cassette data recorder. The recorded data can be transfered from the logger to a desk-top computer, via the IEEE 488 port, for further processing and display. All log parameters are user selectable by means of menu prompted keyboard entry and a Real-Time clock (RTC) provides date and time information automatically.
APA, Harvard, Vancouver, ISO, and other styles
32

Cuce, Simon. "GLOMAR : a component based framework for maintaining consistency of data objects within a heterogeneous distributed file system." Monash University, School of Computer Science and Software Engineering, 2003. http://arrow.monash.edu.au/hdl/1959.1/5743.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Fowler, Robert Joseph. "Decentralized object finding using forwarding address /." Thesis, Connect to this title online; UW restricted, 1985. http://hdl.handle.net/1773/6947.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Pu, Calton. "Replication and nested transactions in the Eden Distributed System /." Thesis, Connect to this title online; UW restricted, 1986. http://hdl.handle.net/1773/6881.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Chuang, Chien-Kuo Eric. "DIB on the Xerox workstation /." Online version of thesis, 1986. http://hdl.handle.net/1850/8856.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Rixner, Lois R. "A survey and analysis of algorithms for the detection of termination in a distributed systems /." Online version of thesis, 1991. http://hdl.handle.net/1850/11008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Rodriguez, Wilfredo. "Identifying mechanisms (naming) in distributed systems : goals, implications and overall influence on performance /." Online version of thesis, 1985. http://hdl.handle.net/1850/8820.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Emerson, Glen D. "Projected performance requirements for personnel entering information processing jobs for the federal government /." Full-text version available from OU Domain via ProQuest Digital Dissertations, 1985.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
39

Liu, Guangtian. "An event service architecture in distributed real-time systems /." Digital version accessible at:, 1999. http://wwwlib.umi.com/cr/utexas/main.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Kilpatrick, Carol Elizabeth. "Capture and display of performanced information for parallel and distributed applications." Diss., Georgia Institute of Technology, 1991. http://hdl.handle.net/1853/8193.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Gu, Weiming. "On-line monitoring and interactive steering of large-scale parallel and distributed applications." Diss., Georgia Institute of Technology, 1995. http://hdl.handle.net/1853/9220.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Chen, Liang. "Performance analysis and improvement of parallel simulation." Diss., Georgia Institute of Technology, 1993. http://hdl.handle.net/1853/25477.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Ravindran, K. "Reliable client-server communication in distributed programs." Thesis, University of British Columbia, 1987. http://hdl.handle.net/2429/27514.

Full text
Abstract:
Remote procedure call (RPC) and shared variable are communication abstractions which allow the various processes of a distributed program, often modelled as clients and servers, to communicate with one another across machine boundaries. A key requirement of the abstractions is to mask the machine and communication failures that may occur during the client-server communications. In practice, many distributed applications can inherently tolerate failures under certain situations. If such application layer information is available to the client-server communication layer (RPC and shared variable), the failure masking algorithms in the communication layer may relax the constraints under which the algorithms may have to operate if the information is not available. The relaxation significantly simplifies the algorithms and the underlying message transport layer and allows formulation of efficient algorithms. This application-driven approach forms the backbone of the failure masking techniques described in the thesis, as outlined below: Orphan handling in RPCs: Using the application-driven approach, the thesis introduces a new technique of adopting the orphans caused by failures during RPCs. The adoption technique is preferable to orphan killing because orphan killing wastes any work already completed and requires rollback which may be expensive and sometimes not meaningful. The thesis incorporates orphan adoption into two schemes of replicating a server: i) Primary-secondary scheme in which one of the replicas of the server acts as the primary and executes RPCs from clients while the other replicas stand by as secondaries. When the primary fails, one of the secondaries becomes the primary, restarts the server execution from the most recent checkpoint and adopts the orphan, ii) Replicated execution scheme in which an RPC on the server is executed by more than one replica of the server. When any of the replicas fails, the orphan generated by the failure is adopted by the surviving replicas. Both schemes employ call re-executions by servers based on the application-level idempotency properties of the calls. Access to shared variables: Contemporary distributed programs deal with a new class of shared variables such as information on name bindings, distributed load and leadership within a service group. Since the consistency constraints on such system variables need not be as strong as those for user data, the access operations on the variables may be made simpler using this application layer information. Along this direction, the thesis introduces an abstraction, which we call application-driven shared variable, to govern access operations on the variables. The algorithms for the access operations on a variable use intra-server group communication and enforce consistency of the variable to the extent required by the application. The thesis describes complete communication models incorporating the application-driven approach to mask failures.
Science, Faculty of
Computer Science, Department of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
44

Pitts, David Vernon. "A storage management system for a reliable distributed operating system." Diss., Georgia Institute of Technology, 1986. http://hdl.handle.net/1853/16895.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Torres-Rojas, Francisco Jose. "Scalable approximations to causality and consistency of distributed objects." Diss., Georgia Institute of Technology, 1999. http://hdl.handle.net/1853/9155.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Setiowijoso, Liono. "Data Allocation for Distributed Programs." PDXScholar, 1995. https://pdxscholar.library.pdx.edu/open_access_etds/5102.

Full text
Abstract:
This thesis shows that both data and code must be efficiently distributed to achieve good performance in a distributed system. Most previous research has either tried to distribute code structures to improve parallelism or to distribute data to reduce communication costs. Code distribution (exploiting functional parallelism) is an effort to distribute or to duplicate function codes to optimize parallel performance. On the other hand, data distribution tries to place data structures as close as possible to the function codes that use it, so that communication cost can be reduced. In particular, dataflow researchers have primarily focused on code partitioning and assignment. We have adapted existing data allocation algorithms for use with an existing dataflow-based system, ParPlum. ParPlum allows the execution of dataflow graphs on networks of workstations. To evaluate the impact of data allocation, we extended ParPlum to more effectively handle data structures. We then implemented tools to extract from dataflow graphs information that is relevant to the mapping algorithms and fed this information to our version of a data distribution algorithm. To see the relation between code and data parallelism we added optimization to optimize the distribution of the loop function components and the data structure access components. All of these are done automatically without programmer or user involvement. We ran a number of experiments using matrix multiplication as our workload. We used different numbers of processors and different existing partitioning and allocation algorithm. Our results show that automatic data distribution greatly improves the performance of distributed dataflow applications. For example, with 15 x 15 matrices, applying data distribution speeds up execution about 80% on 7 machines. Using data distribution and our code-optimizations on 7 machines speeds up execution over the base case by 800%. Our work shows that it is possible to make efficient use of distributed networks with compiler support and shows that both code mapping and data mapping must be considered to achieve optimal performance.
APA, Harvard, Vancouver, ISO, and other styles
47

潘淑欣 and Shuk-yan Poon. "A decentralized multi-agent system for restructured power system operation." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1997. http://hub.hku.hk/bib/B31219810.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Nguyen, Thuyen Dinh 1959. "A CASE STUDY OF FLEXIBLE DISTRIBUTED PROCESSING SYSTEM IN COPIER DEVELOPMENT (PROPOTYPE, DRIVER, PROTOCOL)." Thesis, The University of Arizona, 1986. http://hdl.handle.net/10150/275518.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Kulatunga, Chamil. "Enforcing receiver-driven multicast congestion control using ECN-Nonce." Thesis, Available from the University of Aberdeen Library and Historic Collections Digital Resources, 2009. http://digitool.abdn.ac.uk:80/webclient/DeliveryManager?application=DIGITOOL-3&owner=resourcediscovery&custom_att_2=simple_viewer&pid=33532.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Burke, Janice B. "The effects of informal computer keyboarding on straight copy speed and accuracy." Thesis, This resource online, 1988. http://scholar.lib.vt.edu/theses/available/etd-04272010-020348/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography