To see the other types of publications on this topic, follow the link: Computers Computer science.

Dissertations / Theses on the topic 'Computers Computer science'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Computers Computer science.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Webster, Linda D. "Measuring change in computer self-efficacy and computer literacy of undergraduates in an introduction to computers course /." free to MU campus, to others for purchase, 2004. http://wwwlib.umi.com/cr/mo/fullcit?p3164548.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Gleeson, William Joseph. "Computer-based decision-making: The impact of personal computers and distributed databases on managers' performance." Diss., The University of Arizona, 1990. http://hdl.handle.net/10150/185309.

Full text
Abstract:
This study was a field experiment to test the influence of two different computer conditions on decision-making in an organizational setting. The experiment was carried out in ten mid-sized corporations. The 117 subjects contained about equal numbers of managers and non-managers. The computer conditions tested were (1) a personal computer used with a distributed data base and (2) the traditional mainframe with a central data base supplying information by printouts. The experimental problem was identical in both conditions as was the data in the data base. An additional part of the experiment was to vary the amount of information provided. Half the Subjects in both test conditions had only half the information available to them that the other half of the Subjects had. The results indicated that personal computers were more efficient by enabling Subjects to reach decisions faster. PCs did not produce better outcomes overall. Managers performed best using the traditional printouts. Their performance declined in effectiveness (but not in efficiency) when using a PC. With non-managers the reverse applied. Non-managers performed best when using a PC if they were computer literate. In fact, computer literate non-managers with PCs performed better than managers in either test condition whether the managers were computer literate or not. The level of information available is important. More information leads to better decisions. The implications of the results for management are that (1) more training in the use of computers will produce better outcomes in decision-making; (2) PCs can improve productivity by achieving better effectiveness through better decision outcomes and do so more efficiently by taking less time. Non-managers using PCs can make managerial decisions as well as managers can if the non-managers have computer literacy training. This tends to support the view that managers can be de-skilled by the arrival of PCs in the workplace.
APA, Harvard, Vancouver, ISO, and other styles
3

Nadarajah, Kumaravel. "Computers in science teaching: a reality or dream; The role of computers in effective science education: a case of using a computer to teach colour mixing; Career oriented science education for the next millennium." Thesis, Rhodes University, 2000. http://hdl.handle.net/10962/d1003341.

Full text
Abstract:
Science education in South Africa is not improving much. Many science educators do not have appropriate science qualifications. Majority of the learners have limited facilities to learn science. In this dilemma the move to OBE may result in further substantial deterioration of science education. A possible way out is to use computers in science education to facilitate the learning process. This study was designed to investigate how computers contribute to learners’ skills development in a physics course. A series of interactive computer simulations of colour mixing and a number of closely related traditional practical activities are aimed to promote learners’ understanding of colour. It was concluded that while computer environments have greater potentialas learning tools, they also limit interactions in significant ways.
APA, Harvard, Vancouver, ISO, and other styles
4

Zilli, Davide. "Smartphone-powered citizen science for bioacoustic monitoring." Thesis, University of Southampton, 2015. https://eprints.soton.ac.uk/382943/.

Full text
Abstract:
Citizen science is the involvement of amateur scientists in research for the purpose of data collection and analysis. This practice, well known to different research domains, has recently received renewed attention through the introduction of new and easy means of communication, namely the internet and the advent of powerful “smart” mobile phones, which facilitate the interaction between scientists and citizens. This is appealing to the field of biodiversity monitoring, where traditional manual surveying methods are slow and time consuming and rely on the expertise of the surveyor. This thesis investigates a participatory bioacoustic approach that engages citizens and their smartphones to map the presence of animal species. In particular, the focus is placed on the detection of the New Forest cicada, a critically endangered insect that emits a high pitched call, difficult to hear for humans but easily detected by their mobile phones. To this end, a novel real time acoustic cicada detector algorithm is proposed, which efficiently extracts three frequency bands through a Goertzel filter, and uses them as features for a hidden Markov model-based classifier. This algorithm has permitted the development of a cross-platform mobile app that enables citizen scientists to submit reports of the presence of the cicada. The effectiveness of this approach was confirmed for both the detection algorithm, which achieves an F1 score of 0.82 for the recognition of three acoustically similar insects in the New Forest; and for the mobile system, which was used to submit over 11,000 reports in the first two seasons of deployment, making it one of the largest citizen science projects of its kind. However the algorithm, though very efficient and easily tuned to different microphones, does not scale effectively to many-species classification. Therefore, an alternative method is also proposed for broader insect recognition, which exploits the strong frequency features and the repeating phrases that often occur in insects songs. To express these, it extracts a set of modulation coefficients from the power spectrum of the call, and represents them compactly by sampling them in the log-frequency space, avoiding any bias towards the scale of the phrase. The algorithm reaches an F1 score of 0.72 for 28 species of UK Orthoptera over a small training set, and an F1 score of 0.92 for the three insects recorded in the New Forest, though with higher computational cost compared to the algorithm tailored to cicada detection. The mobile app, downloaded by over 3,000 users, together with the two algorithms, demonstrate the feasibility of real-time insect recognition on mobile devices and the potential of engaging a large crowd for the monitoring of the natural environment.
APA, Harvard, Vancouver, ISO, and other styles
5

Watson, Jason. "Monitoring computer-based training over computer networks." Thesis, University of Huddersfield, 1999. http://eprints.hud.ac.uk/id/eprint/6910/.

Full text
Abstract:
As time is becoming an ever more precious commodity in today's workplace, effective training is also taking on an increasingly important role, but finding the time to train today's workforce is becoming increasingly difficult. With employees in diverse locations across the country and across the world and some working from home, on the road or "hot-desking" we have to take a new approach to training. Fortunately computer-based training can solve many of the traditional problems such as the need to bring all trainees together in the same location at the same time. With today's sophisticated computer-based training applications motivated employees can train where they want, at home or at work, and when they want, at lunchtime or after work. However, there is also a basic legal and pedagogical requirement to record who has been trained and in what. This is very easy in a traditional training scenario, but much more difficult in today's training environments. This problem is currently the major obstacle to the widespread adoption of computer-based training, and looking for a solution to these problems was the aim of this research. This research began by investigating the processes used by multimedia developers when creating Computer Based Training (CBT) applications, identifying the current methodologies, techniques and tools that they use. Very quickly it was easy to see that developers use a whole range of development tools and that their expertise is primarily in the design of training applications, not in programming. Similarly the students want credit for the training that they undergo but do not want to be distracted by an intrusive monitoring system. The role of the Training Manager is equally important. He or she needs to be able to quickly assess the situation of an individual or a group of students and take remedial action where necessary. Balancing all of these needs in a single generic solution to the monitored training problem was the single biggest challenge. This research has addressed these important problems and has developed a solution that permits the monitoring of student training progress in any location and at any time in a way that is totally transparent to the user. The author integrates this additional functionality into a new or existing training through a drag-and-drop interface which is very easy to use, creating a monitoring experience which is totally transparent to the trainee and the Training Manager receives a summary database of student progress. Specifically the system uses a library of C++ functions that interface to Authorware, Director, Toolbook or a C++ application. The functions permit an author to open a monitoring database at the start of a training session and close it at the end. Whilst the database is open we can record any data that we require regarding student progress and performance. On closing the session the resulting database is sent to a central collation point using FTP. Students are identified automatically through their IP address, from their network login or ask them to logon to the training session manually. The system can write any database format that is required and if the network is unavailable when the session ends the database will be saved locally until the next training session. At the central collation point a specially written application takes the many databases created by individual training sessions and collates them into one large database that can be queried by the training manager. Small trials were initially performed with a prototype system at the collaborating company, CBL Technology Ltd, which in turn led to larger trials at both Cable and Wireless Communication PLC and the University of Huddersfield. In these trials authors of CBT applications found the system extremely easy to integrate into their applications and the training managers and course leaders responsible for training outcomes, found the feedback on student performance, that the system provided, invaluable. This research had demonstrated that it is possible to create a generic monitored training solution that balances the needs of the trainee, the author and the Training Manager. Trainees can train at any time, anywhere in the world, over the Internet or from CDROM and a training manager can monitor their progress provided that at some time they connect to a computer network.
APA, Harvard, Vancouver, ISO, and other styles
6

An, Jianhua. "Cultural factors in constructivist design : computer literacy for the workplace /." Access Digital Full Text version, 1994. http://pocketknowledge.tc.columbia.edu/home.php/bybib/11714025.

Full text
Abstract:
Thesis (Ed.D.)--Teachers College, Columbia University, 1994.
Typescript; issued also on microfilm. Sponsor: Florence McCarthy. Dissertation Committee: John Black. Includes tables. Includes bibliographical references (leaves 170-180).
APA, Harvard, Vancouver, ISO, and other styles
7

Goudie, Robert J. B. "Bayesian structural inference with applications in social science." Thesis, University of Warwick, 2011. http://wrap.warwick.ac.uk/78778/.

Full text
Abstract:
Structural inference for Bayesian networks is useful in situations where the underlying relationship between the variables under study is not well understood. This is often the case in social science settings in which, whilst there are numerous theories about interdependence between factors, there is rarely a consensus view that would form a solid base upon which inference could be performed. However, there are now many social science datasets available with sample sizes large enough to allow a more exploratory structural approach, and this is the approach we investigate in this thesis. In the first part of the thesis, we apply Bayesian model selection to address a key question in empirical economics: why do some people take unnecessary risks with their lives? We investigate this question in the setting of road safety, and demonstrate that less satisfied individuals wear seatbelts less frequently. Bayesian model selection over restricted structures is a useful tool for exploratory analysis, but fuller structural inference is more appealing, especially when there is a considerable quantity of data available, but scant prior information. However, robust structural inference remains an open problem. Surprisingly, it is especially challenging for large n problems, which are sometimes encountered in social science. In the second part of this thesis we develop a new approach that addresses this problem|a Gibbs sampler for structural inference, which we show gives robust results in many settings in which existing methods do not. In the final part of the thesis we use the sampler to investigate depression in adolescents in the US, using data from the Add Health survey. The result stresses the importance of adolescents not getting medical help even when they feel they should, an aspect that has been discussed previously, but not emphasised.
APA, Harvard, Vancouver, ISO, and other styles
8

Tan, Nai Kwan. "A firewall training program based on CyberCIEGE." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2005. http://library.nps.navy.mil/uhtbin/hyperion/05Dec%5FTan%5FNai.pdf.

Full text
Abstract:
Thesis (M.S. in Computer Science)--Naval Postgraduate School, December 2005.
Thesis Advisor(s): Cynthia E. Irvine, Paul C. Clark. Includes bibliographical references (p.103-104). Also available online.
APA, Harvard, Vancouver, ISO, and other styles
9

Chew, Heng Hui. "A secure alert system." Thesis, Monterey, Calif. : Naval Postgraduate School, 2006. http://bosun.nps.edu/uhtbin/hyperion.exe/06Dec%5FChew.pdf.

Full text
Abstract:
Thesis (M.S. in Computer Science)--Naval Postgraduate School, December 2006.
Thesis Advisor(s): Gurminder Singh, Karen Burke. "December 2006." Includes bibliographical references (p. 63-66). Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
10

Chiang, Ken H. "A prototype implementation of a time interval file protection system in Linux." Thesis, Monterey, California. Naval Postgraduate School, 2006. http://hdl.handle.net/10945/2359.

Full text
Abstract:
Control of access to information based on temporal attributes has many potential applications. Examples include student user accounts set to expire upon graduation; files marked as time-sensitive so that their contents can be protected appropriately and the period of access to them controlled; and cryptographic keys configured to automatically expire and be unusable beyond a specific time. This thesis implements a prototype of the Time Interval Access Control (TIAC) model in the context of a protected file system for the popular open-source Linux operating system. The Linux Security Module framework is used for the implementation, which includes temporal attributes associated both with the files and the users. The implementation includes modifications to the file system as well as low-level information access constructs. As part of the design process, testing and performance analysis were conducted. Since the temporal access control mechanism is built into the kernel rather than the application, bypassing the mechanism becomes more difficult. Kernel level implementation also affords the same policy enforcement functionality to different applications, thus reducing human errors in their development. This thesis is relevant to the research on dynamic security services for information protection envisioned by the DoD Global Information Grid (GIG).
APA, Harvard, Vancouver, ISO, and other styles
11

Bıçakçı, Kemal. "On the efficiency of authentication protocols, digital signatures and their applications in E-Health a top-down approach /." Ankara : METU, 2003. http://etd.lib.metu.edu.tr/upload/1101500/index.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Erdoğdu, Utku. "Resource based plan revision in dynamic multi-agent environments." Ankara : METU, 2004. http://etd.lib.metu.edu.tr/upload/12604721/index.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Schultz, John S. "Offline forensic analysis of Microsoft Windows XP physical memory." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2006. http://library.nps.navy.mil/uhtbin/hyperion/06Sep%5FSchultz.pdf.

Full text
Abstract:
Thesis (M.S. in Computer Science)--Naval Postgraduate School, September 2006.
Thesis Advisor(s): Chris Eagle. "September 2006." Includes bibliographical references (p. 73-74). Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
14

Humenn, Polar. "The authorization calculus." Related electronic resource: Current Research at SU : database of SU dissertations, recent titles available, full text:, 2008. http://wwwlib.umi.com/cr/syr/main.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Sawsaa, Ahlam. "A generic model of ontology to visualize information science domain (OIS)." Thesis, University of Huddersfield, 2013. http://eprints.hud.ac.uk/id/eprint/17545/.

Full text
Abstract:
Ontology has been a subject of many studies carried out in artificial intelligence (AI) and information system communities. Ontology has become an important component of the semantic web, covering a variety of knowledge domains. Although building domain ontologies still remains a big challenge with regard to its designing and implementation, there are still many areas that need to create ontologies. Information Science (IS) is one of these areas that need a unified ontology model to facilitate information access among the heterogeneous data resources and share a common understanding of the domain knowledge. The objective of this study is to develop a generic model of ontology that serves as a foundation of knowledge modelling for applications and aggregation with other ontologies to facilitate information exchanging between different systems. This model will be a metadata for a knowledge base system to be used in different purposes of interest, such as education applications to support educational needs for teachers and students and information system developers, and enhancing the index tool in libraries to facilitate access to information collections. This thesis describes the process of modelling the domain knowledge of Information Science IS. The building process of the ontology of Information Science (OIS) is preceded by developing taxonomies and thesauruses of IS. This research adopts the Methontology to develop ontology of Information Science OIS. This choice of method relies on the research motivations and aims, with analysis of some development ontology methodologies and IEEE 1074-2006 standards for developing software project life cycle processes as criteria. The methodology mainly consisted of; specification, conceptualization, formalization, implementation, maintenance and evaluation. The knowledge model was formalized using Protégé to generate the ontology code. During the development process the model has been designed and evaluated. This research presents the following contributions to the present state of the art on ontology construction; - The main achievement of the study is in constructing a new model of Information Science ontology OIS. The OIS ontology is a generic model that contains only the key objects and associated attributes with relationships. The model has defined 706 concepts which will be widely used in Information Science applications. It provides the standard definitions for domain terms used in annotation databases for the domain terms, and avoids the consistency problems caused by various ontologies which will have the potential of development by different groups and institutions in the IS domain area. - It provides a framework for analyzing the IS knowledge to obtain a classification based on facet classification. The ontology modelling approach is based on topdown and bottom–up. The top-down begins with an abstract of the domain view. While the bottom-up method starts with description of the domain to gain a hierarchal taxonomy. - Designing Ontocop system a novel method presented to support the developing process as specific virtual community of IS. The Ontocop consists of a number of experts in the subject area around the world. Their feedback and assessment improve the ontology development during the creating process. The findings of the research revealed that overall feedback from the IS community has been positive and that the model met the ontology quality criteria. It was appropriate to provide consistency and clear understanding of the subject area. OIS ontology unifies information science, which is composed of library science, computer science and archival science, by creating the theoretical base useful for further practical systems. Developing ontology of information science (OIS) is not an easy task, due to the complex nature of the field. It needs to be integrated with other ontologies such as social science, cognitive science, philosophy, law management and mathematics, to provide a basic knowledge for the semantic web and also to leverage information retrieval.
APA, Harvard, Vancouver, ISO, and other styles
16

Huang, Wei-Han 1979. "Instrumentation for quantum computers." Thesis, Massachusetts Institute of Technology, 2003. http://hdl.handle.net/1721.1/30104.

Full text
Abstract:
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, February 2004.
Includes bibliographical references (p. 209-215).
Quantum computation poses challenging engineering and basic physics issues for the control of nanoscale systems. In particular, experimental realizations of up to seven-qubit NMR quantum computers have acutely illustrated how quantum circuits require extremely precise control instrumentation for pulsed excitation. In this thesis, we develop two general-purpose, low-cost pulse programmers and two Class E power amplifiers, designed for precise control of qubits and complex pulse excitation. The first-generation pulse programmer has timing resolutions of 235 ns, while the second-generation one has resolutions of 10 ns. The Class E power amplifier has [mu]s transient response times, a high quality-factor, and a small form factor. The verification of the pulse programmer and the Class E power amplifier is demonstrated using a customized nuclear quadrupole resonance (NQR) spectrom- eter, which incorporates both devices. The two devices control the generation of RF pulses used in NQR experiments on paradichlorobenzene (C₆H₄C₁₂) and sodium nitrite (NaNO₂). The NQR signals originating from ¹⁴N in sodium nitrite and from ³⁵Cl in paradichlorobenzene are measured using the NQR spectrometer. The pulse programmer and the Class E power amplifier represent first steps towards development of practical NMR quantum computers.
by Wei-Han Huang.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
17

Witt, Hendrik. "User Interfaces for Wearable Computers Development and Evaluation /." Wiesbaden : Vieweg+Teubner Verlag / GWV Fachverlage GmbH, Wiesbaden, 2008. http://sfx.ethz.ch/sfx_locater?sid=ALEPH:EBI01&genre=book&isbn=9783835192324&id=doi:10.1007/978-3-8351-9232-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Sundar, N. S. "Data access optimizations for parallel computers /." The Ohio State University, 1998. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487950658548697.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Brna, P. "Confronting science misconceptions with the help of a computer." Thesis, University of Edinburgh, 1987. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.377488.

Full text
Abstract:
A long standing aim of science educators is to help secondary school science students to learn efficiently through various exploratory regimes. A further aim, currently held by several leading science educators, is to promote learning by confronting students with the inconsistencies entailed by their own beliefs. The claim at the heart of the thesis is that well designed computer-based modelling facilities can provide advantages over many approaches exploiting other media and that such facilities can be used to promote the kinds of conflict that are believed to be beneficial. This claim is explored through an analysis of the role of modelling in science, the nature of student's beliefs about physical phenomena that conflict with more established beliefs and of how computer-based modelling environments can promote learning through modelling. This requires consideration of a wide number of issues relating to educational theory and practice, student learning, the design of modelling environments and methodologies and techniques taken from the field of Artificial Intelligence. The methodology adopted required that a number of computer environments be constructed and observations made of their usage by students. The environments are used to focus attention on the various issues. The results contained within this thesis include a short analysis of the educational implications if the use of modelling environments were to be more widely adopted, an analysis of the strengths and weaknesses of these systems in terms of how they promote student learning -particularly in relation to the nature of the beliefs that students hold- and design criteria for how future systems might be built.
APA, Harvard, Vancouver, ISO, and other styles
20

Zhou, Xiaosong. "Understanding serendipity and its application in the context of information science and technology." Thesis, University of Nottingham, 2018. http://eprints.nottingham.ac.uk/52100/.

Full text
Abstract:
Serendipity is widely experienced in current society, especially in the digital world. According to the Oxford Concise English Dictionary, the term “serendipity” is defined as “the occurrence and development of events by chance in a happy or beneficial way”. This PhD research project aims to understand serendipity in the context of information research, and then attempts to design information technologies which can support the encountering of serendipity in cyberspace. The whole PhD project is organised with two parts. The first part investigates the nature of serendipity by conducting three user studies. After a systematic literature review on existing empirical studies of serendipity, the author finds there are research methodological problems in current studies; for example, the most widely used methods are those conventional ones like interview or survey, and it is mainly the subjective data that can be collected from participants. The author then conducted the first user study, which was an expert interview, where nine experts in the research area of serendipity were interviewed with a focus on the research methodological issues. This study successfully helped the author to gain a broader understanding of the advantages and disadvantages of employing different research methods in studying serendipity. Then the second user study, which was a diary-based study, was performed among a group of Chinese scholars with the aim to have a further investigation on the role of “context” played in the process of serendipity. The study lasted two weeks and successfully collected 62 serendipitous cases from 16 participants. The outcome of this study helped us with a better understanding of how these Chinese scholars experience serendipity, and a context-based research model was constructed, where the role of external context, social context and internal context were identified in detail during the process of serendipity. One interesting finding from the second user study is that emotions played a role in these participants’ experiencing serendipity, which was a part largely ignored by current serendipity researchers; therefore, the author conducted the third user study with the main objective to find out the impact of emotions during serendipitous encountering. This study first employed electrodermal activity (EDA) device to test participants’ psychological signals during the process of serendipity, which was implemented through a self-developed algorithm and the algorithm was embedded through a “Wizard of Oz” approach in a sketch game. The results of the study show that participants are more possible to experience serendipity under the influence of positive emotions and/or with skin conductance responses (SCRs). The second part of the PhD project is the application of serendipity through recommendation technology. A recommender system is an important area that practises serendipity in the digital world, as users in today’s society are no longer satisfied with “accurate” recommendations, and they aim to be recommended with the information that is more serendipitous and interesting to them. However, a review of existing studies on serendipitous recommendation, I have found that the inspiring achievements of understanding the nature of serendipity from information science failed to gain attention by researchers in the area of recommender systems. I then developed a new serendipitous recommendation algorithm by adopting the theory of serendipity from information research and implemented the algorithm in a real data set. The algorithm was implemented in Movielens, which involves 138,493 users with about 20,000,263 ratings across 27,278 movies. The evaluation of the algorithm was conducted in a sub-dataset, which consists of 855,598 ratings from 2,113 users on 10,197 movies. The developed algorithm was compared with another two widely used collaborative filtering algorithms (user-based collaborative filtering and item-based collaborative filtering), and the results demonstrated the developed algorithm is more effective in recommending “unexpected” and “serendipitous” movies to users. A post user study on twelve movie scholars showed that these participants were possible to experience serendipity when they were recommended with movies under the developed algorithm; and compared to user-based collaborative filtering, these participants were more willing to follow the recommended use by the serendipitous algorithm.
APA, Harvard, Vancouver, ISO, and other styles
21

Robinson, Sionade Ann. "The utility of analogy in systems sciences." Thesis, City, University of London, 1990. http://openaccess.city.ac.uk/17419/.

Full text
Abstract:
The structure of the thesis reflects the three main areas of investigation. The legitimacy of analogy as a systems concept, the derivation of a model of analogy for systems thinkers and the description of a framework for practice. In the first section we are concerned with establishing an appreciation and understanding of the potential utility in the concept of analogy for systems thinkers. Having briefly surveyed the history of analogy in systems thinking and acknowledging the CUITent methodological interest in metaphor we note that our interest in analogy has been a target for our critics and led to a loss of credibility. The thesis calls for a re-evaluation of this situation and we hence describe a system thinker's view of science as the grounds on which the utility of analogy is normally dismissed. The first three chapters show that the basis on which science attacks analogy as invalid and inappropriate is itself contentious and that identified 'weaknesses' in the scientific framework can become strengths in the re-conceptualisation of a model of analogy. We consider and distinguish the dynamic relationships between analogy, model and metaphor. In the second section having established the potential value of analogy as a concept, the thesis develops an explanation of how a model of analogy for systems thinkers can be conceptualised. In the development of the model we will consider particular implications of three types of analogy, 'positive', 'negative' and 'neutral' analogy and discuss the suggestion that they reveal possibilities for exploring different and contrasting rationalities; these issues will be discussed looking at the relationship between analogy and rationality and in this context the validity of the argument from analogy. In the final section the thesis asserts that that systems thinking should not shy away from explicit use of analogy and shows how can use the framework of analogy to reconceptualise systems concepts.
APA, Harvard, Vancouver, ISO, and other styles
22

Ni, Lijun. "Building professional identity as computer science teachers: supporting high school computer science teachers through reflection and community building." Diss., Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/42870.

Full text
Abstract:
Computing education requires qualified computing teachers. The reality is that too few high schools in the U.S. have computing/computer science teachers with formal computer science (CS) training, and many schools do not have CS teacher at all. Moreover, teacher retention rate is often low. Beginning teacher attrition rate is particularly high in secondary education. Therefore, in addition to the need for preparing new CS teachers, we also need to support those teachers we have recruited and trained to become better teachers and continue teaching CS. Teacher education literature, especially teacher identity theory, suggests that a strong sense of teacher identity is a major indicator or feature of committed, qualified teachers. However, under the current educational system in the U.S., it could be challenging to establish teacher identity for high school (HS) CS teachers, e.g., due to a lack of teacher certification for CS. This thesis work centers upon understanding the sense of identity HS CS teachers hold and exploring ways of supporting their identity development through a professional development program: the Disciplinary Commons for Computing Educators (DCCE). DCCE has a major focus on promoting reflection on teaching practice and community building. With scaffolded activities such as course portfolio creation, peer review and peer observation among a group of HS CS teachers, it offers opportunities for CS teachers to explicitly reflect on and narrate their teaching, which is a central process of identity building through their participation within the community. In this thesis research, I explore the development of CS teacher identity through professional development programs. I first conducted an interview study with local HS CS teachers to understand their sense of identity and factors influencing their identity formation. I designed and enacted the professional program (DCCE) and conducted case studies with DCCE participants to understand how their participation in DCCE supported their identity development as a CS teacher. Overall,I found that these CS teachers held different teacher identities with varied features related to their motivation and commitment in teaching CS. I identified four concrete factors that contributed to these teachers' sense of professional identity as a CS teacher. I addressed some of these issues for CS teachers' identity development (especially the issue of lacking community) through offering professional development opportunities with a major focus on teacher reflection and community building. Results from this work indicate a potential model of supporting CS identity development, mapping the characteristics of the professional development program with particular facets of CS teacher identity. This work offers further understanding of the unique challenges that current CS teachers are facing in their CS teaching, as well as the challenges of preparing and supporting CS teachers. My findings also suggest guidelines for teacher education and professional development program design and implementation for building committed, qualified CS teachers in ways that promote the development of CS teacher identity.
APA, Harvard, Vancouver, ISO, and other styles
23

Stalker, R. "Engineer-computer interaction for structural monitoring." Thesis, Lancaster University, 2000. http://eprints.lancs.ac.uk/11792/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Kim, Jang Don. "Applications performance on reconfigurable computers." Thesis, Massachusetts Institute of Technology, 1997. http://hdl.handle.net/1721.1/42711.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Keir, Paul. "Design and implementation of an array language for computational science on a heterogeneous multicore architecture." Thesis, University of Glasgow, 2012. http://theses.gla.ac.uk/3645/.

Full text
Abstract:
The packing of multiple processor cores onto a single chip has become a mainstream solution to fundamental physical issues relating to the microscopic scales employed in the manufacture of semiconductor components. Multicore architectures provide lower clock speeds per core, while aggregate floating-point capability continues to increase. Heterogeneous multicore chips, such as the Cell Broadband Engine (CBE) and modern graphics chips, also address the related issue of an increasing mismatch between high processor speeds, and huge latency to main memory. Such chips tackle this memory wall by the provision of addressable caches; increased bandwidth to main memory; and fast thread context switching. An associated cost is often reduced functionality of the individual accelerator cores; and the increased complexity involved in their programming. This dissertation investigates the application of a programming language supporting the first-class use of arrays; and capable of automatically parallelising array expressions; to the heterogeneous multicore domain of the CBE, as found in the Sony PlayStation 3 (PS3). The language is a pre-existing and well-documented proper subset of Fortran, known as the ‘F’ programming language. A bespoke compiler, referred to as E , is developed to support this aim, and written in the Haskell programming language. The output of the compiler is in an extended C++ dialect known as Offload C++, which targets the PS3. A significant feature of this language is its use of multiple, statically typed, address spaces. By focusing on generic, polymorphic interfaces for both the generated and hand constructed code, a number of interesting design patterns relating to the memory locality are introduced. A suite of medium-sized (100-700 lines), real-world benchmark programs are used to evaluate the performance, correctness, and scalability of the compiler technology. Absolute speedup values, well in excess of one, are observed for all of the programs. The work ultimately demonstrates that an array language can significantly reduce the effort expended to utilise a parallel heterogeneous multicore architecture, while retaining high performance. A substantial, related advantage in using standard ‘F’ is that any Fortran compiler can create debuggable, and competitively performing serial programs.
APA, Harvard, Vancouver, ISO, and other styles
26

Moore, Jeffery Logan. "A questionnaire survey of the teaching of computer studies, pupils attitudes toward computers and perceptions of the learning environment." Thesis, University of Hull, 1988. http://hydra.hull.ac.uk/resources/hull:3111.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Barton, Roy. "Computers and practical work in science education : a comparative study." Thesis, University of East Anglia, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.318020.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Ziani, Ridha. "On the implementation of P-RAM algorithms on feasible SIMD computers." Thesis, University of Warwick, 1992. http://wrap.warwick.ac.uk/110580/.

Full text
Abstract:
The P-RAM model of computation has proved to be a very useful theoretical model for exploiting and extracting inherent parallelism in problems and thus for designing parallel algorithms. Therefore, it becomes very important to examine whether results obtained for such a model can be translated onto machines considered to be more realistic in the face of current technological constraints. In this thesis, we show how the implementation of many techniques and algorithms designed for the P-RAM can be achieved on the feasible SIMD class of computers. The first investigation concerns classes of problems solvable on the P-RAM model using the recursive techniques of compression, tree contraction and 'divide and conquer'. For such problems, specific methods are emphasised to achieve efficient implementations on some SIMD architectures. Problems such as list ranking, polynomial and expression evaluation are shown to have efficient solutions on the 2—dimensional mesh-connected computer. The balanced binary tree technique is widely employed to solve many problems in the P-RAM model. By proposing an implicit embedding of the binary tree of size n on a (√n x√n) mesh-connected computer (contrary to using the usual H-tree approach which requires a mesh of size ≈ (2√n x 2√n), we show that many of the problems solvable using this technique can be efficiently implementable on this architecture. Two efficient O (√n) algorithms for solving the bracket matching problem are presented. Consequently, the problems of expression evaluation (where the expression is given in an array form), evaluating algebraic expressions with a carrier of constant bounded size and parsing expressions of both bracket and input driven languages are all shown to have efficient solutions on the 2—dimensional mesh-connected computer. Dealing with non-tree structured computations we show that the Eulerian tour problem for a given graph with m edges and maximum vertex degree d can be solved in O(d√n) parallel time on the 2 —dimensional mesh-connected computer. A way to increase the processor utilisation on the 2-dimensional mesh-connected computer is also presented. The method suggested consists of pipelining sets of iteratively solvable problems each of which at each step of its execution uses only a fraction of available PE's.
APA, Harvard, Vancouver, ISO, and other styles
29

Law, Timothy R. "An algorithm for computing short-range forces in molecular dynamics simulations with non-uniform particle densities." Thesis, University of Warwick, 2017. http://wrap.warwick.ac.uk/111980/.

Full text
Abstract:
We develop the projection sorting algorithm, used to compute pairwise short-range interaction forces between particles in molecular dynamics simulations. We contrast this algorithm to the state of the art and discuss situations where it may be particularly effective. We then explore the efficient implementation of the projection sorting algorithm in both on-node (shared memory parallel) and off-node (distributed memory parallel) environments. We provide AVX, AVX2, KNC and AVX-512 intrinsic implementations of the force calculation kernel. We use the modern multi- and many-core architectures: Intell Haswell, Broadwell Knights Corner (KNC) and Knights Landing (KNL), as representative slice of modern High Performance Computing (HPC) installations. In the course of implementation we use our algorithm as a means of optimising a contemporary biophysical molecular dynamics simulation of chromosome condensation. We compare state-of-the-art Molecular Dynamics (MD) algorithms and projection sorting, and experimentally demonstrate the performance gains possible with our algorithm. These experiments are carried out in single- and multi-node configurations. We observe speedups of up to 5x when comparing our algorithm to the state of the art, and up to 10x when compared to the original unoptimised simulation. These optimisations have directly affected the ability of domain scientists to carry out their work.
APA, Harvard, Vancouver, ISO, and other styles
30

Khan, Javed Arif. "A visual adaptive authoring framework for adaptive hypermedia." Thesis, University of Warwick, 2018. http://wrap.warwick.ac.uk/111668/.

Full text
Abstract:
In a linear hypermedia system, all users are offered a standard series of hyperlinks. Adaptive Hypermedia (AH) tailors what the user sees to the user's goals, abilities, interests, knowledge and preferences. Adaptive Hypermedia is said to be the answer to the 'lost in hyperspace' phenomenon, where the user has too many hyperlinks to choose from, and has little knowledge to select the most appropriate hyperlink. AH offers a selection of links and content that is most appropriate to the current user. In an Adaptive Educational Hypermedia (AEH) course, a student's learning experiences can be personalised using a User Model (UM), which could include information such as the student's knowledge level, preferences and culture. Beside these basic components, a Goal Model (GM) can represent the goal the users should meet and a Domain Model (DM) would represent the knowledge domain. Adaptive strategies are sets of adaptive rules that can be applied to these models, to allow the personalisation of the course for students, according to their needs. From the many interacting elements, it is clear that the authoring process is a bottleneck in the adaptive course creation, which needs to be improved in terms of interoperability, usability and reuse of the adaptive behaviour (strategies). Authoring of Adaptive Hypermedia is considered to be difficult and time consuming. There is great scope for improving authoring tools in Adaptive Educational Hypermedia system, to aid already burdened authors to create adaptive courses easily. Adaptation specifications are very useful in creating adaptive behaviours, to support the needs of a group of learners. Authors often lack the time or the skills needed to create new adaptation specifications from scratch. Creating an adaptation specification requires the author to know and remember the programming language syntax, which places a knowledge barrier for the author. LAG is a complete and useful programming language, which, however, is considered too complex for authors to deal with directly. This thesis thus proposes a visual framework (LAGBlocks) for the LAG adaptation language and an authoring tool (VASE) to utilise the proposed visual framework, to create adaptive specifications, by manipulating visual elements. It is shown that the VASE authoring tool along with the visual framework enables authors to create adaptive specifications with ease and assist authors in creating adaptive specifications which promote the "separation of concern". The VASE authoring tool offers code completeness, correctness at design time, and also allows for adaptive strategies to be used within other tools for adaptive hypermedia. The goal is thus to make adaptive specifications easier, to create and to share for authors with little or no programming knowledge and experience. This thesis looks at three aspects of authoring in adaptive educational hypermedia systems. The first aspect of the thesis is concerned with problems faced by the author of an adaptive hypermedia system; the second aspect is concerned with describing the findings gathered from investigating the previously developed authoring tools; and the final aspect of the thesis is concerned with the proposal, the implementation and the evaluation of a new authoring tool that improves the authoring process for authors with different knowledge, background and experience. The purpose of the new tool, VASE, is to enable authors to create adaptive strategies in a puzzle-building manner; moreover, the created adaptive strategies could be used within (are compatible with) other systems in adaptive hypermedia, which use the LAG programming language.
APA, Harvard, Vancouver, ISO, and other styles
31

Qahmash, Ayman. "Towards a model of giftedness in programming : an investigation of programming characteristics of gifted students at University of Warwick." Thesis, University of Warwick, 2018. http://wrap.warwick.ac.uk/114146/.

Full text
Abstract:
This study investigates characteristics related to learning programming for gifted first year computer science students. These characteristics include mental representations, knowledge representations, coding strategies, and attitudes and personality traits. This study was motivated by developing a theoretical framework to define giftedness in programming. In doing so, it aims to close the gap between gifted education and computer science education, allowing gifted programmers to be supported. Previous studies indicated a lack of theoretical foundation of gifted education in computer science, especially for identifying gifted programmers, which may have resulted in identification process concerns and/or inappropriate support. The study starts by investigating the relationship between mathematics and programming. We collected 3060 records of raw data of students' grades from 1996 to 2015. Descriptive statistics and the Pearson product-moment correlation test were used for the analysis. The results indicate a statistically significant positive correlation between mathematics and programming in general and between specific mathematics and programming modules. The study evolves to investigate other programming-related characteristics using case study methodology and collecting quantitative and qualitative data. A sample of n=9 cases of gifted students was selected and was interviewed. In addition, we collected the students' grades, code-writing problems and project (Witter) source codes and analysed these data using specific analysis procedures according to each method. The results indicate that gifted student programmers might possess a single or multiple characteristics that have large overlaps. We introduced a model to define giftedness in programming that consists of three profiles: mathematical ability, creativity and personal traits, and each profile consists of sub-characteristics.
APA, Harvard, Vancouver, ISO, and other styles
32

Triantafyllidis, Vasileios. "High-dimensional-output surrogate models for uncertainty and sensitivity analyses." Thesis, University of Warwick, 2018. http://wrap.warwick.ac.uk/114588/.

Full text
Abstract:
Computational models that describe complex physical phenomena tend to be computationally expensive and time consuming. Partial differential equation (PDE) based models in particular produce spatio-temporal data sets in high dimensional output spaces. Repeated calls of computer models to perform tasks such as sensitivity analysis, uncertainty quantification and design optimization can become computationally infeasible as a result. While constructing an emulator is one solution to approximate the outcome of expensive computer models, it is not always capable of dealing with high-dimensional data sets. To deal with high-dimensional data, in this thesis emulation strategies (Gaussian processes (GPs), artificial neural networks (ANNs) and support vector machines (SVMs)) are combined with linear and non-linear dimensionality reduction techniques (kPCA, Isomap and diffusion maps) to develop efficient emulators. For sensitivity analysis (variance based), a probabilistic framework is developed to account for the emulator uncertainty and the method is extended to multivariate outputs, with a derivation of new semi-analytical results for performing rapid sensitivity analysis of univariate or multivariate outputs. The developed emulators are also used to extend reduced order models (ROMs) based on proper orthogonal decomposition to parameter-dependent PDEs, including an extension of the discrete empirical interpolation method for non-linear problems PDE systems.
APA, Harvard, Vancouver, ISO, and other styles
33

Biddiscombe, John A. "Dataflow methods in HPC, visualisation and analysis." Thesis, University of Warwick, 2017. http://wrap.warwick.ac.uk/103415/.

Full text
Abstract:
The processing power available to scientists and engineers using supercomputers over the last few decades has grown exponentially, permitting significantly more sophisticated simulations, and as a consequence, generating proportionally larger output datasets. This change has taken place in tandem with a gradual shift in the design and implementation of simulation and post-processing software, with a shift from simulation as a first step and visualisation/analysis as a second, towards in-situ on the fly methods that provide immediate visual feedback, place less strain on file-systems and reduce overall data-movement and copying. Concurrently, processor speed increases have dramatically slowed and multi and many-core architectures have instead become the norm for virtually all High Performance computing (HPC) machines. This in turn has led to a shift away from the traditional distributed one rank per node model, to one rank per process, using multiple processes per multicore node, and then back towards one rank per node again, using distributed and multi-threaded frameworks combined. This thesis consists of a series of publications that demonstrate how software design for analysis and visualisation has tracked these architectural changes and pushed the boundaries of HPC visualisation using dataflow techniques in distributed environments. The first publication shows how support for the time dimension in parallel pipelines can be implemented, demonstrating how information flow within an application can be leveraged to optimise performance and add features such as analysis of time-dependent flows and comparison of datasets at different timesteps. A method of integrating dataflow pipelines with in-situ visualisation is subsequently presented, using asynchronous coupling of user driven GUI controls and a live simulation running on a supercomputer. The loose coupling of analysis and simulation allows for reduced IO, immediate feedback and the ability to change simulation parameters on the fly. A significant drawback of parallel pipelines is the inefficiency caused by improper load-balancing, particularly during interactive analysis where the user may select between different features of interest, this problem is addressed in the fourth publication by integrating a high performance partitioning library into the visualization pipeline and extending the information flow up and down the pipeline to support it. This extension is demonstrated in the third publication (published earlier) on massive meshes with extremely high complexity and shows that general purpose visualization tools such as ParaView can be made to compete with bespoke software written for a dedicated task. The future of software running on many-core architectures will involve task-based runtimes, with dynamic load-balancing, asynchronous execution based on dataflow graphs, work stealing and concurrent data sharing between simulation and analysis. The final paper of this thesis presents an optimisation for one such runtime, in support of these future HPC applications.
APA, Harvard, Vancouver, ISO, and other styles
34

Kamarudin, Muhammad Hilmi. "An intrusion detection scheme for identifying known and unknown web attacks (I-WEB)." Thesis, University of Warwick, 2018. http://wrap.warwick.ac.uk/103911/.

Full text
Abstract:
The number of utilised features could increase the system's computational effort when processing large network traffic. In reality, it is pointless to use all features considering that redundant or irrelevant features would deteriorate the detection performance. Meanwhile, statistical approaches are extensively practised in the Anomaly Based Detection System (ABDS) environment. These statistical techniques do not require any prior knowledge on attack traffic; this advantage has therefore attracted many researchers to employ this method. Nevertheless, the performance is still unsatisfactory since it produces high false detection rates. In recent years, the demand for data mining (DM) techniques in the field of anomaly detection has significantly increased. Even though this approach could distinguish normal and attack behaviour effectively, the performance (true positive, true negative, false positive and false negative) is still not achieving the expected improvement rate. Moreover, the need to re-initiate the whole learning procedure, despite the attack traffic having previously been detected, seems to contribute to the poor system performance. This study aims to improve the detection of normal and abnormal traffic by determining the prominent features and recognising the outlier data points more precisely. To achieve this objective, the study proposes a novel Intrusion Detection Scheme for Identifying Known and Unknown Web Attacks (I-WEB) which combines various strategies and methods. The proposed I-WEB is divided into three phases namely pre-processing, anomaly detection and post-processing. In the pre-processing phase, the strengths of both filter and wrapper procedures are combined to select the optimal set of features. In the filter, Correlation-based Feature Selection (CFS) is proposed, whereas the Random Forest (RF) classifier is chosen to evaluate feature subsets in wrapper procedures. In the anomaly detection phase, the statistical analysis is used to formulate a normal profile as well as calculate the traffic normality score for every traffic. The threshold measurement is defined using Euclidean Distance (ED) alongside the Chebyshev Inequality Theorem (CIT) with the aim of improving the attack recognition rate by eliminating the set of outlier data points accurately. To improve the attack identification and reduce the misclassification rates that are first detected by statistical analysis, ensemble-learning particularly using a boosting classifier is proposed. This method uses using LogitBoost as the meta-classifier and RF as the base-classifier. Furthermore, verified attack traffic detected by ensemble learning is then extracted and computed as signatures before storing it in the signature library for future identification. This helps to reduce the detection time since similar traffic behaviour will not have to be re-executed in future.
APA, Harvard, Vancouver, ISO, and other styles
35

Li, Chin-Hsiang. "Extensions to the attribute grammar form model to model meta software engineering environments /." The Ohio State University, 1985. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487259580261289.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Kiper, James Dennis. "The ergonomic, efficient, and economic integration of existing tools into a software environment /." The Ohio State University, 1985. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487260531956924.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Crumpacker, John R. "Distributed password cracking." Thesis, Monterey, California : Naval Postgraduate School, 2009. http://edocs.nps.edu/npspubs/scholarly/theses/2009/Dec/09Dec%5FCrumpacker.pdf.

Full text
Abstract:
Thesis (M.S. in Computer Science)--Naval Postgraduate School, December 2009.
Thesis Advisor(s): Dinolt, George. Second Reader: Eagle, Chris. "December 2009." Description based on title screen as viewed on January 27, 2010. Author(s) subject terms: Distributed password cracking, Berkeley Open Infrastructure for Network Computing (BOINC), and John the Ripper. Includes bibliographical references (p. 63-64). Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
38

McNamee, Joshua. "Efficient streaming for high fidelity imaging." Thesis, University of Warwick, 2017. http://wrap.warwick.ac.uk/109950/.

Full text
Abstract:
Researchers and practitioners of graphics, visualisation and imaging have an ever-expanding list of technologies to account for, including (but not limited to) HDR, VR, 4K, 360°, light field and wide colour gamut. As these technologies move from theory to practice, the methods of encoding and transmitting this information need to become more advanced and capable year on year, placing greater demands on latency, bandwidth, and encoding performance. High dynamic range (HDR) video is still in its infancy; the tools for capture, transmission and display of true HDR content are still restricted to professional technicians. Meanwhile, computer graphics are nowadays near-ubiquitous, but to achieve the highest fidelity in real or even reasonable time a user must be located at or near a supercomputer or other specialist workstation. These physical requirements mean that it is not always possible to demonstrate these graphics in any given place at any time, and when the graphics in question are intended to provide a virtual reality experience, the constrains on performance and latency are even tighter. This thesis presents an overall framework for adapting upcoming imaging technologies for efficient streaming, constituting novel work across three areas of imaging technology. Over the course of the thesis, high dynamic range capture, transmission and display is considered, before specifically focusing on the transmission and display of high fidelity rendered graphics, including HDR graphics. Finally, this thesis considers the technical challenges posed by incoming head-mounted displays (HMDs). In addition, a full literature review is presented across all three of these areas, detailing state-of-the-art methods for approaching all three problem sets. In the area of high dynamic range capture, transmission and display, a framework is presented and evaluated for efficient processing, streaming and encoding of high dynamic range video using general-purpose graphics processing unit (GPGPU) technologies. For remote rendering, state-of-the-art methods of augmenting a streamed graphical render are adapted to incorporate HDR video and high fidelity graphics rendering, specifically with regards to path tracing. Finally, a novel method is proposed for streaming graphics to a HMD for virtual reality (VR). This method utilises 360° projections to transmit and reproject stereo imagery to a HMD with minimal latency, with an adaptation for the rapid local production of depth maps.
APA, Harvard, Vancouver, ISO, and other styles
39

Mower, Jacob. "Photonic quantum computers and communication systems." Thesis, Massachusetts Institute of Technology, 2015. http://hdl.handle.net/1721.1/103851.

Full text
Abstract:
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2015.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 123-137).
Quantum information processors have been proposed to solve classically intractable or unsolvable problems in computing, sensing, and secure communication. There has been growing interest in photonic implementations of quantum processors as they offer relatively long coherence lengths, precise state manipulation, and efficient measurement. In this thesis, we first present experimental techniques to generate on-chip, photonic quantum processors and then discuss protocols for fast and secure quantum communication. In particular, we describe how -to combine the outputs of multiple stochastic single-photon sources using a photonic integrated circuit to generate an efficient source of single photons. We then show designs for silicon-based quantum photonic processors that can be programmed to implement a large class of existing quantum algorithms and can lead to quicker testing of new algorithms than was previously possible. We will then present the integration of large numbers of high-efficiency, low-timing jitter single-photon detectors onto a silicon photonic integrated circuit. To conclude, we will present a quantum key distribution protocol that uses the robust temporal degree of freedom of entangled photons to enable fast, secure key exchange, as well as experimental results for implementing key distribution protocols using silicon photonic integrated circuits.
by Jacob Mower.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
40

Zhang, Zhao. "Enabling efficient parallel scripting on large-scale computers." Thesis, The University of Chicago, 2014. http://pqdtopen.proquest.com/#viewpdf?dispub=3627911.

Full text
Abstract:

Many-task computing (MTC) applications assemble existing sequential (or parallel) programs, using POSIX files for intermediate data. The parallelism of such applications often comes from data parallelism. MTC applications can be grouped into stages, and dependencies between tasks in different stages can be in the form of file production and consumption. The computation stage of MTC applications can have a large number of tasks, thus it can have a large amount of I/O traffic (metadata traffic and I/O traffic) which is also highly concurrent. Some MTC applications are iterative, where the computation iterates over a dataset and exit when some condition(s) are reached. Some MTC applications are interactive, where the application requires human action between computation stages.

In this dissertation we develop a complete parallel scripting framework called AMFORA, which has a shared in-RAM file system and task execution engine. It implements the multi-read single-write consistency model, preserves the POSIX interface for original applications, and provides an interface for collective data movement and functional data transformation. It is interoperable with many existing serial scripting languages (e.g., Bash, Python). AMFORA runs on thousands of compute nodes on an IBM BG/P supercomputer. It also runs on cloud environments such as Amazon EC2 and Google Compute Engine. To understand the baseline MTC application performance on large-scale computers, we define MTC Envelope, which is a file system benchmark to measure the capacity of a given software/hardware stack in the context of MTC applications.

The main contributions of this dissertation are: A system independent approach to profile and understand the concurrency of MTC applications' I/O behavior; A benchmark definition that measures the file system's capacity for MTC applications; A theoretical model to estimate the I/O overhead of MTC applications on large-scale computers; A scalable distributed file system design, with no centralized component, that achieves good scalability; A collective file system management toolkit to enable fast data movement; A functional file system management toolkit to enable fast file content transformation; A new parallel scripting programming model that extends a scripting language (e.g., Bash); A novel file system access interface design that combines both POSIX and non-POSIX interfaces to ease programming without loss of efficiency; An automated method for identifying data flow patterns that are amenable to collective optimizations at runtime; The open source implementation of the entire framework to enable MTC applications on large-scale computers. (Abstract shortened by UMI.)

APA, Harvard, Vancouver, ISO, and other styles
41

Wakefield, Jonathan P. "A framework for generic computer vision." Thesis, University of Huddersfield, 1994. http://eprints.hud.ac.uk/id/eprint/4003/.

Full text
Abstract:
This thesis presents a highly flexible framework for generic computer vision. The framework is implemented as an essentially object-oriented blackboard system and can easily be modified for new application domains. This has been achieved by allowing application-specific knowledge representation and data representation to be defined in terms of generic system prototypes. Using the object-oriented programming/frames paradigm allows application-specific elements of the system to inherit interpretation strategies for finding objects, and methods for calculating measurements of their features. Furthermore, the compositional structure of objects and their inter-relationships can be represented. The system automatically generates control strategies for the current domain. Interpretation of an object consists of executing a number of interpretation strategies for that object, which may be interspersed amongst other interpretation tasks and thus termed dynamic interpretation strategies. Confidence ratings for object hypotheses, created by the interpretation strategies, are evaluated and combined consistently. The 'best' hypotheses are stored on the blackboard and used to guide subsequent processing. The division of an object's interpretation into stages facilitates the early posting of tentative hypotheses on the blackboard and the system concurrently considers alternative competing hypotheses. The developed system currently performs region-based image analysis, although the framework can be extended to incorporate edge-based and motion-based analysis. A uniform and consistent approach has been adopted to all objects, including object-parts, and all application specific knowledge is made explicit. New interpretation strategies can easily be incorporated. A review of related research and background theory is included. Results of example interpretation experiments, covering various applications, are provided for an implementation of the framework on both real and simulated images.
APA, Harvard, Vancouver, ISO, and other styles
42

Song, Youngjae. "Sound-production related cognitive tasks for onset detection in self-paced brain-computer interfaces." Thesis, University of Essex, 2017. http://repository.essex.ac.uk/20755/.

Full text
Abstract:
Objective. The main goal of this research is proposing a novel method of onset detection for Self-Paced (SP) Brain-Computer Interfaces (BCIs) to increase usability and practicality of BCIs towards real-world uses from laboratory research settings. Approach. To achieve this goal, various Sound-Production Related Cognitive Tasks (SPRCTs) were tested against idle state in offline and simulated-online experiments. An online experiment was then conducted that turned a messenger dialogue on when a new message arrived by executing the Sound Imagery (SI) onset detection task in real-life scenarios (e.g. watching video, reading text). The SI task was chosen as an onset task because of its advantages over other tasks: 1) Intuitiveness. 2) Beneficial for people with motor disabilities. 3) No significant overlap with other common, spontaneous cognitive states becoming easier to use in daily-life situations. 4) No dependence on user’s mother language. Main results. The final online experimental results showed the new SI onset task had significantly better performance than the Motor Imagery (MI) approach. 84.04% (SI) vs 66.79% (MI) TFP score for sliding image scenario, 80.84% vs 61.07% for watching video task. Furthermore, the onset response speed showed the SI task being significantly faster than MI. In terms of usability, 75% of subjects answered SI was easier to use. Significance. The new SPRCT outperforms typical MI for SP onset detection BCIs (significantly better performance, faster onset response and easier usability), therefore it would be more easily used in daily-life situations. Another contribution of this thesis is a novel EMG artefact-contaminated EEG channel selection and handling method that showed significant class separation improvement against typical blind source separation techniques. A new performance evaluation metric for SP BCIs, called true-false positive score was also proposed as a standardised performance assessment method that considers idle period length, which was not considered in other typical metrics.
APA, Harvard, Vancouver, ISO, and other styles
43

Ali, Khattab M. "An intelligent intrusion detection system for external communications in autonomous vehicles." Thesis, University of Essex, 2017. http://repository.essex.ac.uk/20747/.

Full text
Abstract:
Advancements in computing, electronics and mechanical systems have resulted in the creation of a new class of vehicles called autonomous vehicles. These vehicles function using sensory input with an on-board computation system. Self-driving vehicles use an ad hoc vehicular network called VANET. The network has ad hoc infrastructure with mobile vehicles that communicate through open wireless channels. This thesis studies the design and implementation of a novel intelligent intrusion detection system which secures the external communication of self-driving vehicles. This thesis makes the following four contributions: It proposes a hybrid intrusion detection system to protect the external communication in self-driving vehicles from potential attacks. This has been achieved using fuzzification and artificial intelligence. The second contribution is the incorporation of the Integrated Circuit Metrics (ICMetrics) for improved security and privacy. By using the ICMetrics, specific device features have been used to create a unique identity for vehicles. Our work is based on using the bias in on board sensory systems to create ICMetrics for self-driving vehicles. The incorporation of fuzzy petri net in autonomous vehicles is the third contribution of the thesis. Simulation results show that the scheme can successfully detect denial-of-service attacks. The design of a clustering based hierarchical detection system has also been presented to detect worm hole and Sybil attacks. The final contribution of this research is an integrated intrusion detection system which detects various attacks by using a central database in BusNet. The proposed schemes have been simulated using the data extracted from trace files. Simulation results have been compared and studied for high levels of detection capability and performance. Analysis shows that the proposed schemes provide high detection rate with a low rate of false alarm. The system can detect various attacks in an optimised way owing to a reduction in the number of features, fuzzification.
APA, Harvard, Vancouver, ISO, and other styles
44

Ahmadzadeh-Ghahnaviehei, Sahar. "Real-time pricing algorithms with uncertainty consideration for smart grid." Thesis, University of Essex, 2017. http://repository.essex.ac.uk/20790/.

Full text
Abstract:
In today modern life smart electrical devices are used to make the human lives more comfortable. Actually, this is the combination of electronics and communications that provides the opportunity for real time communication while the measured electricity by smart meters is sent to the energy provider. In this way smart meters in residential areas play an important role for two way interaction between several users and energy provider. Solving an optimization problem with regard to consideration of satisfaction of both sides of users and energy providers tends to achieve the optimum price that is sent to the users to optimize their consumption in peak demand periods that is the main goal of demand response management programs. As nowadays the renewable energy plays an important role in providing the request of the users specially in residential areas consideration of the concept of uncertainty is an important issue that is considered in this thesis. Therefore, solving the optimization problem in presence of load uncertainty is important topic that is investigated. Another interesting issue is consideration of users' number variation in presence of load uncertainty in dynamic pricing demand response programs which gives the advantage of having good estimation of optimum consumption level of users according to the optimum announced price. In this thesis these issues are considered for solving an Income Based and Utility Base optimization problems that are further explained in upcoming chapters. In chapter III ,which provides the first contribution of the thesis a novel algorithm called Income Based Optimization (IBO) is defined and compared with previously proposed Utility Based Optimization problem (UBO). The price, users' consumption versus provided energy capacity by energy provider in 24 hours period are simulated and analyzed. The effect of variation in other parameters dependent to the cost imposed to the energy provider and the parameters that affect the users level of satisfaction is also evaluated. In Chapter IV, existence of load uncertainty is considered in proposed UBO algorithm when it is assumed that number of users in each time slot is varying based on different distributions such as Uniform or Poison. The results for the average gap between energy provider's generating capacity and consumption of the users are compared with when number of users kept constant in presence of load uncertainty in 24 hours period. Moreover, the effect of different distributions on the gap between generating capacity and the users consumption is evaluated assuming the number of users are increasing and following the distributions. The results for the announced price in 24 hours period is also evaluated and further is extended to the average announced price with respect to increase in number of users when it is assumed that user entry and departure type is varying based on different distributions and the load uncertainty also is existed. In chapter V, the proposed IBO algorithm in chapter three is further extended to the Uncertain IBO and is called UIBO. Therefore, it is assumed that bounded uncertainty is added to the users consumption. This algorithm is further extended in a way that variation in number of users is considered based on different distributions. The results are evaluated for the average gap between generating capacity and users consumption in 24 hours period and is further extended with respect to consideration of the increasing pattern for the number of users in presence of load uncertainty and different types of distributions for the users number variation. With respect to consideration of UIBO algorithm the price in 24 hours period is evaluated and the results are further extended to evaluate the average price with respect to increasing pattern for number of users that are varying based on different distributions when the bounded uncertainty is added to the users consumption. Moreover, the achieved gain of the proposed algorithm based on the ratio of the variation of the announced price to the varying number of users is evaluated. Finally chapter VI provides the conclusion and suggestion for future work.
APA, Harvard, Vancouver, ISO, and other styles
45

Alharbi, Yasser. "A network-aware virtual machine placement approach for data-intensive applications in a cloud environment." Thesis, University of Essex, 2018. http://repository.essex.ac.uk/21404/.

Full text
Abstract:
Cloud computing provides beneficial services to users, enabling them to share large amounts of information, employ Storage Nodes (SN), utilise Computing Nodes (CN) and gather knowledge for research. Virtual Machines (VMs) usually host data-intensive applications, which submit thousands of jobs that access subsets of the petabytes of data distributed over Clouds Datacentres (DCs). The VMs scheduling allocation decisions in cloud environments are based on different parameters, such as cost, resource utilisation, performance, time and resource availability. In the case of application performance, the decisions are often made on the basis of jobs being either data intensive or computation intensive. In data-intensive situations, jobs may be pushed to the data; in computation intensive situations, data may be pulled to the jobs. This kind of scheduling, in which there is no consideration of network characteristics, can lead to performance degradation in a cloud environment and may result in large processing queues and job execution delays due to site overloads. This thesis proposes a novel service framework, the network- aware VM placement approach for data- intensive applications (NADI), to address the need for improved application performance . NADI takes into account a jobs time cost based on a mechanism that maps VMs against the resources when making scheduling decisions across multiple DCs. So, it not only allocates the best available resources to a VM to minimise the time needed to complete its jobs but also checks the global state of jobs and resources so that the output of the whole cloud is maximised. The thesis begins with a statement of the problem addressed and the objectives of the research. The methodology adopted for the research is described subsequently, and the outline of the thesis is presented. This is followed by a brief introduction highlighting the current approaches in VM placement and migration in cloud computing. Next, this thesis presents a framework for the proposed NADI with a description of its various components and enabling functionalities, which are required to realise this framework. Multi-objective strategies suitable for the problems in NADI are presented. Novel algorithms for managing applications and their data are proposed; they aim to improve each jobs performance and minimise the traffic between the application and its related data. The results indicate that there are considerable performance improvements and that the completion time is reduced by 25% to 51%, which can be gained by adopting the NADI scheduling approach.
APA, Harvard, Vancouver, ISO, and other styles
46

Shercliff, Gareth. "Quality assessment of service providers in a conformance-centric Service Oriented Architecture." Thesis, Cardiff University, 2009. http://orca.cf.ac.uk/29514/.

Full text
Abstract:
In a Service Oriented Architecture (SOA), the goal of consumers is to discover and use services which lead to them experiencing the highest quality, such that their expectations and needs are satisfied. In supporting this discovery, quality assessment tools are required to establish the degree to which these expectations will be met by specific services. Traditional approaches to quality assessment in SOA assume that providers and consumers of services will adopt a performance-centric view of quality, which assumes that consumers will be most satisfied when they receive the highest absolute performance. However, adopting this approach does not consider the subjective nature of quality and will not necessarily lead to consumers receiving services that meet their individual needs. By using existing approaches to quality assessment that assume a consumer's primary goal as being optimisation of performance, consumers in SOA are currently unable to effectively identify and engage with providers who deliver services that will best meet their needs. Developing approaches to assessment that adopt a more conformance-centric view of quality (where it is assumed that consumers are most satisfied when a service meets, but not necessarily exceeds, their individual expectations) is a challenge that must be addressed if consumers are to effectively adopt SOA as a means of accessing services. In addressing the above challenge, this thesis develops a conformance-centric model of an SOA in which conformance is taken to be the primary goal of consumers. This model is holistic, in that it considers consumers, providers and assessment services and their relationship; and novel in that it proposes a set of rational provider behaviours that would be adopted in using a conformance-centric view of quality. Adopting such conformance-centric behaviour leads to observable and predictable patterns in the performance of the services offered by providers, due to the relationship that exists between the level of service delivered by the service and the expectation of the consumer. In order to support consumers in the discovery of high quality services, quality assessment tools must be able to effectively assess past performance information about services, and use this as a prediction of future performance. In supporting consumers within a conformance-centric SOA, this thesis proposes and evaluates a new set of approaches to quality assessment which make use of the patterns in provider behaviour described above. The approaches developed are non-trivial – using a selection of adapted pattern classification and other statistical techniques to infer the behaviour of individual services at run-time and calculating a numerical measure of confidence for each result that can be used by consumers to combine assessment information with other evidence. The quality assessment approaches are evaluated within a software implementation of a conformance-centric SOA, whereby they are shown to lead to consumers experiencing higher quality than with existing performance-centric approaches. By introducing conformance-centric principles into existing real-world SOA, consumers will be able to evaluate and engage with providers that offer services that have been differentiated based on consumer expectation. The benefits of such capability over the current state-of-the-art in SOA are twofold. Firstly, individual consumers will receive higher quality services, and therefore will increase the likelihood of their needs being effectively satisfied. Secondly, the availability of assessment tools which acknowledge the conformance-centric nature of consumers will encourage providers to offer a range of services for consumers with varying expectation, rather than simply offering a single service that aims to delivery maximum performance. This recognition will allow providers to use their resources more efficiently, leading to reduced costs and increased profitability. Such benefits can only be realised by adopting a conformance-centric view of quality across the SOA and by providing assessment services that operate effectively in such environments. This thesis proposes, develops and evaluates models and approaches that enable the achievement of this goal.
APA, Harvard, Vancouver, ISO, and other styles
47

Crawford, Heather Anne. "A framework for continuous, transparent authentication on mobile devices." Thesis, University of Glasgow, 2012. http://theses.gla.ac.uk/4046/.

Full text
Abstract:
Mobile devices have consistently advanced in terms of processing power, amount of memory and functionality. With these advances, the ability to store potentially private or sensitive information on them has increased. Traditional methods for securing mobile devices, passwords and PINs, are inadequate given their weaknesses and the bursty use patterns that characterize mobile devices. Passwords and PINs are often shared or weak secrets to ameliorate the memory load on device owners. Furthermore, they represent point-of-entry security, which provides access control but not authentication. Alternatives to these traditional meth- ods have been suggested. Examples include graphical passwords, biometrics and sketched passwords, among others. These alternatives all have their place in an authentication toolbox, as do passwords and PINs, but do not respect the unique needs of the mobile device environment. This dissertation presents a continuous, transparent authentication method for mobile devices called the Transparent Authentication Framework. The Framework uses behavioral biometrics, which are patterns in how people perform actions, to verify the identity of the mobile device owner. It is transparent in that the biometrics are gathered in the background while the device is used normally, and is continuous in that verification takes place regularly. The Framework requires little effort from the device owner, goes beyond access control to provide authentication, and is acceptable and trustworthy to device owners, all while respecting the memory and processor limitations of the mobile device environment.
APA, Harvard, Vancouver, ISO, and other styles
48

Patefield, Steven. "The diagnostic efficacy of JPEG still image compression in three radiological imaging modalities." Thesis, Lancaster University, 2002. http://eprints.lancs.ac.uk/12092/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Rooksby, J. "A story model of report and work in neuroradiology." Thesis, Lancaster University, 2002. http://eprints.lancs.ac.uk/12193/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Wheatman, Martin J. "An object layer for conventional file-systems." Thesis, Lancaster University, 1999. http://eprints.lancs.ac.uk/11680/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography