To see the other types of publications on this topic, follow the link: Electronic digital computers.

Dissertations / Theses on the topic 'Electronic digital computers'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Electronic digital computers.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Khayeat, Ali. "Copy-move forgery detection in digital images." Thesis, Cardiff University, 2017. http://orca.cf.ac.uk/107043/.

Full text
Abstract:
The ready availability of image-editing software makes it important to ensure the authenticity of images. This thesis concerns the detection and localization of cloning, or Copy-Move Forgery (CMF), which is the most common type of image tampering, in which part(s) of the image are copied and pasted back somewhere else in the same image. Post-processing can be used to produce more realistic doctored images and thus can increase the difficulty of detecting forgery. This thesis presents three novel methods for CMF detection, using feature extraction, surface fitting and segmentation. The Dense Scale Invariant Feature Transform (DSIFT) has been improved by using a different method to estimate the canonical orientation of each circular block. The Fitting Function Rotation Invariant Descriptor (FFRID) has been developed by using the least squares method to fit the parameters of a quadratic function on each block curvatures. In the segmentation approach, three different methods were tested: the SLIC superpixels, the Bag of Words Image and the Rolling Guidance filter with the multi-thresholding method. We also developed the Segment Gradient Orientation Histogram (SGOH) to describe the gradient of irregularly shaped blocks (segments). The experimental results illustrate that our proposed algorithms can detect forgery in images containing copy-move objects with different types of transformation (translation, rotation, scaling, distortion and combined transformation). Moreover, the proposed methods are robust to post-processing (i.e. blurring, brightness change, colour reduction, JPEG compression, variations in contrast and added noise) and can detect multiple duplicated objects. In addition, we developed a new method to estimate the similarity threshold for each image by optimizing a cost function based probability distribution. This method can detect CMF better than using a fixed threshold for all the test images, because our proposed method reduces the false positive and the time required to estimate one threshold for different images in the dataset. Finally, we used the hysteresis to decrease the number of false matches and produce the best possible result.
APA, Harvard, Vancouver, ISO, and other styles
2

Grove, Duncan A. "Performance modelling of message-passing parallel programs." Title page, contents and abstract only, 2003. http://web4.library.adelaide.edu.au/theses/09PH/09phg8832.pdf.

Full text
Abstract:
This dissertation describes a new performance modelling system, called the Performance Evaluating Virtual Parallel Machine (PEVPM). It uses a novel bottom-up approach, where submodels of individual computation and communication events are dynamically constructed from data-dependencies, current contention levels and the performance distributions of low-level operations, which define performance variability in the face of contention.
APA, Harvard, Vancouver, ISO, and other styles
3

Fernandez, Sepulveda Antonio. "Evaluation of digital identity using Windows CardSpace." Thesis, Edinburgh Napier University, 2008. http://researchrepository.napier.ac.uk/output/4032/.

Full text
Abstract:
The Internet was initially created for academic purposes, and due to its success, it has been extended to commercial environments such as e-commerce, banking, and email. As a result, Internet crime has also increased. This can take many forms, such as: personal data theft; impersonation of identity; and network intrusions. Systems of authentication such as username and password are often insecure and difficult to handle when the user has access to a multitude of services, as they have to remember many different authentications. Also, other more secure systems, such as security certificates and biometrics can be difficult to use for many users. This is further compounded by the fact that the user does not often have control over their personal information, as these are stored on external systems (such as on a service provider's site). The aim of this thesis is to present a review and a prototype of Federated Identity Management system, which puts the control of the user's identity information to the user. In this system the user has the control over their identity information and can decide if they want to provide specific information to external systems. As well, the user can manage their identity information easily with Information Cards. These Information Cards contain a number of claims that represent the user's personal information, and the user can use these for a number of different services. As well, the Federated Identity Management system, it introduces the concept of the Identity Provider, which can handle the user's identity information and which issues a token to the service provider. As well, the Identity Provider verifies that the user's credentials are valid. The prototype has been developed using a number of different technologies such as .NET Framework 3.0, CardSpace, C#, ASP.NET, and so on. In order to obtain a clear result from this model of authentication, the work has created a website prototype that provides user authentication by means of Information Cards, and another, for evaluation purposes, using a username and password. This evaluation includes a timing test (which checks the time for the authentication process), a functionality test, and also quantitative and qualitative evaluation. For this, there are 13 different users and the results obtained show that the use of Information Cards seems to improve the user experience in the authentication process, and increase the security level against the use of username and password authentication. This thesis concludes that the Federated Identity Management model provides a strong solution to the problem of user authentication, and could protect the privacy rights of the user and returns the control of the user's identity information to the user.
APA, Harvard, Vancouver, ISO, and other styles
4

Ghahroodi, Massoud. "Variation and reliability in digital CMOS circuit design." Thesis, University of Southampton, 2014. https://eprints.soton.ac.uk/365136/.

Full text
Abstract:
The silicon chip industry continues to provide devices with feature sizes at Ultra-Deep-Sub-Micron (UDSM) dimensions. This results in higher device density and lower power and cost per function. While this trend is positive, there are a number of negative side effects, including the increased device parameter variation, increased sensitivity to soft errors, and lower device yields. The lifetime of next- generation devices is also decreasing due to lower reliability margins and shorter product lifetimes. This thesis presents an investigation into the challenges of UDSM CMOS circuit design, with a review of the research conducted in this field. This investigation has led to the development of a methodology to determine the timing vulnerability factors of UDSM CMOS that leads to a more realistic definition of the Window of Vulnerability (WOV) for Soft-Error-Rate (SER) computation. We present an implementation of a Radiation-Hardened 32-bit Pipe-lined Processor as well as two novel radiation hardening techniques at Gate-level. We present a Single Event-Upset (SEU) tolerant Flip-Flop design with 38% less power overhead and 25% less area overhead at 65nm technology, compared to the conventional Triple Modular Redundancy (TMR) technique for Flip-Flop design. We also propose an approach for in-field repair (IFR) by trading area for reliability. In the case of permanent faults, spare logic blocks will replace the faulty blocks on the fly. The simulation results show that by tolerating approximately 70% area overhead and less than 18% power overhead, the reliability is increased by a factor of x10 to x100 for various component failure rates.
APA, Harvard, Vancouver, ISO, and other styles
5

Lima, Antonio. "Digital traces of human mobility and interaction : models and applications." Thesis, University of Birmingham, 2016. http://etheses.bham.ac.uk//id/eprint/6833/.

Full text
Abstract:
In the last decade digital devices and services have permeated many aspects of everyday life. They generate massive amounts of data that provide insightful information about how people move across geographic areas and how they interact with others. By analysing this detailed information, it is possible to investigate aspects of human mobility and interaction. Therefore, the thesis of this dissertation is that the analysis of mobility and interaction traces generated by digital devices and services, at different timescales and spatial granularity, can be used to gain a better understanding of human behaviour, build new applications and improve existing services. In order to substantiate this statement I develop analytical models and applications supported by three sources of mobility and interaction data: online social networks, mobile phone networks and GPS traces. First, I present three applications related to data gathered from online social networks, namely the analysis of a global rumour spreading in Twitter, the definition of spatial dissemination measures in a social graph and the analysis of collaboration between developers in GitHub. Then I describe two applications of the analysis of country-wide data of cellular phone networks: the modelling of epidemic containment strategies, with the goal of assessing their efficacy in curbing infectious diseases; the definition of a mobility-based measure of individual risk, which can be used to identify who needs targeted treatment. Finally, I present two applications based on GPS traces: the estimation of trajectories from spatially-coarse temporally-sparse location traces and the analysis of routing behaviour in urban settings.
APA, Harvard, Vancouver, ISO, and other styles
6

Betts, Thomas. "An investigation of the digital sublime in video game production." Thesis, University of Huddersfield, 2014. http://eprints.hud.ac.uk/id/eprint/25020/.

Full text
Abstract:
This research project examines how video games can be programmed to generate the sense of the digital sublime. The digital sublime is a term proposed by this research to describe experiences where the combination of code and art produces games that appear boundless and autonomous. The definition of this term is arrived at by building on various texts and literature such as the work of Kant, Deleuze and Wark and on video games such as Proteus, Minecraft and Love. The research is based on the investigative practice of my work as an artist-programmer and demonstrates how games can be produced to encourage digitally sublime scenarios. In the three games developed for this thesis I employ computer code as an artistic medium, to generate games that explore permutational complexity and present experiences that walk the margins between confusion and control. The structure of this thesis begins with a reading of the Kantian sublime, which I introduce as the foundation for my definition of the digital sublime. I then combine this reading with elements of contemporary philosophy and computational theory to establish a definition applicable to the medium of digital games. This definition is used to guide my art practice in the development of three games that examine different aspects of the digital sublime such as autonomy, abstraction, complexity and permutation. The production of these games is at the core of my research methodology and their development and analysis is used to produce contributions in the following areas. 1. New models for artist-led game design. This includes methods that re-contextualise existing aesthetic forms such as futurism, synaesthesia and romantic landscape through game design and coding. It also presents techniques that merge visuals and mechanics into a format developed for artistic and philosophical enquiry. 2. The development of new procedural and generative techniques in the programming of video games. This includes the implementation of a realtime marching cubes algorithm that generates fractal noise filtered terrain. It also includes a versatile three-dimensional space packing architectural construction algorithm. 3. A new reading of the digital sublime. This reading draws from the Kantian sublime and the writings of Deleuze, Wark and De Landa in order to present an understanding of the digital sublime specific to the domain of art practice within video games. These contributions are evidenced in the writing of this thesis and in the construction of the associated portfolio of games.
APA, Harvard, Vancouver, ISO, and other styles
7

Thakar, Aniruddha. "Visualization feedback from informal specifications." Thesis, This resource online, 1993. http://scholar.lib.vt.edu/theses/available/etd-03242009-040810/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Lee, Sae Hun. "A unified approach to optimal multiprocessor implementations from non-parallel algorithm specifications." Diss., Georgia Institute of Technology, 1986. http://hdl.handle.net/1853/16745.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Song, Chunlin. "A robust region-adaptive digital image watermarking system." Thesis, Liverpool John Moores University, 2012. http://researchonline.ljmu.ac.uk/6122/.

Full text
Abstract:
Digital image watermarking techniques have drawn the attention of researchers and practitioners as a means of protecting copyright in digital images. The technique involves a subset of information-hiding technologies, which work by embedding information into a host image without perceptually altering the appearance of the host image. Despite progress in digital image watermarking technology, the main objectives of the majority of research in this area remain improvements in the imperceptibility and robustness of the watermark to attacks. Watermark attacks are often deliberately applied to a watermarked image in order to remove or destroy any watermark signals in the host data. The purpose of the attack is. aimed at disabling the copyright protection system offered by watermarking technology. Our research in the area of watermark attacks found a number of different types, which can be classified into a number of categories including removal attacks, geometry attacks, cryptographic attacks and protocol attacks. Our research also found that both pixel domain and transform domain watermarking techniques share similar levels of sensitivity to these attacks. The experiment conducted to analyse the effects of different attacks on watermarked data provided us with the conclusion that each attack affects the high and low frequency part of the watermarked image spectrum differently. Furthermore, the findings also showed that the effects of an attack can be alleviated by using a watermark image with a similar frequency spectrum to that of the host image. The results of this experiment led us to a hypothesis that would be proven by applying a watermark embedding technique which takes into account all of the above phenomena. We call this technique 'region-adaptive watermarking'. Region-adaptive watermarking is a novel embedding technique where the watermark data is embedded in different regions of the host image. The embedding algorithms use discrete wavelet transforms and a combination of discrete wavelet transforms and singular value decomposition, respectively. This technique is derived from the earlier hypothesis that the robustness of a watermarking process can be improved by using watermark data in the frequency spectrum that are not too dissimilar to that of the host data. To facilitate this, the technique utilises dual watermarking technologies and embeds parts of the watermark images into selected regions of the host image. Our experiment shows that our technique improves the robustness of the watermark data to image processing and geometric attacks, thus validating the earlier hypothesis. In addition to improving the robustness of the watermark to attacks, we can also show a novel use for the region-adaptive watermarking technique as a means of detecting whether certain types of attack have occurred. This is a unique feature of our watermarking algorithm, which separates it from other state-of-the-art techniques. The watermark detection process uses coefficients derived from the region-adaptive watermarking algorithm in a linear classifier. The experiment conducted to validate this feature shows that, on average, 94.5% of all watermark attacks can be correctly detected and identified.
APA, Harvard, Vancouver, ISO, and other styles
10

Greenwood, Rob. "Semantic analysis for system level design automation." Thesis, This resource online, 1992. http://scholar.lib.vt.edu/theses/available/etd-10062009-020216/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Rudman, Hannah. "A framework for the transformation of the creative industries in a digital age." Thesis, Edinburgh Napier University, 2015. http://researchrepository.napier.ac.uk/Output/8866.

Full text
Abstract:
The creative industries sector faces a constantly changing context characterised by the speed of the development and deployment of digital information systems and Information Communications Technologies (ICT) on a global scale. This continuous digital disruption has had significant impact on the whole value chain of the sector: creation and production; discovery and distribution; and consumption of cultural goods and services. As a result, creative enterprises must evolve business and operational models and practices to be sustainable. Enterprises of all scales, type, and operational model are affected, and all sectors face ongoing digital disruption. Management consultancy practitioners and business strategy academics have called for new strategy development frameworks and toolkits, fit for a continuously changing world. This thesis investigates a novel approach to organisational change appropriate to the digital age, in the context of the creative sector in Scotland. A set of concepts, methods, tools, and processes to generate theoretical learning and practical knowing was created to support enterprises to digitally adapt through undertaking journeys of change and organisational development. The framework is called The AmbITion Approach. It was developed by blending participatory action research (PAR) methods and modern management consultancy, design, and creative practices. Empirical work also introduced to the framework Coghlan and Rashford's change categories. These enabled the definition and description of the extent to which organisations developed: whether they experienced first order (change), second order (adaptation) or third order (transformation) change. Digital research tools for inquiry were tested by a pilot study, and then embedded in a longitudinal study over two years of twentyone participant organisations from Scotland's creative sector. The author applied and investigated the novel approach in a national digital development programme for Scotland's creative industries. The programme was designed and delivered by the author and ran nationally between 2012-14. Detailed grounded thematic analysis of the data corpus was undertaken, along with analysis of rich media case studies produced by the organisations about their change journeys. The results of studies on participants, and validation criteria applied to the results, demonstrated that the framework triggers second (adaptation) and third order change (transformation) in creative industry enterprises. The AmbITion Approach framework is suitable for the continuing landscape of digital disruption within the creative sector. The thesis contributes to practice the concepts, methods, tools, and processes of The AmbITion Approach, which have been empirically tested in the field, and validated as a new framework for business transformation in a digital age. The thesis contributes to knowledge a theoretical and conceptual framework with a specific set of constructs and criteria that define first, second, and third order change in creative enterprises, and a robust research and action framework for the analysis of the quality, validity and change achieved by action research based development programmes. The thesis additionally contributes to the practice of research, adding to our understanding of the value of PAR and design thinking approaches and creative practices as methods for change.
APA, Harvard, Vancouver, ISO, and other styles
12

Kwon, Hyosun. "From ephemerality to delicacy : applying delicacy in the design space of digital gifting." Thesis, University of Nottingham, 2017. http://eprints.nottingham.ac.uk/46705/.

Full text
Abstract:
We encounter uncountable ephemeral phenomena in everyday life. Some of them are particularly appreciated for their ungraspable beauty and limited availability. From the outset, one strand of computing technology has evolved to encapsulate and preserve this transient experience. A myriad of digital devices has been developed to capture the fleeting moments and to store as digital files for later use, edit, share, and distribute. On the other hand, a portion of Human-Computer Interaction (HCI) research has engaged in adopting the transience of temporal phenomena in the design of interactive computing systems. Some computer and mobile applications metaphorically adopt the ephemerality in graphical elements or functions that resemble our real world experiences such as, forgetting and real-time conversation that naturally fades away immediately. Interactive artefacts or installations often incorporate ephemeral materials for abstract and artistic expression. Therefore, ephemeral artefacts or phenomena are often employed as a passive design element in ambient and peripheral interactions rather than in applications for practical purpose. However, ephemeral materials also engender experiences of a non-ambient nature. Some materials are physically fragile, only lasting for a brief moment, and therefore require constant care to retain their status, which might lead to highly focused attention, delicate interaction, and even a tense experience. This thesis aims to investigate how to harness the fleeting and irreversible feature of ephemeral artefacts in the design of practical products and services. This PhD builds on the methods of design-oriented HCI research. Thus, this thesis will present a research process that involves a series of challenges to initially frame a design problem in a fertile area for exploration; speculate a preferred situation; develop proof-of-concept prototypes to demonstrate the potential solution; and evaluate the prototypes through a user study. Contributions of this PhD have visualised by the outputs from multiple design studies. First, this thesis illustrates how the concept of ephemerality is currently understood in HCI. Then proposes a different approach to the use of ephemeral materials by shifting the focus to delicacy. The first design study introduces FugaciousFilm, a soap film based interactive touch display that shifted ephemerality from a user’s periphery to the focal point of interaction. The prototype is a platform for manifesting ephemeral interactions by inducing subtly delicate experiences. By demonstrating that ephemeral interactions reinforce user’s attention, delicacy was noticed as an attribute of user experience. By understanding of the use of delicacy, the research focus has moved from exploring how an individual ephemeral material can be utilised in interaction design, to harnessing delicacy of such materials in experience design that benefits Human-Computer Interaction. Thus, this thesis recaptures digital gift wrapping as a context by reviewing the current state of affairs in digital gifting in the field of HCI and design. A 5-stage gifting framework has been synthesised from the literature review and guided this PhD throughout the studies. The framework ought to be seen as a significant contribution in its own right. Based on this framework, a series of interviews was conducted to identify any weaknesses that reside in current media platforms, digital devices, and different modes of interaction. Hence, ‘unwrapping a digital gift’ has captured as a gap in the design space that could be reinforced by a delicate, ephemeral interaction. Therefore, this PhD proposes Hybrid Gift, a series of proof-of-concept prototypes that demonstrates digital gift wrappings. Hybrid Gift has been probed in a semi-structured design workshop to examine the use of delicacy and ephemerality in the design of digital gifting practices. The prototypes were designed to retrieve not only the unwrapping experience but also rituals around gift exchange. Therefore, this thesis discusses design implications of the findings that emerged throughout the study. Digital gifting is still an under-explored research area that is worthwhile to investigate through field works. Thus, the design implications and the framework are proposed to researchers and designers who wish to engage in the arena of digital gifting, also broadly in social user experience, and communication service and system design. From a macroscopic perspective, we are experiencing fleeting moments every second, minute, and day. However, they are rarely noticed unless we recognise that time passes irreversibly. This thesis extracted delicacy as a feature of ephemeral interactions and argued that it holds the potential to augment and enhance mundane experiences mediated by digital technology. In so doing, the series of design studies has conceptually influenced the design perspective to be shifted from material-oriented design to experience-focused design research. The design space of digital gifting would not have been recognised without the hands-on design practices in the process of this PhD. Finally, the proof-of-concept prototypes, framework, and design implications are thought to be of significance and value to the design students, researchers, and designers who want to employ similar methods and approaches in design research.
APA, Harvard, Vancouver, ISO, and other styles
13

Groves, Michael Peter. "A soliton circuit design system /." Title page, contents and summary only, 1987. http://web4.library.adelaide.edu.au/theses/09PH/09phg884.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Cooper, Simon. "DISE : a game technology-based digital interactive storytelling framework." Thesis, Liverpool John Moores University, 2011. http://researchonline.ljmu.ac.uk/6101/.

Full text
Abstract:
This thesis details the design and implementation of an Interactive Storytelling Framework. Using software engineering methodology and framework development methods, we aim to design a full Interactive Storytelling system involving a story manager, a character engine, an action engine, a planner, a 3D game engine and a set of editors for story data, world environment modelling and real-time character animation. The framework is described in detail and specified to meet the requirement of bringing a more dynamic real-time interactive story experience to the medium of computer games. Its core concepts borrow from work done in the fields of narrative theory, software engineering, computer games technology, HCI, 3D character animation and artificial intelligence. The contributions of our research and the novelties lie in the data design of the story which allows a modular approach to building reusable resources such as actions, objects, animated characters and whole story 'levels'; a switchable story planner and re-planning system implementation, allowing many planners, heuristics and schedulers that are compatible with PDDL (the "Planning Domain Definition Language") to be easily integrated with minor changes to the main classes; a 3D game engine and framework for web launched or in browser deployment of the finished product; and a user friendly story and world/environment editor; so story authors do not need advanced knowledge of coding PDDL syntax, games programming or 3D modelling to design and author a basic story. As far as we know our Interactive Storytelling Framework is the only one to include a full 3D cross-platform game engine, procedural and manual modelling tools, a story -editor and customisable planner in one complete integrated solution. The finished interactive storytelling applications are presented as computer games designed to be a real-time 3D first person experience, with the player as a main story character in a world where every context filtered action displayed is executable and the player's choices make a difference to the outcome of the story, whilst still allowing the authors high level constraints to progress the narrative along their desired path(s).
APA, Harvard, Vancouver, ISO, and other styles
15

Chen-Wilson, Lisha. "eCert : a secure and user centric edocument transmission protocol : solving the digital signing practical issues." Thesis, University of Southampton, 2013. https://eprints.soton.ac.uk/369983/.

Full text
Abstract:
Whilst our paper-based records and documents are gradually being digitized, security concerns about how such electronic data is stored, transmitted, and accessed have increased rapidly. Although the traditional digital signing method can be used to provide integrity, authentication, and non-repudiation for signed eDocuments, this method does not address all requirements, such as fine-grained access control and content status validation. What is more, information owners have increasing demands regarding their rights of ownership. Therefore, a secure user-centric eDocument management system is essential. Through a case study of a secure and user-centric electronic qualification certificate (eCertificate) system, this dissertation explores the issues and the technology gaps; it identifies existing services that can be re-used and the services that require further development; it proposes a new signing method and the corresponding system framework which solves the problems identified. In addition to tests that have been carried out for the newly designed eCertificate system to be employed under the selected ePortfolio environments, the abstract protocol (named eCert protocol) has also been applied and evaluated in two other eDocument transmitting situations, Mobile eID and eHealthcare patient data. Preliminary results indicate that the recommendation from this research meets the design requirements, and could form the foundation of future eDocument transmitting research and development.
APA, Harvard, Vancouver, ISO, and other styles
16

Li, Chin-Hsiang. "Extensions to the attribute grammar form model to model meta software engineering environments /." The Ohio State University, 1985. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487259580261289.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Kiper, James Dennis. "The ergonomic, efficient, and economic integration of existing tools into a software environment /." The Ohio State University, 1985. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487260531956924.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Abdulla, Alan Anwer. "Exploiting similarities between secret and cover images for improved embedding efficiency and security in digital steganography." Thesis, University of Buckingham, 2015. http://bear.buckingham.ac.uk/149/.

Full text
Abstract:
The rapid advancements in digital communication technology and huge increase in computer power have generated an exponential growth in the use of the Internet for various commercial, governmental and social interactions that involve transmission of a variety of complex data and multimedia objects. Securing the content of sensitive as well as personal transactions over open networks while ensuring the privacy of information has become essential but increasingly challenging. Therefore, information and multimedia security research area attracts more and more interest, and its scope of applications expands significantly. Communication security mechanisms have been investigated and developed to protect information privacy with Encryption and Steganography providing the two most obvious solutions. Encrypting a secret message transforms it to a noise-like data which is observable but meaningless, while Steganography conceals the very existence of secret information by hiding in mundane communication that does not attract unwelcome snooping. Digital steganography is concerned with using images, videos and audio signals as cover objects for hiding secret bit-streams. Suitability of media files for such purposes is due to the high degree of redundancy as well as being the most widely exchanged digital data. Over the last two decades, there has been a plethora of research that aim to develop new hiding schemes to overcome the variety of challenges relating to imperceptibility of the hidden secrets, payload capacity, efficiency of embedding and robustness against steganalysis attacks. Most existing techniques treat secrets as random bit-streams even when dealing with non-random signals such as images that may add to the toughness of the challenges. This thesis is devoted to investigate and develop steganography schemes for embedding secret images in image files. While many existing schemes have been developed to perform well with respect to one or more of the above objectives, we aim to achieve optimal performance in terms of all these objectives. We shall only be concerned with embedding secret images in the spatial domain of cover images. The main difficulty in addressing the different challenges stems from the fact that the act of embedding results in changing cover image pixel values that cannot be avoided, although these changes may not be easy to detect by the human eye. These pixel changes is a consequence of dissimilarity between the cover LSB plane and the secretimage bit-stream, and result in changes to the statistical parameters of stego-image bit-planes as well as to local image features. Steganalysis tools exploit these effects to model targeted as well as blind attacks. These challenges are usually dealt with by randomising the changes to the LSB, using different/multiple bit-planes to embed one or more secret bits using elaborate schemes, or embedding in certain regions that are noise-tolerant. Our innovative approach to deal with these challenges is first to develop some image procedures and models that result in increasing similarity between the cover image LSB plane and the secret image bit-stream. This will be achieved in two novel steps involving manipulation of both the secret image and the cover image, prior to embedding, that result a higher 0:1 ratio in both the secret bit-stream and the cover pixels‘ LSB plane. For the secret images, we exploit the fact that image pixel values are in general neither uniformly distributed, as is the case of random secrets, nor spatially stationary. We shall develop three secret image pre-processing algorithms to transform the secret image bit-stream for increased 0:1 ratio. Two of these are similar, but one in the spatial domain and the other in the Wavelet domain. In both cases, the most frequent pixels are mapped onto bytes with more 0s. The third method, process blocks by subtracting their means from their pixel values and hence reducing the require number of bits to represent these blocks. In other words, this third algorithm also reduces the length of the secret image bit-stream without loss of information. We shall demonstrate that these algorithms yield a significant increase in the secret image bit-stream 0:1 ratio, the one that based on the Wavelet domain is the best-performing with 80% ratio. For the cover images, we exploit the fact that pixel value decomposition schemes, based on Fibonacci or other defining sequences that differ from the usual binary scheme, expand the number of bit-planes and thereby may help increase the 0:1 ratio in cover image LSB plane. We investigate some such existing techniques and demonstrate that these schemes indeed lead to increased 0:1 ratio in the corresponding cover image LSB plane. We also develop a new extension of the binary decomposition scheme that is the best-performing one with 77% ratio. We exploit the above two steps strategy to propose a bit-plane(s) mapping embedding technique, instead of bit-plane(s) replacement to make each cover pixel usable for secret embedding. This is motivated by the observation that non-binary pixel decomposition schemes also result in decreasing the number of possible patterns for the three first bit-planes to 4 or 5 instead of 8. We shall demonstrate that the combination of the mapping-based embedding scheme and the two steps strategy produces stego-images that have minimal distortion, i.e. reducing the number of the cover pixels changes after message embedding and increasing embedding efficiency. We shall also demonstrate that these schemes result in reasonable stego-image quality and are robust against all the targeted steganalysis tools but not against the blind SRM tool. We shall finally identify possible future work to achieve robustness against SRM at some payload rates and further improve stego-image quality.
APA, Harvard, Vancouver, ISO, and other styles
19

Majeed, Taban Fouad. "Segmentation, super-resolution and fusion for digital mammogram classification." Thesis, University of Buckingham, 2016. http://bear.buckingham.ac.uk/162/.

Full text
Abstract:
Mammography is one of the most common and effective techniques used by radiologists for the early detection of breast cancer. Recently, computer-aided detection/diagnosis (CAD) has become a major research topic in medical imaging and has been widely applied in clinical situations. According to statics, early detection of cancer can reduce the mortality rates by 30% to 70%, therefore detection and diagnosis in the early stage are very important. CAD systems are designed primarily to assist radiologists in detecting and classifying abnormalities in medical scan images, but the main challenges hindering their wider deployment is the difficulty in achieving accuracy rates that help improve radiologists’ performance. The detection and diagnosis of breast cancer face two main issues: the accuracy of the CAD system, and the radiologists’ performance in reading and diagnosing mammograms. This thesis focused on the accuracy of CAD systems. In particular, we investigated two main steps of CAD systems; pre-processing (enhancement and segmentation), feature extraction and classification. Through this investigation, we make five main contributions to the field of automatic mammogram analysis. In automated mammogram analysis, image segmentation techniques are employed in breast boundary or region-of-interest (ROI) extraction. In most Medio-Lateral Oblique (MLO) views of mammograms, the pectoral muscle represents a predominant density region and it is important to detect and segment out this muscle region during pre-processing because it could be bias to the detection of breast cancer. An important reason for the breast border extraction is that it will limit the search-zone for abnormalities in the region of the breast without undue influence from the background of the mammogram. Therefore, we propose a new scheme for breast border extraction, artifact removal and removal of annotations, which are found in the background of mammograms. This was achieved using an local adaptive threshold that creates a binary mask for the images, followed by the use of morphological operations. Furthermore, an adaptive algorithm is proposed to detect and remove the pectoral muscle automatically. Feature extraction is another important step of any image-based pattern classification system. The performance of the corresponding classification depends very much on how well the extracted features represent the object of interest. We investigated a range of different texture feature sets such as Local Binary Pattern Histogram (LBPH), Histogram of Oriented Gradients (HOG) descriptor, and Gray Level Co-occurrence Matrix (GLCM). We propose the use of multi-scale features based on wavelet and local binary patterns for mammogram classification. We extract histograms of LBP codes from the original image as well as the wavelet sub-bands. Extracted features are combined into a single feature set. Experimental results show that our proposed method of combining LBPH features obtained from the original image and with LBPH features obtained from the wavelet domain increase the classification accuracy (sensitivity and specificity) when compared with LBPH extracted from the original image. The feature vector size could be large for some types of feature extraction schemes and they may contain redundant features that could have a negative effect on the performance of classification accuracy. Therefore, feature vector size reduction is needed to achieve higher accuracy as well as efficiency (processing and storage). We reduced the size of the features by applying principle component analysis (PCA) on the feature set and only chose a small number of eigen components to represent the features. Experimental results showed enhancement in the mammogram classification accuracy with a small set of features when compared with using original feature vector. Then we investigated and propose the use of the feature and decision fusion in mammogram classification. In feature-level fusion, two or more extracted feature sets of the same mammogram are concatenated into a single larger fused feature vector to represent the mammogram. Whereas in decision-level fusion, the results of individual classifiers based on distinct features extracted from the same mammogram are combined into a single decision. In this case the final decision is made by majority voting among the results of individual classifiers. Finally, we investigated the use of super resolution as a pre-processing step to enhance the mammograms prior to extracting features. From the preliminary experimental results we conclude that using enhanced mammograms have a positive effect on the performance of the system. Overall, our combination of proposals outperforms several existing schemes published in the literature.
APA, Harvard, Vancouver, ISO, and other styles
20

Koutsouras, Panagiotis. "Crafting content : the discovery of Minecraft's invisible digital economy." Thesis, University of Nottingham, 2018. http://eprints.nottingham.ac.uk/51744/.

Full text
Abstract:
This thesis presents an ethnographic study that aims at explicating the work of creating content in Minecraft. The existing literature paves the way in understanding Minecraft’s community by providing fragments of what players do. However, the game is studied mainly from a ludic perspective or is treated as a resource to explore distinct research agendas, instead of a field of study in itself. As such, particular phenomena that are situated inside Minecraft’s community are lost. The conducted fieldwork discovered the invisible digital economy that is part of this community. More specifically, the chapters to follow elaborate on the actors involved in this economy, covering their roles, responsibilities and goals. Furthermore, the lived work of content production is unpacked by presenting the various work practices members attend to in commissioning, creating, and delivering Minecraft content. It also becomes evident that there is a complex division of labour at play, which is based on a fragmented infrastructure as Minecraft itself does not support the wide range of activities that are necessary for carrying out the work. Essentially, actors bootstrap the market’s infrastructure by appropriating or even creating bespoke systems for conducting the various work practices that are entailed in this business. On top of that, these systems are utilised for articulation work, which is necessary for tracking progress between the geographically dispersed actors, accounting for conducted work and addressing contingent scenarios. The main contribution of this PhD project is the discovery of this digital economy, which evidently plays a significant role in Minecraft’s current form and development. Additionally, prevailing understandings of Minecraft’s ecosystem are re-visited, re-examined, and re-specified, based on the empirical evidence presented in this thesis. Finally, a number of design implications are raised with regard to addressing the game’s lack of CSCW support.
APA, Harvard, Vancouver, ISO, and other styles
21

Ku, Azir Ku Nurul Fazira. "Human factors of ubiquitous computing : ambient cueing in the digital kitchen?" Thesis, University of Birmingham, 2014. http://etheses.bham.ac.uk//id/eprint/5518/.

Full text
Abstract:
This thesis is concerned with the uses of Ubiquitous Computing (UbiComp) in everyday domestic environments. The concept of UbiComp promises to shift computing away from the desktop into everyday objects and settings. It has the twin goals of providing ‘transparent’ technologies where the information has been thoroughly embedded into everyday activities and objects (thus making the computer invisible to the user) and also (and more importantly) of seamless integration of these technologies into the activities of their users. However, this raises the challenge of how best to support interaction with a ‘transparent’ or ‘invisible’ technology; if the technology is made visible, it will attract the user's attention to it and away from the task at hand, but if it is hidden, then how can the user cope with malfunctions or other problems in the technology? We approach the design of Human-Computer Interaction in the ubiquitous environment through the use of ambient displays, i.e. the use of subtle cueing, embedded in the environment which is intended to guide human activity. This thesis draws on the concept of stimulus-response compatibility and applies this to the design ambient display. This thesis emphasizes the need to understand the users’ perspectives and responses in any particular approach that has been proposed. Therefore, the main contributions of this thesis focus on approaches to improve human performance in the ubiquitous environment through ambient display.
APA, Harvard, Vancouver, ISO, and other styles
22

Shaheed, Amjad. "A framework for the visualisation and control of ubiquitous devices, services and digital content." Thesis, Liverpool John Moores University, 2011. http://researchonline.ljmu.ac.uk/6018/.

Full text
Abstract:
The General Organization for Technical Education and Vocational Training, Riyadh, Saudi Arabia, has developed a special feeding program for the students at its institutions. The effects of this program on the nutritional and health status of these students have not been evaluated yet, and since no published dietary research has been performed on Technical and Vocational young adult male students, the present work was undertaken to investigate the nutritional status of this community in Riyadh, Kingdom of Saudi Arabia. After a pilot survey, it was decided to use a selfcompleted questionnaire combined with personal interview to investigate the nutritional status of 690 students randomly selected from the study population. Dietary data was collected by two methods: usual weekly intakes "diet history" and actual daily intakes "diet diary". The nutrient intakes were calculated using the unilever Dietary Analysis Program (UNIDAP). The statistical Package for the social Science (SPSS/PC+) was employed to analyse the data; statistical significance of relationships between certain sets of data was determined by chi-square analysis. Some general factors affecting the nutritional status of these students were identified, their nutritional habits and attitudes were investigated, and the average daily intakes of energy, the macronutrients, and selected micronutrients were calculated. The main results of this study shows that the majority of the study population are adolescent, moderately active individuals, and have lower than the standard range of the Body Mass Index; anaemia is the most stated health problem; meal-skipping and eating between meals are common habits amongst the students. Regarding nutrient intake, there was an energy, polyunsaturated fat, and vitamin C deficiency; adequate intake of saturated fat, dietary fibre, retinol, and zinc; more than adequate intake of protein, total fat, cholesterol, thiamin, riboflavin, calcium, and iron. Recommendations are given which aim to improve the nutrition of technical and vocational students.
APA, Harvard, Vancouver, ISO, and other styles
23

Goudoulakis, E. "DIEGESIS : a multi-agent Digital Interactive Storytelling framework using planning and re-planning techniques." Thesis, Liverpool John Moores University, 2014. http://researchonline.ljmu.ac.uk/4512/.

Full text
Abstract:
In recent years, the field of Digital Interactive Storytelling (DIS) has become very popular both in academic circles, as well as in the gaming industry, in which stories are becoming a unique selling point. Academic research on DIS focuses in the search for techniques that allow the creation of systems that can generate dynamically interesting stories which are not linear and can change dynamically at runtime as a consequence of a player’s actions, therefore leading to different story endings. To reach this goal, DIS systems usually employ Artificial Intelligence planning and re-planning algorithms as part of their solution. There is a lack of algorithms created specifically for DIS purposes since most DIS systems use generic algorithms, and they do not usually assess if and why a given algorithm is the best solution for their purposes. Additionally, there is no unified way (e.g. in the form of a selection of metrics) to evaluate such systems and algorithms. To address these issues and to provide new solutions to the DIS field, we performed a review of related DIS systems and algorithms, and based on the critical analysis of that work we designed and implemented a novel multi-agent DIS framework called DIEGESIS, which includes –among other novel aspects- two new DIS-focused planning and re-planning algorithms. To ensure that our framework and its algorithms have met the specifications we set, we created a large scale evaluation scenario which models the story of Troy, derived from Homer’s epic poem, “Iliad”, which we used to perform a number of evaluations based on metrics that we chose and we consider valuable for the DIS field. This collection of requirements and evaluations could be used in the future from other DIS systems as a unified test-bed for analysis and evaluation of such systems.
APA, Harvard, Vancouver, ISO, and other styles
24

Shahidipour, Hamed. "A study on the effects of variability on performance of CNFET based digital circuits." Thesis, University of Southampton, 2012. https://eprints.soton.ac.uk/364216/.

Full text
Abstract:
With the continuous trend of reducing feature sizes, and employing continuously smaller components on integrated circuits, new challenges arise on the way of silicon CMOS circuits and devices. Emerging “nanodevices” promise the possibility of increased integration density and reduced power consumption. The emerging and new devices, partially due to their extremely small dimensions, show large variations in their behaviour. The variation shown by these devices affects their reliability and the performance of circuits made from them. The Carbon Nano-Tube (CNT) is one such device which is also the device of choice in this work. This work is concerned with building reliable systems out of these unreliable components. The work was done in HSPICE with the help of the Stanford CNFET model. Logic gates are implemented using CNT Field Effect Transistors (CNFETs) which are in turn made from CNTs with different physical attributes. Given a CNT manufacturing process, there exists a mean and standard deviation (STD) for the diameter distribution of the manufactured CNTs which depend on the accuracy of the manufacturing process. In the first part of this work, CNTs with different mean diameters and standard deviations (STD) in their diameter distribution are considered. Simulation results show that logic gates made from CNTs with larger mean and smaller STDs in their diameter distribution show less variation in their timing behaviour (propagation delay, rise and fall times) and a promise of more reliable operation. Alternative structures were then explored in the form of multiplexers and XOR gates. It is shown that these structures have the advantage over the gates studied previously in that they exhibit similar rise and fall transition times and hence are better suited to CNFET-based circuit design. The next stage of this work involves implementation and simulation of a memory structure (SRAM). Parameters such as Static Noise Margin (SNM), leakage power and read/write delays were studied and the effects of CNT diameter variation on them examined. The next contributions of this work are empirical models developed for a library of CNFET-based logic gates/circuit structures. The models can predict both the mean and standard deviation (STD) in various circuit performance parameters of a given CNFET-based logic gate/SRAM given the mean and STD of the diameter of CNTs used in their manufacture. The aim is, given a target reliability specification (timing requirements, power, speed, etc.), for various logic gates, and larger circuit components, to come up with a design strategy to suggest what physical properties the nano-device of choice should have to meet the target specification or vice versa. Best-case CNT diameter mean and STD selection scenarios are proposed to minimise circuit parameter variations. In the last part of this work, the effects of doping fluctuations in the source/drain regions of the CNFETs on the performance of logic gates made from them are studied. The work concludes that if doping concentration is kept above 1%, variation in doping concentration has a minimal effect on performance parameters.
APA, Harvard, Vancouver, ISO, and other styles
25

Williams, Dewi L. (Dewi Lloyd) Carleton University Dissertation Engineering Electrical. "A Functional-test specification language." Ottawa, 1988.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
26

Cartwright, Daniel R. "Digital decision-making : using computational argumentation to support democratic processes." Thesis, University of Liverpool, 2011. http://livrepository.liverpool.ac.uk/2993/.

Full text
Abstract:
One of the key questions facing governments around the world is that of how to increase and maintain the engagement of citizens in democratic processes. Recent thought, both within academia and government itself, has turned to the use of modern computational technology to provide citizens with access to democratic processes. Access to computer and Internet technology by the general public has vastly increased over the past decade, and this wide access is one of a number of motivations behind research into the provision of democratic tasks and processes online. The particular democratic process that forms the focus of this thesis is that of online opinion gathering in order to aid government decision making. The provision of mechanisms to gather and analyse public opinion is important to any government which claims to promote a fair and equal democracy, as decisions should be made in consideration of the views and opinions of the citizens of such a democracy. The work that comprises this thesis is motivated by existing research into harvesting opinion through a variety of online methods. The software tools available largely fall into one of two categories: Those which are not based on formal structure, and those which are based on an underlying formal model of argument. The work presented in this thesis aims to overcome the shortfalls inherent to both of these categories of tool in order to realise a software suite to support both the process of opinion gathering, and analysis of the resulting data. This is achieved through the implementation of computational models of argument from the research area of argumentation, with special consideration as to how these models can be used in implemented systems in a manner that allows laypersons to interact with them effectively. A particular model of argument which supports the process of practical reasoning is implemented in a web-based computer system, thus allowing for the collection of structured arguments which are later analysed according to formal models of argument visualisation and evaluation. The theories underlying the system are extended in order to allow for added expressivity, thus providing a mechanism for more life-like argument within a system which supports comprehensive computational analysis. Ultimately, the contributions of this thesis are a functional system to support an important part of the democratic process, and an investigation into how the underlying theories can be built upon and extended in order to promote expressive argumentation.
APA, Harvard, Vancouver, ISO, and other styles
27

German, Laura. "Academic research data re-usage in a digital age : modelling best practice." Thesis, University of Southampton, 2015. https://eprints.soton.ac.uk/383481/.

Full text
Abstract:
Recent high profile retractions – such as the case of Woo Suk Hwang and others – demonstrate that there are still significant issues regarding the reliability of published academic research data. While technological advances offer the potential for greater data re-usability on the Web, models of best practice are yet to be fully re-purposed for a digital age. Employing interdisciplinary web science practices, this thesis asks what makes for excellent quality academic research across the sciences, social sciences and humanities. This thesis uses a case study approach to explore five existing digital data platforms within chemistry, marine environmental sciences and modern languages research. It evaluates their provenance metadata, legal, technological and socio cultural frameworks. This thesis further draws on data collected from semi-structured interviews conducted with eighteen individuals connected to these five data platforms. The participants have a wide range of expertise in the following areas: data management, data policy, academia, law and technology. Through the interdisciplinary literature review and cross-comparison of the three case studies, this thesis identifies the five main principles for improved modelling of best practice for academic research data re-usage both now and in the future. These principles are: (1) sustainability, (2) working towards a common understanding, (3) accreditation, (4) discoverability, and (5) a good user experience. It also reveals nine key grey areas that require further investigation.
APA, Harvard, Vancouver, ISO, and other styles
28

McHugh, Andrew. "An ontology for risk management of digital collections." Thesis, University of Glasgow, 2016. http://theses.gla.ac.uk/7757/.

Full text
Abstract:
Maintaining accessibility to and understanding of digital information over time is a complex challenge that often requires contributions and interventions from a variety of individuals and organizations. The processes of preservation planning and evaluation are fundamentally implicit and share similar complexity. Both demand comprehensive knowledge and understanding of every aspect of to-be-preserved content and the contexts within which preservation is undertaken. Consequently, means are required for the identification, documentation and association of those properties of data, representation and management mechanisms that in combination lend value, facilitate interaction and influence the preservation process. These properties may be almost limitless in terms of diversity, but are integral to the establishment of classes of risk exposure, and the planning and deployment of appropriate preservation strategies. We explore several research objectives within the course of this thesis. Our main objective is the conception of an ontology for risk management of digital collections. Incorporated within this are our aims to survey the contexts within which preservation has been undertaken successfully, the development of an appropriate methodology for risk management, the evaluation of existing preservation evaluation approaches and metrics, the structuring of best practice knowledge and lastly the demonstration of a range of tools that utilise our findings. We describe a mixed methodology that uses interview and survey, extensive content analysis, practical case study and iterative software and ontology development. We build on a robust foundation, the development of the Digital Repository Audit Method Based on Risk Assessment. We summarise the extent of the challenge facing the digital preservation community (and by extension users and creators of digital materials from many disciplines and operational contexts) and present the case for a comprehensive and extensible knowledge base of best practice. These challenges are manifested in the scale of data growth, the increasing complexity and the increasing onus on communities with no formal training to offer assurances of data management and sustainability. These collectively imply a challenge that demands an intuitive and adaptable means of evaluating digital preservation efforts. The need for individuals and organisations to validate the legitimacy of their own efforts is particularly prioritised. We introduce our approach, based on risk management. Risk is an expression of the likelihood of a negative outcome, and an expression of the impact of such an occurrence. We describe how risk management may be considered synonymous with preservation activity, a persistent effort to negate the dangers posed to information availability, usability and sustainability. Risk can be characterised according to associated goals, activities, responsibilities and policies in terms of both their manifestation and mitigation. They have the capacity to be deconstructed into their atomic units and responsibility for their resolution delegated appropriately. We continue to describe how the manifestation of risks typically spans an entire organisational environment, and as the focus of our analysis risk safeguards against omissions that may occur when pursuing functional, departmental or role-based assessment. We discuss the importance of relating risk-factors, through the risks themselves or associated system elements. To do so will yield the preservation best-practice knowledge base that is conspicuously lacking within the international digital preservation community. We present as research outcomes an encapsulation of preservation practice (and explicitly defined best practice) as a series of case studies, in turn distilled into atomic, related information elements. We conduct our analyses in the formal evaluation of memory institutions in the UK, US and continental Europe. Furthermore we showcase a series of applications that use the fruits of this research as their intellectual foundation. Finally we document our results in a range of technical reports and conference and journal articles. We present evidence of preservation approaches and infrastructures from a series of case studies conducted in a range of international preservation environments. We then aggregate this into a linked data structure entitled PORRO, an ontology relating preservation repository, object and risk characteristics, intended to support preservation decision making and evaluation. The methodology leading to this ontology is outlined, and lessons are exposed by revisiting legacy studies and exposing the resource and associated applications to evaluation by the digital preservation community.
APA, Harvard, Vancouver, ISO, and other styles
29

Murphy, Michaela. ""Lost in the noise" : DIY amateur music practice in a digital age." Thesis, University of Nottingham, 2017. http://eprints.nottingham.ac.uk/46754/.

Full text
Abstract:
A fast expanding network of DIY music communities in the UK see digital technologies transforming ways in which part-time amateur musicians are able to collaborate creatively and form alliances, touring and distributing their music to an international audience and expanding the possibilities of a DIY approach to music making beyond its subcultural, micro-cultural past. Creative autonomy and control is sought to be retained and celebrated in shared non- commercial spaces run by the artists themselves. With an interview based approach, this thesis explores the continued importance of gaining a local audience in a digital age, exploring amateur music activities in two very distinct cities. These reveal how local traditions of amateur practice continue to influence musicians and their shared venues, both in their revival and reinvention. How DIY is defined in a digital age is also explored with both observation and interview data revealing the continued legacy of Punk and how this plays a part in DIY’s expanding definition. The approaches and motivations behind amateur musicians seeking out and establishing shared places for their DIY practice reveals a collective striving for creative control and the creative reimagining of disused urban spaces. Whilst there is a commitment to the upkeep of these spaces, there are also essential online activities shared by the amateur musicians that assist their own personal music promotion alongside the networking and expanding of the local DIY communities. This discussion also reveals how the musicians tackle periods of isolation from their peers, as increased opportunities to collaborate remotely with others changes the dynamics of bands and music scenes. In a combining of interview and observational data, the thesis also explores in depth the handcrafting and DIY activities practiced and celebrated in the shared DIY spaces. There is then further discussion as to how the musicians manage their peer networks and how they stay connected to other musicians in their local areas. This reveals more relaxed, open networking tactics widely adopted by amateur musicians in a digital age. There is a continued discussion then as to how the musicians are able to sustain their DIY practices on a part- time basis, with a focus on the co-operative strategies for creating a sense of community, shared values and ambitions amongst the musicians. In conclusion, I draw upon the themes of material, digital, local and global practices, revealing how amateurs seek to protect both a micro-scale, exclusive aspect to their music and opportunities for face-to-face live performance for real engagement with their peers and audiences.
APA, Harvard, Vancouver, ISO, and other styles
30

Ratzinger, Daniel. "The impact of university education upon digital start-ups." Thesis, University of Nottingham, 2017. http://eprints.nottingham.ac.uk/38838/.

Full text
Abstract:
With a worldwide shift towards a knowledge economy, universities are seen as a fundamental driver of economic growth. While previous studies have focused on universities’ more direct commercialisation activities, this research investigates the relatively unexplored influence of university education upon graduate entrepreneurship. By considering the digital economy, this exploratory study examines a fast growing sector where knowledge is considered to be a core asset. A global open dataset of digital start-ups is used to evaluate universities’ contribution to the performance of these ventures through the provision of formal technical, business or more general education. The impact of this human capital contribution on the probability of being a habitual entrepreneur in this industry sector, as well as the impact on the probability and rate of reaching the equity investment milestones of “funding” and “exit” were investigated. Prior to the data analysis, using computer science methods, unsupervised algorithms were developed to pre-process and transform the crowd-sourced dataset by linking multiple existing data sources, and it was demonstrated that this approach allows sophisticated natural language processing challenges to be overcome with relatively low technical capabilities. The consequent analysis of the transformed dataset reveals that: (1) having a founder with a university qualification significantly increases the probability of securing funding and successful exit; (2) having a founder with a university qualification in business significantly decreases the duration at which the first funding is secured and exit is achieved; (3) having a technical university qualification has no impact on the duration to securing funding, and increases the duration to exit. Following the empirical analysis, models for digital start-up teams are proposed. The thesis concludes that a consideration of the heterogeneous influence of different types of university qualifications reveals novel insights into the relationship between human capital and new venture performance.
APA, Harvard, Vancouver, ISO, and other styles
31

Ho, Chun-yin. "Group-based checkpoint/rollback recovery for large scale message-passing systems." Click to view the E-thesis via HKUTO, 2008. http://sunzi.lib.hku.hk/hkuto/record/B39794052.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

McKnight, Walter Lee. "A meta system for generating software engineering environments /." The Ohio State University, 1985. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487260531958418.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Sobel, Ann E. Kelley. "Modular verification of concurrent systems /." The Ohio State University, 1986. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487267546983528.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Ho, Chun-yin, and 何俊賢. "Group-based checkpoint/rollback recovery for large scale message-passing systems." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2008. http://hub.hku.hk/bib/B39794052.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Perumalla, Kalyan S. "Techniques for efficient parallel simulation and their application to large-scale telecommunication network models." Diss., Georgia Institute of Technology, 1999. http://hdl.handle.net/1853/13086.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Liu, Nien-chen. "A methodology for specifying and analyzing communication protocols and services /." The Ohio State University, 1986. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487267024997781.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Naderi, Ramin. "Quadtree-based processing of digital images." PDXScholar, 1986. https://pdxscholar.library.pdx.edu/open_access_etds/3590.

Full text
Abstract:
Image representation plays an important role in image processing applications, which usually. contain a huge amount of data. An image is a two-dimensional array of points, and each point contains information (eg: color). A 1024 by 1024 pixel image occupies 1 mega byte of space in the main memory. In actual circumstances 2 to 3 mega bytes of space are needed to facilitate the various image processing tasks. Large amounts of secondary memory are also required to hold various data sets. In this thesis, two different operations on the quadtree are presented. There are, in general, two types of data compression techniques in image processing. One approach is based on elimination of redundant data from the original picture. Other techniques rely on higher levels of processing such as interpretations, generations, inductions and deduction procedures (1, 2). One of the popular techniques of data representation that has received a considerable amount of attention in recent years is the quadtree data structure. This has led to the development of various techniques for performing conversions and operations on the quadtree. Klinger and Dyer (3) provide a good bibliography of the history of quadtrees. Their paper reports experiments on the degree of compaction of picture representation which may be achieved with tree encoding. Their experiments show that tree encoding can produce memory savings. Pavlidis [15] reports on the approximation of pictures by quadtrees. Horowitz and Pavidis [16] show how to segment a picture using traversal of a quadtree. They segment the picture by polygonal boundaries. Tanimoto [17] discusses distortions which may occur in quadtrees for pictures. Tanimoto [18, p. 27] observes that quadtree representation is particularly convenient for scaling a picture by powers of two. Quadtrees are also useful in graphics and animation applications [19, 20] which are oriented toward construction of images from polygons and superpositiofis of images. Encoded pictures are useful for display, especially if encoding lends itself to processing.
APA, Harvard, Vancouver, ISO, and other styles
38

Newman, Kimberly Eileen. "A parallel digital interconnect test methodology for multi-chip module substrate networks." Diss., Georgia Institute of Technology, 1999. http://hdl.handle.net/1853/13847.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Iqbal, Javed. "Digital literacy and access for educational inclusion : a comparative study of British Muslim girls schools." Thesis, University of Huddersfield, 2012. http://eprints.hud.ac.uk/id/eprint/18095/.

Full text
Abstract:
The educatiuonal achievements of British Muslims, particularly South Asians, have been studied in past decades, but, unfortunately, the impact of digital technologies on young Muslim children has not recieved sufficient attention. In addition, past studies mostly relied on quantitative methods to gain knowledge on the educational achievements of British Muslims. The thesis is grounded in a qualitative approach within a social constructionist paradigm, to elicit the views of young British Muslim girls on their use of digital technologies for educational achievements. The data presented were obtained by carrying out semi-structured interviews with a sample of young (14-19 year old) British Muslim girls at three single-sex Islamic faith schools, and were analysed using mainly template analysis, and also matrix analysis and cross-case analysis within and cross the case studies. It was found that most of the female Muslim students interviewed for this research study were satisfied and performed competently at case Islamic faith schools. Furthermore, the educational success at school A was attributable to educational norms and values relative to the provision of digital resources and skilled teaching staff. The educational experiences of school B and C were problematic, largely because of access to digital technologies, and provision of digital content and skilled teaching staff. Another factor of students’ underachievement was found to be that parents had limited levels of education and inadequate understanding of the education. Most of the students had a positive attitude towards the technologies. The thesis concludes that the educational achievement of British Muslim girls in schools is closely related to access to digital technologies, digital academic content, skilled academic staff and the technological, infrastructure in schools. The net effect of digital technologies is positive on Muslim girls in the increasingly competitive nature of the education system. The thesis is original and the first study of this kind that offers an insight into the access to digital technologies and educational attainment of young British Muslim girls that is reflected in key concepts through the usage and incorporation of technologies in education. Other aspects of this research include the issues of provision of technologies at home and parents’ educational level, contribution to knowledge, and the need for further broader and longitudinal study.
APA, Harvard, Vancouver, ISO, and other styles
40

Baxter-Webb, Joe. "How geek kids get geek jobs : a cross-generational inquiry into digital play and young adults' careers in IT." Thesis, Canterbury Christ Church University, 2015. http://create.canterbury.ac.uk/14694/.

Full text
Abstract:
From programming 'home-brew' games, to modifying the content of existing commercial titles, digital gaming can be regarded as a potential gateway into more serious uses of computers; welcoming some while repelling others. The socio-demographic makeup of computer science, games development and related areas of work are of interest to feminist scholars of culture. In light of skills shortages, industry is also interested in increasing women and ethnic minorities' participation in STEM fields. Representational inequalities within tech are regarded as a social issue not just because this area of employment can be highly lucrative, but also because control over tech can provide other forms of empowerment - including being able to influence and shape everyday communication technologies. However, the route into these industries has historically been shaped by a number of factors including formal computing education, the rise of hobbyist computing and a surrounding masculine 'geek' culture - and a sort of reciprocal relationship between hobbyist computing and digital games. This thesis interrogates the idea of games as a form of 'technological enculturation'; the notion of a causal link between gaming and careers in computing. I take the biographies of those working in the IT sector in southeast England and explore the role of gaming in the personal histories of what appears to be a predominantly white and male group. The thesis pays great attention to salient differences between technological platforms - something relatively underdeveloped in the existing literature on player cultures and in game studies more generally. Finally, I take a cross-generational perspective by comparing the experiences of adult IT workers with a cohort of teenage ICT students. Using a theoretical framework adapted from leisure studies and the sociology of Pierre Bourdieu, I explore how certain types of game-related activity - but not all gaming - are particularly conducive to producing young people who are a good 'cultural fit' for this particular set of professions. This has implications for how we think and talk about increasing participation in STEM, as well as the somewhat under-developed role of games and game-making in UK schools.
APA, Harvard, Vancouver, ISO, and other styles
41

Alrumaithi, A. M. "Prioritisation in digital forensics : a case study of Abu Dhabi Police." Thesis, Liverpool John Moores University, 2018. http://researchonline.ljmu.ac.uk/8936/.

Full text
Abstract:
The main goal of this research is to investigate prioritization process in digital forensics departments in law enforcement organizations. This research is motivated by the fact that case prioritisation plays crucial role to achieve efficient operations in digital forensics departments. Recent years have witnessed the widespread use of digital devices in every aspect of human life, around the globe. One of these aspects is crime. These devices have became an essential part of every investigation in almost all cases handled by police. The reason behind their importance lies in their ability to store huge amounts of data that can be utilized by investigators to solve cases under consideration. Thus, involving Digital Forensics departments, though often over-burdened and under-resourced, is becoming a compulsory to achieve successful investigations. Increasing the effectiveness of these departments requires improving their processes including case prioritisation. Existing literature focuses on prioritisation process within the context of crime scene triage. The main research problem in literature is prioritising existing digital devices found in crime scene in a way that leads to successful digital forensics. On the other hand, the research problem in this thesis focuses on prioritisation of cases rather than digital devices belonging to a specific case. Normally, Digital Forensics cases are prioritised based on several factors where influence of officers handling the case play one of the most important roles. Therefore, this research investigates how perception of different individuals in law enforcement organization may affect case prioritisation for the Digital Forensics department. To address this prioritisation problem, the research proposes the use of maturity models and machine learning. A questionnaire was developed and distributed among officers in Abu Dhabi Police. The main goal of this questionnaire is to measure perception regarding digital forensics among employees in Abu Dhabi police. Response of the subjects were divided into two sets. The first set represents responses of subjects who are experts in DF; while the other set includes the remaining subjects. Responses in the first set were averaged to produce a benchmark of the optimal questionnaire answers. Then, a reliability measure is proposed to summarize each subject’s perception. Data obtained from the reliability measurement were used in machine learning models, so that the process is automated. Results of data analysis confirmed the severity of problem where the proposed prioritisation process can be a very effective solution as seen in the results provided in this thesis.
APA, Harvard, Vancouver, ISO, and other styles
42

Sander, Samuel Thomas. "Retargetable compilation for variable-grain data-parallel execution in image processing." Diss., Georgia Institute of Technology, 2002. http://hdl.handle.net/1853/13850.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Chen, Yin Fu. "SIMTM turing machine simulator." CSUSB ScholarWorks, 1995. https://scholarworks.lib.csusb.edu/etd-project/1229.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Zahidin, Ahmad Zamri. "Using Ada tasks (concurrent processing) to simulate a business system." Virtual Press, 1988. http://liblink.bsu.edu/uhtbin/catkey/539634.

Full text
Abstract:
Concurrent processing has always been a traditional problem in developing operating systems. Today, concurrent algorithms occur in many application areas such as science and engineering, artificial intelligence, business systems databases, and many more. The presence of concurrent processing facilities allows the natural expression of these algorithms as concurrent programs. This is a very distinct advantage if the underlying computer offers parallelism. On the other hand, the lack of concurrent processing facilities forces these algorithms to be written as sequential programs, thus, destroying the structure of the algorithms and making them hard to understand and analyze.The first major programming language that offers high-level concurrent processing facilities is Ada. Ada is a complex, general purpose programming language that provides an excellent concurrent programming facility called task that is based on rendezvous concept. In this study, concurrent processing is practiced by simulating a business system using Ada language and its facilities.A warehouse (the business system) consists of a number of employees purchases microwave ovens from various vendors and distributes them to several retailers. Simulation of activities in the system is carried over by assigning each employee to a specific task and all tasks run simultaneously. The programs. written for this business system produce transactions and financial statements of a typical business day. They(programs) are also examining the behavior of activities that occur simultaneously. The end results show that concurrency and Ada work efficiently and effectively.
Department of Computer Science
APA, Harvard, Vancouver, ISO, and other styles
45

Hale-Ross, S. A. "The UK's legal response to terrorist communication in the 21st century : striking the right balance between individual privacy and collective security in the digital age." Thesis, Liverpool John Moores University, 2017. http://researchonline.ljmu.ac.uk/6726/.

Full text
Abstract:
The dynamics of private life have changed along with the vast advancements in 21st Century communications technology. Private conversations no longer simply take place in the citizens’ home or through using a landline telephone, but rather online through the Internet, social media and through the ever-growing list of chat applications available on the smartphone that allows encryption. However, what often follows the legitimate use of technological advancements is criminal, or in this case terrorist exploitation. In the digital age it has become increasingly easy for terrorist groups to communicate their propaganda and for individual terrorists to communicate freely. This has served to create an investigatory capabilities gap thereby increasing the pressures on UK policing and security agencies’, in fulfilling their task of protecting national security and protecting the citizens’ right to life. In response, the UK and the European Union (EU) have attempted to close the capabilities gap and thereby ensure collective security, by enacting new laws allowing the law enforcement agencies’ to monitor electronic communications. The UK Government has recently enacted the Investigatory Powers Act 2016 (IPA) that introduces and preserves the ability to bulk collect, and retain electronic communications data, and to attain the operators’ assistance in decryption. Although the IPA attempts to take a human rights approach, the main contentious elements in the Act are those in relation to the authorities’ capabilities to intercept electronic communications data on mass, and to retain such data. Specifically, concerns currently surround the introduction of ‘backdoors’ into encrypted online services, and bulk interception and equipment interference warrants, and bulk personal data sets, all of which serve to weaken the security and individual data protection and privacy rights of, potentially, the entire population. The Court of Justice of the European Union (CJEU) has been the most influential judicial body in terms of individual data protection, and thereby on the UK’s law making process, through its key judgements in Digital Rights Ireland Ltd v Minister for Communications, Marine and Natural Resources and others, and the conjoined case of Kärntner Landesregierung, Michael Seitlinger, Christof Tschohl and others (Digital Rights Ireland). The CJEU has done this by asserting the EU’s constitutional and legal prowess in protecting data protection, such as Article 8 of the Charter of Fundamental Rights and byway of two directives, namely the Data Protection Directive in 1995 and the e-Privacy Directive in 2002. In order to close the capabilities gap ensuring national security, the UK Government must ensure the law endures by safeguarding the cohesiveness with the jurisprudence of the CJEU and the European Court of Human Rights (ECtHR). The courts do focus on different elements, built around the Conventional rights, with the CJEU focused on data protection and the ECtHR on Article 8 right to privacy. To solve the balance between individual privacy and collective security, a human rights focus is required with emphasis placed on the practical reality that one cannot assert privacy rights, if one’s right to life is not fully protected in the first place. This focus must re-forge the UK’s counterterrorism legal structure. Taken in conjunction with the UK’s already broadly worded counterterrorism legal framework, particularly the lack of a freedom fighter exclusion within the legal definition of terrorism, the consequence is to almost criminalise any expression of a view that the armed resistance to a brutal or repressive anti-democratic regime, could in certain circumstances be justifiable, even where such resistance is directed away from non-combatant casualties’. Although the current counterterrorism structure is broad, the UK and the EU must police the Internet and remove the safe places used by criminals and terrorists. The IPA fashions a way within which to achieve this, but because it can be aimed at the whole population, subject to authorisation safeguards, and following historical case law dealing with blanket policies that effect the innocent, it is likely to receive continual CJEU and ECtHR judicial scrutiny. Post the UK’s exit from the EU however, the CJEU may become less important leaving the ECtHR to conduct the analysis. At present, the UK must follow CJEU rulings when the matter concerns EU law, whereas ECtHR decisions are merely recommendatory. The thesis found that overall, the balance between collective security and individual data privacy rights in the UK are fairly stable because of the role and importance of judicial review; judicial independence, and the over-arching scrutiny provided by commissioners and parliamentary committees. It is further argued that a blanket approach to retaining electronic communications data is necessary in finding the terrorist in the ever growing haystacks, because sometimes privacy rights and data protection must be curtailed to ensure the state can protect citizens’ rights to life.
APA, Harvard, Vancouver, ISO, and other styles
46

Katz, Heather Alicia. "The relationship between learners' goal orientation and their cognitive tool use and achievement in an interactive hypermedia environment." Access restricted to users with UT Austin EID Full text (PDF) from UMI/Dissertation Abstracts International, 2001. http://wwwlib.umi.com/cr/utexas/fullcit?p3033584.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Hong, Chun Pyo. "Implementation of recursive shift-invariant flow graphs in parallel pipelined processing environment." Diss., Georgia Institute of Technology, 1991. http://hdl.handle.net/1853/15678.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Shultes, Bruce Chase. "Regenerative techniques for estimating performance measures of highly dependable systems with repairs." Diss., Georgia Institute of Technology, 1997. http://hdl.handle.net/1853/25035.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Somani, Arun K. (Arun Kumar). "A unified theory of system-level diagnosis and its application to regular interconnected structures /." Thesis, McGill University, 1985. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=72037.

Full text
Abstract:
System-level diagnosis is considered to be a viable alternative to circuit-level testing in complex multiprocessor systems. The characterization problem, the diagnosability problem, and the diagnosis problem in this framework have been widely studied in the literature with respect to a special fault class, called t-fault class, in which all fault sets of size up to t are considered. Various models for the interpretation of test outcomes have been proposed and analyzed. Among these, four most known models are: symmetric invalidation model, asymmetric invalidation model, symmetric invalidation model with intermittent faults, and asymmetric invalidation model with intermittent faults.
In this thesis, a completely new generalization of the characterization problem in system-level diagnosis area is developed. This generalized characterization theorem provides necessary and sufficient conditions for any fault-pattern of any size to be uniquely diagnosable under all the four models. Moreover, the following three results are obtained for the t-fault class: (1) the characterization theorem for t-diagnosable systems under the asymmetric invalidation model with intermittent faults is developed for the first time; (2) a unified t-characterization theorem covering all the four models is presented; and finally (3) it is proven that the classical t-characterization theorems under the first three models and the new result for the fourth model, as mentioned in (1) above, are special cases of the generalized characterization theorem.
The general diagnosability problem is also studied. It is shown that the single fault diagnosability problem, under the asymmetric invalidation model is Co-NP-complete.
As regards the diagnosis problem, most of the diagnosis algorithms developed thus far are global algorithms in which a complete syndrome is analyzed by a single supervisory processor. In this thesis, distributed diagnosis algorithms for regular interconnected structures are developed which take advantage of the interconnection architecture of a multiprocessor system.
APA, Harvard, Vancouver, ISO, and other styles
50

Al-Kofahi, Khalid A. "Reliability analysis of triple modular redundancy system with spare /." Online version of thesis, 1993. http://hdl.handle.net/1850/11565.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography