Siga este enlace para ver otros tipos de publicaciones sobre el tema: Novell software.

Tesis sobre el tema "Novell software"

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte los 50 mejores tesis para su investigación sobre el tema "Novell software".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Explore tesis sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.

1

Du, Preez Jacobus Andries. "Novells open source evolution a case study in adapting open source business strategies /". Pretoria : [s.n.], 2006. http://upetd.up.ac.za/thesis/available/etd-03102006-171345.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Lee, Chun-to Michael y 李俊圖. "Novel techniques for implementing tamper-resistant software". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2004. http://hub.hku.hk/bib/B42577494.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Lee, Chun-to Michael. "Novel techniques for implementing tamper-resistant software". Click to view the E-thesis via HKUTO, 2004. http://sunzi.lib.hku.hk/hkuto/record/B42577494.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Harkness, Rob. "Novel software solutions for automating biochemical assays". Thesis, University of Surrey, 2009. http://epubs.surrey.ac.uk/843979/.

Texto completo
Resumen
Laboratory Automation is used throughout the pharmaceutical and biotechnology industries to assist research within the drug discovery process. Many software packages are commercially available for automating biochemical assays, such as the ELISA, as part of this process. However, it is often difficult for a scientist to translate their assay into what is essentially a piece of programming logic. Advanced users with an understanding of basic programming are often required. By shifting the development approach, a software product has been created that focuses on how the user can set up an assay as opposed to how the software will automate instrumentation. A review of existing software in the field of laboratory automation and the scheduling methods that are used has provided a basic platform from which a new product, Overlord2, has been written using the Microsoft .NET framework. A flow chart interface has been selected as the method of describing an assay. This has the distinct advantage of allowing the user to control how their assay will be processed unlike the commercial products that currently exist. A new method of event driven scheduling has been created that uses fully utilizes this new flow chart interface. A simple underlying architecture has also been created that separates the core functionality into discrete components. This design has significantly improved the development-testing lifecycle. Additionally, this has allowed custom applications, tailored to the users requirements, to be implemented that use a set of common components, a novel concept in the field of laboratory automation. A software package, Overlord2, has been produced as part of this work using the latest programming technologies. At its core, it uses an instantly recognisable flow chart interface for assay creation. A scientist with limited programming knowledge can automate, with this software, the most common type of assays carried out in the Drug Discovery process.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Andersson, Björn y Marie Persson. "Software Reliability Prediction – An Evaluation of a Novel Technique". Thesis, Blekinge Tekniska Högskola, Avdelningen för programvarusystem, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-3589.

Texto completo
Resumen
Along with continuously increasing computerization, our expectations on software and hardware reliability increase considerably. Therefore, software reliability has become one of the most important software quality attributes. Software reliability modeling based on test data is done to estimate whether the current reliability level meets the requirements for the product. Software reliability modeling also provides possibilities to predict reliability. Costs of software developing and tests together with profit issues in relation to software reliability are one of the main objectives to software reliability prediction. Software reliability prediction currently uses different models for this purpose. Parameters have to be set in order to tune the model to fit the test data. A slightly different prediction model, Time Invariance Estimation, TIE is developed to challenge the models used today. An experiment is set up to investigate whether TIE could be found useful in a software reliability prediction context. The experiment is based on a comparison between the ordinary reliability prediction models and TIE.
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Voightmann, Michael P. (Michael Paul) 1979. "Generating quality software specifications for decision support : a novel approach". Thesis, Massachusetts Institute of Technology, 2004. http://hdl.handle.net/1721.1/17836.

Texto completo
Resumen
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Aeronautics and Astronautics, 2004.
Includes bibliographical references (leaves 70-74).
An approach utilizing cognitive support tools is presented with the purpose of improving upon the generation of quality software requirement specifications. Based on successful tools in product development and cognitive engineering, the framework suggests how to improve upon user comprehension by the creation of decision support tools. These tools can facilitate the construction of a well-structured system map, incorporating key cognitive aspects, as well as identifying hidden requirements. Minimizing the requirements errors early in the process can have a tremendous impact on the success of the project. Current practices have been unable to improve upon the large failure rate of software systems. They generally lack the structure that cognitive support tools can provide. In order to assess the proposed approach, an overview of the current state of software requirements practice is provided. Key product development and cognitive support approaches were then analyzed. The proposed approach is described, followed by suggestions for future tools that could have a great impact on requirements engineering. Finally, the proposed approach is evaluated by applying it to an automotive application: adaptive cruise control.
by Michael P. Voightmann.
S.M.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Jameson, Brian Douglas. "A NOVEL MULTI-FUNCTIONAL SOFTWARE-DEFINED RADAR: THEORY & EXPERIMENTS". Miami University / OhioLINK, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=miami1375821039.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Senft, Björn [Verfasser]. "A value-centered software engineering approach for unique and novel software-based solutions : aligning design thinking with a coopetition-based evolutionary software development / Björn Senft". Paderborn : Universitätsbibliothek, 2021. http://d-nb.info/1236630041/34.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Pacione, Michael John. "A novel software visualisation model to support object-oriented program comprehension". Thesis, University of Strathclyde, 2005. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.428841.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Zhan, Ryan A. "Development of Novel Hardware and Software for Wind Turbine Condition Monitoring". DigitalCommons@CalPoly, 2021. https://digitalcommons.calpoly.edu/theses/2268.

Texto completo
Resumen
With the increased use of wind turbines as sources of energy, maintenance of these devices becomes more and more important. Utility scale wind turbines can be time consuming and expensive to repair so an intelligent method of monitoring these devices is important. Commercial solutions for condition monitoring exist but are expensive and can be difficult to implement. In this project a novel condition monitoring system is developed. The priority of this system, dubbed the LifeLine, is to provide reliable condition monitoring through an easy-to-install and low-cost system. This system utilizes a microcontroller to collect acceleration data to detect imbalances on turbines blades. Two graphical user interfaces are created. One improves control with a small wind turbine while the other interfaces with the LifeLine. A custom PCB is designed for the LifeLine and additional rotor speed, current, and voltage sensors are incorporated into the LifeLine system. Future improvements to this system are also discussed.
Los estilos APA, Harvard, Vancouver, ISO, etc.
11

Ozada, Neriman. "A novel musculoskeletal joint modelling for orthopaedic applications". Thesis, Brunel University, 2008. http://bura.brunel.ac.uk/handle/2438/6556.

Texto completo
Resumen
The objective of the work carried out in this thesis was to develop analytical and computational tools to model and investigate musculoskeletal human joints. It was recognised that the FEA was used by many researchers in modelling human musculoskeletal motion, loading and stresses. However the continuum mechanics played only a minor role in determining the articular joint motion, and its value was questionable. This is firstly due to the computational cost and secondly due to its impracticality for this application. On the other hand, there isn’t any suitable software for precise articular joint motion analysis to deal with the local joint stresses or non standard joints. The main requirement in orthopaedics field is to develop a modeller software (and its associated theories) to model anatomic joint as it is, without any simplification with respect to joint surface morphology and material properties of surrounding tissues. So that the proposed modeller can be used for evaluating and diagnosing different joint abnormalities but furthermore form the basis for performing implant insertion and analysis of the artificial joints. The work which is presented in this thesis is a new frame work and has been developed for human anatomic joint analysis which describes the joint in terms of its surface geometry and surrounding musculoskeletal tissues. In achieving such a framework several contributions were made to the 6DOF linear and nonlinear joint modelling, the mathematical definition of joint stiffness, tissue path finding and wrapping and the contact with collision analysis. In 6DOF linear joint modelling, the contribution is the development of joint stiffness and damping matrices. This modelling approach is suitable for the linear range of tissue stiffness and damping properties. This is the first of its kind and it gives a firm analytical basis for investigating joints with surrounding tissue and the cartilage. The 6DOF nonlinear joint modelling is a new scheme which is described for modelling the motion of multi bodies joined by non-linear stiffness and contact elements. The proposed method requires no matrix assembly for the stiffness and damping elements or mass elements. The novelty in the nonlinear modelling, relates to the overall algorithmic approach and handling local non-linearity by procedural means. The mathematical definition of joint stiffness is also a new proposal which is based on the mathematical definition of stiffness between two bodies. Based on the joint stiffness matrix properties, number of joint stiffness invariants was obtained analytically such as the centre of stiffness, the principal translational stiffnesses, and the principal rotational stiffnesses. In corresponding to these principal stiffnesses, their principal axes have been also obtained. Altogether, a joint is assessed by six principal axes and six principal stiffnesses and its centre of stiffness. These formulations are new and show that a joint can be described in terms of inherent stiffness properties. It is expected that these will be better in characterising a joint in comparison to laxity based characterisation. The development of tissue path finding and wrapping algorithms are also introduced as new approaches. The musculoskeletal tissue wrapping involves calculating the shortest distance between two points on a meshed surface. A new heuristic algorithm was proposed. The heuristic is based on minimising the accumulative divergence from the straight line between two points on the surface and the direction of travel on the surface (i.e. bone). In contact and collision based development, the novel algorithm has been proposed that detects possible colliding points on the motion trajectory by redefining the distance as a two dimensional measure along the velocity approach vector and perpendicular to this vector. The perpendicular distance determines if there are potentially colliding points, and the distance along the velocity determines how close they are. The closest pair among the potentially colliding points gives the “time to collision”. The algorithm can eliminate the “fly pass” situation where very close points may not collide because of the direction of their relative velocity. All these developed algorithms and modelling theories, have been encompassed in the developed prototype software in order to simulate the anatomic joint articulations through modelling formulations developed. The software platform provides a capability for analysing joints as 6DOF joints based on anatomic joint surfaces. The software is highly interactive and driven by well structured database, designed to be highly flexible for the future developments. Particularly, two case studies are carried out in this thesis in order to generate results relating to all the proposed elements of the study. The results obtained from the case studies show good agreement with previously published results or model based results obtained from Lifemod software, whenever comparison was possible. In some cases the comparison was not possible because there were no equivalent results; the results were supported by other indicators. The modelling based results were also supported by experiments performed in the Brunel Orthopaedic Research and Learning Centre.
Los estilos APA, Harvard, Vancouver, ISO, etc.
12

Burton, Daniel John. "Development of a novel hybrid field and zone fire model". Thesis, University of Greenwich, 2011. http://gala.gre.ac.uk/9086/.

Texto completo
Resumen
This thesis describes the design and implementation of a novel hybrid field/zone fire model, linking a fire field model to a zone model. This novel concept was implemented using SMARTFIRE (a fire field model produced at the University of Greenwich) and two different zone models (CFAST which is produced by NIST and FSEG-ZONE which has been produced by the author during the course of this work). The intention of the hybrid model is to reduce the amount of computation incurred in using field models to simulate multi-compartment geometries, and it will be implemented to allow users to employ the zone component without having to make further technical considerations, in line with the existing paradigm of the SMARTFIRE suite. In using the hybrid model only the most important or complex parts of the geometry are fully modelled using the field model. Other suitable and less important parts of the geometry are modelled using the zone model. From the field model‘s perspective the zone model is represented as an accurate pressure boundary condition. From the zone model‘s perspective the energy and mass fluxes crossing the interface between the models are seen as point sources. The models are fully coupled and iterate towards a solution ensuring both global conservation along with conservation between the regions of different computational method. By using this approach a significant proportion of the computational cells can be replaced by a relatively simple zone model, saving computational time. The hybrid model can be used in a wide range of situations but will be especially applicable to large geometries, such as hotels, prisons, factories or ships, where the domain size typically proves to be extremely computationally expensive for treatment using a field model. The capability to model such geometries without the associated mesh overheads could eventually permit simulations to be run in ‘faster-real-time’, allowing the spread of fire and effluents to be modelled, along with a close coupling with evacuation software, to provide a tool not just for research objectives, but to allow real time incident management in emergency situations. Initial ‘proof of concept’ work began with the development of one way coupling regimes to demonstrate that a valid link between models could allow communication and conservation of the respective variables. This was extended to a two-way coupling regime using the CFAST zone model and results of this implementation are presented. Fundamental differences between the SMARTFIRE and CFAST models resulted in the development of the FSEG-ZONE model to address several issues; this implementation and numerous results are discussed at length. Finally, several additions were made to the FSEG-ZONE model that are necessary for an accurate consideration of fire simulations. The test cases presented in this thesis show that a good agreement with full- field results can be obtained through use of the hybrid model, while the reduction in computational time realised is approximately equivalent to the percentage of domain cells that are replaced by the zone calculations of the hybrid model.
Los estilos APA, Harvard, Vancouver, ISO, etc.
13

Tangworakitthaworn, Preecha. "ILO diagram : a novel conceptual model of intended learning outcomes". Thesis, University of Southampton, 2014. https://eprints.soton.ac.uk/370600/.

Texto completo
Resumen
Achieving intended learning outcomes (ILOs) in education is an ongoing topic within distance learning and educational communities. The term "ILOs" has been introduced to indicate what learners will be able to do by the end of the course of study. Developing the ILO structure, in which the subject matter and their relationships are integrated with the capabilities to be learned, is a challenge to instructional designers. In this research, the ILO diagram – a novel conceptual model of intended learning outcomes – is proposed to support not only instructional designers in designing and developing courses of study, but also learners and instructors in performing the courses' learning and teaching activities. The research covers three objectives. First, in order to pioneer courses of study which should consider all stakeholders in education, the research aims primarily to reconcile constructivist and instructivist theories in order to propose an equivalent architecture, using ILOs to support learning and teaching. Second, more significantly, the research aims to contribute a novel conceptual model of ILOs (called an ILO diagram) using a diagrammatic technique. In the ILO diagram, ILO nodes are represented as the two-dimensional classification of a performance/content matrix based on the component display theory proposed by Merrill. The ILO relationships have formulated the hierarchical structure using the cognitive hierarchy comprising six levels adopted the Bloom's taxonomy of the cognitive domain. Moreover, three types of the principal relationship, two types of the composite relationship, and three relationship constraints are proposed. Finally, the third objective of the research is to experimentally ascertain how the structured ILOs format conceptualised through the proposed ILO diagram can contribute to both teaching and learning. Furthermore, the three experimental studies were conducted to explore whether providing the well-defined structure of ILOs, conceptualised through the ILO diagram, can facilitate teaching and learning. In the first experiment, the main aim was to investigate the instructors’ satisfaction with using the ILO diagram in teaching. The results revealed that the proposed ILO diagram met the instructors’ satisfactions with higher ratings for perceived usefulness, perceived ease of use, and attitude towards representing ILOs than the plain-text document. The second experiment was to investigate whether using the ILO diagram to facilitate learning can support learners to indicate the learning paths. The results revealed that the mean completeness of all learning paths was statistically significantly higher with the structured ILOs (ILO diagram), showing that the learners benefited from the ILO diagram in performing their self-regulated learning. Finally, the last experiment was to investigate how well the learners understand the conceptual representation of the ILO diagram. The results of the experiment revealed that the average mean of understandability for the conceptual representation of the ILO diagram was higher than for both the sentential and tabular representations. These findings indicate that the ILO diagram provides more understandability than the sentential and tabular representational styles of ILOs.
Los estilos APA, Harvard, Vancouver, ISO, etc.
14

Sim, Kevin. "Novel hyper-heuristics applied to the domain of bin packing". Thesis, Edinburgh Napier University, 2014. http://researchrepository.napier.ac.uk/Output/7563.

Texto completo
Resumen
Principal to the ideology behind hyper-heuristic research is the desire to increase the level of generality of heuristic procedures so that they can be easily applied to a wide variety of problems to produce solutions of adequate quality within practical timescales. This thesis examines hyper-heuristics within a single problem domain, that of Bin Packing where the benefits to be gained from selecting or generating heuristics for large problem sets with widely differing characteristics is considered. Novel implementations of both selective and generative hyper-heuristics are proposed. The former approach attempts to map the characteristics of a problem to the heuristic that best solves it while the latter uses Genetic Programming techniques to automate the heuristic design process. Results obtained using the selective approach show that solution quality was improved significantly when contrasted to the performance of the best single heuristic when applied to large sets of diverse problem instances. Although enforcing the benefits to be gained by selecting from a range of heuristics the study also highlighted the lack of diversity in human designed algorithms. Using Genetic Programming techniques to automate the heuristic design process allowed both single heuristics and collectives of heuristics to be generated that were shown to perform significantly better than their human designed counterparts. The thesis concludes by combining both selective and generative hyper-heuristic approaches into a novel immune inspired system where heuristics that cover distinct areas of the problem space are generated. The system is shown to have a number of advantages over similar cooperative approaches in terms of its plasticity, efficiency and long term memory. Extensive testing of all of the hyper-heuristics developed on large sets of both benchmark and newly generated problem instances enforces the utility of hyper-heuristics in their goal of producing fast understandable procedures that give good quality solutions for a range of problems with widely varying characteristics.
Los estilos APA, Harvard, Vancouver, ISO, etc.
15

Gelinas, Robert. "A novel approach to modeling tunnel junction diodes using Silvaco Atlas software /". Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2005. http://library.nps.navy.mil/uhtbin/hyperion/05Dec%5FGelinas.pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
16

Rossouw, Jacobus Laurence. "Novel software to reduce the risk of energy related illnesses / J.L. Rossouw". Thesis, North-West University, 2006. http://hdl.handle.net/10394/1313.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
17

Pavlou, Georgios. "Telecommunications management network : a novel approach for its architecture through software platforms". Thesis, University College London (University of London), 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.300448.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
18

Gelinas, Robert J. "A novel approach to modeling tunnel junction diodes using Silvaco Atlas software". Thesis, Monterey, California. Naval Postgraduate School, 2005. http://hdl.handle.net/10945/1784.

Texto completo
Resumen
This thesis investigates the ability to model a tunnel junction device using the ATLAS device simulator by Silvaco International. The tunnel junction is a critical component of a multijunction solar cell. This thesis will concentrate on simulating the tunnel junction for application as part of a multijunction solar cell. It will try several methods, in ATLAS device simulator, to produce a model of the tunnel junction that can later be used while designing multijunction devices. These methods will consist of the review of past work, attempting to modify past work to be applied in the current design, producing a new tunnel junction simulation from the ground-up, and review of the simulations of similar devices to learn if they can be modified and applied to making a working tunnel junction model.
Los estilos APA, Harvard, Vancouver, ISO, etc.
19

Cockshott, Malcolm Tunde. "Wet and sticky : a novel model for computer-based painting". Thesis, University of Glasgow, 1991. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.299837.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
20

Filho, Edson Costa de Barros Carvalho. "Investigation of Boolean neural networks on a novel goal-seeking neuron". Thesis, University of Kent, 1990. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.277285.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
21

Graves, Jamie Robert. "Forensic verification of operating system activity via novel data, acquisition and analysis techniques". Thesis, Edinburgh Napier University, 2009. http://researchrepository.napier.ac.uk/Output/6699.

Texto completo
Resumen
Digital Forensics is a nascent field that faces a number of technical, procedural and cultural difficulties that must be overcome if it is to be recognised as a scientific discipline, and not just an art. Technical problems involve the need to develop standardised tools and techniques for the collection and analysis of digital evidence. This thesis is mainly concerned with the technical difficulties faced by the domain. In particular, the exploration of techniques that could form the basis of trusted standards to scientifically verify data. This study presents a set of techniques, and methodologies that can be used to describe the fitness of system calls originating from the Windows NT platform as a form of evidence. It does so in a manner that allows for open investigation into the manner in which the activities described by this form of evidence can be verified. The performance impact on the Device Under Test (DUT) is explored via the division of the Windows NT system calls into service subsets. Of particular interest to this work is the file subset, as the system calls can be directly linked to user interaction. The subsequent quality of data produced by the collection tool is examined via the use of the Basic Local Alignment Search Tool (BLAST) sequence alignment algorithm . In doing so, this study asserts that system calls provide a recording, or time line, of evidence extracted from the operating system, which represents actions undertaken. In addition, it asserts that these interactions can be compared against known profiles (fingerprints) of activity using BLAST, which can provide a set of statistics relating to the quality of match, and a measure of the similarities of sequences under scrutiny. These are based on Karlin-Altschul statistics which provides, amongst other values, a P-Value to describe how often a sequence will occur within a search space. The manner in which these statistics are calculated is augmented by the novel generation of the NM1,5_D7326 scoring matrix based on empirical data gathered from the operating system, which is compared against the de facto, biologically generated, BLOSUM62 scoring matrix. The impact on the Windows 2000 and Windows XP DUTs of monitoring most of the service subsets, including the file subset, is statistically insignificant when simple user interactions are performed on the operating system. For the file subset, p = 0.58 on Windows 2000 Service Pack 4, and p = 0.84 on Windows XP Service Pack 1. This study shows that if the event occurred in a sequence that originated on an operating system that was not subjected to high process load or system stress, a great deal of confidence can be placed in a gapped match, using either the NM_I.5~7326 or BLOSUM62 scoring matrices, indicating an event occurred, as all fingerprints of interest (FOI) were identified. The worst-case BLOSUM62 P-Value = 1.10E-125, and worst-case NM1.5_D7326 P-Value = 1.60E-72, showing that these matrices are comparable in their sensitivity during normal system conditions. This cannot be said for sequences gathered during high process load or system stress conditions. The NM1.5_D7326 scoring matrix failed to identify any FOI. The BLOSUM62 scoring matrix returned a number of matches that may have been the FOI, as discerned via the supporting statistics, but were not positively identified within the evaluation criteria. The techniques presented in this thesis are useful, structured and quantifiable. They provide the basis for a set of methodologies that can be used for providing objective data for additional studies into this form of evidence, which can further explore the details of the calibration and analysis methods, thus supplying the basis for a trusted form of evidence, which may be described as fit-for-purpose.
Los estilos APA, Harvard, Vancouver, ISO, etc.
22

Li, Jianzhi. "A novel approach to evolving legacy software systems into a grid computing environment". Thesis, De Montfort University, 2006. http://hdl.handle.net/2086/4103.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
23

Pedemonte, Aguilar Iván. "Evaluation of open pit slope deformation using novel numerical modeling software slope model". Tesis, Universidad de Chile, 2018. http://repositorio.uchile.cl/handle/2250/159333.

Texto completo
Resumen
Magíster en Minería
A medida que las minas a cielo abierto crecen y se profundizan la estabilidad de sus taludes toma un papel importante y crítico, es por eso que la estimación previa del comportamiento de la roca en cada una de las etapas de la construcción de la mina es crucial para asegurar la estabilidad en el largo plazo. Una correcta estimación permite diseños más empinados, y una mejora en la ratio de remoción estéril/mineral, lo que debería reflejarse en el mejoramiento del VAN del proyecto. El objetivo de esta tesis es validar el nuevo software Slope Model, para el análisis de estabilidad de taludes, mediante la comparación de este con el software ya validado 3DEC. El fin del proyecto es mejorar el conocimiento respecto el comportamiento de macizos rocosos fracturados. En la actualidad existen diferentes programas de modelamiento numérico para la estimación del comportamiento de taludes mineros, los cuales van desde el método de equilibrio límite (LE) hasta enfoques matemáticos analíticos más complejos. La elección de usar un método u otro depende de varios factores como son el nivel de detalle que se le quiere dar a la zona a estudiar, las propiedades de las rocas y la cantidad de discontinuidades presentes. Los métodos utilizados más comunes son los continuos, discontinuos e híbridos. En el marco de los modelos discontinuos se observó que los softwares actuales no son capaces de reproducir la creación y propagación de nuevas fracturas mediante la rotura de roca intacta, hecho que si ocurre en la realidad. Por este motivo se eligió el nuevo software Slope Model (SM), de la empresa ITASCA, el cual sí reproduce dichos fenómenos, muy importantes para el estudio geotécnico del área analizada. Siendo SM un software en desarrollo, los resultados fueron comparados con un modelamiento usando el software 3DEC. En la presente tesis se llevó a cabo la representación simplificada de un talud de una mina ubicada en Chile. Utilizando los mismos parámetros de entrada, los resultados de SM representan correctamente los principales desplazamientos, habiendo diferencias en la magnitud de los valores. Los factores de seguridad obtenidos en SM son levemente menores que en 3DEC, lo que concuerda con la teoría ya que SM tiene la capacidad de representar la rotura de roca intacta y propagación de fracturas, resultando en una menor resistencia de la roca.
Los estilos APA, Harvard, Vancouver, ISO, etc.
24

Muradas, Fernando Martins. "A novel framework for requirements elicitation in a military setting". Thesis, University of Oxford, 2012. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.589599.

Texto completo
Resumen
Due to their unique characteristics, military domains contain various peculiarities that directly or indirectly and favourably or unfavourably impact the development of software products. Observations of systems development in the Brazilian Navy for many years have shown that systems are developed and delivered usually with many delays, and during development there are many changes to the requirements initially elicited. Since many authors in the software engineering literature agree that problems in requirements account for more than 70% of total system development failures, it seemed to be logical that any effort to solve the problems encountered in the military environment should start focusing on the requirements area. Currently several techniques and approaches already exist that support the execution of activities in this phase. With this abundance of techniques, it becomes a difficult task for the analysts to select the best technique in a given circumstance. To support the selection of these techniques, several frameworks were also created, each one guided by a respective group of influencing factors. This study examined, in a detailed manner, these techniques and frameworks, and noticed that there are still issues to be addressed to guide the selection of elicitation techniques, especially in a complex environment such as the military one. To elicit these issues an exploratory qualitative research was performed. The results showed that social issues rather than technical ones were the main concern in the domain under study. Issues such as hierarchy and high staff turnover interfere in the requirements process but are yet not addressed. The findings of the qualitative research are the first contribution of the thesis. Based on these results this research proposed a novel framework for requirements elicitation within the context of a military environment taking into account social and technical factors, which is the second contribution of the thesis. Such a framework was developed following Conflict Theory from sociology. This framework organized a selection of techniques based on possible solutions to conflicts. Finally, the solution was evaluated to assess its efficacy. This evaluation was based on qualitative and quantitative research. Based on the evaluation results the framework was updated.
Los estilos APA, Harvard, Vancouver, ISO, etc.
25

Cancellieri, Michela. "Computer-aided design, synthesis and evaluation of novel antiviral compounds". Thesis, Cardiff University, 2014. http://orca.cf.ac.uk/69187/.

Texto completo
Resumen
RNA viruses are a major cause of disease that in the last fifteen years counted for frequent outbreaks, infecting both humans and animals. Examples of emerging or ri-emerging viral pathogens are the Foot-and- Mouth disease virus (FMDV) for animals, Chikungunya virus (CHIKV), Coxsackie virus B3 (CVB3) and Respiratory Syncytial virus (RSV) for humans, all responsible for infections associated with mild to severe complications. Although both vaccines and small-molecule compounds are at different stages of development, no selective antiviral drugs have been approved so far, therefore for all four these viruses improved treatment strategies are required. Promising targets are the viral non-structural proteins, which are commonly evaluated for the identification of new antivirals. Starting from the study of different viral proteins, several computer-aided techniques were applied, aiming to identify hit molecules first, and secondly to synthesise new series of potential antiviral compounds. The available crystal structures of some of the proteins that play a role in viral replication were used for structure- and ligand-based virtual screenings of commercially available compounds against CVB3, FMDV and RSV. New families of potential anti-CHIKV compounds were rationally designed and synthesized, in order to establish a structureactivity relationship study on a lead structure previously found in our group. Finally, a de-novo drug design approach was performed to find a suitable scaffold for the synthesis of a series of zinc-ejecting compounds against RSV. Inhibition of virus replication was evaluated for all the new compounds, of which different showed antiviral potential.
Los estilos APA, Harvard, Vancouver, ISO, etc.
26

Judeh, Thair. "SEA: a novel computational and GUI software pipeline for detecting activated biological sub-pathways". ScholarWorks@UNO, 2011. http://scholarworks.uno.edu/td/463.

Texto completo
Resumen
With the ever increasing amount of high-throughput molecular profile data, biologists need versatile tools to enable them to quickly and succinctly analyze their data. Furthermore, pathway databases have grown increasingly robust with the KEGG database at the forefront. Previous tools have color-coded the genes on different pathways using differential expression analysis. Unfortunately, they do not adequately capture the relationships of the genes amongst one another. Structure Enrichment Analysis (SEA) thus seeks to take biological analysis to the next level. SEA accomplishes this goal by highlighting for users the sub-pathways of a biological pathways that best correspond to their molecular profile data in an easy to use GUI interface.
Los estilos APA, Harvard, Vancouver, ISO, etc.
27

Wang, Chen. "Novel software tool for microsatellite instability classification and landscape of microsatellite instability in osteosarcoma". Miami University / OhioLINK, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=miami1554829925088174.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
28

Czyrnyj, Catriona. "UROKIN: A Novel Software for Kinematic Analysis of Urogenital Motion Using Transperineal Ultrasound Imaging". Thesis, Université d'Ottawa / University of Ottawa, 2017. http://hdl.handle.net/10393/36147.

Texto completo
Resumen
Dynamic transperineal ultrasound (TPUS) video allows for kinematic analysis of urogenital morphology and mobility, however, measures are often limited to peak displacements of anatomical landmarks and are vulnerable to error incurred by probe rotation during imaging. This thesis aimed to (1) develop an algorithm to calculate kinematic curves of urogenital landmark motion from TPUS video and to (2) investigate the error incurred in these kinematic measures due to in-plane ultrasound probe rotation. UROKIN, a semi-automated software, was developed and, as a proof of concept, was used to identify differences in urogenital kinematics during pelvic floor muscle maximum voluntary contractions between women with and without stress urinary incontinence. A mathematical model revealed that the error incurred by TPUS probe rotation in the x- (anterior-posterior) and y- (cranial-caudal) directions, was a factor of: r, the radius of rotation; Ɵ, the in-plane angular probe rotation; and α, the angular deviation between the anatomical planes and the coordinate system in which error was calculated. As an absolute measure, the error incurred by in-plane probe rotation is reduced to a factor of only r and Ɵ. Moving forward, UROKIN must be adapted to include findings from (1), and must be tested for validity and reliability.
Los estilos APA, Harvard, Vancouver, ISO, etc.
29

Shepherd, Adrian John. "Novel second-order techniques and global optimisation methods for supervised training of multi-layer perceptrons". Thesis, University College London (University of London), 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.321662.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
30

Löffler-Wirth, Henry, Edith Willscher, Peter Ahnert, Kerstin Wirkner, Christoph Engel, Markus Löffler y Hans Binder. "Novel anthropometry based on 3D-bodyscans applied to a large population based cohort". Universitätsbibliothek Leipzig, 2016. http://nbn-resolving.de/urn:nbn:de:bsz:15-qucosa-207844.

Texto completo
Resumen
Three-dimensional (3D) whole body scanners are increasingly used as precise measuring tools for the rapid quantification of anthropometric measures in epidemiological studies. We analyzed 3D whole body scanning data of nearly 10,000 participants of a cohort collected from the adult population of Leipzig, one of the largest cities in Eastern Germany. We present a novel approach for the systematic analysis of this data which aims at identifying distinguishable clusters of body shapes called body types. In the first step, our method aggregates body measures provided by the scanner into meta-measures, each representing one relevant dimension of the body shape. In a next step, we stratified the cohort into body types and assessed their stability and dependence on the size of the underlying cohort. Using self-organizing maps (SOM) we identified thirteen robust meta-measures and fifteen body types comprising between 1 and 18 percent of the total cohort size. Thirteen of them are virtually gender specific (six for women and seven for men) and thus reflect most abundant body shapes of women and men. Two body types include both women and men, and describe androgynous body shapes that lack typical gender specific features. The body types disentangle a large variability of body shapes enabling distinctions which go beyond the traditional indices such as body mass index, the waist-to-height ratio, the waist-to-hip ratio and the mortality-hazard ABSI-index. In a next step, we will link the identified body types with disease predispositions to study how size and shape of the human body impact health and disease.
Los estilos APA, Harvard, Vancouver, ISO, etc.
31

Oliveira, João Paulo dos Santos. "Rabbit: A novel approach to find data-races during state-space exploration". Universidade Federal de Pernambuco, 2012. https://repositorio.ufpe.br/handle/123456789/10891.

Texto completo
Resumen
Submitted by Pedro Henrique Rodrigues (pedro.henriquer@ufpe.br) on 2015-03-05T18:45:35Z No. of bitstreams: 2 jpso-master_rabbit_complete.pdf: 1450168 bytes, checksum: 081b9f94c19c494561e97105eb417001 (MD5) license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5)
Made available in DSpace on 2015-03-05T18:45:35Z (GMT). No. of bitstreams: 2 jpso-master_rabbit_complete.pdf: 1450168 bytes, checksum: 081b9f94c19c494561e97105eb417001 (MD5) license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) Previous issue date: 2012-08-30
Data-races are an important kind of error in concurrent shared-memory programs. Software model checking is a popular approach to find them. This research proposes a novel approach to find races that complements model-checking by efficiently reporting precise warnings during state-space exploration (SSE): Rabbit. It uses information obtained across different paths explored during SSE to predict likely racy memory accesses. We evaluated Rabbit on 33 different scenarios of race, involving a total of 21 distinct application subjects of various sources and sizes. Results indicate that Rabbit reports race warnings very soon compared to the time the model checker detects the race (for 84.8% of the cases it reports a true warning of race in <5s) and that the warnings it reports include very few false alarms. We also observed that the model checker finds the actual race quickly when it uses a guided-search that builds on Rabbit’s output (for 74.2% of the cases it reports the race in <20s).
Los estilos APA, Harvard, Vancouver, ISO, etc.
32

Yeung, Kim-sang y 楊儉生. "The design and multiplier-less realization of a novel digital IF for software radio receivers". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2003. http://hub.hku.hk/bib/B2946660X.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
33

Sitthiworachart, Jirarat. "An investigation into novel software tools for enhancing students' higher cognitive skills in computer programming". Thesis, University of Warwick, 2005. http://wrap.warwick.ac.uk/34679/.

Texto completo
Resumen
Active learning is considered by many academics as an important and effective learning strategy. Students can improve the quality of their work by developing their higher cognitive skills through reflection on their own ideas, and through practice of analytic and evaluative skills. Assessment is a tool for learning, but traditional assessment methods often encourage surface learning, rather than deep learning which is an approach to developing higher cognitive skills. Peer assessment is one of the successful approaches, which can be used to enhance deep learning. It is a method of motivating students, involving students discussing, marking and providing feedback on other students' work. Although it is often used in the context of essays, it has seldom been applied to computer programming courses. The skill of writing good software includes understanding different approaches to the task, and stylistic and related considerations - these can be developed by evaluation of other programmers' solutions. As part of a study investigating the extent that peer assessment can promote deep learning to develop the higher cognitive skills in a programming course, a novel web-based peer assessment tool has been developed. - The process used is novel, since students are engaged not only in marking each other's work, but also in evaluating the quality of marking of their peers. - This system is designed to provide anonymity for the whole process, in order to ensure that the process is fair, and to encourage students to discuss without embarrassment by using an anonymous communication device (ACD) in a variety of roles (script authors, marker, and feedback marker). In this thesis, we describe and compare the learning theory and tools, which are relevant in learning computer programming. Deep learning, which can be described using the six categories of learning in Bloom's taxonomy, is discussed. Other peer assessment software tools are compared and discussed. The design and implementation of a novel web-based peer assessment system (with anonymous communication device) are described, and set in the context of the learning theories. The results of evaluating the tools through several experiments involving large programming classes and an essay writing module are reported. In this thesis, we also propose a new variation of Bloom's taxonomy, which is appropriate to describe the skills required for tasks such as programming. The results indicate that this approach to web-based peer assessment has successfully helped students to develop their higher cognitive skills in learning computer programming, and peer assessment is an accurate assessment method in a programming course.
Los estilos APA, Harvard, Vancouver, ISO, etc.
34

Timmis, Jonathan Ian. "Artificial immune systems : a novel data analysis technique inspired by the immune network theory". Thesis, University of Kent, 2000. https://kar.kent.ac.uk/21989/.

Texto completo
Resumen
This thesis presents a novel data analysis technique inspired by the natural immune system. Immunological metaphors were extracted, simplified and applied to create an effective data analysis technique. This thesis builds on foundations of previous work, extracts salient features of the immune system and creates a principled and effective data analysis technique. Throughout this thesis, a methodical and principled approach was adopted. Previous work, along with background immunology was extensively surveyed. Problems with previous research were identified and principles from immunology were extracted to create the initial AIS for data analysis. The AIS, through the process of cloning and mutation, built up a network of B cells that were a diverse representation of data being analysed. This network was visualised via a specially developed tool. This allows the user to interact with the network and use the system for exploratory data analysis. Experiments were performed on two different data sets, a simple simulated data set and the Fisher Iris data set. Good results were obtained by the AIS on both sets, with the AIS being able to identify clusters known to exist within them. Extensive investigation into the algorithm's behaviour was undertaken and the way in which algorithm parameters effected performance and results was also examined. Despite initial success from the original AIS, problems were identified with the algorithm and the second stage of research was undertaken. This resulted in the resource limited artificial immune system (RLAIS) which created a stable network of objects that did not deteriorate or loose patterns once discovered. Periods of stable network size were observed with perturbations of the network size. This thesis presents a successful application of immune system metaphors to create a novel data analysis technique. Furthermore, the RLAIS goes a long way toward making AIS a viable contender for effective data analysis and further research is identified for study.
Los estilos APA, Harvard, Vancouver, ISO, etc.
35

Haggett, Simon J. "Towards a multipurpose neural network approach to novelty detection". Thesis, University of Kent, 2008. https://kar.kent.ac.uk/24133/.

Texto completo
Resumen
Novelty detection, the identification of data that is unusual or different in some way, is relevant in a wide number of real-world scenarios, ranging from identifying unusual weather conditions to detecting evidence of damage in mechanical systems. However, utilising novelty detection approaches in a particular scenario presents significant challenges to the non-expert user. They must first select an appropriate approach from the novelty detection literature for their scenario. Then, suitable values must be determined for any parameters of the chosen approach. These challenges are at best time consuming and at worst prohibitively difficult for the user. Worse still, if no suitable approach can be found from the literature, then the user is left with the impossible task of designing a novelty detector themselves. In order to make novelty detection more accessible, an approach is required which does not pose the above challenges. This thesis presents such an approach, which aims to automatically construct novelty detectors for specific applications. The approach combines a neural network model, recently proposed to explain a phenomenon observed in the neural pathways of the retina, with an evolutionary algorithm that is capable of simultaneously evolving the structure and weights of a neural network in order to optimise its performance in a particular task. The proposed approach was evaluated over a number of very different novelty detection tasks. It was found that, in each task, the approach successfully evolved novelty detectors which outperformed a number of existing techniques from the literature. A number of drawbacks with the approach were also identified, and suggestions were given on ways in which these may potentially be overcome.
Los estilos APA, Harvard, Vancouver, ISO, etc.
36

Kotoula, Eleni. "Virtualizing conservation : exploring and developing novel digital visualizations for preventive and remedial conservation of artefacts". Thesis, University of Southampton, 2015. https://eprints.soton.ac.uk/381464/.

Texto completo
Resumen
Critical evaluation of the actions involved in conservation practice reveals that the limitations of traditional approaches for examination and treatment influence decision-making and affect the artefacts’ interpretation and display. Such problems demonstrate the technological needs and underlie the research aims for the development of scientific conservation and practice. This research evaluates the application of digital technology in conservation of antiquities and works of art by proposing alternative digital methodologies for examination, restoration and conservation documentation. Its value is demonstrated by case studies, covering a broad range of artefacts types and a variety of materials. The key elements of the proposed methodology are the following: • Development and application of computational imaging, computer vision and digitization techniques for enhanced examination and visual analysis • Graphical 3d modelling and physical 3d reproduction for interventive treatment • Workflows for digital and conventional conservation documentation This thesis addresses to what extent 3D technologies contribute to conservation objectives, defined as the balance of preservation, investigation and display, considering also the ethical and theoretical aspects.
Los estilos APA, Harvard, Vancouver, ISO, etc.
37

Pan, Zhenghui. "A novel library catalogue system". View thesis, 2009. http://handle.uws.edu.au:8081/1959.7/44313.

Texto completo
Resumen
Thesis (M.Sc. (Hons.))--University of Western Sydney, 2009.
A thesis presented to the University of Western Sydney, College of Health and Science, School of Computing and Mathematics, in fulfilment of the requirements for the degree of Master of Science (Honours). Includes bibliographies.
Los estilos APA, Harvard, Vancouver, ISO, etc.
38

Garcia, Raul Murillo. "Development and evaluation of a novel tool for real-time control and remote monitoring over local area networks". Thesis, Glasgow Caledonian University, 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.289677.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
39

Imam, Ayad Tareq. "Relative-fuzzy : a novel approach for handling complex ambiguity for software engineering of data mining models". Thesis, De Montfort University, 2010. http://hdl.handle.net/2086/3909.

Texto completo
Resumen
There are two main defined classes of uncertainty namely: fuzziness and ambiguity, where ambiguity is ‘one-to-many’ relationship between syntax and semantic of a proposition. This definition seems that it ignores ‘many-to-many’ relationship ambiguity type of uncertainty. In this thesis, we shall use complex-uncertainty to term many-to-many relationship ambiguity type of uncertainty. This research proposes a new approach for handling the complex ambiguity type of uncertainty that may exist in data, for software engineering of predictive Data Mining (DM) classification models. The proposed approach is based on Relative-Fuzzy Logic (RFL), a novel type of fuzzy logic. RFL defines a new formulation of the problem of ambiguity type of uncertainty in terms of States Of Proposition (SOP). RFL describes its membership (semantic) value by using the new definition of Domain of Proposition (DOP), which is based on the relativity principle as defined by possible-worlds logic. To achieve the goal of proposing RFL, a question is needed to be answered, which is: how these two approaches; i.e. fuzzy logic and possible-world, can be mixed to produce a new membership value set (and later logic) that able to handle fuzziness and multiple viewpoints at the same time? Achieving such goal comes via providing possible world logic the ability to quantifying multiple viewpoints and also model fuzziness in each of these multiple viewpoints and expressing that in a new set of membership value. Furthermore, a new architecture of Hierarchical Neural Network (HNN) called ML/RFL-Based Net has been developed in this research, along with a new learning algorithm and new recalling algorithm. The architecture, learning algorithm and recalling algorithm of ML/RFL-Based Net follow the principles of RFL. This new type of HNN is considered to be a RFL computation machine. The ability of the Relative Fuzzy-based DM prediction model to tackle the problem of complex ambiguity type of uncertainty has been tested. Special-purpose Integrated Development Environment (IDE) software, which generates a DM prediction model for speech recognition, has been developed in this research too, which is called RFL4ASR. This special purpose IDE is an extension of the definition of the traditional IDE. Using multiple sets of TIMIT speech data, the prediction model of type ML/RFL-Based Net has classification accuracy of 69.2308%. This accuracy is higher than the best achievements of WEKA data mining machines given the same speech data.
Los estilos APA, Harvard, Vancouver, ISO, etc.
40

Huang, Wei-Chih. "Self-adaptive containers : a novel framework for building scalable QoS-aware software with low programmer overhead". Thesis, Imperial College London, 2015. http://hdl.handle.net/10044/1/25278.

Texto completo
Resumen
As the number of execution environments increases dramatically, ever-changing non-functional requirements often lead to the challenge of frequent code refactoring. Despite help of traditional software engineering techniques, adapting software to meet each execution environment and application context remains a non-trivial endeavour. Manually reimplementing software possibly takes months or years of programmer effort and requires high levels of expertise. Furthermore, to build software for different execution environments often results in either a small code base which cannot guarantee Quality of Service or a large manually-optimised code base which is difficult to maintain. This thesis presents a novel self-adaptive container framework which can dynamically adjust its resource usage in an effort to meet resource constraints and scalability requirements. Each container instance is associated with programmer-specified Service Level Objectives with respect to performance, reliability, and primary memory use. To prevent ambiguity among multiple Service Level Objectives, each of them is specified in the format of standard Web Service Level Agreement. This framework features tighter functionality specification than that of standard container frameworks, which enables greater scope for efficiency optimisations, including the exploitation of probabilistic data structures, out-of-core storage, parallelism, and cloud storage. These techniques are utilised in a low-cost way through the integration of third-party libraries, which also enable our framework to provide a wider class of Service Level Objectives. In addition, to reduce the time of learning how to use the framework, its interfaces are designed to be close to those of standardised libraries. The framework has been implemented in C++ and utilised in two case studies centred on explicit state-space exploration adopting a breadth-first search algorithm, and route planning adopting a Dijkstra’s shortest path algorithm. In order to illustrate the framework's viability and capability, various Service Level Objectives are assigned. Furthermore, the implementation of our framework is utilised to explore approximately 240 million states in the first case study and to find the shortest path of a graph representing the USA road network, containing approximately 24 million nodes and 58 million arcs. The experimental results show that the framework is capable of dynamically adjusting its resource usage according to assigned Service Level Objectives and dealing with large-scale data. At the same time, the programmer overhead is kept low in terms of the degree to which code is modified.
Los estilos APA, Harvard, Vancouver, ISO, etc.
41

Keaveny, John Joseph. "Analysis and Implementation of a Novel Single Channel Direction Finding Algorithm on a Software Radio Platform". Thesis, Virginia Tech, 2005. http://hdl.handle.net/10919/31300.

Texto completo
Resumen
A radio direction finding (DF) system is an antenna array and a receiver arranged in a combination to determine the azimuth angle of a distant emitter. Basically, all DF systems derive the emitter location from an initial determination of the angle-of-arrival (AOA). Radio direction finding techniques have classically been based on multiple-antenna systems employing multiple receivers. Classic techniques such as MUSIC [1][2] and ESPRIT use simultaneous phase information from each antenna to estimate the angle-of-arrival of the signal of interest. In many scenarios (e.g., hand-held systems), however, multiple receivers are impractical. Thus, single channel techniques are of interest, particularly in mobile scenarios. Although the amount of existing research for single channel DF is considerably less than for multi-channel direction finding, single channel direction finding techniques have been previously investigated. Since many of the single channel direction finding techniques are older analog techniques and have been analyzed in previous work, we will investigate a new single channel direction finding technique that takes specific advantage of digital capabilities. Specifically, we propose a phase-based method that uses a bank of Phase-Locked Loops (PLLs) in combination with an eight-element circular array. Our method is similar to the Pseudo-Doppler method in that it samples antennas in a circular array using a commutative switch. In the proposed approach the sampled data is fed to a bank of PLLs which track the phase on each element. The parallel PLLs are implemented in software and their outputs are fed to a signal processing block that estimates the AOA. This thesis presents the details of the new Phase-Locked Loop (PLL) algorithm and compares its performance to existing single channel DF techniques such as the Watson-Watt and the Pseudo-Doppler techniques. We also describe the implementation of the PLL algorithm on a DRS Signal Solutions, Incorporated (DRS-SS) WJ-8629A Software Definable Receiver with Sunrise â ¢ Technology and present measured performance results.
Master of Science
Los estilos APA, Harvard, Vancouver, ISO, etc.
42

Kulzer, Pedro Manuel Casal. "Novel hardware and software combination for automotive motorsport vehicles based on direct processing of graphical functions". Doctoral thesis, Universidade de Aveiro, 2015. http://hdl.handle.net/10773/14093.

Texto completo
Resumen
Doutoramento em Engenharia Eletrónica
The main motivation for the work presented here began with previously conducted experiments with a programming concept at the time named "Macro". These experiments led to the conviction that it would be possible to build a system of engine control from scratch, which could eliminate many of the current problems of engine management systems in a direct and intrinsic way. It was also hoped that it would minimize the full range of software and hardware needed to make a final and fully functional system. Initially, this paper proposes to make a comprehensive survey of the state of the art in the specific area of software and corresponding hardware of automotive tools and automotive ECUs. Problems arising from such software will be identified, and it will be clear that practically all of these problems stem directly or indirectly from the fact that we continue to make comprehensive use of extremely long and complex "tool chains". Similarly, in the hardware, it will be argued that the problems stem from the extreme complexity and inter-dependency inside processor architectures. The conclusions are presented through an extensive list of "pitfalls" which will be thoroughly enumerated, identified and characterized. Solutions will also be proposed for the various current issues and for the implementation of these same solutions. All this final work will be part of a "proof-of-concept" system called "ECU2010". The central element of this system is the before mentioned "Macro" concept, which is an graphical block representing one of many operations required in a automotive system having arithmetic, logic, filtering, integration, multiplexing functions among others. The end result of the proposed work is a single tool, fully integrated, enabling the development and management of the entire system in one simple visual interface. Part of the presented result relies on a hardware platform fully adapted to the software, as well as enabling high flexibility and scalability in addition to using exactly the same technology for ECU, data logger and peripherals alike. Current systems rely on a mostly evolutionary path, only allowing online calibration of parameters, but never the online alteration of their own automotive functionality algorithms. By contrast, the system developed and described in this thesis had the advantage of following a "clean-slate" approach, whereby everything could be rethought globally. In the end, out of all the system characteristics, "LIVE-Prototyping" is the most relevant feature, allowing the adjustment of automotive algorithms (eg. Injection, ignition, lambda control, etc.) 100% online, keeping the engine constantly working, without ever having to stop or reboot to make such changes. This consequently eliminates any "turnaround delay" typically present in current automotive systems, thereby enhancing the efficiency and handling of such systems.
A principal motivação para o trabalho que conduziu a esta tese residiu na constatação de que os actuais métodos de modelação de centralinas automóveis conduzem a significativos problemas de desenvolvimento e manutenção. Como resultado dessa constatação, o objectivo deste trabalho centrou-se no desenvolvimento de um conceito de arquitectura que rompe radicalmente com os modelos state-of-the-art e que assenta num conjunto de conceitos que vieram a ser designados de "Macro" e "Celular ECU". Com este modelo pretendeu-se simultaneamente minimizar a panóplia de software e de hardware necessários à obtenção de uma sistema funcional final. Inicialmente, esta tese propõem-se fazer um levantamento exaustivo do estado da arte na área específica do software e correspondente hardware das ferramentas e centralinas automóveis. Os problemas decorrentes de tal software serão identificados e, dessa identificação deverá ficar claro, que praticamente todos esses problemas têm origem directa ou indirecta no facto de se continuar a fazer um uso exaustivo de "tool chains" extremamente compridas e complexas. De forma semelhante, no hardware, os problemas têm origem na extrema complexidade e inter-dependência das arquitecturas dos processadores. As consequências distribuem-se por uma extensa lista de "pitfalls" que também serão exaustivamente enumeradas, identificadas e caracterizadas. São ainda propostas soluções para os diversos problemas actuais e correspondentes implementações dessas mesmas soluções. Todo este trabalho final faz parte de um sistema "proof-of-concept" designado "ECU2010". O elemento central deste sistema é o já referido conceito de “Macro”, que consiste num bloco gráfico que representa uma de muitas operações necessárias num sistema automóvel, como sejam funções aritméticas, lógicas, de filtragem, de integração, de multiplexagem, entre outras. O resultado final do trabalho proposto assenta numa única ferramenta, totalmente integrada que permite o desenvolvimento e gestão de todo o sistema de forma simples numa única interface visual. Parte do resultado apresentado assenta numa plataforma hardware totalmente adaptada ao software, bem como na elevada flexibilidade e escalabilidade, para além de permitir a utilização de exactamente a mesma tecnologia quer para a centralina, como para o datalogger e para os periféricos. Os sistemas actuais assentam num percurso maioritariamente evolutivo, apenas permitindo a calibração online de parâmetros, mas nunca a alteração online dos próprios algoritmos das funcionalidades automóveis. Pelo contrário, o sistema desenvolvido e descrito nesta tese apresenta a vantagem de seguir um "clean-slate approach", pelo que tudo pode ser globalmente repensado. No final e para além de todas as restantes características, o “LIVE-PROTOTYPING” é a funcionalidade mais relevante, ao permitir alterar algoritmos automóveis (ex: injecção, ignição, controlo lambda, etc.) de forma 100% online, mantendo o motor constantemente a trabalhar e sem nunca ter de o parar ou re-arrancar para efectuar tais alterações. Isto elimina consequentemente qualquer "turnaround delay" tipicamente presente em qualquer sistema automóvel actual, aumentando de forma significativa a eficiência global do sistema e da sua utilização.
Los estilos APA, Harvard, Vancouver, ISO, etc.
43

Khan, Mohsin Amir Faiz. "A systems biology design and implementation of novel bioinformatics software tools for high throughput gene expression analysis". Thesis, Brunel University, 2009. http://bura.brunel.ac.uk/handle/2438/4456.

Texto completo
Resumen
Microarray technology has revolutionized the field of molecular biology by offering an efficient and cost effective platform for the simultaneous quantification of thousands of genes or even entire genomes in a single experiment. Unlike southern blotting, which is restricted to the measurement of one gene at-a-time, microarrays offer biologists with the opportunity to carry out genome-wide experiments in order to help them gain a systems level understanding of cell regulation and control. The application of bioinformatics in the milieu of gene expression analysis has attracted a great deal of attention in the recent past due to specific algorithms and software solutions that attempt to illustrate complex multidimensional microarray data in a biologically coherent fashion so that it can be understood by the biologist. This has given rise to some exciting prospects for deciphering microarray data, by helping us refine our comprehension pertinent to the underlying physiological dynamics of disease. Although much progress is being made in the development of specialized bioinformatics software pipelines with the purpose of decoding large volumes of gene expression data in the context of systems biology, several loopholes exist. Perhaps most notable of these loopholes is the fact that there is an increasing demand for software solutions that specialize in automating the comparison of multiple gene expression profiles, derived from microarray experiments sharing a common biological theme. This is no doubt an important challenge, since common genes across different biological conditions having similar expression patterns are likely to be involved in the same biological process and hence, may share the same regulatory signatures. The potential benefits of this in refining our understanding of the physiology of disease are undeniable. The research presented in this thesis provides a systematic walkthrough of a series of software pipelines developed for the purpose of streamlining gene expression analysis in a systems biology context. Firstly, we present BiSAn, a software tool that deciphers expression data from the perspective of transcriptional regulation. Following this, we present Genome Interaction Analyzer (GIA), which analyzes microarray data in the integrative framework of transcription factor binding sites, protein-protein interactions and molecular pathways. The final contribution is a software pipeline called MicroPath, which analyzes multiple sets of gene expression profiles and attempts to extract common regulatory signatures that may be implicating the biological question.
Los estilos APA, Harvard, Vancouver, ISO, etc.
44

Breitwieser, Oliver Julien [Verfasser] y Johannes [Akademischer Betreuer] Schemmel. "Learning by Tooling: Novel Neuromorphic Learning Strategies in Reproducible Software Environments / Oliver Julien Breitwieser ; Betreuer: Johannes Schemmel". Heidelberg : Universitätsbibliothek Heidelberg, 2021. http://d-nb.info/1238148530/34.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
45

Crespin, Aaron L. "A novel approach to modeling the effects of radiation in Gallium-Arsenide solar cells using Silvaco's atlas software". Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2004. http://library.nps.navy.mil/uhtbin/hyperion/04Sept%5FCrespin.pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
46

Verbeek, Benjamin. "Maximum Likelihood Estimation of Hyperon Parameters in Python : Facilitating Novel Studies of Fundamental Symmetries with Modern Software Tools". Thesis, Uppsala universitet, Institutionen för materialvetenskap, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-446041.

Texto completo
Resumen
In this project, an algorithm has been implemented in Python to estimate the parameters describing the production and decay of a spin 1/2 baryon - antibaryon pair. This decay can give clues about a fundamental asymmetry between matter and antimatter. A model-independent formalism developed by the Uppsala hadron physics group and previously implemented in C++, has been shown to be a promising tool in the search for physics beyond the Standard Model (SM) of particle physics. The program developed in this work provides a more user-friendly alternative, and is intended to motivate further use of the formalism through a more maintainable, customizable and readable implementation. The hope is that this will expedite future research in the area of charge parity (CP)-violation and eventually lead to answers to questions such as why the universe consists of matter. A Monte-Carlo integrator is used for normalization and a Python library for function minimization. The program returns an estimation of the physics parameters including error estimation. Tests of statistical properties of the estimator, such as consistency and bias, have been performed. To speed up the implementation, the Just-In-Time compiler Numba has been employed which resulted in a speed increase of a factor 400 compared to plain Python code.
Los estilos APA, Harvard, Vancouver, ISO, etc.
47

Reif, Michael [Verfasser], Mira [Akademischer Betreuer] Mezini y Eric [Akademischer Betreuer] Bodden. "Novel Approaches to Systematically Evaluating and Constructing Call Graphs for Java Software / Michael Reif ; Mira Mezini, Eric Bodden". Darmstadt : Universitäts- und Landesbibliothek, 2021. http://d-nb.info/1241741522/34.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
48

Nissa, Holmgren Robert. "Automated Measurement and Change Detection of an Application’s Network Activity for Quality Assistance". Thesis, Linköpings universitet, Databas och informationsteknik, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-107707.

Texto completo
Resumen
Network usage is an important quality metric for mobile apps. Slow networks, low monthly traffic quotas and high roaming fees restrict mobile users’ amount of usable Internet traffic. Companies wanting their apps to stay competitive must be aware of their network usage and changes to it. Short feedback loops for the impact of code changes are key in agile software development. To notify stakeholders of changes when they happen without being prohibitively expensive in terms of manpower the change detection must be fully automated. To further decrease the manpower overhead cost of implementing network usage change detection the system need to have low configuration requirements, and keep the false positive rate low while managing to detect larger changes. This thesis proposes an automated change detection method for network activity to quickly notify stakeholders with relevant information to begin a root cause analysis after a change in the network activity is introduced. With measurements of the Spotify’s iOS app we show that the tool achieves a low rate of false positives while detecting relevant changes in the network activity even for apps with dynamic network usage patterns as Spotify.
Nätverksaktivitet är ett viktigt kvalitetsmått för mobilappar. Mobilanvändare begränsas ofta av långsamma nätverk, låg månatlig trafikkvot och höga roamingavgifter. Företag som vill ha konkurrenskraftiga appar behöver vara medveten om deras nätverksaktivitet och förändringar av den. Snabb återkoppling för effekten av kodändringar är vitalt för agil programutveckling. För att underrätta intressenter om ändringar när de händer utan att vara avskräckande dyrt med avseende på arbetskraft måste ändringsdetekteringen vara fullständigt automatiserad. För att ytterligare minska arbetskostnaderna för ändringsdetektering av nätverksaktivitet måste detekteringssystemet vara snabbt att konfigurera, hålla en låg grad av felaktig detektering samtidigt som den lyckas identifiera stora ändringar. Den här uppsatsen föreslår ett automatiserat förändringsdetekteringsverktyg för nätverksaktivitet för att snabbt meddela stakeholders med relevant information för påbörjan av grundorsaksanalys när en ändring som påverkar nätverksaktiviteten introduceras. Med hjälp av mätningar på Spotifys iOS-app visar vi att verktyget når en låg grad av felaktiga detekteringar medan den identifierar ändringar i nätverksaktiviteten även för appar med så dynamisk nätverksanvändning som Spotify.
Los estilos APA, Harvard, Vancouver, ISO, etc.
49

Suh, Caitlin D. "The Use of High-Throughput Virtual Screening Software in the Proposal of A Novel Treatment for Congenital Heart Defects". Scholarship @ Claremont, 2019. https://scholarship.claremont.edu/cmc_theses/2260.

Texto completo
Resumen
Conventional screening of potential drug candidates through wet lab affinity experiments using libraries of thousands of modified molecules is time and resource consuming, along with the fact that it contributes to the widening time gap between the discovery of disease-causing mutations and the implementation of resulting novel treatments. It is necessary to explore whether the preliminary use of high-throughput virtual screening (HTVS) software such as PyRx will curb both the time and money spent in discovering novel treatments for diseases such as congenital heart defects (CHDs). For example, AXIN2, a protein involved in a negative feedback loop inhibiting the Wnt/β-catenin signaling pathway important for cardiogenesis, has recently been associated with CHD. The loss-of-function mutation L10F on the tankyrase-binding domain of AXIN2 has been shown to upregulate the pathway by loss of inhibition ability, leading to the accumulation of intracellular β-catenin. In a different paper, however, AXIN2 has been shown to be stabilized using XAV-939, a small-molecule drug which targets tankyrase. PyRx and VMD will be used to modify the drug in order to increase its binding affinity to AXIN2, stabilizing the protein and reinstating its inhibitory property to treat CHDs. When used in adjunction to wet lab experiments, HTVS software may decrease costs and the time required to bring a potentially life-saving treatment into use.
Los estilos APA, Harvard, Vancouver, ISO, etc.
50

Maema, Mathe. "OVR : a novel architecture for voice-based applications". Thesis, Rhodes University, 2011. http://hdl.handle.net/10962/d1006694.

Texto completo
Resumen
Despite the inherent limitation of accessing information serially, voice applications are increasingly growing in popularity as computing technologies advance. This is a positive development, because voice communication offers a number of benefits over other forms of communication. For example, voice may be better for delivering services to users whose eyes and hands may be engaged in other activities (e.g. driving) or to semi-literate or illiterate users. This thesis proposes a knowledge based architecture for building voice applications to help reduce the limitations of serial access to information. The proposed architecture, called OVR (Ontologies, VoiceXML and Reasoners), uses a rich backend that represents knowledge via ontologies and utilises reasoning engines to reason with it, in order to generate intelligent behaviour. Ontologies were chosen over other knowledge representation formalisms because of their expressivity and executable format, and because current trends suggest a general shift towards the use of ontologies in many systems used for information storing and sharing. For the frontend, this architecture uses VoiceXML, the emerging, and de facto standard for voice automated applications. A functional prototype was built for an initial validation of the architecture. The system is a simple voice application to help locate information about service providers that offer HIV (Human Immunodeficiency Virus) testing. We called this implementation HTLS (HIV Testing Locator System). The functional prototype was implemented using a number of technologies. OWL API, a Java interface designed to facilitate manipulation of ontologies authored in OWL was used to build a customised query interface for HTLS. Pellet reasoner was used for supporting queries to the knowledge base and Drools (JBoss rule engine) was used for processing dialog rules. VXI was used as the VoiceXML browser and an experimental softswitch called iLanga as the bridge to the telephony system. (At the heart of iLanga is Asterisk, a well known PBX-in-a-box.) HTLS behaved properly under system testing, providing the sought initial validation of OVR.
LaTeX with hyperref package
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía