To see the other types of publications on this topic, follow the link: Core logics.

Dissertations / Theses on the topic 'Core logics'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Core logics.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Ng, Yat Sze. "At the Intersection of Fintech and Strategy: Case Studies of the Competitive Strategies of Fintech Platforms." Thesis, The University of Sydney, 2022. https://hdl.handle.net/2123/29360.

Full text
Abstract:
In recent years, digital technologies have been applied to generate fintech innovations in the form of new services and business models within the global financial sector. But in spite of the global proliferation of fintech start-ups, market estimates put the failure rate of these new fintech platforms at roughly 90%. The high mortality rate suggests two possibilities. First, there may be a lack of knowledge on the appropriate strategies to adopt. Second, the traditional prescriptions for strategy may be less relevant in the context of fintech platforms. The objective of my thesis is to lay the groundwork for future research by first ascertaining either or both of these possibilities based on what is currently known. To this end, I first present a comprehensive review of the fintech literature, which found that there is a relative paucity of research on fintech strategies and important limitations associated with the existing works. I constructed a research agenda consisting of several open questions based on my analysis to provide directions for future research in this area. Next, building on this research agenda, I addressed the research question “How do competitive strategies influence the performance of successful fintech platforms?”, and explored if distinct strategies are required for different types of fintech platforms with a case study of four market-leading fintech platforms in China. Finally, I investigated if different strategies would be required at different points along a fintech platform’s development trajectory. Based on a case study of Ppdai.com, one of the world’s largest online lending platforms, I developed a process model that sheds light on the underlying mechanisms of this phenomenon. Overall, my thesis demonstrates that the competitive strategies employed have to be contingent on the nature of fintech services provided by the platform, and reveals that the strategies have to evolve over time with the development of the platform.
APA, Harvard, Vancouver, ISO, and other styles
2

Chen, Fei. "Enabling system validation for the many-core supercomputer." Access to citation, abstract and download form provided by ProQuest Information and Learning Company; downloadable PDF file, 208 p, 2009. http://proquest.umi.com/pqdweb?did=1891590561&sid=7&Fmt=2&clientId=8331&RQT=309&VName=PQD.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Walker, Douglas H. "Women in ministry: The logical core of the debate." TRINITY INTERNATIONAL UNIVERSITY, 2012. http://pqdtopen.proquest.com/#viewpdf?dispub=3487789.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Domingues, Miguel Brazão. "Core language for web applications." Master's thesis, Faculdade de Ciências e Tecnologia, 2010. http://hdl.handle.net/10362/5103.

Full text
Abstract:
Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática
Web applications have a very high demand for rapid development and constant change. Several languages were created for this kind of applications, which are very flexible but many times trade the benefits of strongly-typed programming languages by untyped interpreted languages. With this kind of languages the interaction between different layers in a web application is usually developed using dialects and programming conventions with no real mechanical verifications between the client and server sides, and the SQL code within the application and the database. We present a typed core language for web applications that integrates the typing of the interface definition, business logic, and database manipulation representing these interactions at a high abstract level. Using only one language, typed and with its own instructions to define the interface and the interaction with the database, becomes possible to make static checks. Thereby, avoiding execution errors caused by the usual heterogeneity among web applications. We also describe the implementation of a prototype with a highly flexible programming environment for our language that allows the application development and publishing tasks to be done through a web interface, interacting directly with the application and without loosing the integrity checks. This kind of development relies on an agile development methodology. Therefore, the modifications made to the application are made active using the dynamic reconfiguration mechanism, avoiding the recompilation of the application and system restart.
APA, Harvard, Vancouver, ISO, and other styles
5

Han, Yi. "A high-performance CMOS programmable logic core for system-on-chip applications /." Thesis, Connect to this title online; UW restricted, 2005. http://hdl.handle.net/1773/5948.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Haftmann, Florian. "Code generation from specifications in higher-order logic." kostenfrei, 2009. https://mediatum2.ub.tum.de/node?id=886023.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Davis, Justin S. "An FPGA-based digital logic core for ATE support and embedded test applications." Diss., Georgia Institute of Technology, 2003. http://hdl.handle.net/1853/15639.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Seater, Robert Morrison. "Core extraction and non-example generation : debugging and understanding logical models." Thesis, Massachusetts Institute of Technology, 2004. http://hdl.handle.net/1721.1/34127.

Full text
Abstract:
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, June 2005.
Includes bibliographical references (p. 127-132).
Declarative models, in which conjunction and negation are freely used, are a powerful tool for software specification and verification. Unfortunately, tool support for developing and debugging such models is limited. The challenges to developing such tools are twofold: technical information must be extracted from the model, then that information must be presented to the user in way that is both meaningful and manageable. This document introduces two such techniques to help fill the gap. Non-example generation allows the user to ask for the role of a particular subformula in a model. A formula's role is explained in terms of how the set of satisfying solutions to the model would change were that subformula removed or altered. Core extraction helps detect and localize unintentional overconstraint, in which real counterexamples are masked by bugs in the model. It leverages recent advances in SAT solvers to identify irrelevant portions of an unsatisfiable model. Experiences are reported from applying these two techniques to a variety of existing models.
by Robert Morrison Seater.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
9

Mathur, Garima. "Fuzzy logic control for infant-incubator systems." University of Akron / OhioLINK, 2006. http://rave.ohiolink.edu/etdc/view?acc_num=akron1153768682.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Méndez, Real Maria. "Spatial Isolation against Logical Cache-based Side-Channel Attacks in Many-Core Architectures." Thesis, Lorient, 2017. http://www.theses.fr/2017LORIS454/document.

Full text
Abstract:
L’évolution technologique ainsi que l’augmentation incessante de la puissance de calcul requise par les applications font des architectures ”many-core” la nouvelle tendance dans la conception des processeurs. Ces architectures sont composées d’un grand nombre de ressources de calcul (des centaines ou davantage) ce qui offre du parallélisme massif et un niveau de performance très élevé. En effet, les architectures many-core permettent d’exécuter en parallèle un grand nombre d’applications, venant d’origines diverses et de niveaux de sensibilité et de confiance différents, tout en partageant des ressources physiques telles que des ressources de calcul, de mémoire et de communication. Cependant, ce partage de ressources introduit également des vulnérabilités importantes en termes de sécurité. En particulier, les applications sensibles partageant des mémoires cache avec d’autres applications, potentiellement malveillantes, sont vulnérables à des attaques logiques de type canaux cachés basées sur le cache. Ces attaques, permettent à des applications non privilégiées d’accéder à des informations secrètes sensibles appartenant à d’autres applications et cela malgré des méthodes de partitionnement existantes telles que la protection de la mémoire et la virtualisation. Alors que d’importants efforts ont été faits afin de développer des contremesures à ces attaques sur des architectures multicoeurs, ces solutions n’ont pas été originellement conçues pour des architectures many-core récemment apparues et nécessitent d’être évaluées et/ou revisitées afin d’être applicables et efficaces pour ces nouvelles technologies. Dans ce travail de thèse, nous proposons d’étendre les services du système d’exploitation avec des mécanismes de déploiement d’applications et d’allocation de ressources afin de protéger les applications s’exécutant sur des architectures many-core contre les attaques logiques basées sur le cache. Plusieurs stratégies de déploiement sont proposées et comparées à travers différents indicateurs de performance. Ces contributions ont été implémentées et évaluées par prototypage virtuel basé sur SystemC et sur la technologie ”Open Virtual Platforms” (OVP)
The technological evolution and the always increasing application performance demand have made of many-core architectures the necessary new trend in processor design. These architectures are composed of a large number of processing resources (hundreds or more) providing massive parallelism and high performance. Indeed, many-core architectures allow a wide number of applications coming from different sources, with a different level of sensitivity and trust, to be executed in parallel sharing physical resources such as computation, memory and communication infrastructure. However, this resource sharing introduces important security vulnerabilities. In particular, sensitive applications sharing cache memory with potentially malicious applications are vulnerable to logical cache-based side-channel attacks. These attacks allow an unprivileged application to access sensitive information manipulated by other applications despite partitioning methods such as memory protection and virtualization. While a lot of efforts on countering these attacks on multi-core architectures have been done, these have not been designed for recently emerged many-core architectures and require to be evaluated, and/or revisited in order to be practical for these new technologies. In this thesis work, we propose to enhance the operating system services with security-aware application deployment and resource allocation mechanisms in order to protect sensitive applications against cached-based attacks. Different application deployment strategies allowing spatial isolation are proposed and compared in terms of several performance indicators. Our proposal is evaluated through virtual prototyping based on SystemC and Open Virtual Platforms(OVP) technology
APA, Harvard, Vancouver, ISO, and other styles
11

Ronfeldt, Matthew Stephen. "Crafting core selves during professional education /." May be available electronically:, 2008. http://proquest.umi.com/login?COPT=REJTPTU1MTUmSU5UPTAmVkVSPTI=&clientId=12498.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Stroiński, Mateusz. "Simple Transitive 2-Representations of Cell 2-Subcategories for Algebras with a Self-Injective Core." Thesis, Uppsala universitet, Algebra och geometri, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-411491.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Jefferies, A. S. "An incrementally compiled code approach to concurrent switch level logic simulation." Thesis, University of Kent, 1986. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.374685.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Quintero, Flores Perfecto Malaquias. "Fuzzy Gradual Pattern Mining Based on Multi-Core Architectures." Thesis, Montpellier 2, 2013. http://www.theses.fr/2013MON20232/document.

Full text
Abstract:
Les motifs graduels visent à décrire des co-variations au sein des données et sont de la forme plus l'âge est important, plus le salaire est élevé. Ces motifs ont fait l'objet de nombreux travaux en fouille de données ces dernières années, du point de vue des définitions que peuvent avoir de tels motifs et d'un point de vue algorithmique pour les extraire efficacement. Ces définitions et algorithmes considèrent qu'il est possible d'ordonner de manière stricte les valeurs (par exemple l'âge, le salaire). Or, dans de nombreux champs applicatifs, il est difficile voire impossible d'ordonner de cette manière. Par exemple, quand l'on considère l'expression de gènes, dire que l'expression d'un gène est plus importante que l'expression d'un autre gène quand leurs expressions ne diffèrent qu'à la dixième décimale n'a pas de sens d'un point de vue biologique. Ainsi, nous proposons dans cette thèse une approche fondée sur les ordres flous. Les algorithmes étant très consommateurs tant en mémoire qu'en temps de calcul, nous proposons des optimisations d'une part du stockage des degrés flous et d'autre part de calcul parallélisé. Les expérimentations que nous avons menées sur des bases de données synthétiques et réelles montrent l'intérêt de notre approche
Gradual patterns aim at describing co-variations of data such as the older, the higher the salary. They have been more and more studied from the data mining point of view in recent years, leading to several ways of defining their meaning and and several algorithms to automatically extract them.They consider that data can be ordered regarding the values taken on the attributes (e.g. the age and the salary).However, in many application domains, it is hardly possible to consider that data values are crisply ordered. For instance, when considering gene expression, it is not true, from the biological point of view, to say that Gene 1 is more expressed than Gene 2 if the levels of expression only differ from the tenth decimal. This thesis thus considers fuzzy orderings and propose both formal definitions and algorithms to extract gradual patterns considering fuzzy orderings. As these algorithms are both time and memory consuming, we propose some optimizations based on an efficient storage of the fuzzy ordering informationcoupled with parallel algorithms. Experimental results run on synthetic and real database show the interest or our proposal
APA, Harvard, Vancouver, ISO, and other styles
15

Thompson, Michael. "Roots and Role of the Imagination in Kant: Imagination at the Core." [Tampa, Fla] : University of South Florida, 2009. http://purl.fcla.edu/usf/dc/et/SFE0002945.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Fraga, Wilton Bezerra de. "Study of the performance of assymmetrical two-core non linear directional fiber coupler operating logic gates." Universidade Federal do CearÃ, 2006. http://www.teses.ufc.br/tde_busca/arquivo.php?codArquivo=0.

Full text
Abstract:
Conselho Nacional de Desenvolvimento CientÃfico e TecnolÃgico
We investigate the performance of three different non linear directional assymmetrical fibers couplers that include a profile of self-modulation of increasing and decreasing phase. The asymmetry is associated with the profile of self-modulation of phase of one of the chanels. Initially, we investigate the performance of the considered coupler using ultrashort pulses, type sÃliton with 2ps of width and later operating with signal CW. Observing the characteristics of transmission of the device, through the direct chanel and cross chanel, we made a study of the extinction ratio (Xratio) of the devices. The extinction ratio of a switching on-off is the relation among the exit power in the state on and the power of exit in the state off. It was observed that the performance of gates AND, XOR, OR are dependents of the profile of non linearity. In the profile of constant it was not verified that logics AND and XOR present one better performance with the device operating in CW, while logic OR present better with the coupler operating in pulse regime. We conclude that coupler to operate it as logic gate we can control the non-linearity profile to optimize the characteristics of transmission through the extinction ratio.
NÃs investigamos o desempenho de trÃs diferentes acopladores direcionais nÃolineares duplo assimÃtrico que incluem um perfil de auto modulaÃÃo de fase crescente e decrescente. A assimetria està associada ao perfil de auto modulaÃÃo de fase de um dos canais. Inicialmente, investigamos o desempenho do acoplador proposto utilizando pulsos ultracurtos, tipo sÃliton com 2ps de largura e posteriormente operando com sinal CW. Observando as caracterÃsticas de transmissÃo do dispositivo, atravÃs do canal direto e cruzado, fizemos um estudo do coeficiente de extinÃÃo (Xratio) dos dispositivos. O coeficiente de extinÃÃo de um chaveamento on-off à a relaÃÃo entre a potÃncia de saÃda no estado on e a potÃncia de saÃda no estado off . Foi observado que a performance de portas AND, XOR, OR sÃo dependentes do perfil de nÃo linearidade. No perfil de nÃo linearidade constante verificou-se que as lÃgicas AND e XOR apresentam um melhor desempenho com o dispositivo operando em CW, enquanto a lÃgica OR mostra-se melhor com o acoplador operando em regime pulsado. ConcluÃmos que para o acoplador operar como porta lÃgica nÃs podemos controlar o perfil de nÃo linearidade para otimizar as caracterÃsticas de transmissÃo atravÃs do coeficiente de extinÃÃo.
APA, Harvard, Vancouver, ISO, and other styles
17

Hartford, Margaret Ann. "Core conditions in student-centered learning environments." To access this resource online via ProQuest Dissertations and Theses @ UTEP, 2009. http://0-proquest.umi.com.lib.utep.edu/login?COPT=REJTPTU0YmImSU5UPTAmVkVSPTI=&clientId=2515.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Lu, Weiyun. "Formally Verified Code Obfuscation in the Coq Proof Assistant." Thesis, Université d'Ottawa / University of Ottawa, 2019. http://hdl.handle.net/10393/39994.

Full text
Abstract:
Code obfuscation is a software security technique where transformations are applied to source and/or machine code to make them more difficult to analyze and understand to deter reverse-engineering and tampering. However, in many commercial tools, such as Irdeto's Cloakware product, it is not clear why the end user should believe that the programs that come out the other end are still the same program"! In this thesis, we apply techniques of formal specification and verification, by using the Coq Proof Assistant and IMP (a simple imperative language within it), to formulate what it means for a program's semantics to be preserved by an obfuscating transformation, and give formal machine-checked proofs that these properties hold. We describe our work on opaque predicate and control flow flattening transformations. Along the way, we also employ Hoare logic as an alternative to state equivalence, as well as augment the IMP program with Switch statements. We also define a lower-level flowchart language to wrap around IMP for modelling certain flattening transformations, treating blocks of codes as objects in their own right. We then discuss related work in the literature on formal verification of data obfuscation and layout obfuscation transformations in IMP, and conclude by discussing CompCert, a formally verified C compiler in Coq, along with work that has been done on obfuscation there, and muse on the possibility of implementing formal methods in the next generation of real-world obfuscation tools.
APA, Harvard, Vancouver, ISO, and other styles
19

Parham, M. J. "Adaptive logic network correlation techniques for optical code division multiple access systems." Thesis, University of Surrey, 1994. http://epubs.surrey.ac.uk/843838/.

Full text
Abstract:
Code Division Multiple Access (CDMA) techniques afford Local Area Networks (LANs) the support of concurrent, asynchronous communication between users without access delay. These properties are obtained by encoding users' data with high rate code sequences, so that data is spread over a much larger bandwidth than would usually be required for transmission. The necessary bandwidth is provided by using optical fibre both as the LAN medium and for incoherent optical signal processing. Conventionally, extraction of a desired user's signal is achieved by correlation using a single delay-line matched filter. Matched filters are optimal for the recovery of a known signal in the presence of additive noise. However, in a CDMA environment, their performance is limited by Multiple Access Interference (MAI), arising from the cross-correlation products of overlaid users, and degrades as the number of users increases. Adaptive Logic Networks (ALNs), a form of Artificial Neural System (ANS), are applied to the extraction of a single user's signal in a multi-user environment. In the approach taken, ALNs learn to incorporate the presence of interfering users' signals, in deciding the actual data bit received. Computer simulation is used to compare the error rates obtained by ALNs and the previously proposed correlation receivers; the performance of the latter providing a benchmark. Simulations are conducted assuming chip synchronism between users and no external sources of noise, i.e. MAI is assumed dominant. Consideration is given to systems employing both sparse optical codes and Gold-like codes as spreading sequences. In all the systems considered, ALNs are shown to enable significant reductions in error rate over the conventional correlation receivers. MAI effects, causing errors with the correlation receivers, are reduced by using additional temporal and intensity based information contained in the receiver input signal. This permits an ALN to extract details of the structure of interfering users' signals, to provide a better context for the classification of the desired user's signal. In the systems employing sparse codes, it is demonstrated that while a certain amount of MAI persists, it may be minimised by selection of the ALN input window, to provide the maximum possible information regarding the interfering users' signals. In the systems using Gold-like codes, it is shown that ALNs can be used to completely eliminate the effects of MAI. This is significant since, although this form of code sequence is suited to coherent CDMA systems, the cross-correlation products arising in incoherent optical environments are normally considered to be unacceptably high.
APA, Harvard, Vancouver, ISO, and other styles
20

Shaw, Ryan Phillip. "Application of Subjective Logic to Vortex Core Line Extraction and Tracking from Unsteady Computational Fluid Dynamics Simulations." BYU ScholarsArchive, 2012. https://scholarsarchive.byu.edu/etd/2989.

Full text
Abstract:
Presented here is a novel tool to extract and track believable vortex core lines from unsteady Computational Fluid Dynamics data sets using multiple feature extraction algorithms. Existing work explored the possibility of extracting features concurrent with a running simulation using intelligent software agents, combining multiple algorithms' capabilities using subjective logic. This work modifies the steady-state approach to work with unsteady fluid dynamics and is designed to work within the Concurrent Agent-enabled Feature Extraction concept. Each agent's belief tuple is quantified using a predefined set of information. The information and functions necessary to set each component in each agent's belief tuple is given along with an explanation of the methods for setting the components. This method is applied to the analyses of flow in a lid-driven cavity and flow around a cylinder, which highlight strengths and weaknesses of the chosen algorithms and the potential for subjective logic to aid in understanding the resulting features. Feature tracking is successfully applied and is observed to have a significant impact on the opinion of the vortex core lines. In the lid-driven cavity data set, unsteady feature extraction modifications are shown to impact feature extraction results with moving vortex core lines. The Sujudi-Haimes algorithm is shown to be more believable when extracting the main vortex core lines of the cavity simulation while the Roth-Peikert algorithm succeeding in extracting the weaker vortex cores in the same simulation. Mesh type and time step is shown to have a significant effect on the method. In the curved wake of the cylinder data set, the Roth-Peikert algorithm more reliably detects vortex core lines which exist for a significant amount of time. the method was finally applied to a massive wind turbine simulation, where the importance of performing feature extraction in parallel is shown. The use of multiple extraction algorithms with subjective logic and feature tracking helps determine the expected probability that an extracted vortex core is believable. This approach may be applied to massive data sets which will greatly reduce analysis time and data size and will aid in a greater understanding of complex fluid flows.
APA, Harvard, Vancouver, ISO, and other styles
21

Speciale, Giovanni Maria. "Il Ragionamento Logico come Forma di Apprendimento: Sviluppo di Un Framework per ILP." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2021. http://amslaurea.unibo.it/23786/.

Full text
Abstract:
Questa tesi si focalizza sulla programmazione logica induttiva (ILP) e, in particolare, in una sua ricontestualizzazione all'interno di ecosistemi tecnologici per lo sviluppo di applicativi di intelligenza artificiale (AI) moderni. ILP è un paradigma per l'apprendimento automatico: sulla base di una conoscenza del dominio e una serie di esempi (positivi e negativi) nell'ambito di interesse — rappresentati utilizzando la logica —, un sistema ILP riesce a derivare un programma logico che generalizza tutti gli esempi positivi e nessuno degli esempi negativi. Il suo innesto all'interno di ecosistemi di intelligenza artificiale simbolica può portare allo sviluppo di nuovi scenari applicativi in cui la logica induttiva può diventare il ponte tra mondo simbolico e quello sub-simbolico. In particolare, la tesi ha un duplice obiettivo. In primo luogo, una sistematizzazione dello stato dell’arte al fine di evidenziare gli approcci e le tecniche ILP esistenti ed eventuali tecnologie correlate. In secondo luogo, la progettazione e realizzazione un modulo ILP all'interno di un ecosistema tecnologico per AI simbolica — 2P-Kt — fornendo una prima implementazione fruibile in contesti pervasivi—quali quelli richiesti dalle moderne applicazioni di AI. Detto modulo andrà a supportare i principali algoritmi classici per ILP — Golem, Progol e Metagol — tramite una tecnologia multi-paradigma, consentendo l'utilizzo di tecniche induttive sia come applicazione che come libreria riusabile.
APA, Harvard, Vancouver, ISO, and other styles
22

Li, He. "Synthesis and characterization of metal-carbon core-shell nanoparticles /." May be available electronically:, 2008. http://proquest.umi.com/login?COPT=REJTPTU1MTUmSU5UPTAmVkVSPTI=&clientId=12498.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Boskovitz, Agnes, and abvi@webone com au. "Data Editing and Logic: The covering set method from the perspective of logic." The Australian National University. Research School of Information Sciences and Engineering, 2008. http://thesis.anu.edu.au./public/adt-ANU20080314.163155.

Full text
Abstract:
Errors in collections of data can cause significant problems when those data are used. Therefore the owners of data find themselves spending much time on data cleaning. This thesis is a theoretical work about one part of the broad subject of data cleaning - to be called the covering set method. More specifically, the covering set method deals with data records that have been assessed by the use of edits, which are rules that the data records are supposed to obey. The problem solved by the covering set method is the error localisation problem, which is the problem of determining the erroneous fields within data records that fail the edits. In this thesis I analyse the covering set method from the perspective of propositional logic. I demonstrate that the covering set method has strong parallels with well-known parts of propositional logic. The first aspect of the covering set method that I analyse is the edit generation function, which is the main function used in the covering set method. I demonstrate that the edit generation function can be formalised as a logical deduction function in propositional logic. I also demonstrate that the best-known edit generation function, written here as FH (standing for Fellegi-Holt), is essentially the same as propositional resolution deduction. Since there are many automated implementations of propositional resolution, the equivalence of FH with propositional resolution gives some hope that the covering set method might be implementable with automated logic tools. However, before any implementation, the other main aspect of the covering set method must also be formalised in terms of logic. This other aspect, to be called covering set correctibility, is the property that must be obeyed by the edit generation function if the covering set method is to successfully solve the error localisation problem. In this thesis I demonstrate that covering set correctibility is a strengthening of the well-known logical properties of soundness and refutation completeness. What is more, the proofs of the covering set correctibility of FH and of the soundness / completeness of resolution deduction have strong parallels: while the proof of soundness / completeness depends on the reduction property for counter-examples, the proof of covering set correctibility depends on the related lifting property. In this thesis I also use the lifting property to prove the covering set correctibility of the function defined by the Field Code Forest Algorithm. In so doing, I prove that the Field Code Forest Algorithm, whose correctness has been questioned, is indeed correct. The results about edit generation functions and covering set correctibility apply to both categorical edits (edits about discrete data) and arithmetic edits (edits expressible as linear inequalities). Thus this thesis gives the beginnings of a theoretical logical framework for error localisation, which might give new insights to the problem. In addition, the new insights will help develop new tools using automated logic tools. What is more, the strong parallels between the covering set method and aspects of logic are of aesthetic appeal.
APA, Harvard, Vancouver, ISO, and other styles
24

Waite, Stephen J. "A logic model to review material nominated for inclusion into project code PL3." Thesis, Monterey, California. Naval Postgraduate School, 1989. http://hdl.handle.net/10945/26045.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Cui, Song. "Hardware mapping of critical paths of a GaAs core processor for solid modelling accelerator /." Title page, contents and abstract only, 1996. http://web4.library.adelaide.edu.au/theses/09PH/09phc9661.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Crick, Thomas. "Superoptimisation : provably optimal code generation using answer set programming." Thesis, University of Bath, 2009. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.518295.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Black-Schaffer, David. "Block parallel programming for real-time applications on multi-core processors /." May be available electronically:, 2008. http://proquest.umi.com/login?COPT=REJTPTU1MTUmSU5UPTAmVkVSPTI=&clientId=12498.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Biallas, Sebastian [Verfasser]. "Verification of Programmable Logic Controller Code using Model Checking and Static Analysis / Sebastian Biallas." Aachen : Shaker, 2016. http://d-nb.info/1118258428/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Mortensen, Clifton H. "A Computational Fluid Dynamics Feature Extraction Method Using Subjective Logic." BYU ScholarsArchive, 2010. https://scholarsarchive.byu.edu/etd/2208.

Full text
Abstract:
Computational fluid dynamics simulations are advancing to correctly simulate highly complex fluid flow problems that can require weeks of computation on expensive high performance clusters. These simulations can generate terabytes of data and pose a severe challenge to a researcher analyzing the data. Presented in this document is a general method to extract computational fluid dynamics flow features concurrent with a simulation and as a post-processing step to drastically reduce researcher post-processing time. This general method uses software agents governed by subjective logic to make decisions about extracted features in converging and converged data sets. The software agents are designed to work inside the Concurrent Agent-enabled Feature Extraction concept and operate efficiently on massively parallel high performance computing clusters. Also presented is a specific application of the general feature extraction method to vortex core lines. Each agent's belief tuple is quantified using a pre-defined set of information. The information and functions necessary to set each component in each agent's belief tuple is given along with an explanation of the methods for setting the components. A simulation of a blunt fin is run showing convergence of the horseshoe vortex core to its final spatial location at 60% of the converged solution. Agents correctly select between two vortex core extraction algorithms and correctly identify the expected probabilities of vortex cores as the solution converges. A simulation of a delta wing is run showing coherently extracted primary vortex cores as early as 16% of the converged solution. Agents select primary vortex cores extracted by the Sujudi-Haimes algorithm as the most probable primary cores. These simulations show concurrent feature extraction is possible and that intelligent agents following the general feature extraction method are able to make appropriate decisions about converging and converged features based on pre-defined information.
APA, Harvard, Vancouver, ISO, and other styles
30

Alba, Castro Mauricio Fernando. "Abstract Certification of Java Programs in Rewriting Logic." Doctoral thesis, Universitat Politècnica de València, 2011. http://hdl.handle.net/10251/13617.

Full text
Abstract:
In this thesis we propose an abstraction based certification technique for Java programs which is based on rewriting logic, a very general logical and semantic framework efficiently implemented in the functional programming language Maude. We focus on safety properties, i.e. properties of a system that are defined in terms of certain events not happening, which we characterize as unreachability problems in rewriting logic. The safety policy is expressed in the style of JML, a standard property specification language for Java modules. In order to provide a decision procedure, we enforce finite-state models of programs by using abstract interpretation. Starting from a specification of the Java semantics written in Maude, we develop an abstraction based, finite-state operational semantics also written in Maude which is appropriate for program verification. As a by-product of the verification based on abstraction, a dependable safety certificate is delivered which consists of a set of rewriting proofs that can be easily checked by the code consumer by using a standard rewriting logic engine. The abstraction based proof-carrying code technique, called JavaPCC, has been implemented and successfully tested on several examples, which demonstrate the feasibility of our approach. We analyse local properties of Java methods: i.e. properties of methods regarding their parameters and results. We also study global confidentiality properties of complete Java classes, by initially considering non--interference and, then, erasure with and without non--interference. Non--interference is a semantic program property that assigns confidentiality levels to data objects and prevents illicit information flows from occurring from high to low security levels. In this thesis, we present a novel security model for global non--interference which approximates non--interference as a safety property.
Alba Castro, MF. (2011). Abstract Certification of Java Programs in Rewriting Logic [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/13617
Palancia
APA, Harvard, Vancouver, ISO, and other styles
31

Pippin, William E. Jr. "Optimizing Threads of Computation in Constraint Logic Programs." The Ohio State University, 2003. http://rave.ohiolink.edu/etdc/view?acc_num=osu1041551800.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Chen, Jiehua. "Regression models with spatially correlated residuals : applications to urban core growth in China /." May be available electronically:, 2008. http://proquest.umi.com/login?COPT=REJTPTU1MTUmSU5UPTAmVkVSPTI=&clientId=12498.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Johansson, Adam, and Tim Johansson. "Code generation for programmable logic controllers : Evaluating model-based engineering practices in a real-world context." Thesis, Högskolan i Skövde, Institutionen för informationsteknologi, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-18714.

Full text
Abstract:
The industrial manufacturing of today is achieved through the use of programmable logic controllers (PLC). The way PLCs are programmed remains largely unchanged since their conception 40 years ago, but complexity and software size have increased, and requirements have becomemore elaborate. Model-driven engineering (MDE) practices, formal verification and automated testing could help manage these challenges. This study seeks to improve development practices in the context of a company that delivers automation projects. Through design science methodology the state of the field is investigated and an artefact is developed. The artefact shows potential benefits resulting from the introduction of model-driven code generation, which is evaluated through an experiment with engineers. Our results indicate the engineers may benefit from incorporating generated code in their work.
APA, Harvard, Vancouver, ISO, and other styles
34

Coffman, Michael G. "Using a programmable logic controller to implement a (16,8) 2-bit error detection and correction code /." Available to subscribers only, 2007. http://proquest.umi.com/pqdweb?did=1456292311&sid=1&Fmt=2&clientId=1509&RQT=309&VName=PQD.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Ye, Xin. "Model checking self modifying code." Thesis, Université de Paris (2019-....), 2019. http://www.theses.fr/2019UNIP7010.

Full text
Abstract:
Le code auto-modifiant est un code qui modifie ses propres instructions pendant le temps d'exécution. Il est aujourd'hui largement utilisé, notamment dans les logiciels malveillants pour rendre le code difficile à analyser et à été détecté par les anti-virus. Ainsi, l'analyse de tels programmes d'auto-modifiant est un grand défi. Pushdown System(PDSs) est un modèle naturel qui est largement utilisé pour l'analyse des programmes séquentiels car il permet de modéliser précisément les appels de procédures et de simuler la pile du programme. Dans cette thèse, nous proposons d'étendre le modèle du PDS avec des règles auto-modifiantes. Nous appelons le nouveau modèle Self-Modifying PushDown System (SM- PDS). Un SM-PDS est un PDS qui peut modifier l’ensemble des règles de transitions pendant l'exécution. Tout d'abord, nous montrons comment les SM-PDS peuvent être utilisés pour représenter des programmes auto- et nous fournissons des algorithmes efficaces pour calculer les configurations précédentes et suivantes des SM-PDS accessibles. Ensuite, nous résolvons les problèmes sur la vérification de propriétés LTL et CTL pour le code auto-modifiant. Nous implémentons nos techniques dans un outil appelé SMODIC. Nous avons obtenu des résultats encourageants. En particulier, notre outil est capable de détecter plusieurs logiciels malveillants auto-modifiants ; il peut même détecter plusieurs logiciels malveillants que les autres logiciels anti-virus bien connus comme McAfee, Norman, BitDefender, Kinsoft, Avira, eScan, Kaspersky, Qihoo-360, Avast et Symantec n'ont pas pu détecter
A Self modifying code is code that modifies its own instructions during execution time. It is nowadays widely used, especially in malware to make the code hard to analyse and to detect by anti-viruses. Thus, the analysis of such self modifying programs is a big challenge. Pushdown Systems (PDSs) is a natural model that is extensively used for the analysis of sequential programs because it allows to accurately model procedure calls and mimic the program’s stack. In this thesis, we propose to extend the PushDown System model with self-modifying rules. We call the new model Self-Modifying PushDown System (SM-PDS). A SM-PDS is a PDS that can modify its own set of transitions during execution. First, we show how SM-PDSs can be used to naturally represent self-modifying programs and provide efficient algorithms to compute the backward and forward reachable configurations of SM-PDSs. Then, we consider the LTL model-checking problem of self-modifying code. We reduce this problem to the emptiness problem of Self-modifying Büchi Pushdown Systems (SM-BPDSs). We also consider the CTL model-checking problem of self-modifying code. We reduce this problem to the emptiness problem of Self-modifying Alternating Büchi Pushdown Systems (SM-ABPDSs). We implement our techniques in a tool called SMODIC. We obtained encouraging results. In particular, our tool was able to detect several self-modifying malwares; it could even detect several malwares that well-known anti-viruses such as McAfee, Norman, BitDefender, Kinsoft, Avira, eScan, Kaspersky, Qihoo-360, Avast and Symantec failed to detect
APA, Harvard, Vancouver, ISO, and other styles
36

Tuch, Harvey Computer Science &amp Engineering Faculty of Engineering UNSW. "Formal memory models for verifying C systems code." Publisher:University of New South Wales. Computer Science & Engineering, 2008. http://handle.unsw.edu.au/1959.4/41233.

Full text
Abstract:
Systems code is almost universally written in the C programming language or a variant. C has a very low level of type and memory abstraction and formal reasoning about C systems code requires a memory model that is able to capture the semantics of C pointers and types. At the same time, proof-based verification demands abstraction, in particular from the aliasing and frame problems. In this thesis, we study the mechanisation of a series of models, from semantic to separation logic, for achieving this abstraction when performing interactive theorem-prover based verification of C systems code in higher- order logic. We do not commit common oversimplifications, but correctly deal with C's model of programming language values and the heap, while developing the ability to reason abstractly and efficiently. We validate our work by demonstrating that the models are applicable to real, security- and safety-critical code by formally verifying the memory allocator of the L4 microkernel. All formalisations and proofs have been developed and machine-checked in the Isabelle/HOL theorem prover.
APA, Harvard, Vancouver, ISO, and other styles
37

Surdilovic, Tihomir. "Fuzzy Mouse Cursor Control System for Computer Users with Spinal Cord Injuries." Digital Archive @ GSU, 2006. http://digitalarchive.gsu.edu/cs_theses/49.

Full text
Abstract:
People with severe motor-impairments due to Spinal Cord Injury (SCI) or Spinal Cord Dysfunction (SCD), often experience difficulty with accurate and efficient control of pointing devices (Keates et al., 02). Usually this leads to their limited integration to society as well as limited unassisted control over the environment. The questions “How can someone with severe motor-impairments perform mouse pointer control as accurately and efficiently as an able-bodied person?” and “How can these interactions be advanced through use of Computational Intelligence (CI)?” are the driving forces behind the research described in this paper. Through this research, a novel fuzzy mouse cursor control system (FMCCS) is developed. The goal of this system is to simplify and improve efficiency of cursor control and its interactions on the computer screen by applying fuzzy logic in its decision-making to make disabled Internet users use the networked computer conveniently and easily. The FMCCS core consists of several fuzzy control functions, which define different user interactions with the system. The development of novel cursor control system is based on utilization of motor functions that are still available to most complete paraplegics, having capability of limited vision and breathing control. One of the biggest obstacles of developing human computer interfaces for disabled people focusing primarily on eyesight and breath control is user’s limited strength, stamina, and reaction time. Within the FMCCS developed in this research, these limitations are minimized through the use of a novel pneumatic input device and intelligent control algorithms for soft data analysis, fuzzy logic and user feedback assistance during operation. The new system is developed using a reliable and cheap sensory system and available computing techniques. Initial experiments with healthy and SCI subjects have clearly demonstrated benefits and promising performance of the new system: the FMCCS is accessible for people with severe SCI; it is adaptable to user specific capabilities and wishes; it is easy to learn and operate; point-to-point movement is responsive, precise and fast. The integrated sophisticated interaction features, good movement control without strain and clinical risks, as well the fact that quadriplegics, whose breathing is assisted by a respirator machine, still possess enough control to use the new system with ease, provide a promising framework for future FMCCS applications. The most motivating leverage for further FMCCS development is however, the positive feedback from persons who tested the first system prototype.
APA, Harvard, Vancouver, ISO, and other styles
38

Colombier, Brice. "Methods for protecting intellectual property of IP cores designers." Thesis, Lyon, 2017. http://www.theses.fr/2017LYSES038/document.

Full text
Abstract:
La conception de circuits intégrés est aujourd'hui une tâche extrêmement complexe. Cela pousse les concepteurs à adopter une approche modulaire, où chaque bloc fonctionnel est décrit de manière indépendante. Ces blocs fonctionnels, appelés composants virtuels, sont vendus par leurs concepteurs à des intégrateurs système qui les utilisent dans des projets complexes. Cette division a pour conséquence une hausse inquiétante des cas de copie illégale des composants virtuels. Afin de lutter contre cette menace sur la propriété intellectuelle des concepteurs, l'objectif de cette thèse était de mettre au point un système complet d'activation à distance de composants virtuels, permettant au concepteur de savoir exactement combien de composants virtuels sont effectivement utilisés. Pour cela, les deux premières contributions de cette thèse portent sur la modification de la logique combinatoire d'un composant virtuel afin de le rendre activable. La première méthode permet de forcer les sorties à une valeur fixe de manière contrôlée. La seconde est une technique efficace de sélection de nœuds à altérer, encore une fois de manière contrôlée, afin de rendre le composant virtuel temporairement inutilisable. La troisième contribution de cette thèse est une méthode légère de correction d'erreurs à appliquer aux réponses issues des fonctions physiques non-clonables, qui constituent un identifiant intrinsèque des instances du composant virtuel. Réutilisant un protocole de correction d'erreurs issu de l'échange quantique de dés, cette méthode est beaucoup plus légère que les codes correcteurs d'erreurs classiquement utilisés pour cette application
Designing integrated circuits is now an extremely complex task. This is why designers adopt a modular approach, where each functional block is described independently. These functional blocks, called intellectual property (IP) cores, are sold by their designers to system integrators who use them in complex projects. This division led to the rise of cases of illegal copying of IP cores. In order to fight this threat against intellectual property of lP core designers, the objective of this PhD thesis was to develop a secure remote activation scheme for IP cores, allowing the designer to know exactly how many IP cores are currently used. To achieve this, the first two contributions of thesis thesis deal with the modification of combinational logic of an IP core to make it activable. The first method allows to controllably force the outputs to a fixed logic value. The second is an efficient technique to select the nodes to controllably alter, so that the IP core is temporarily unusable. The third contribution of this thesis is a lightweight method of error correction to use with PUF (Physical Undonable Functions) responses, which are an intrinsic identifier of instances of the lP core. Reusing an error-correction protocol used in quantum key ex.change, this method is much more lightweight than error-correcting
APA, Harvard, Vancouver, ISO, and other styles
39

Escobar, Rozas Freddy. "Leibniz, the Science and the Civil Code." IUS ET VERITAS, 2016. http://repositorio.pucp.edu.pe/index/handle/123456789/122768.

Full text
Abstract:
Yhis article addresses the issue about the differences between the old regulatory bodies and the current Civil Codes. t he author analyzes from the Medieval Period until the present to evidence the change and evolution that made the thinkers to apply the geometric method to Law and how to configure the rules in the Civil Law. Furthermore, it recognizes and analyzes the contributions of e uropean writers and thinkers that promoted the Scientific Revolution of the 17th century, especially the work of the lawyer Gottfried Wilhelm Leibniz.
El presente artículo aborda la temática sobre las diferencias entre los cuerpos normativos antiguos y los Códigos Civiles actuales. El autor realiza un análisis desde la época Medieval hasta la actualidad para evidenciar el cambio y la evolución que hicieron los pensadores para aplicar el método geométrico al Derecho y la forma de configurar de las normas en el Civil Law. Asimismo, se reconoce y analiza los aportes de los escritores y pensadores europeos, que impulsaron la Revolución Científica del siglo XVII, en especial la obra del abogado Gottfried Wilhelm Leibniz.
APA, Harvard, Vancouver, ISO, and other styles
40

Biallas, Sebastian [Verfasser], Stefan [Akademischer Betreuer] Kowalewski, and Alexander [Akademischer Betreuer] Fay. "Verification of programmable logic controller code using model checking and static analysis / Sebastian Biallas ; Stefan Kowalewski, Alexander Fay." Aachen : Universitätsbibliothek der RWTH Aachen, 2016. http://d-nb.info/1129875946/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Cominotte, Alexandre. "Uso de imagens biométricas para predição do peso corporal e de carcaça quente de bovinos Nelore /." Jaboticabal, 2018. http://hdl.handle.net/11449/157479.

Full text
Abstract:
Orientador: Otavio Rodrigues Machado Neto
Banca: João Ricardo Rebouças Dorea
Banca: Guilherme Luis Pereira
Resumo: O trabalho foi dividido em dois estudos. No estudo 1, objetivou-se predizer o peso corporal (PC) e o ganho médio diário (GMD) de bovinos Nelore por meio de imagens tridimensionais e comparar quatro modelos de predição: Regressão Linear Múltipla (RLM), Regressão LASSO (LASSO), Partial Least Squares (PLS) e Artificial Neural Networks (ANN). Foram coletadas 234 imagens de bovinos Nelore. As coletas de dados foram realizadas em quatro fases da ao longo da vida do animal: Desmame aos 244 dias de idade e 202,3 kg (± 27,1), Recria aos 457 dias de idade e 213,9 kg (± 25,1), Inicio da Terminação aos 590 dias de idade e 334,5 kg (± 29,2) e Final da Terminação aos 763 dias de idade e 449,5 kg (± 47,5). Nas três primeiras fases foram coletadas imagens de 62 bovinos Nelore, enquanto que na última fase apenas 48 imagens foram coletadas. O GMD foi medido: 1: Desmama - Recria, 2: Desmama - Inicio da Terminação, 3: Desmama - Final da Terminação, 4: Recria - Inicio da Terminacao, 5: Recria - Final da Terminação e 6: Inicio da Terminação - Final da Terminação. No estudo 2, quatrocentas e cinquenta imagens de bovinos Nelore foram coletadas em quatro experimentos para predição de PC e peso de carcaça quente (PCQ). Quatro conjuntos experimentais foram considerados: Set 1 inclui os experimentos 1, 2 e 3 para treinamento e experimento 4 para validação; Set 2 inclui os experimentos 1, 2 e 4 para treinamento e experimento 3 para validação; Set 3 inclui os experimentos 1, 3 e 4 para treinamento e exper... (Resumo completo, clicar acesso eletrônico abaixo)
Abstract: The work was divided in two studies. The objective of this study was to predict the body weight (BW) and the average daily gain (ADG) of Nellore cattle by 3-D images and to compare four prediction models: Multiple Linear Regression (MLR), LASSO Regression, Partial Least Squares (PLS) and Artificial Neural Networks (ANN). A total of 234 images of bovine Nellore were collected. Data collection was performed in four stages throughout the life of the animal: Weaning at 244 days of age and 202.3 kg (± 27.1), Stocker at 457 days of age and 213.9 kg (± 25.1), Initial of Termination at 590 days of age and 334.5 kg (± 29.2) and Finish of Termination at 763 days of age and 449.5 kg (± 47.5). In the first three phases images of 62 Nellore cattle were collected, while in the last phase only 48 images were collected. The ADG was measured: 1: Weaning - Stocker, 2: Weaning - Initial of Termination, 3: Weaning - Finish of Termination, 4: Stocker - Initial of Termination, 5: Stocker - Finish of Termination and 6: Initial of Termination - Final of Termination. In study 2, four hundred and fifty images of Nellore cattle were collected in four experiments for prediction of BW and hot carcass weight (HCW). Four experimental sets were considered: Set 1 includes experiments 1, 2 and 3 for training and experiment 4 for validation; Set 2 includes experiments 1, 2 and 4 for training and experiment 3 for validation; Set 3 includes experiments 1, 3 and 4 for training and experiment 2 for validation; Set... (Complete abstract click electronic access below)
Mestre
APA, Harvard, Vancouver, ISO, and other styles
42

Gunputh, Rajendra Parsad. "L'interprétation du code Napoléon par les juridictions mauriciennes." La Réunion, 2005. http://elgebar.univ-reunion.fr/login?url=http://thesesenligne.univ.run/05_24_Gunputh.pdf.

Full text
Abstract:
L'interprétation du Code Napoléon par les juridictions mauriciennes démontre comment, malgré l'autonomie du Code Napoléon et des juridictions nationales, les dispositions du même code sont consacrées à l'étude du droit civil dans un pays doté d'un droit mixte. De nature très technique, cette interprétation est basée tantôt sur le droit civil français et tantôt sur le droit anglais. Les tribunaux locaux interpréteront alors les dispositions du Code Napoléon à trois niveaux : selon une jurisprudence bien établie, en s'inspirant nettement des arrêts rendus par la Cour de cassation, ceux de la doctrine et jurisprudence anglaise et enfin, certaines fois, les tribunaux locaux assurent indépendamment leurs propres jugements. N'oublions pas qu'il existe une loi d'origine législative quant à l'interprétation des textes qui est aussi d'origine anglaise : The Interpretation General Clauses Act 1974. On note aussi que les tribunaux locaux, en l'occurrence la Cour Suprême, s'inspire et reprend largement les arrêts de la Cour de cassation et l'on peut se demander si les jugements rendus ont une certaine originalité. Évidemment non ! Pour remplir leur mission, la jurisprudence et la doctrine ont quand même besoin d'une méthode. En droit mauricien, les juges de la Cour Suprême ont une confiance inébranlable dans les sources du droit français en général. Cela fait ressortir l'ambiguïté et le pragmatisme du système de contrôle exercé par les institutions judiciaires, ses difficultés d'interprétations entre l'éloignement et le rapprochement du Code Napoléon du Code civil français. Quoi qu'il en soit, le Code Napoléon est vivant et cela explique sa très longue longévité, malgré la dure concurrence de la common law. En effet, la procédure anglaise joue son rôle d'intrusion à merveille. Conçue sur trois piliers : famille, propriété et obligations, la thèse explique comment la Cour Suprême "survit" grâce au droit français et comment les jugements des tribunaux étrangers jouent un rôle important dans le droit interne
This thesis demonstrates how the Napoleon Code is interpreted in a Commonwealth country where there is also a great resistance from the common law. Thought there is the interpretation General Clauses Act 1974, which is English inspired, most interpretation is nevertheless borrowed from the french doctrine and jurisprudence. Judgments from the famous Cour de cassation are constantly referred to. In fact, there is no proper autonomy or originality from judgments given by the Mauritian tribunals especially the Supreme Court. In fact, the Supreme Court still relies on the decisions of the Privy Council based in London. Mauritian law, however, innovate in certain vvay because the legislator has passed a certain number of reforms related to the law of successions. The three pillars of french civil law Family, Property and Obligations, are fully discussed to demonstrate the great similarities and differences between french and mauritian law. This can be achieved by reference to local jurisprudence and how the Supreme Court normally sticks to local statutes, the Napoleon Code and stare decisis or precedent cases in order to sum up with his ratio decidendi
APA, Harvard, Vancouver, ISO, and other styles
43

Thapalia, Anita. "Zinc and copper isotopes as tracers of anthropogenic contamination in a sediment core from an urban lake." To access this resource online via ProQuest Dissertations and Theses @ UTEP, 2009. http://0-proquest.umi.com.lib.utep.edu/login?COPT=REJTPTU0YmImSU5UPTAmVkVSPTI=&clientId=2515.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Zhu, Lei. "Circular reconstruction and scatter correction in X-ray cone-beam CT /." May be available electronically:, 2007. http://proquest.umi.com/login?COPT=REJTPTU1MTUmSU5UPTAmVkVSPTI=&clientId=12498.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Sandberg, Natalia. "Automatic Generation of PLC Code Based on Net Condition Event Systems." Scholar Commons, 2008. http://scholarcommons.usf.edu/etd/3771.

Full text
Abstract:
An important consideration in discrete event dynamic systems control theory is the selection of a suitable modeling formalism that can capture the complex characteristics of the system and the capability to automatically synthesize a controller based on the system model. Net condition event systems are well suited for modeling complex discrete event dynamic systems owing to their input and output structure, which effectively captures the behavior of the physical devices to be monitored and/or controlled. To date, net condition event systems control models have not been extensively applied to highly automated manufacturing systems and there are few guidelines on how to automatically generate Programmable Logic Controller programming languages from net condition event systems models. This research automatically converted net condition event systems control models into Programmable Logic Controller programming language and evaluated the applicability of the proposed methodology in highly automated manufacturing systems using HAS-200 as a test bed.
APA, Harvard, Vancouver, ISO, and other styles
46

Come, David. "Analyse de la qualité de code via une approche logique et application à la robotique." Thesis, Toulouse 3, 2019. http://www.theses.fr/2019TOU30008.

Full text
Abstract:
La qualité d'un code informatique passe à la fois par sa correction fonctionnelle mais aussi par des critères de lisibilité, compréhension et maintenabilité. C'est une problématique actuellement importante en robotique où de nombreux frameworks open-source se diffusent mal dans l'industrie en raison d'incertitudes sur la qualité du code. Les outils d'analyse et de recherche de code sont efficaces pour améliorer ces aspects. Il est important qu'ils laissent l'utilisateur spécifier ce qu'il recherche afin pouvoir prendre en compte les spécificités de chaque projet et du domaine. Il existe deux principales représentations du code : son arbre de syntaxe abstraite (Abstract Syntax Tree en anglais, ou AST) et le graphe de flot de contrôle (Control Flow Graph en anglais, ou CFG) des fonctions qui sont définies dans le code. Les mécanismes de spécification existants utilisent exclusivement l'une ou l'autre de ces représentations, ce qui est dommage car elles offrent des informations complémentaires. L'objectif de ce travail est donc de développer une méthode de vérification de la conformité du code avec des règles utilisateurs qui puissent exploiter conjointement l'AST et le CFG. La méthode repose sur une nouvelle logique développée dans le cadre de ces travaux : FO++ , qui une extension temporelle de la logique du premier ordre. Cette logique a plusieurs avantages. Tout d'abord, elle est indépendante de tout langage de programmation et dotée d'une sémantique formelle. Ensuite, elle peut être utilisée comme moyen de formaliser les règles utilisateurs une fois instanciée pour un langage de programmation donné. Enfin, l'étude de son problème de model-checking offre un mécanisme de vérification automatique et correct de la conformité du code. Ces différents concepts ont été implémentés dans Pangolin, un outil pour le langage C++. Étant donné le code à vérifier et une spécification (qui correspond à une formule de FO++ , écrite dans le langage utilisateur de Pangolin), l'outil indique si oui ou non le code respecte la spécification. Il offre de plus un résumé synthétique de l'évaluation pour pouvoir retrouver le potentiel code fautif ainsi qu'un certificat de correction du résultat. Pangolin et FO++ ont trouvé une première application dans le domaine de la robotique via l'analyse de la qualité des paquets ROS et formalisation d'un design-pattern spécifique à ROS. Une seconde application plus générale concerne le développement de programme en C++ avec la formalisation de diverses règles de bonnes pratiques pour ce langage. Enfin, on montre comment il est possible de spécifier et vérifier des règles intimement liées à un projet en vérifiant des propriétés sur Pangolin lui-même
The quality of source code depends not only on its functional correctness but also on its readability, intelligibility and maintainability. This is currently an important problem in robotics where many open-source frameworks do not spread well in the industry because of uncertainty about the quality of the code. Code analysis and search tools are effective in improving these aspects. It is important that they let the user specify what she is looking for in order to be able to take into account the specific features of the project and of the domain. There exist two main representations of the source code : its Abstract Syntax Tree (AST) and the Control Flow Graph (CFG) of its functions. Existing specification mechanisms only use one of these representations, which is unfortunate because they offer complementaty information. The objective of this work is therefore to develop a method for verifying code compliance with user rules that can take benefit from both the AST and the CFG. The method is underpinned by a new logic we developed in this work : FO++ , which is a temporal extension of first-order logic. Relying on this logic has two advantages. First of all, it is independent of any programming language and has a formal semantics. Then, once instantiated for a given programming language, it can be used as a mean to formalize user provided properties. Finally, the study of its model-checking problem provides a mechanism for the automatic and correct verification of code compliance. These different concepts have been implemented in Pangolin, a tool for the C++ language. Given the code to be checked and a specification (which corresponds to an FO++ formula, written using Pangolin language), the tool indicates whether or not the code meets the specification. It also offers a summary of the evaluation in order to be able to find the code that violate the property as well as a certificate of the result correctness. Pangolin and FO++ have been applied to the field of robotics through the analysis of the quality of ROS packages and the formalization of a ROS-specific design-pattern. As a second and more general application to the development of programs in C++, we have formalized various good practice rules for this language. Finally, we have showed how it is possible to specify and verify rules that are closely related to a specific project by checking properties on the source code of Pangolin itself
APA, Harvard, Vancouver, ISO, and other styles
47

Jackson, Ugueto Eileen Elizabeth. "Progress towards the synthesis of gnidimacrin and its analogues synthesis of a highly advanced oxygen-rich intermediate incorporating the carbotricyclic core of gnidimacrin /." May be available electronically:, 2008. http://proquest.umi.com/login?COPT=REJTPTU1MTUmSU5UPTAmVkVSPTI=&clientId=12498.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Menezes, Josà Wally MendonÃa. "ImplementaÃao de porta lÃgicas Ãpticas com acoplador direcional nÃo linear triplo planar simÃtrico de fÃbras Ãpticas." Universidade Federal do CearÃ, 2006. http://www.teses.ufc.br/tde_busca/arquivo.php?codArquivo=189.

Full text
Abstract:
FundaÃÃo de Amparo à Pesquisa do Estado do CearÃ
Neste trabalho, portas lÃgicas Ãpticas sÃo propostas a partir da utilizaÃÃo de um acoplador direcional nÃo linear (NLDC) triplo planar simÃtrico de fibra Ãptica e com um dos guias operando como controle. Para tal fim, obtemos as caracterÃsticas de transmissÃo do acoplador e, em seguida, fizemos uma anÃlise do coeficiente de extinÃÃo e do fator de compressÃo. Inicialmente, investigamos o desempenho do acoplador proposto operando no regime CW e posteriormente utilizando pulsos ultracurtos, tipo sÃliton com 2ps de largura. Com o modelo proposto para o dispositivo, conseguimos efetivar portas lÃgicas AND, NAND, OR, XOR e NOT para um conjunto de fases aplicadas ao pulso de controle. As portas lÃgicas geradas com o dispositivo operando com sinais CW, apresentaram-se mais eficientes que as mesmas portas geradas com sinais pulsados.
In this work, optical logical gates are proposed starting from the use of a symmetric three-core nonlinear directional coupler (NLDC) of fiber optic and with one of the guides operating as control. For such end, we obtain the characteristics of transmission of the coupler and, soon afterwards, we made an analysis of the extinction ratio and of the compression factor. Initially, we investigated the acting of the proposed coupler operating in the regime CW and later using ultra short pulses, type sÃliton with 2ps of width. With the model proposed for the device, we got to execute logical gates AND, NAND, OR, XOR and NOT for a group of applied phases to the control pulse. The logical gates generated with the device operating with signs CW, they came more efficient than the same gates generated with soliton pulses.
APA, Harvard, Vancouver, ISO, and other styles
49

Menezes, José Wally Mendonça. "Implementação de porta lógicas ópticas com acoplador direcional não linear triplo planar simétrico de fíbras ópticas." reponame:Repositório Institucional da UFC, 2006. http://www.repositorio.ufc.br/handle/riufc/12468.

Full text
Abstract:
MENEZES, José Wally Mendonça. Implementação de porta lógicas ópticas com acoplador direcional não linear triplo planar simétrico de fíbras ópticas. 2006. 111 f. Dissertação (Mestrado em Física) - Programa de Pós-Graduação em Física, Departamento de Física, Centro de Ciências, Universidade Federal do Ceará, Fortaleza, 2006.
Submitted by Edvander Pires (edvanderpires@gmail.com) on 2015-05-25T22:18:49Z No. of bitstreams: 1 2006_dis_jwmmenezes.pdf: 2170108 bytes, checksum: 0615686d0a317fa8d05e85006ca06e1b (MD5)
Approved for entry into archive by Edvander Pires(edvanderpires@gmail.com) on 2015-05-27T18:52:06Z (GMT) No. of bitstreams: 1 2006_dis_jwmmenezes.pdf: 2170108 bytes, checksum: 0615686d0a317fa8d05e85006ca06e1b (MD5)
Made available in DSpace on 2015-05-27T18:52:06Z (GMT). No. of bitstreams: 1 2006_dis_jwmmenezes.pdf: 2170108 bytes, checksum: 0615686d0a317fa8d05e85006ca06e1b (MD5) Previous issue date: 2006
In this work, optical logical gates are proposed starting from the use of a symmetric three-core nonlinear directional coupler (NLDC) of fiber optic and with one of the guides operating as control. For such end, we obtain the characteristics of transmission of the coupler and, soon afterwards, we made an analysis of the extinction ratio and of the compression factor. Initially, we investigated the acting of the proposed coupler operating in the regime CW and later using ultra short pulses, type sóliton with 2ps of width. With the model proposed for the device, we got to execute logical gates AND, NAND, OR, XOR and NOT for a group of applied phases to the control pulse. The logical gates generated with the device operating with signs CW, they came more efficient than the same gates generated with soliton pulses.
Neste trabalho, portas lógicas ópticas são propostas a partir da utilização de um acoplador direcional não linear (NLDC) triplo planar simétrico de fibra óptica e com um dos guias operando como controle. Para tal fim, obtemos as características de transmissão do acoplador e, em seguida, fizemos uma análise do coeficiente de extinção e do fator de compressão. Inicialmente, investigamos o desempenho do acoplador proposto operando no regime CW e posteriormente utilizando pulsos ultracurtos, tipo sóliton com 2ps de largura. Com o modelo proposto para o dispositivo, conseguimos efetivar portas lógicas AND, NAND, OR, XOR e NOT para um conjunto de fases aplicadas ao pulso de controle. As portas lógicas geradas com o dispositivo operando com sinais CW, apresentaram-se mais eficientes que as mesmas portas geradas com sinais pulsados.
APA, Harvard, Vancouver, ISO, and other styles
50

Quintana, Joel. "Hybrid optical network using incoherent optical code division multiple access via optical delay lines." To access this resource online via ProQuest Dissertations and Theses @ UTEP, 2009. http://0-proquest.umi.com.lib.utep.edu/login?COPT=REJTPTU0YmImSU5UPTAmVkVSPTI=&clientId=2515.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography