Dissertations / Theses on the topic 'Formalisation de la connaissance'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Formalisation de la connaissance.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Chabot, Robert. "La nécessité d'une analyse sociologique des situations de travail préalablement au recueil des connaissances et à leur formalisation dans la base de connaissance d'un système expert." Aix-Marseille 1, 1994. http://www.theses.fr/1994AIX10008.
Full textWhen a company wants to develop an expert system, several problems of methodology occur concerning the clarification of knowledge prior to its formalization. To these knowledge-acquisition difficulties is added the problem of the expert system's organizational integration. What are the consequences on work organisation and on the contents of the tasks ? As a metter of fact, in artificial intelligence, the expertise corresponds to what we call "knows", i. E. The product of an interaction between the actor, the technical plan of action and his environment, whereas knowledge as information is the condition of "knows" development. Within the context of research in collaboration with a company developping an expert system, we tried to show in what way sociology, through its methods concerning analysing work situations and technnical practices, was fully capable of providing the knowledge-engineer with useful tools for assessment, localization and description of knows
Grosz, Georges. "Formalisation des connaissances reutilisables pour la conception des systemes d'information." Paris 6, 1991. http://www.theses.fr/1991PA066510.
Full textArioua, Abdallah. "Formalisation et étude des explications dialectiques dans les bases de connaissances incohérentes." Thesis, Montpellier, 2016. http://www.theses.fr/2016MONTT261/document.
Full textKnowledge bases are deductive databases where the machinery of logic is used to represent domain-specific and general-purpose knowledge over existing data. In the existential rules framework a knowledge base is composed of two layers: the data layer which represents the factual knowledge, and the ontological layer that incorporates rules of deduction and negative constraints. The main reasoning service in such framework is answering queries over the data layer by means of the ontological layer. As in classical logic, contradictions trivialize query answering since everything follows from a contradiction (ex falso quodlibet). Recently, inconsistency-tolerant approaches have been proposed to cope with such problem in the existential rules framework. They deploy repairing strategies on the knowledge base to restore consistency and overcome the problem of trivialization. However, these approaches are sometimes unintelligible and not straightforward for the end-user as they implement complex repairing strategies. This would jeopardize the trust relation between the user and the knowledge-based system. In this thesis we answer the research question: ``How do we make query answering intelligible to the end-user in presence of inconsistency?''. The answer that the thesis is built around is ``We use explanations to facilitate the understanding of query answering''. We propose meta-level and object-level dialectical explanations that take the form of a dialogue between the user and the reasoner about the entailment of a given query. We study these explanations in the framework of logic-based argumentation and dialectics and we study their properties and their impact on users
Schneider, Jean-Jacques. "Un système d'apprentissage numérique et symbolique pour la formalisation de connaissances prosodiques." Avignon, 1995. http://www.theses.fr/1995AVIG0113.
Full textDerras, Cédric. "Formalisation de l'imprécision informationnelle et des incertitudes décisionnelles des connaissances expertes pour la génération de processus de fabrication." Nancy 1, 1998. http://www.theses.fr/1998NAN10284.
Full textIphar, Clément. "Formalisation d'un environnement d'analyse des données basé sur la détection d'anomalies pour l'évaluation de risques : Application à la connaissance de la situation maritime." Thesis, Paris Sciences et Lettres (ComUE), 2017. http://www.theses.fr/2017PSLEM041/document.
Full textAt sea, various systems enable vessels to be aware of their environment and on the coast, those systems, such as radar, provide a picture of the maritime traffic to the coastal states. One of those systems, the Automatic Identification System (AIS) is used for security purposes (anti-collision) and as a tool for on-shore bodies as a control and surveillance and decision-support tool.An assessment of AIS based on data quality dimensions is proposed, in which integrity is highlighted as the most important of data quality dimensions. As the structure of AIS data is complex, a list of integrity items have been established, their purpose being to assess the consistency of the data within the data fields with the technical specifications of the system and the consistency of the data fields within themselves in a message and between the different messages. In addition, the use of additional data (such as fleet registers) provides additional information to assess the truthfulness and the genuineness of an AIS message and its sender.The system is weekly secured and bad quality data have been demonstrated, such as errors in the messages, data falsification or data spoofing, exemplified in concrete cases such as identity theft or vessel voluntary disappearances. In addition to message assessment, a set of threats have been identified, and an assessment of the associated risks is proposed, allowing a better comprehension of the maritime situation and the establishment of links between the vulnerabilities caused by the weaknesses of the system and the maritime risks related to the safety and security of maritime navigation
Paskevych, Andriy Verchinine Konstantin. "Méthodes de formalisation des connaissances et des raisonnements mathématiques aspects appliqués et théoriques /." [S.l.] : [s.n.], 2008. http://doxa.scd.univ-paris12.fr:80/theses/th0405882.pdf.
Full textPaskevych, Andriy. "Méthodes de formalisation des connaissances et des raisonnements mathématiques : aspects appliqués et théoriques." Paris 12, 2007. http://www.theses.fr/2007PA120071.
Full textWe study the means of presentation of mathematical knowledge and reasoning schemes. Our research aims at an automated system for verification of formalized mathematical texts. In this system, a text to verify is written in a formal language which is close to the natural language and style of mathematical publications. Our intention is to exploit the hint which are given to us by the "human" form of the problem : definitions, proof scemes, nouns denoting classes of objects, etc. We describe such a language, called ForTheL. Verification consists in showing that the text is "sensible" and "grounded", that functions and relations are applied within the domain, according to the definitions, and assertions follow from their respective premises. A formal definition of a correct text relies on a sound sequent calculus and on the notion of local validity (local with respect to some occurrence inside a formula). Proof search is carried out on two levels. The lower level is an automated theorem prover based on a combinatorial procedure. We introduce a variant of connection tableaux which is sound and complete in the first-order logic with equality. The higher level is a "reasoner" which employs natural proving techniques in order to filter, simplify, decompose a proof task before passing it to the prover. The algorithms of the rasoner are based on transformations that preserve the locally valid propositions. The proposed methods are implemented in the proof assistant SAD
Léger, Bertrand. "Recueil et Formalisation de procédés experts pour conduire une protection intégrée du vignoble." Phd thesis, Ecole nationale superieure agronomique de montpellier - AGRO M, 2008. http://tel.archives-ouvertes.fr/tel-00372383.
Full textNapoli, Aldo. "Formalisation et gestion des connaissances dans la modélisation du comportement des incendies de forêt." Phd thesis, Université de Nice Sophia-Antipolis, 2001. http://tel.archives-ouvertes.fr/tel-00532631.
Full textEude, Thibaut. "Forage des données et formalisation des connaissances sur un accident : Le cas Deepwater Horizon." Thesis, Paris Sciences et Lettres (ComUE), 2018. http://www.theses.fr/2018PSLEM079/document.
Full textData drilling, the method and means developed in this thesis, redefines the process of data extraction, the formalization of knowledge and its enrichment, particularly in the context of the elucidation of events that have not or only slightly been documented. The Deepwater Horizon disaster, the drilling platform operated for BP in the Gulf of Mexico that suffered a blowout on April 20, 2010, will be our case study for the implementation of our proof of concept for data drilling. This accident is the result of an unprecedented discrepancy between the state of the art of drilling engineers' heuristics and that of pollution response engineers. The loss of control of the MC 252-1 well is therefore an engineering failure and it will take the response party eighty-seven days to regain control of the wild well and halt the pollution. Deepwater Horizon is in this sense a case of engineering facing extreme situation, as defined by Guarnieri and Travadel.First, we propose to return to the overall concept of accident by means of an in-depth linguistic analysis presenting the semantic spaces in which the accident takes place. This makes it possible to enrich its "core meaning" and broaden the shared acceptance of its definition.Then, we bring that the literature review must be systematically supported by algorithmic assistance to process the data taking into account the available volume, the heterogeneity of the sources and the requirements of quality and relevance standards. In fact, more than eight hundred scientific articles mentioning this accident have been published to date and some twenty investigation reports, constituting our research material, have been produced. Our method demonstrates the limitations of accident models when dealing with a case like Deepwater Horizon and the urgent need to look for an appropriate way to formalize knowledge.As a result, the use of upper-level ontologies should be encouraged. The DOLCE ontology has shown its great interest in formalizing knowledge about this accident and especially in elucidating very accurately a decision-making process at a critical moment of the intervention. The population, the creation of instances, is the heart of the exploitation of ontology and its main interest, but the process is still largely manual and not without mistakes. This thesis proposes a partial answer to this problem by an original NER algorithm for the automatic population of an ontology.Finally, the study of accidents involves determining the causes and examining "socially constructed facts". This thesis presents the original plans of a "semantic pipeline" built with a series of algorithms that extract the expressed causality in a document and produce a graph that represents the "causal path" underlying the document. It is significant for scientific or industrial research to highlight the reasoning behind the findings of the investigation team. To do this, this work leverages developments in Machine Learning and Question Answering and especially the Natural Language Processing tools.As a conclusion, this thesis is a work of a fitter, an architect, which offers both a prime insight into the Deepwater Horizon case and proposes the data drilling, an original method and means to address an event, in order to uncover answers from the research material for questions that had previously escaped understanding
Troncy, Raphaël. "Formalisation des connaissances documentaires et des connaissances conceptuelles à l'aide d'ontologies : application à la description de documents audiovisuels." Phd thesis, Grenoble 1, 2004. http://tel.archives-ouvertes.fr/tel-00005263.
Full textMangin, Philippe. "Identification des paramètres clés du laminage transversal : vers la formalisation des connaissances scientifiques et technologiques." Phd thesis, Ecole nationale supérieure d'arts et métiers - ENSAM, 2012. http://pastel.archives-ouvertes.fr/pastel-00752476.
Full textPhilippe, Mangin. "Identification des paramètres clés du laminage transversal : vers la formalisation des connaissances scientifiques et technologiques." Phd thesis, Ecole nationale supérieure d'arts et métiers - ENSAM, 2012. http://tel.archives-ouvertes.fr/tel-00839532.
Full textFouquet, Dany. "Formalisation des connaissances d'usinage pour une intégration des logiciels de CFAO : application aux pièces structurales aéronautiques." Mémoire, École de technologie supérieure, 2009. http://espace.etsmtl.ca/44/1/FOUQUET_Dany.pdf.
Full textMasson, Cyrille. "Contribution au cadre des bases de données inductives : formalisation et évaluation des scénarios d'extraction de connaissances." Lyon, INSA, 2005. http://theses.insa-lyon.fr/publication/2005ISAL0042/these.pdf.
Full textThe success of database technologies has lead to an always increasing mass of collected information in different application fields. Knowledge Discovery in Databases (KDD) aims at going further in the querying processes on such data so as to find in these data some hidden knowledge materialized under the form of patterns. The Inductive Database (IDB) concept is a generalization of the database concept which integrates patterns and data in a common framework. A KDD process can thus be seen as an extended querying process on an IDB. This PhD. Thesis is about the formalization and the evaluation of KDD scenarios in the IDB framework. We first show how to use an abstract language for IDBs to formally describe extraction processes that can be performed by the user. We thus obtain a prototypical scenario, i. E. A theoritical object made of a sequence of inductive queries and on which it is possible to reason. Such a kind of scenario is useful to formalize processes when transfering expertise between final users and KDD experts. Another application of the concept of scenario is the evaluation on a common basis of different implementations of IDBs, similarly to existing benchmarks for databases. An evaluation scenario has the same form than a prototypical scenario, but it focuses more on algorithmic issues and optimization techniques for sequences of inductive queries. When computing an execution plan for such a scenario, the IDB system should analyze the properties of queries composing it, by discovering dependencies between them or conjunctions of constraints for which it is useful to have efficient extraction tools. Finally, we present an evaluation scenario in the field of bioinformatics, and we show how to solve it by using techniques developed in our group or especially designed for the need of this scenario
Gesbert, Nils. "Étude de la formalisation des spécifications de bases de données géographiques en vue de leur intégration." Université de Marne-la-Vallée, 2005. http://www.theses.fr/2005MARN0261.
Full textIntegrating them into a federated database system, by describing the precise data meaning in a way both homogeneous between databases and as formal as possible. This precise data meaning is contained in the databases’ content specifications (surveying rules). Method : The general organization of the present specifications follows that of the databases’ schemata, but these schemas are heterogeneous and influenced by implementation problems. To overcome this problem, we suppose that it will be possible to find, in the specifications’ text, a number of common terms referring to shared geographical concepts. All these concepts would constitute what is called a domain ontology. Our idea is not to create a complete ontology but rather a partial, ad hoc one, which would be extended to take new concepts into account as needed. The specifications would then be represented as a bundle of what we call representation procedures, which describe how, given a geographic entity (instance of some geographical concept), one or more representations of this entity are built up into the different databases depending on the nature and the properties of the entity. Thus these procedures describe the links between the ontology and the schemata of the databases. Results : For the example of hydrography in two different IGN databases, BDCarto and BDTopo, our hypothesis seems confirmed : a common ontology could rather easily be defined. Concerning the representation procedures, we were able to establish the main kinds of elementary rules from which they can be constructed. To describe formally these procedures, we then defined a formal language whose grammar has been described in BNF and is based on these elementary rules. Finally, we have made a software prototype, containing a parser for this language, for entering, saving and handling the formal specifications
El, Maarouf Ismaïl. "Formalisation de connaissances à partir de corpus : modélisation linguistique du contexte pour l'extraction automatique de relations sémantiques." Phd thesis, Université de Bretagne Sud, 2011. http://tel.archives-ouvertes.fr/tel-00657708.
Full textMaarouf, Ismaïl Mathieu El. "Formalisation de connaissances à partir de corpus : modélisation linguistique du contexte pour l'extraction automatique de relations sémantiques." Lorient, 2011. http://www.theses.fr/2011LORIL245.
Full textCorpora which are text collections selected for specific purposes, are playing an increasing role in Linguistics and Natural Language Processing (NLP). They are conceived as knowledge sources on natural language use, as much as knowledge on the entities designated by linguistic expressions, and they are used in particular to evaluate NLP application performances. The criteria prevailing on their constitution have an obvious, though still delicate to characterize, impact on (i) the major linguistic structures they contain, (ii) the knowledge conveyed, and, (iii) computational systems' success on a give task. This thesis studies methodologies of automatic extraction of semantic relations on written text corpora. Such a topic calls for a detailed examination of the context in which a given expression holds, as well as for the discovery of the features which determine its meaning, in order to be able to link semantic units. Generally, contextual models are built from the co-occurrence analysis of linguistic informations, drawn from resources and NLP tools. The benefits and limits of these informations are evaluated in a task of relation extraction from corpora belonging to different genres (press article, fairy tale, biography). The results show that these informations are insufficient to reach a satisfying semantic representation as well as to design robust systems. Two problems are particularly addressed. On the one hand, it seems indispensable to add informations related to text genre. So as to characterize the impact of genre on semantic relations, an automatic classification method, which relies on the semantic restrictions holding between verbs and nouns, is proposed. The method is experimented on a fairy tale corpus and on a press corpus. On the other hand, contextual models need to deal with problems which come under discourse surface variation. In a text, related linguistic expressions are not always close to one another and it is sometimes necessary to design complex algorithms in order to detect long dependencies. To answer this problem in a coherent manner, a method of discourse segmentation based on surface structure triggers in written corpora, is proposed. It paves the way for grammars operating on macro-syntactic categories in order to structure the discursive representation of a sentence. This method is applied prior to a syntactic analysis and its improvement is evaluated. The solutions proposed to these problems help us to approach Information Extraction from a particular angle : the implemented system is evaluated on a task of Named Entity correction in the context of a Question-Answering System. This specific need entails the alignment of a category definition on the type of answer expected by the question
Vincent, Béatrice. "Contribution à la représentation des connaissances en comptabilité : formalisation et développement d'un système auto-adaptatif de formation." Toulouse 1, 1994. http://www.theses.fr/1994TOU10033.
Full textThis research work contributes to accouting knowledge, by using a cognitive approach, that is to say by studying the constitution and transmission of knowledge in accounting. A conceptual reflection on teaching and learning process has led to develop a model of self-adaptive training system. Such a model relies on a hierarchical structure made up of three sub-systems : pedagogic, didactic and media-based. Each of these incorporates and employs decision processors able to analyse and diagnose situations that are still vague and not totally clear. Such decisions are based on creating models of accounting and of the learner's evalution. This conceptual part has been validated and enlarged by the development of a prototype training system called intelligen t tutoring system, which was tested with various student populations. Such system plays a significant part in the research method by validating and building back the model. The experimentation enhanced the contribution of these original systems to the present training structures and to the learning of accounting opened up by an approach of the field based on a problematic of constitution and transmission of knowledge
Pautret, Vincent. "D'une méthodologie de modélisation des connaissances du domaine à sa formalisation au sein d'un agent rationnel dialoguant." Rennes 1, 2001. http://www.theses.fr/2001REN10082.
Full textLe, Yaouanc Jean-Marie. "Formalisation de la description d'un environnement naturel. Application à la géo-localisation d'un individu." Phd thesis, Université de Bretagne occidentale - Brest, 2010. http://tel.archives-ouvertes.fr/tel-00529036.
Full textAbeille, Joël. "Vers un couplage des processus de conception de systèmes et de planification de projets : formalisation de connaissances méthodologiques et de connaissances métier." Thesis, Toulouse, INPT, 2011. http://www.theses.fr/2011INPT0051/document.
Full textThe work presented in this thesis deals with aiding system design, development project planning and its coupling. Aiding design and planning is based on the formalization of two kind of knowledge: methodological knowledge that can be used in all kind of design projects and business knowledge that are dedicated to a particular kind of design and/or planning. The first chapter presents a state of the art about coupling system design process and project planning process and gives the problem of our work. Then, two parts deal with design and planning coupling thanks to, on one hand, methodological knowledge, and on the other hand, business knowledge. The first part presents three types of methodological coupling. The structural coupling defines design and planning entities and permits its simultaneous creation of and its association. The informational coupling defines feasibility and verification attributes for these entities and synchronizes its attribute states. Finally, the decisional coupling consists in proposing, in a single dashboard, the necessary and sufficient information to make a decision by the design project actors. The second part proposes to formalize, to exploit and to capitalize business knowledge. This knowledge is formalized with ontology of concepts. Then, two mechanisms are exploited: a case reuse mechanism that permits to reuse and adapt former design projects and a constraint propagation mechanism that allows propagating decisions from design to planning and reciprocally
Charlot, Jean-Marc. "Formalisation et comparaison cognitives de modèles mentaux de novices et d'experts en situation de résolution de problèmes." Sherbrooke : Université de Sherbrooke, 1998.
Find full textBarrué, Jean-Pierre. "Fonctionnement des savoirs sur l'objet d'enseignement à l'intérieur du système didactique : étude, en éducation physique et sportive, de la diversité, fonctionnalité et complémentarité des savoirs sur l'objet athlétique." Toulouse 3, 1994. http://www.theses.fr/1994TOU30234.
Full textGouriveau, Rafael. "Analyse des risques : formalisation des connaissances et structuration des données pour l'intégration des outils d'étude et de décision." Toulouse, INPT, 2003. http://www.theses.fr/2003INPT035H.
Full textCharlot, Jean-Marc. "Formalisation et comparaison cognitives de modèles mentaux de novices et d'experts en situation de résolution de problèmes." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1998. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape11/PQDD_0004/NQ40514.pdf.
Full textSemeraro, Concetta. "Contribution à la formalisation d'invariants de modélisation de systèmes cyber-physiques, dirigés par les données." Electronic Thesis or Diss., Université de Lorraine, 2020. http://www.theses.fr/2020LORR0029.
Full textThe digital transformation of collaborative networked manufacturing enterprises requires the building and the applying digital models representing the set of resources and processes knowledge. Modelling such digital copy of the physical system to perform real-time validation and optimization is quite complex and thus needs a big amount of data and some modelling patterns representing the operational semantics of the modelled elements. Generally, the modelling action has a specific application type. For this reason, the core challenge of the digital transformation modelling is to create an invariant approach, namely a decomposable and re-composable model, towards different applications. This PhD thesis aims at identifying and formalising modelling constructs contributing at building informational and functional models for improving the sustainability of manufacturing processes and products based on networked components. The constructs will then allow representing knowledge and its deep relationship with the manufacturing processes. They make the shared knowledge more readily reusable and are at the basis of standardization efforts
Gateau, Thibault. "Supervision de mission pour une équipe de véhicules autonomes hétérogènes." Thesis, Toulouse, ISAE, 2012. http://www.theses.fr/2012ESAE0038/document.
Full textMany autonomous robots with specific control oriented architectures have already been developed worldwide.The advance of the work in this field has led researchers wonder for many years to what extent robots would be able to be integrated into a team consisting of autonomous and heterogeneous vehicles with complementary functionalities. However, robot cooperation in a real dynamic environment under unreliable communication conditions remains challenging, especially if these autonomous vehicles have different individual control architectures.In order to address this problem, we have designed a decision software architecture, distributed on each vehicle.This decision layer aims at managing execution and at increasing the fault tolerance of the global system. The mission plan is assumed to be hierarchically structured. ln case of failure detection, the plan repair is done as locally as possible, based on the hierarchical organization.This allows us to restrict message exchange only between the vehicles concerned by the repair process. Knowledge formalisation is also a part of the study permitting the improvement of interoperability between team members. It also provides relevant information all along mission execution, from initial planning computation to plan repair in this multirobot context. The feasibility of the system has been evaluated by simulations and real experiments thanks to the Action project (http://action.onera.fr/welcome/)
Fayemi, Pierre-Emmanuel. "Innovation par la conception bio-inspirée : proposition d'un modèle structurant les méthodes biomimétiques et formalisation d'un outil de transfert de connaissances." Thesis, Paris, ENSAM, 2016. http://www.theses.fr/2016ENAM0062/document.
Full textBiomimetics applies principles and strategies which stem from biological systems in order to facilitate technological design. Providing a high innovation potential, biomimetics could become a key process for various business. However, there are still a few challenges to overcome in order for the bioinspired design to become a sustainable approach. The work which has been carried out addresses this bioinspired design diffusion with two distinct focuses. First of all, they tend to standardize conceptual fields for bio-inspiration and biomimetic process models to enable the evaluation of tools supporting said design process. This methodological assessment, addressed from an objective and subjective point of view, results in the formalization of a structuring model, a classification tree which guides designers through the biomimetic process. Alongside the development of this methodological reference framework establishment, the work tends to overcome another obstacle of the bioinspired design implementation which is the interaction between biology and engineering. By developing a specific tool, the research studies offer a model which functionally describes biological systems without biological expertise prerequisites. The concatenation of these accomplishments addresses the main issue of these disciplinary fields: its development through the dissemination of its application to industrial innovation, in order to encourage the emergence of “biomimetic products” at the expense of “bio-inspired accidents”
Sarmiento, Lozano Camilo. "Formalisation des raisonnements éthiques : modélisation des processus en éthique et modélisation, représentation et automatisation du raisonnement causal." Electronic Thesis or Diss., Sorbonne université, 2024. http://www.theses.fr/2024SORUS047.
Full textThis thesis is in the field of computational ethics, which aims to formalise ethical reasoning. In other words, this work is part of the field that seeks to emulate our capacity as rational beings to morally evaluate an action. The formalisation of this reasoning has two objectives: to better understand it and to integrate it into computer systems to ensure that decisions made comply with chosen moral principles.This thesis makes a contribution to the field in two ways. Firstly, it proposes a common framework for formalising faithfully the most common moral principles in Western philosophy. This first contribution can be summarised as 'modelling ethical processes'. The second set of contributions pertains to the proposal for formalising causal reasoning. This formalisation not only enhances our comprehension of this reasoning but also enables its integration into computer systems, facilitating the establishment of complex causal relationships. This capability is crucial for formalising a wide range of moral principles. To ensure that our proposal can formalise all these moral principles, we have designed it to satisfy a number of conditions. Firstly, our formalisation is based on a formalism that explicitly addresses the subtleties of problems related to both causal and ethical reasoning. Secondly, our formalism's definition of causality free of any confusion with the notion of responsibility. Otherwise, it would not be common to formalise all moral principles. Finally, our proposal can handle all causal cases, including the most complex. The second group of contributions focuses on 'modelling, representing and automating causal reasoning'. The main contributions of this thesis belong to this second group
Muro, Amuchastegui Naiara. "Développement d'un système avancé d'aide à la décision clinique : enrichir la connaissance issue des guides de pratique clinique avec l'expérience." Thesis, Sorbonne université, 2019. http://www.theses.fr/2019SORUS266.
Full textEvidence-Based Medicine has been formalized as Clinical Practice Guidelines, which define workflows and recommendations to be followed for a given clinical domain. These documents were formalized aiming to standardize healthcare and seeking the best patient outcomes. Nevertheless, clinicians do not adhere as expected to these guidelines due to several clinical and implementation limitations. On one hand, clinicians do not feel familiar, agree with and or are unaware of guidelines, hence doubting their self-efficacy and outcome expectancy compared to previous or more common practices. On the other hand, maintaining these guidelines updated with the most recent evidence requires continuous versioning of these paper-based documents. Clinical Decision Support Systems are proposed to help during the clinical decision-making process with the computerized implementation of the guidelines to promote their easy consultation and increased compliance. Even if these systems help improving guideline compliance, there are still some barriers inherited from paper-based guidelines that are not solved, such as managing complex cases not defined within the guidelines or the lack of representation of other external factors that may influence the provided treatments, biasing from guidelines’ recommendations (i.e. patient preferences). Retrieving observational data and patients’ quality of life outcomes related to the provided healthcare during routine clinical practice could help to identify and overcome these limitations and would generate Real World Data representing the real population and going beyond the limitations of the knowledge reported in the Randomized Clinical Trials. This thesis proposes an advanced Clinical Decision Support System for coping with the purely guideline-based support limitations and going beyond the formalized knowledge by analyzing the clinical data, outcomes, and performance of all the decisions made over time. To achieve these objectives, an approach for modeling the clinical knowledge and performance in a semantically validated and computerized way has been presented, leaning on an ontology and the formalization of the Decisional Event concept. Moreover, a domain-independent framework has been implemented for easing the process of computerizing, updating and implementing Clinical Practice Guidelines within a Clinical Decision Support System in order to provide clinical support for any queried patient. For addressing the reported guideline limitations, a methodology for augmenting the clinical knowledge using experience has been presented along with some clinical performance and quality evaluation over time, based on different studied clinical outcomes, such as the usability and the strength of the rules for evaluating the clinical reliability behind the formalized clinical knowledge. Finally, the accumulated Real World Data was explored to support future cases, promoting the study of new clinical hypotheses and helping in the detection of trends and patterns over the data using visual analytics tools. The presented modules had been developed and implemented in their majority within the European Horizon 2020 project DESIREE, in which the use case was focused on supporting Breast Units during the decision-making process for Primary Breast Cancer patients management, performing a technical and clinical validation over the presented architecture, whose results are presented in this thesis. Nevertheless, some of the modules have been also used in other medical domains such as Gestational Diabetes guidelines development, highlighting the interoperability and flexibility of the presented work
Bévo, Wandji Évariste Valéry. "Analyse et formalisation ontologique des procédures de mesure associées aux méthodes de mesure de la taille fonctionnelle des logiciels : de nouvelles perspectives pour la mesure /." Montréal : Université du Québec à Montréal, 2005. http://accesbib.uqam.ca/cgi-bin/bduqam/transit.pl?&noMan=24144802.
Full textEn tête du titre: Université du Québec à Montréal. Bibliogr.: f. [172]-186. Publié aussi en version électronique.
Abadie, Nathalie. "Formalisation, acquisition et mise en œuvre de connaissances pour l'intégration virtuelle de bases de données géographiques : les spécifications au cœur du processus d'intégration." Phd thesis, Université Paris-Est, 2012. http://tel.archives-ouvertes.fr/tel-00794395.
Full textAbadie, Nathalie. "Formalisation, acquisition et mise en œuvre de connaissances pour l’intégration virtuelle de bases de données géographiques : les spécifications au cœur du processus d’intégration." Thesis, Paris Est, 2012. http://www.theses.fr/2012PEST1054/document.
Full textThis PhD thesis deals with topographic databases integration. This process aims at facilitating the use of several heterogeneous databases by making the relationships between them explicit. To automatically achieve databases integration, several aspects of data heterogeneity must be detected and solved. Identifying heterogeneities between topographic databases implies comparing some knowledge about their respective contents. Therefore, we propose to formalise and acquire this knowledge and to use it for topographic databases integration. Our work focuses on the specific problem of topographic databases schema matching, as a first step in an integration application. To reach this goal, we propose to use a specific knowledge source, namely the databases specifications, which describe the data implementing rules. Firstly, they are used as the main resource for the knowledge acquisition process in an ontology learning application. As a first approach for schema matching, the domain ontology created from the texts of IGN's databases specifications is used as a background knowledge source in a schema matching application based on terminological and structural matching techniques. In a second approach, this ontology is used to support the representation, in the OWL 2 language, of topographic entities selection and geometry capture rules described in the databases specifications. This knowledge is then used by a reasoner in a semantic-based schema matching application
Thiefaine, Arnaud. "Caractérisation et formalisation du processus de développement construit autour du paradigme de l' ingénierie des modèles : vers une ingénierie des modèles d' ordre 2." Paris 6, 2005. http://www.theses.fr/2005PA066251.
Full textSchmeltzer, Olivier. "Modélisation de cartes génomiques : une formalisation et un algorithme de construction fondé sur le raisonnement temporel." Phd thesis, Université Joseph Fourier (Grenoble), 1995. http://tel.archives-ouvertes.fr/tel-00005062.
Full textReynaud, Christian. "Contribution à la formalisation et à la communication d'un concept d'écologie des milieux littoraux : les ecosystèmes paraliques - interprétation épistémologique et propositions didactiques." Montpellier 2, 1997. http://www.theses.fr/1997MON20039.
Full textCazalens, Sylvie. "Formalisation en logique non standard de certaines méthodes de raisonnement pour fournir des réponses coopératives, dans des systèmes de bases de données et de connaissances." Toulouse 3, 1992. http://www.theses.fr/1992TOU30172.
Full textLéger, Aurélie. "Contribution à la formalisation unifiée des connaissances fonctionnelles et organisationnelles d'un système industriel en vue d'une évaluation quantitative des risques et de l'impact des barrières envisagées." Phd thesis, Université Henri Poincaré - Nancy I, 2009. http://tel.archives-ouvertes.fr/tel-00417164.
Full textLéger, Aurélie Iung Benoît. "Contribution à la formalisation unifiée des connaissances fonctionnelles et organisationnelles d'un système industriel en vue d'une évaluation quantitative des risques et de l'impact des barrières envisagées." S. l. : S. n, 2009. http://www.scd.uhp-nancy.fr/docnum/SCD_T_2009_0058_LEGER.pdf.
Full textRuin, Thomas. "Contribution à la quantification des programmes de maintenance complexes." Electronic Thesis or Diss., Université de Lorraine, 2013. http://www.theses.fr/2013LORR0202.
Full textTo face with new legislatives and environmental contexts in which they have to operate, it is needed now that the industrials systems have to satisfy to many different requirements and constraints. Thus, these requirements are not only conventional ones such as availability and costs, but also emergent ones such as safety and sustainability. This report implies for the specific French company EDF (energy power supplier) to evolve from its usual approach of reliability centered maintenance (RCM) to a new approach. It is consisting mainly in developing a tool able to support the Complex Maintenance Programs Quantification (CMPQ). This Ph.D. is dealing with this the engineering and deployment of this tool in the frame of the GIS 3SGS - DEPRADEM 2 project. The first step of the work is to generalize EDF needs, then to propose a framework enabling to identify required generic knowledge needed to assess the Key Performances Indicators (KPIs) for supporting quantification. The next step is to model the generic knowledge in two complementary ways: a formalization of the static, interactional and behavioral knowledge based on different SysML diagrams; and a formalization of the dynamic and executable knowledge formalized by AltaRicaDF (ADF) language, allowing to perform stochastic simulation and to assess required KPIs. The path to elaborate dynamic executable vision from SysML diagrams is released by means of rules between each element of interest of both languages. All this approach/tool is applied to a specific EDF case study: the ARE system
Ruin, Thomas. "Contribution à la quantification des programmes de maintenance complexes." Phd thesis, Université de Lorraine, 2013. http://tel.archives-ouvertes.fr/tel-00944825.
Full textPotes, Ruiz Paula Andrea. "Génération de connaissances à l’aide du retour d’expérience : application à la maintenance industrielle." Thesis, Toulouse, INPT, 2014. http://www.theses.fr/2014INPT0089/document.
Full textThe research work presented in this thesis relates to knowledge extraction from past experiences in order to improve the performance of industrial process. Knowledge is nowadays considered as an important strategic resource providing a decisive competitive advantage to organizations. Knowledge management (especially the experience feedback) is used to preserve and enhance the information related to a company’s activities in order to support decision-making and create new knowledge from the intangible heritage of the organization. In that context, advances in information and communication technologies play an essential role for gathering and processing knowledge. The generalised implementation of industrial information systems such as ERPs (Enterprise Resource Planning) make available a large amount of data related to past events or historical facts, which reuse is becoming a major issue. However, these fragments of knowledge (past experiences) are highly contextualized and require specific methodologies for being generalized. Taking into account the great potential of the information collected in companies as a source of new knowledge, we suggest in this work an original approach to generate new knowledge based on the analysis of past experiences, taking into account the complementarity of two scientific threads: Experience Feedback (EF) and Knowledge Discovery techniques from Databases (KDD). The suggested EF-KDD combination focuses mainly on: i) modelling the experiences collected using a knowledge representation formalism in order to facilitate their future exploitation, and ii) applying techniques related to data mining in order to extract new knowledge in the form of rules. These rules must necessarily be evaluated and validated by experts of the industrial domain before their reuse and/or integration into the industrial system. Throughout this approach, we have given a privileged position to Conceptual Graphs (CGs), knowledge representation formalism chosen in order to facilitate the storage, processing and understanding of the extracted knowledge by the user for future exploitation. This thesis is divided into four chapters. The first chapter is a state of the art addressing the generalities of the two scientific threads that contribute to our proposal: EF and KDD. The second chapter presents the EF-KDD suggested approach and the tools used for the generation of new knowledge, in order to exploit the available information describing past experiences. The third chapter suggests a structured methodology for interpreting and evaluating the usefulness of the extracted knowledge during the post-processing phase in the KDD process. Finally, the last chapter discusses real case studies dealing with the industrial maintenance domain, on which the proposed approach has been applied
Laloix, Thomas. "Méthodologie d’élaboration d’un bilan de santé de machines de production pour aider à la prise de décision en exploitation : application à un centre d’usinage à partir de la surveillance des composants de sa cinématique." Thesis, Université de Lorraine, 2018. http://www.theses.fr/2018LORR0291/document.
Full textThis PhD work has been initiated by Renault, in collaboration with Nancy Research Centre in Automatic Control (CRAN), with the aim to propose the foundation of a generic PHM-based methodology leading to machine health check regarding machine-product joint consideration and facing industrial requirements. The proposed PHM-based methodology is structured in five steps. The two first steps are developed in this PhD work and constitute the major contributions. The first originality represents the formalization of machine-product relationship knowledge based on the extension of well-known functioning/dysfunctioning analysis methods. The formalization is materialized by means of meta-modelling based on UML (Unified Modelling Language). This contribution leads to the identification of relevant parameters to be monitored, from component up to machine level. These parameters serve as a basis of the machine health check elaboration. The second major originality of the thesis aims at the definition of health check elaboration principles from the previously identified monitoring parameters and formalized system knowledge. Elaboration of such health indicators is based on Choquet integral as aggregation method, raising the issue of capacity identification. In this way, it is proposed a global optimization model of capacity identification according to system multi-level, by the use of Genetic Algorithms. Both contributions are developed with the objective to be generic (not only oriented on a specific class of equipment), according to industrial needs. The feasibility and the interests of such approach are shown on the case of machine tool located in RENAULT Cléon Factory
Traore, Lamine. "Semantic modeling of an histopathology image exploration and analysis tool." Thesis, Paris 6, 2017. http://www.theses.fr/2017PA066621/document.
Full textSemantic modelling of a histopathology image exploration and analysis tool. Recently, anatomic pathology (AP) has seen the introduction of several tools such as high-resolution histopathological slide scanners, efficient software viewers for large-scale histopathological images and virtual slide technologies. These initiatives created the conditions for a broader adoption of computer-aided diagnosis based on whole slide images (WSI) with the hope of a possible contribution to decreasing inter-observer variability. Beside this, automatic image analysis algorithms represent a very promising solution to support pathologist’s laborious tasks during the diagnosis process. Similarly, in order to reduce inter-observer variability between AP reports of malignant tumours, the College of American Pathologists edited 67 organ-specific Cancer Checklists and associated Protocols (CAP-CC&P). Each checklist includes a set of AP observations that are relevant in the context of a given organ-specific cancer and have to be reported by the pathologist. The associated protocol includes interpretation guidelines for most of the required observations. All these changes and initiatives bring up a number of scientific challenges such as the sustainable management of the available semantic resources associated to the diagnostic interpretation of AP images by both humans and computers. In this context, reference vocabularies and formalization of the associated knowledge are especially needed to annotate histopathology images with labels complying with semantic standards. In this research work, we present our contribution in this direction. We propose a sustainable way to bridge the content, features, performance and usability gaps between histopathology and WSI analysis
Liao, Yongxin. "Annotations sémantiques pour l'interopérabilité des systèmes dans un environnement PLM." Phd thesis, Université de Lorraine, 2013. http://tel.archives-ouvertes.fr/tel-00904822.
Full textLaloix, Thomas. "Méthodologie d’élaboration d’un bilan de santé de machines de production pour aider à la prise de décision en exploitation : application à un centre d’usinage à partir de la surveillance des composants de sa cinématique." Electronic Thesis or Diss., Université de Lorraine, 2018. http://www.theses.fr/2018LORR0291.
Full textThis PhD work has been initiated by Renault, in collaboration with Nancy Research Centre in Automatic Control (CRAN), with the aim to propose the foundation of a generic PHM-based methodology leading to machine health check regarding machine-product joint consideration and facing industrial requirements. The proposed PHM-based methodology is structured in five steps. The two first steps are developed in this PhD work and constitute the major contributions. The first originality represents the formalization of machine-product relationship knowledge based on the extension of well-known functioning/dysfunctioning analysis methods. The formalization is materialized by means of meta-modelling based on UML (Unified Modelling Language). This contribution leads to the identification of relevant parameters to be monitored, from component up to machine level. These parameters serve as a basis of the machine health check elaboration. The second major originality of the thesis aims at the definition of health check elaboration principles from the previously identified monitoring parameters and formalized system knowledge. Elaboration of such health indicators is based on Choquet integral as aggregation method, raising the issue of capacity identification. In this way, it is proposed a global optimization model of capacity identification according to system multi-level, by the use of Genetic Algorithms. Both contributions are developed with the objective to be generic (not only oriented on a specific class of equipment), according to industrial needs. The feasibility and the interests of such approach are shown on the case of machine tool located in RENAULT Cléon Factory
Ferreirone, Mariano. "Extraction and integration of constraints in multiple data sources using ontologies and knowledge graphs." Electronic Thesis or Diss., Université de Lorraine, 2025. http://www.theses.fr/2025LORR0013.
Full textThis thesis explores the introduction of Shapes Constraint Language graphs in semantic environments which present heterogeneous sources and different context requirements. This research presents a proposal for the enrichment of Semantic Web based systems, providing benefits for several domains such as the Industry 4.0. The thesis starts with a wide review of the current works related to the validation of semantic constraints on knowledge representation models. Based on a systematic literature review, a definition of a taxonomy which describes the related types of works is proposed. The open challenges related to the creation of shape graphs and their inclusion in existing semantic environments are highlighted. The needs for a shape graph representation which is able to attend different contexts and for the integration of shape graphs stand out. Based on the Shapes Constraint Language standards, a semantic restriction model which represents groups of shapes which share target and could hold inter-shape conflicts is presented. A pre-validation configuration process to activate the model's sub-graph that best fits the current context is described. Moreover, an approach for the integration of these graphs which belongs to a common environment is proposed. The integration procedure resolves the constraint conflicts with the specialization of shapes. A practical use case based on the French Ski School demonstrates the usability of the proposed contributions. Evaluations of the correctness and consistency of the generated shape graph are carried out. The implemented procedures' performance is also evaluated. The thesis concludes by summarizing the contributions and suggesting future research directions to further improve the integration and representation of Shapes Constraint Language graphs
Chevrou, Florent. "Formalisation of asynchronous interactions." Phd thesis, Toulouse, INPT, 2017. http://oatao.univ-toulouse.fr/19493/1/CHEVROU_Florent.pdf.
Full text