Dissertations / Theses on the topic 'Gestion des données (systèmes d'information)'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Gestion des données (systèmes d'information).'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Pollet, Yann. "Un système de gestion de bases de données opérationnelles pour les systèmes d'information et de communication." Rouen, 1995. http://www.theses.fr/1995ROUES047.
Full textPeerbocus, Mohamed Ally. "Gestion de l'évolution spatiotemporelle dans une base de données géographiques." Paris 9, 2001. https://portail.bu.dauphine.fr/fileviewer/index.php?doc=2001PA090055.
Full textHajji, Hicham. "Gestion des risques naturels : une approche fondée sur l'intégration des données." Lyon, INSA, 2005. http://theses.insa-lyon.fr/publication/2005ISAL0039/these.pdf.
Full textThere is a huge geographic data available with many organizations collecting geographic data for centuries, but some of that is still in the form of paper maps or in traditional files or databases, and with the emergence of latest technologies in the field of software and data storage some has been digitized and is stored in latest GIS systems. However, too often their reuse for new applications is a nightmare, due to diversity of data sets, heterogeneity of existing systems in terms of data modeling concepts, data encoding techniques, obscure semantics of data,storage structures, access functionality, etc. Such difficulties are more common in natural hazards information systems. In order to support advanced natural hazards management based on heterogeneous data, this thesis develops a new approach to the integration of semantically heterogeneous geographic information which is capable of addressing the spatial and thematic aspects of geographic information. The approach is based on OpenGIS standard. It uses it as a common model for data integration. The proposed methodology takes into consideration a large number of the aspects involved in the construction and the modelling of natural hazards management information system. Another issue has been addressed in this thesis, which is the design of an ontology for natural hazards. The ontology design has been extensively studied in recent years, we have tried throughout this work to propose an ontology to deal with semantic heterogeneity existing between different actors and to model existing knowledge present for this issue. The ontology contains the main concepts and relationships between these concepts using OWL Language
Petiot, Guillaume. "Fusion d'informations symboliques et de données numériques pour la gestion des crues." Toulouse 3, 2005. http://www.theses.fr/2005TOU30291.
Full textBoumediene, Mohamed Salah. "Définition d'un système générique de partage de données entre systèmes existants." Lyon, INSA, 2005. http://theses.insa-lyon.fr/publication/2005ISAL0125/these.pdf.
Full textMy thesis deals with the database integration problems and the confidentiality of the exchanged data. My aim, however, is to solve the problems related to the mediator schema creation. We proposed a solution which will generate a global view of the different databases by reducing, considerably, the manual interventions. To achieve this, we will describe, at the beginning, each schema using ontologic terms. This description will create for each database an XML file which will be used ,then, for the creation of mediator schema and the matching rules. In order to process the mediator schema, we created a mediator that allows users to query the different databases trough the global view. To lighten the data input process, we used the DRUID system which allows users to input their data under the form of files which will be processed, then, to populate the databases. To handle the confidentiality of the data entry and access, however, we proposed the use of (DTD) documents models and files to each type of user's profil, whether, for writing or reading files. These DTD are generated, automatically, from the database schema and then modified, for each user type according to their rights on the database. Our solution was applied in the medical domain through the consulting of a distributed medical file
Tea, Céline. "Retour d'expérience et données subjectives : quel système d'information pour la gestion des risques ?" Phd thesis, Paris, ENSAM, 2009. http://pastel.archives-ouvertes.fr/pastel-00005574.
Full textLevy, Philippe. "Développement de produits en ingénierie concourante : système d'information pour la gestion des problèmes d'interfaces." Aix-Marseille 3, 2000. http://www.theses.fr/2000AIX30057.
Full textThe development of complex products in a concurrent engineering approach implies the decompsition of the products in subsets then their integration and the coordination of the teams responsible for the subsests. However, many imperfections in the coordination of these teams come out during the advanced phases of development onto tecnical problems called interfaces problems, penalizing in term of cost and times. We could define the origin of the coordination problems and we noted the current lack of approach to contribute to the management of the interfaces. It then appeared significant to be interested in the analysis of the information requirements, the transactions and the communication necessary so that the teams can make the good decisions. .
Dubois, Gilles. "Apport de l'intelligence artificielle à la coopération de systèmes d'information automatisée." Lyon 3, 1997. http://www.theses.fr/1997LYO33004.
Full textRecent advances in distributed systems, computer network and database technology have changed the information processing needs of organizations. Current information systems should integrate various heterogeneous sources of data and knowledge according to distributed logical and physical requirements. An automated information system is perceived as a set of autonomous components which work in a synergistic manner by exchanging information expertise and coordinating their activities. In order for this exchange to be judicious, the individual systems must agree on the meaning of their exchanged information to solve conflicts due to heterogeneity. We have chosen an object oriented model as canonical model. The object model overcomes component heterogeneity and respects the autonomy of local systems in a distributed context. The cooperation structure uses artificial intelligence techniques to solve both structural and semantic conflicts. A dynamic description of information sources deals with local evolution and is involved in global queries treatment. An extension of the proposal exploits agents interactions to bring cognitive capabilities to the cooperation structure. The contribution of multi-agent systems to information system cooperation is argued. Technical choices to implement a prototype in an object oriented environment are described
Meyer, Elise. "Acquisition 3D, documentation et restitution en archéologie : proposition d'un modèle de Système d'Information dédié au patrimoine." Thesis, Nancy 1, 2007. http://www.theses.fr/2007NAN10109/document.
Full textThe documentation of archaeological heritage is an activity that evolves with the development of the New Information and Communication Technologies (NICT). Traditionally associated with recording, the documentation of an archeological site is also today synonym of publication, because it can be disseminated on-line both to other professionals and to the general public. This PhD thesis proposes a model of Information System dedicated to the documentation on Internet of patrimonial sites. It allows to record, to manage and to represent traditional documents, data coming from bi-and three-dimensional surveys, but also results of restitution and imagery works. At first, the study establishes a state of the art that allows knowing the current means and the needs of the heritage professionals in terms of conservation, visualization and publication of their data. Then, our approach considers these preoccupations to define the features of the Information System that we propose. On the basis of examples stemming from the Luxemburg heritage (the Castle of Vianden and the Villa of Echternach), we describe the way we keep the data and the associated metadata, as well as the tools developed for the representation of this information. We also present our principles of data management, based on the spatiotemporal connections that may exist between the various documents. These connections allow us to propose the use of bi-dimensional graphics or three-dimensional models as privileged supports of navigation and interaction with all other preserved documents. A global modeling of the Information System, being able to serve as a metamodel of system of on-line documentation, allows us finally to open our scope to other domains like architecture or civil engineering
El, Khalkhali Imad. "Système intégré pour la modélisation, l'échange et le partage des données de produits." Lyon, INSA, 2002. http://theses.insa-lyon.fr/publication/2002ISAL0052/these.pdf.
Full textIn Virtual Enterprise and Concurrent Engineering environments, a wide variety of information is used. A crucial issue is the data communication and exchange between heterogeneous systems and distant sites. To solve this problem, the STEP project was introduced. The STandard for the Exchange of Product model data STEP is an evolving international standard for the representation and exchange of product data. The objective of STEP is to provide the unambiguous computer-interpretable representation of product data in all phases of the product’s lifecycle. In a collaborative product development different types of experts in different disciplines are concerned by the product (Design, Manufacturing, Marketing, Customers,. . . ). Each of these experts has his own viewpoint about the same product. STEP Models are unable to represent the expert’s viewpoints. The objective of our research work is to propose a methodology for representation and integration of different expert’s viewpoints in design and manufacturing phases. An Information Infrastructure for modelling, exchanging and sharing product data models is also proposed
Cyril, Labbé. "Le sens au coeur des systèmes d'information." Habilitation à diriger des recherches, Université de Grenoble, 2010. http://tel.archives-ouvertes.fr/tel-00809360.
Full textFollin, Jean-Michel. "Gestion incrémentale de données multi-résolutions dans un système mobile de visualisation d'informations géographiques." La Rochelle, 2004. http://www.theses.fr/2004LAROS131.
Full textWe propose a solution for presentation and management of vector multiresolution geodata by taking into account constraints related to mobile context (limitations of storage, displaying capacities and transfer rate). Our solution provides users with the LoD ("Level of Detail") in adequation with scale by respecting the well-known "principle of constant density of data". The amount of data exchanged between client and server is minimized in our system by reusing already locally available data when possible. Increment corresponds to an operation sequence allowing reconstruction of LoD of an object from an available LoD of the same object on the client side. Transferring only increment appears more interesting than downloading an "entire" LoD object. We present models of multi-resolution data and transfer, and principles allowing an incremental management of data in a mobile geodata visualization system. Then we prove interest of our multi-resolution strategy in relation to mono-resolution one
Souris, Marc. "La construction d'un système d'information géographique : principes et algorithmes du système Savane." La Rochelle, 2002. http://www.theses.fr/2002LAROS087.
Full textThis thesis present a work in computer sciences and software development. This purpose is to try to give an answer to the question : " How to build a full geographic information system following the principles of database management adapting it to geographical data ? ". We try to show with the full example of the Savane system how general theory of geographical data and algorithms in computational geometry may be use to build a GIS software. This work is part of a research program from the IRD (Institut de Recherche pour le Développement). The thesis expose all the architecture, methods and algorithms of the system, trying to explain all the options of the system building, in the different areas : definition and utilization of geographical information ; principles of database management systems and extension to geographical data ; algorithms to use to the implementation of this principles in an information system ; construction of an operational system build from the theoretical principles and functional requirements for the use in projects in geography and research for the development
Léveillé, Valérie. "De l'organisation des données dans les systèmes d'information. Réalisation d'un outil de gestion de données hétérogènes et formelles appliqué à la veille technologique." Aix-Marseille 3, 2000. http://www.theses.fr/2000AIX30003.
Full textBoukhari, Sanâa. "Les facteurs explicatifs du comportement de contribution aux systèmes de gestion des connaissances intégratifs : le cas des bases électroniques de connaissances." Aix-Marseille 2, 2008. http://www.theses.fr/2008AIX24017.
Full textThis thesis aims at studying the knowledge sharing behavior through Knowledge Management Systems (KMS). More specifically, it intends to provide an explanation of contribution behavior to Electronic Knowledge Repositories (EKR). The literature review has allowed to clarify and define key concepts used in this work, and has enabled us to propose and discuss a conceptual framework for the development of our research model. Following a hypothetico-deductive reasoning, a explanatory model of contribution behavior to EKR is proposed. The model included two dependent variables “Knowledge sharing” and “KMS use”, nine independent variables namely motivational, cultural, relational, technological and informational, as well as six moderating variables. The model was later tested with 388 individuals belonging to two companies i. E. A world leader in food industry and a global company specializing in technology solutions. The data collected was analyzed using exploratory and confirmatory analyses through the structural equation modeling techniques. The results showed several factors determining the knowledge sharing behavior through EKR. These results enriched previous studies about knowledge sharing and provided managers an explanation about contribution behavior to EKR
Jugnet, Albane. "Evaluation des performances des données stratégiques non exprimées en termes de cout : une recherche à partir du cas de l'industrie de l'automobile." Paris 9, 2001. https://portail.bu.dauphine.fr/fileviewer/index.php?doc=2001PA090012.
Full textFoissac, Véronique. "Conception et protection du système de gestion de données personnelles d'un établissement de santé." Montpellier 1, 2003. http://www.theses.fr/2003MON10002.
Full textHsu, Lung-Cheng. "Pbase : une base de données déductive en Prolog." Compiègne, 1988. http://www.theses.fr/1988COMPD126.
Full textThis thesis describes a relational database system coupling PROLOG II and VAX RMS (Record Management Services). The SQL-like DDL (Data Definition Language) and DML (Data Manipulation Language) are implemented in PROLOG and the management of storage and research of fact record is delegated to RMS. The indexed file organization is adopted to provide a satisfactory response time. An interface written in PASCAL is called to enable the communication between PROLOG and RMS. Once the interface is established, access to the database is transparent. No precompilation is requiert. PBASE can be used as a general DBMS or it can cooperate with an expert system (Our SQL translation module can be considered as such) to manage the voluminous facts stored in the secondary memory. It can also cooperate with VAX RDB (Relational DataBase) to constitute a powerful deductive database. Although PBASE works for normalized relations as well as non-normalized ones, a normalization module is included to avoid the problems caused by the redundancy of data
Castelltort, Arnaud. "Historisation de données dans les bases de données NoSQLorientées graphes." Thesis, Montpellier 2, 2014. http://www.theses.fr/2014MON20076.
Full textThis thesis deals with data historization in the context of graphs. Graph data have been dealt with for many years but their exploitation in information systems, especially in NoSQL engines, is recent. The emerging Big Data and 3V contexts (Variety, Volume, Velocity) have revealed the limits of classical relational databases. Historization, on its side, has been considered for a long time as only linked with technical and backups issues, and more recently with decisional reasons (Business Intelligence). However, historization is now taking more and more importance in management applications.In this framework, graph databases that are often used have received little attention regarding historization. Our first contribution consists in studying the impact of historized data in management information systems. This analysis relies on the hypothesis that historization is taking more and more importance. Our second contribution aims at proposing an original model for managing historization in NoSQL graph databases.This proposition consists on the one hand in elaborating a unique and generic system for representing the history and on the other hand in proposing query features.We show that the system can support both simple and complex queries.Our contributions have been implemented and tested over synthetic and real databases
Koussa, Chokri. "Implantation d’un système d’information géographique 3D sur Internet pour la gestion des modèles urbains." Strasbourg, 2011. https://publication-theses.unistra.fr/public/theses_doctorat/2011/KOUSSA_Chokri_2011.pdf.
Full textDue to a lot of reasons, i. E. Continuous progress of computer tools in terms of software and hardware, the development of powerful spatial data acquisition tools, the generalization of spatial data and applications for an extended use, etc. A lot of spatial applications, more and more sophisticated, are carried out either as free or commercial tools. The Geographic Information Systems (GIS) are proving to be major consumers of geo-referenced data. They consists in adequate, or even the most appropriate, tools for using such kind of data. Indeed, they offer various features that focus especially on the management, interrogation and representation of spatial data. The real beginning of GIS generalization was in the 1990s. But then, given the limits of computer tools, GIS were mainly interested in 2D spatial data. While at the late 1990s and during the first decade of the 21st century, GIS developers are oriented to the 3rd dimension for spatial data representation. The concept of 3D GIS then became ubiquitous and a widespread research topic around the world. As the progress of GIS is related mainly to advances in computer technology, Internet has become fastly a very promising way for the online dissemination of GIS applications. Thus, the 2D GIS are now multiple over Internet, while 3D GIS are rare. It is in this context that joins our thesis work. Indeed, the main objective is to develop a 3D GIS prototype accessible via Internet. Because GIS are multidisciplinary tools since they rely on very different concepts and technologies (data modeling, databases, development tools, 3D, etc. ), Our thesis work will focus on the basic concepts of 3D GIS, i. E. Three-dimensional spatial data modeling, spatial databases modeling (BDS, spatial data integration in the BDS), carrying out a reflection on the querying functionality of spatial data, a Web application development to access online all services offered by the GIS, etc. As for the technologies to be used for the development of GIS, our choices were oriented mainly to the free tools. The objective is to study the various technologies implemented and their combination for the establishment of a functional 3D GIS accessible over Internet
Samara, Tarek. "Multiplicité des utilisateurs et pertinence des systèmes d'information multidimensionnels : l'exemple du secteur bancaire." Paris, CNAM, 2004. http://www.theses.fr/2004CNAM0455.
Full textThe changes in the environment these last years involved the creation of new functions obliging the accounting system to satisfy a multiplicity of users whose needs are detailed and specific. Our objective consists in showing – with the example of the banking sector - that this multiplicity leads to a reorganization of architecture according to a multidimensional approach. Relying on a theoretical context which combines the theories of event and of the contingency, we carried out two case studies and a quantitative analysis of 52 questionnaires. That showed the relevance of the multidimensional approach - whose application is partial as well as contingent - towards which the accounting model evolves becoming more and more a module within an enterprise resource planning. A typology of situations illustrated the importance of hybrid information systems as an intermediate stage in this evolution
Jomaa, Hanene. "Contribution de l'usage des systèmes d'information à la performance des organisations." Phd thesis, Paris, Télécom ParisTech, 2009. https://pastel.hal.science/pastel-00730391.
Full textThe aim of this work is to analyse how the use of information system can increase the organization's performance
Jomaa, Hanene. "Contribution de l'usage des systèmes d'information à la performance des organisations." Phd thesis, Télécom ParisTech, 2009. http://pastel.archives-ouvertes.fr/pastel-00730391.
Full textBosco, Michel. "Contribution à la spécification et à la conception de systèmes d'information intelligents pour le génie logiciel." Toulouse, ENSAE, 1988. http://www.theses.fr/1988ESAE0004.
Full textGaltié, Jean-François. "Information géographique numérique pour l'environnement : approche hiérarchique, modélisation et gestion prévisionnelle du risque incendie en région méditerranéenne : couplage données terrain : données de télédétection-video et intégration opérationnelle sous SIG." Toulouse 2, 1997. http://www.theses.fr/1997TOU20045.
Full textThis research aims at improving fire risk forecast in mediterranean shrubs environment. It has been carried out in the hilly wildlands of aspres, in the french eastern pyrenees, a region which is, at regular intervals, devastated by wildfires. The research project was designed in order to achieve two purposes : - on the one hand, the effectiviness of early detection of fire risk by remote sensing, from fuel spectral responses charateristics, had to be checked and confirmed according to the range of observation scales, within a hierarchical approach. - on the other hand, the final purpose of the work was to implement a methodological and operational framework for forecasting and managing fire risk according to the needs of the fire-fighting authorities in relation to decision support. The first objective was carried through by means of coupling both ground and remotly sensed data and of modelling plant fire susceptibility, controlled by fuel inflammability and combustibility, from relationships between biological and plant water status parameters on the one hand and, on the other hand, the potentially dectectable spectral charateristics of fuel. The three spectral bands of the experimental video device were then put to the test. A fire susceptibility biological index (fisbi), worked from mir data, has been suggested. Modelling was realized according the growing observation scales, from elementary plant particle to plant and vegetation cover. The design of the early fire detection and warning system rest on the previous modelling features and was built from and airbone video sensor with real-time ground data transmission, a gps and a gis for managing and bringing up-to-and date the fire risk maps. A full-size validation was carried through in the aspres during the summer of 1996 with the regional fire-fighting authorities
Tambellini, Caroline. "Un système de recherche d'information adapté aux données incertaines : adaptation du modèle de langue." Phd thesis, Grenoble 1, 2007. http://www.theses.fr/2007GRE10322.
Full textAn information retrieval system is based on a formal methodology to assert if terms documents correspond to terms of queries. Most of these systems assume that terms extracted from documents are perfectly recognized which involves that their matching function can consider the equality between terms of documents and terms of queries. Our work occurs in a context where data are not perfectly recognized and thus considered as uncertain. In this case, the equality between terms of documents and terms of queries may be change to the context of 'almost equality'. We propose an information retrieval system adapted to the uncertain data and based on the language model. We introduce the concept of pairing which measures 'almost equality' between two terms by the concordance and the intersection values. The pairing is also introduced in the matching function. Furthermore, the matching function is extended to take into account the extracted terms certainty value computed by an interpretation system. Basic assumptions of information retrieval such as Zipf's law and Luhn's conjecture are first checked. Then, our model is implemented. Our model is experimentally validated and compared with systems which do not integrate the concept of uncertainty. Finally, we present a tool dedicated to phone meeting which is an application using an information retrieval system adapted to the uncertain data
Tambellini, Caroline. "Un système de recherche d'information adapté aux données incertaines : adaptation du modèle de langue." Phd thesis, Université Joseph Fourier (Grenoble), 2007. http://tel.archives-ouvertes.fr/tel-00202702.
Full textNotre travail se positionne dans le cas où les données ne s'avèrent pas parfaitement reconnues et donc qualifiées d'incertaines. Dans ce contexte, l'égalité entre termes du document et termes de la requête est remise en cause pour laisser place à la notion de ‘presque égalité'. Nous proposons un système de recherche d'informations adapté aux données incertaines et basé sur le modèle de langue. Nous introduisons la notion d'appariement qui mesure la ‘presque égalité' entre deux termes par le biais de la concordance et de l'intersection. L'appariement s'intègre à la fonction de correspondance. De plus, la valeur de certitude d'extraction des termes fournie par un système d'interprétation s'insère dans la fonction de pondération. Préalablement à la mise en place d'un tel modèle, nous vérifions l'applicabilité des hypothèses de base de la recherche d'information, à savoir la loi de Zipf et la conjecture de Luhn, à des données issues de l'oral, exemple de données incertaines.
Le modèle proposé est validé expérimentalement et comparé à des systèmes n'intégrant pas la notion d'incertitude. Enfin, nous présentons une application possible utilisant un système de recherche adapté aux données incertaines : un outil d'aide à la réunion téléphonique.
Delefosse, Thierry. "Stratégies de recherche d'Informations émergentes pour la compréhension de grands volumes documentaires numérisées : application à la sécurité des systèmes d'information." Thesis, Paris Est, 2008. http://www.theses.fr/2008PEST0224.
Full textChaxel, Frédéric. "Contribution à la gestion et à la conduite des systèmes manufacturiers par les objets nomades de production." Nancy 1, 1995. http://docnum.univ-lorraine.fr/public/SCD_T_1995_0341_CHAXEL.pdf.
Full textBerti-Équille, Laure. "La qualité des données et leur recommandation : modèle conceptuel, formalisation et application a la veille technologique." Toulon, 1999. http://www.theses.fr/1999TOUL0008.
Full textTechnological Watch activities are focused on information qualification and validation by human expertise. As a matter of facf, none of these systems can provide (nor assist) a critical and qualitative analysis of data they store and manage- Most of information systems store data (1) whose source is usually unique, not known or not identified/authenticated (2) whose quality is unequal and/or ignored. In practice, several data may describe the same entity in the real world with contradictory values and their relative quality may be comparatively evaluated. Many techniques for data cleansing and editing exist for detecting some errors in database but it is determinant to know which data have bad quality and to use the benefit of a qualitative expert judgment on data, which is complementary to quantitative and statistical data analysis. My contribution is to provide a multi-source perspective to data quality, to introduce and to define the concepts of multi-source database (MSDB) and multi-source data quality (MSDQ). My approach was to analyze the wide panorama of research in the literature whose problematic have some analogies with technological watch problematic. The main objective of my work was to design and to provide a storage environment for managing textual information sources, (more or less contradictory) data that are extracted from the textual content and their quality mcta-data. My work was centered on proposing : the methodology to guide step-by-step a project for data quality in a multi-source information context, the conceptual modeling of a multi-source database (MSDB) for managing data sources, multi-source data and their quality meta-data and proposing mechanisms for multi-criteria data recommendation ; the formalization of the QMSD data model (Quality of Multi-Source Data) which describes multi-source data, their quality meta-data and the set of operations for manipulating them ; the development of the sQuaL prototype for implementing and validating my propositions. In the long term, the perspectives are to develop a specific dccisional information system extending classical functionalities for (1) managing multi-source data (2) taking into account their quality meta-data and (3) proposing data-quality-based recommendation as query results. The ambition is to develop the concept of "introspective information system" ; that is to say, an information system thai is active and reactive concerning the quality of its own data
Gomez, Dario. "Le prototypage basé sur des méta données phase 1 du cycle de développement." Thesis, Université Laval, 2013. http://www.theses.ulaval.ca/2013/30416/30416.pdf.
Full textDesigning information systems is a lengthy and complex process that leads to numerous failures. Prototyping has been proposed as a solution to improve the specifications' quality in the beginning of an information system's life cycle. Every information system (IS) is based upon the information architecture ; it is, before all, a content about the perceived reality. A "domain" is a formalization of the perceived reality in which the IS users identify the representations of facts (the data) by means of semantic keys. IS designers have to transform this model using their knowledge about the abstract functioning of computers. The objective of our research is to guide the action of requirements specification in the initial design phase of "Communication Customer - Designer" and in the beginning of the development phase "Communication Designer - Developer" using prototyping artifacts. Our work actually opens the way where it becomes possible to envisage that every modification during the information system's life cycle could be done from within the domain model, which is an input for the "prototyper" and becomes then itself an information system. Keywords : information system ; design method ; conceptual data model ; déclarative specification; executables pecification; prototype ; méta-data ; application architecture
Pinot, Gilbert. "Concepts multimédia et technologies associées : applications : consultation d'une base de données d'images, bornes interactives." Mulhouse, 1993. http://www.theses.fr/1993MULH0281.
Full textMenet, Ludovic. "Formalisation d'une approche d'Ingénierie Dirigée par les Modèles appliquée au domaine de la gestion des données de référence." Paris 8, 2010. http://www.theses.fr/2010PA083184.
Full textOur research work is in line with the problematic of data models definition in the framework of Master Data Management. Indeed, Model Driven Engineering (MDE) is a theme in great expansion in the academic world as well as in the industrial world. It brings an important change in the conception of applications taking in account the durability of savoir-faire and of gains of productivity, and taking profits of platforms advantages without suffering of secondary effects. The MDE architecture is based on the transformation of models to come to a technical solution on a chosen platform from independent business models of any platform. In this thesis, a conceptual and technical thought process of the MDE approach is applied to the definition of pivot data models, which are the base of Master Data Management (MDM). Thus, we use Unified Modeling Language (UML) as formalism to describe the independent aspects of the platform (business model), and we propose a meta-model, in the form of an UML profile, to describe the dependent aspects of the MDM platform. Then, we present our approach to move from a business model to a platform model to be able to generate the physical pivot model. The inputs of the thesis are : the study of a MDE approach in the MDM context, the definition of UML transformations towards a MDM model (based on a XML Schema structure), besides we bring a new aspect to MDE applied to MDM, that is to say the definition of a method for incremental model validation allowing the optimization of validation stages during model conception
Lévesque, Johann. "Évaluation de la qualité des données géospatiales : approche top-down et gestion de la métaqualité." Thesis, Université Laval, 2007. http://www.theses.ulaval.ca/2007/24759/24759.pdf.
Full textGanzin, Nicolas. "Contribution des données satellites à large champ pour l'aide à la gestion des ressources pastorales en milieu aride et semi-aride au Kenya et en Namibie." Orléans, 2004. http://www.theses.fr/2004ORLE1061.
Full textThe rangelands of arid and semi-arid areas represent precious but vulnerable resources. These must be managed by adapting the animal load to the capacity of the land, thus avoiding overgrazing and the degradations that it can induce. In this regard, reliable information on the forage resources is useful but difficult to obtain over large areas due to the great variability in time and space which characterises aridity. This work studies the potential of "wide field" (coarse resolution) satellite imagery to fill-in this gap. AVHRR and VEGETATION images used as main input for the Monteith vegetation production model provided biomass production estimations over a series of growing seasons for two study areas : Kenya and Namibia. The accuracy of the results is acceptable, and the method appears to be sufficiently simple and adaptable to be easily implemented an operational way with a wide range of satellite data. Three case studies illustrate the application of the type of information for rangeland resources management at various scales of space and time : in Namibia for quick and targeted government intervention in case of drought, for the management of a small protected area in Kenya, and to assist in the implementation of the land reform in Namibia
Bessenay, Carole. "La gestion des données environnementales dans un espace naturel sensible : le système d'information géographique des Hautes-Chaumes foréziennes (Massif central)." Saint-Etienne, 1995. http://www.theses.fr/1995STET2024.
Full textThe object of this research is to present and to apply to a specific territory the geographical information systems' concepts and potentialities that can help understand the functioning and evolution processes of natural spaces. The GIS of the "Hautes-Chaumes foreziennes" underlines the interest of a computerization of "ecological planning" methods whose aims are to integrate environment into management practices thanks to the analysis of the specific aptitudes or sensitivities of one space. This study is based on the inventory and the mapping ot the Hautes-Chaumes principal natural and human characteristics : topography, vegetation, humidity, pastoral activities. . . The selection of several criteria allows the elaboration of a pluridisciplinary diagnosis which underlines the important sensitivity of this area. This diagnosis is then compared with an evaluation model of anthropic frequenting in a way to define a zoning of the most vulnerable sectors, which are both sensitive and subject to important pressures. This analysis should urge politicians to conceive differentiated management measures related with the incentives at stake in each area in order to conciliate all anthropic activities while respecting the aptitudes of this natural space
Barde, Julien. "Mutualisation de données et de connaissances pour la gestion Intégrée des Zones Côtières : application au projet SYSCOLAG." Montpellier 2, 2005. https://tel.archives-ouvertes.fr/tel-00112661.
Full textCorbière, François de. "L'amélioration de la qualité des données par l'électronisation des échanges à l'épreuve des fiches produit dans le secteur de la grande distribution." Nantes, 2008. http://www.theses.fr/2008NANT4020.
Full textOur research question is concerned with the influence of electronic exchanges organizations on data quality improvement. Product information is a set of data that identify and describe a product of a manufacturer. Electronic exchanges organizations deal with the sending information systems, the receiving information systems and their interconnection. A case study based qualitative research is conducted to understand how electronic exchanges organizations are perceived to participate to data quality improvement from manufacturers’ and retailers’ points of view. Our results show that sending, receiving and interconnected architectures, exchanges automation and exchanges standardization all influence the perceived improvement of some data quality dimensions. In a processing view of exchanges, our main theoretical contribution is to show that this set of factors can all be conceptualized with interdependence. We define interdependence through three levels: technical, informational and organizational. In each of these levels, we propose that interdependence types can be positioned between two extremes that are dyadic interdependence and sector interdependence. Dyadic interdependence refers to multiple sequential interdependencies between two firms. Sector interdependence refers to a pool interdependency between all the firms
Chevalier, Max. "Interface adaptative pour l'aide à la recherche d'information sur le web." Toulouse 3, 2002. http://www.theses.fr/2002TOU30157.
Full textGhozzi, Faouzi. "La feuille 1:50000 de Zaouiet Medien (Tunisie N-O) : étude archéologique à travers un Système d'Information Géographique (SIG) de la protohistoire jusqu'à la fin du Moyen Âge." Nice, 2006. http://www.theses.fr/2006NICE2019.
Full textMinier, Thomas. "Web preemption for querying the linked data open." Thesis, Nantes, 2020. http://www.theses.fr/2020NANT4047.
Full textFollowing the Linked Open Data principles, data providers have published billions of RDF documents using public SPARQL query services. To ensure these services remains stable and responsive, they enforce quotas on server usage. Queries which exceed these quotas are interrupted and deliver partial results. Such interruption is not an issue if it is possible to resume queries execution afterward. Unfortunately, there is no preemption model for the Web that allows for suspending and resuming SPARQL queries. In this thesis, we propose to tackle the issue of building public SPARQL query servers that allow any data consumer to execute any SPARQL query with complete results. First, we propose a new query execution model called Web Preemption. It allows SPARQL queries to be suspended by the Web server after a fixed time quantum and resumed upon client request. Web preemption is tractable only if its cost in time is negligible compared to the time quantum. Thus, we propose SaGe: a SPARQL query engine that implements Web Preemption with minimal overhead. Experimental results demonstrate that SaGe outperforms existing SPARQL query processing approaches by several orders of magnitude in term of the average total query execution time and the time for first results
Lbath, Ahmed. "A. I. G. L. E. : un environnement visuel pour la conception et la génération automatique d'applications géomatiques." Lyon, INSA, 1997. http://www.theses.fr/1997ISAL0048.
Full textThe Geographical Information Systems (GIS) applications domain is very large. Nowadays marketed GfS are closed regarding applications. Each geographical application needs specific development which is time consuming and which is dedicated to a specific GIS. Moreover, in GIS, manipulated spatial data could be complex (raster or vector). Mental model of the end-user is not considered in the method design aspect; it is not taken into account in the query language aspect. Regarding these problems no real solution exists and most of marketed GIS are still proprietary systems. Our aim is to develop a new visual CASE tool named AIGLE, capable of generating various applications on several GIS platforms. The problem of portability leads us to integrate an intermediate language into the CASE tool. This CASE tool supports, in one hand, a visual object oriented method named OMEGA, and, in the other hand, a visual query language. OMEGA is dedicated to the design and development of geographical end-user applications. This method uses a visual iconic metaphors for the representation of object classes and dynamic state diagrams. In the visual query language for GIS two important levels are defined: end-user level with iconic metaphors and technical level with graphical representation of class abject. The iconic metaphors could represent data or queries. For data, iconic metaphors should be defined in the design stage with OMEGA. The proposed visual language is translated into an intermediate query language before being generated into a specific target GIS. In order to validate our approach a prototype has been developed and an example of geographical application has been generated and marketed
Bacha, Rebiha. "De la gestion de données techniques pour l'ingénierie de production : référentiel du domaine et cadre méthodologique pour l'ingénierie des systèmes d'information techniques en entreprise." Phd thesis, Ecole Centrale Paris, 2002. http://tel.archives-ouvertes.fr/tel-00011949.
Full textHamani, Dalil Omar. "Un système d'information pour le bâtiment - Elaboration d'un modèle conceptuel de données pour les ouvrages façonnés en place issus de la production de bâtiment." Aix-Marseille 3, 2005. http://www.theses.fr/2005AIX30035.
Full textOur work focuses on information exchanges in the construction activity. We present the contribution of our research to the description of the built elements in the building site where information is shared by partners who are distant from one another and focused on fields of expertise that are distinct but concurrent. Our work aims to provide an information system that promotes cooperation between the buildings actors, by sharing a unique data model among the partners involved in the building construction processes. It takes into account the technical data needed in the different phases of the supply chain management cycle in the construction field. Our research was carried out in the UMR 694 MAP-GAMSAU laboratory, within the framework of the "Communication and tools of CAD" project. It aims to provide a solution to the growing of building partners demand cooperative work during the building production phase. We focused our work on the engineering phase that withstands a vast data exchange. Developed in a cooperative context, our work is based on the following hypothesis : 1- The building description can be made through its elements. 2- The building description can be made through its tasks. 3- Building the elements is cooperative; it requires knowledge, working systems shared between the different partners. 4- The ability and qualification of an element in its CAD background assists its description on a far-away database server. During the implementation phase, the different activities on a building site can be described as a model that integrates all the related tasks and includes all the work practices and requirements shared among the partners of the architectural project. The experiments that took place in this research were based on the Database management system of Oracle 9iAS and the CAD system Architectural DeskTop
Ghurbhurn, Rahee. "Intégration de données à partir de connaissances : une approche multi-agent pour la gestion des changements." Phd thesis, Ecole Nationale Supérieure des Mines de Saint-Etienne, 2008. http://tel.archives-ouvertes.fr/tel-00785415.
Full textMeyer, Elise. "ACQUISITION 3D, DOCUMENTATION ET RESTITUTION EN ARCHEOLOGIEProposition d'un modèle de Système d'Information dédié au patrimoine." Phd thesis, Université Henri Poincaré - Nancy I, 2007. http://tel.archives-ouvertes.fr/tel-00270010.
Full textCoste, Benjamin. "Détection contextuelle de cyberattaques par gestion de confiance à bord d'un navire." Thesis, Ecole nationale supérieure Mines-Télécom Atlantique Bretagne Pays de la Loire, 2018. http://www.theses.fr/2018IMTA0106/document.
Full textNavigation and ship’s management are two essential aspects for the security of the ship itself and people on board as much as the maritime environment protection. Modern ships ensure these functions by increasingly embedding connected and automated technologies such as sensors, actuators, programmable logic controllers and pieces of software. However, the security of this objects as well as the trust in the information they produce cannot be guaranteed: they can be deceived or under the control of a malicious third party. In this context, a novel approach of data falsification detection is proposed. It is based on trust assessment of information system components which can be seen as inter-related functional blocks producing, processing and receiving pieces of information. The trust one can have inproduction blocks, called information sources, is assessed through its ability to report real situation of the ship. Trust is then propagated to the remainder part of the system. A simulator was made thanks to which we experiment several scenarios including intentional modification of numerical data. In these cases and under some conditions, the variability of trust give us the ability to identify the attack occurrence as much as its target. Our proposition is not restricted to naval information systems and can be employed in various situations even with human factor
Bendriss, Sabri. "Contribution à l'analyse et la conception d'un système d'information pour la gestion de la traçabilité des marchandises dans un contexte de transport multimodal." Le Havre, 2009. http://www.theses.fr/2009LEHA0024.
Full textOne of solutions to regulate and rationalize the physical displacement of goods is to succeed to synchronize the physical flow with its informational counterpart throughout the various links constituting the transport chain. In this context, a solution of goods tracking and tracing can contribute to a better mastery of flows. In this memory, we propose a data modeling approach for goods traceability based on innovative research approaches (PLM, Intelligent product, Product centered systems) and taking into account the possibilities offered by the use of NICT in terms of data sharing, auto-identification and geolocation. Then, in order to integrate our traceability data with the other transport chain data, but also in order to facilitate the sharing and the exchange of our data, we propose a modeling and the development of an intermediation platform based on the web services logic. Meeting criteria of interoperability and integrability, the result allows through mechanisms for exchange and saving data to follow and to restore goods lifecycle in its entirety
Grira, Joël. "Improving knowledge about the risks of inappropriate uses of geospatial data by introducing a collaborative approach in the design of geospatial databases." Doctoral thesis, Université Laval, 2014. http://hdl.handle.net/20.500.11794/25177.
Full textNowadays, the increased availability of geospatial information is a reality that many organizations, and even the general public, are trying to transform to a financial benefit. The reusability of datasets is now a viable alternative that may help organizations to achieve cost savings. The quality of these datasets may vary depending on the usage context. The issue of geospatial data misuse becomes even more important because of the disparity between the different expertises of the geospatial data end-users. Managing the risks of geospatial data misuse has been the subject of several studies over the past fifteen years. In this context, several approaches have been proposed to address these risks, namely preventive approaches and palliative approaches. However, these approaches are often based on ad-hoc initiatives. Thus, during the design process of the geospatial database, risk analysis is not always carried out in accordance neither with the principles/guidelines of requirements engineering nor with the recommendations of ISO standards. In this thesis, we suppose that it is possible to define a preventive approach for the identification and analysis of risks associated to inappropriate use of geospatial data. We believe that the expertise and knowledge held by experts and users of geospatial data are key elements for the assessment of risks of geospatial data misuse of this data. Hence, it becomes important to enrich that knowledge. Thus, we review the geospatial data design process and propose a collaborative and user-centric approach for requirements analysis. Under this approach, the user is involved in a collaborative process that helps provide an a priori identification of inappropriate use of the underlying data. Then, by reviewing research in the domain of risk analysis, we propose to systematically integrate risk analysis – using the Delphi technique – through the design of geospatial databases. Finally, still in the context of a collaborative approach, an ontological risk repository is proposed to enrich the knowledge about the risks of data misuse and to disseminate this knowledge to the design team, developers and end-users. The approach is then implemented using a web platform in order to demonstrate its feasibility and to get the concepts working within a concrete prototype.
Delot, Thierry. "Interrogation d'annuaires étendus : modèles, langage et optimisation." Versailles-St Quentin en Yvelines, 2001. http://www.theses.fr/2001VERS0028.
Full text