Academic literature on the topic 'Optimisation de l'échange de données'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Optimisation de l'échange de données.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Optimisation de l'échange de données"
Chukwu, Uzoma. "Optimisation du rapport coûts-avantages dans la réalisation de dictionnaires terminologiques informatisés (DTI) sur PC." Meta 41, no. 3 (September 30, 2002): 317–27. http://dx.doi.org/10.7202/002241ar.
Full textMitton, C., Y. C. MacNab, N. Smith, and L. Foster. "Données relatives aux blessures en Colombie‑Britannique : opinion des décideurs en ce qui concerne le transfert de connaissances." Maladies chroniques au Canada 29, no. 2 (2009): 78–88. http://dx.doi.org/10.24095/hpcdp.29.2.05f.
Full textManens, JP. "L'échange de données médicotechniques en radiothérapie: les besoins, les méthodes et les limites actuelles." Cancer/Radiothérapie 1, no. 5 (November 1997): 524–31. http://dx.doi.org/10.1016/s1278-3218(97)89633-3.
Full textMornet, Élisabeth. "Les chanoines au Moyen Âge : les fondements de l'échange d'une banque internationale de données prosopographiques." Le médiéviste et l'ordinateur 1, no. 1 (1990): 89–92. http://dx.doi.org/10.3406/medio.1990.1242.
Full textEl Khayat, Mustapha. "L'échange de données informatisées dans les activités d'exportation des pays du Sud : les passages portuaires." Tiers-Monde 35, no. 138 (1994): 359–74. http://dx.doi.org/10.3406/tiers.1994.4880.
Full textMurnaghan, D., W. Morrison, EJ Griffith, BL Bell, LA Duffley, K. McGarry, and S. Manske. "Étude sur les systèmes d'échange des connaissances pour la santé des jeunes et la prévention des maladies chroniques : étude de cas menée dans trois provinces." Maladies chroniques et blessures au Canada 33, no. 4 (September 2013): 290–300. http://dx.doi.org/10.24095/hpcdp.33.4.07f.
Full text-COLOMIES, Bernard. "Optimisation et rationalisation de la gestion des données d'essais." Revue de l'Electricité et de l'Electronique -, no. 06 (1999): 52. http://dx.doi.org/10.3845/ree.1999.061.
Full textGrenier, Olivier, and Adrien Barton. "Une ontologie dispositionnelle du risque." Lato Sensu: Revue de la Société de philosophie des sciences 8, no. 2 (April 6, 2021): 58–69. http://dx.doi.org/10.20416/lsrsps.v8i2.6.
Full textFrémal, Sébastien, Michel Bagein, and Pierre Manneback. "Optimisation des transferts de données inter-domaines au sein de Xen." Techniques et sciences informatiques 34, no. 1-2 (April 30, 2015): 31–52. http://dx.doi.org/10.3166/tsi.34.31-52.
Full textSkaržauskas, Valentinas, Valentinas Jankovski, and Juozas Atkočiűnas. "Optimisation des structures métalliques élastoplastiques sous conditions de rigidité et de plasticité données." Revue européenne de génie civil 13, no. 10 (December 15, 2009): 1203–19. http://dx.doi.org/10.3166/ejece.13.1203-1219.
Full textDissertations / Theses on the topic "Optimisation de l'échange de données"
Ouertani, Mohamed Zied. "DEPNET : une approche support au processus de gestion de conflits basée sur la gestion des dépendances de données de conception." Phd thesis, Université Henri Poincaré - Nancy I, 2007. http://tel.archives-ouvertes.fr/tel-00163113.
Full textC'est à la gestion de ce phénomène, le conflit, que nous nous sommes intéressés dans le travail présenté dans ce mémoire, et plus particulièrement à la gestion de conflits par négociation. Nous proposons l'approche DEPNET (product Data dEPendencies NETwork identification and qualification) pour supporter au processus de gestion de conflits basée sur la gestion des dépendances entre les données. Ces données échangées et partagées entre les différents intervenants sont l'essence même de l'activité de conception et jouent un rôle primordial dans l'avancement du processus de conception.
Cette approche propose des éléments méthodologiques pour : (1) identifier l'équipe de négociation qui sera responsable de la résolution de conflit, et (2) gérer les impacts de la solution retenu suite à la résolution du conflit. Une mise en œuvre des apports de ce travail de recherche est présentée au travers du prototype logiciel DEPNET. Nous validons celui-ci sur un cas d'étude industriel issu de la conception d'un turbocompresseur.
De, Vlieger P. "Création d'un environnement de gestion de base de données " en grille ". Application à l'échange de données médicales." Phd thesis, Université d'Auvergne - Clermont-Ferrand I, 2011. http://tel.archives-ouvertes.fr/tel-00654660.
Full textDe, Vlieger Paul. "Création d'un environnement de gestion de base de données "en grille" : application à l'échange de données médicales." Phd thesis, Université d'Auvergne - Clermont-Ferrand I, 2011. http://tel.archives-ouvertes.fr/tel-00719688.
Full textEl, Khalkhali Imad. "Système intégré pour la modélisation, l'échange et le partage des données de produits." Lyon, INSA, 2002. http://theses.insa-lyon.fr/publication/2002ISAL0052/these.pdf.
Full textIn Virtual Enterprise and Concurrent Engineering environments, a wide variety of information is used. A crucial issue is the data communication and exchange between heterogeneous systems and distant sites. To solve this problem, the STEP project was introduced. The STandard for the Exchange of Product model data STEP is an evolving international standard for the representation and exchange of product data. The objective of STEP is to provide the unambiguous computer-interpretable representation of product data in all phases of the product’s lifecycle. In a collaborative product development different types of experts in different disciplines are concerned by the product (Design, Manufacturing, Marketing, Customers,. . . ). Each of these experts has his own viewpoint about the same product. STEP Models are unable to represent the expert’s viewpoints. The objective of our research work is to propose a methodology for representation and integration of different expert’s viewpoints in design and manufacturing phases. An Information Infrastructure for modelling, exchanging and sharing product data models is also proposed
Stoeklé, Henri-Corto. "Médecine personnalisée et bioéthique : enjeux éthiques dans l'échange et le partage des données génétiques." Thesis, Sorbonne Paris Cité, 2017. http://www.theses.fr/2017USPCB175.
Full textIn the context of medicine and life sciences, personalized medicine (PM) is all too often reduced to the idea of adapting a diagnosis, predisposition or treatment according to the genetic characteristics of an individual. However, in human and social sciences, PM may be considered as a complex social phenomenon, due to the proper existence and unique composition of the constraints it imposes on individuals, the large number of interactions and interferences between a large number of units, rich in uncertainties, indeterminations, chance, order and disorder. We feel that this alternative point of view makes it possible to study PM more effectively by bioethics research approaches, but with a new objective, contrasting but complementary to those of law and moral philosophy, and a new method. Indeed, the objective of bioethics should be prospective studies questioning established norms in the face of emerging complex social phenomena, rather than the other way round. This makes it possible to determine the benefits, to society and its individuals, of allowing the phenomenon to emerge fully, and to study possible and probable solutions, rather than certainties, for the present and the future. This may allow the identified benefits to occur. However, this objective requires a method for studying the functioning of the phenomenon as a whole, at the scale of society, without a priori restriction to certain individuals, thereby favoring its interactions over its elements. Qualitative inductive systemic theoretical modeling is just such an approach. The key idea here is a rationale of discovery, rather than of proof. This new approach allowed us to understand that PM should not be called "personalized", or even "genomic" or "precision" medicine, and that the term "data medicine" (DM) should be favored, given the key role of data in its functioning. Indeed, the goal of this phenomenon seems to be to use a large mass of data (genetics) to deduce (data mining) or induce (big data) different types of information useful for medical care, research and industry. The means of achieving this end seems to be the development of a network for exchanging or sharing biological samples, genetic data and information between patients, clinicians, researchers and industrial partners, through electronic communication, with the central storage of biological samples and genetic data, and with treatment and analysis carried out at academic care and research centers (France) or in private companies (United States), with or without the involvement of a clinician. The major ethical issues thus seem to relate to the means and mode of access to, and the storage and use of genetic data, which may lead to a radically opposed (social/liberal) organizations and functioning, calling into question certain moral and legal standards. Finally, our method provided several arguments in favor of the use of dynamic electronic informed consent (e-CE) as a solution optimizing the development of PM in terms of genetic data access, storage and use, for the sharing (France) or exchange (United States) of genetic data
Azizi, Leila. "Pratique et problèmes légaux de l'échange de données informatisées, le cas du crédit documentaire dématérialisé." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape9/PQDD_0020/MQ47163.pdf.
Full textDarlay, Julien. "Analyse combinatoire de données : structures et optimisation." Phd thesis, Université de Grenoble, 2011. http://tel.archives-ouvertes.fr/tel-00683651.
Full textGamoudi, Oussama. "Optimisation adaptative appliquée au préchargement de données." Paris 6, 2012. http://www.theses.fr/2012PA066192.
Full textData prefetching is an effective way to bridge the increasing performance gap between processor and memory. Prefetching can improve performance but it has some side effects which may lead to no performance improvement while increasing memory pressure or to performance degradation. Adaptive prefetching aims at reducing negative effects of prefetching while keeping its advantages. This paper proposes an adaptive prefetching method based on runtime activity, which corresponds to the processor and memory activities retrieved by hardware counters, to predict the prefetch efficiency. Our approach highlights and relies on the correlation between the prefetch effects and runtime activity. Our method learns all along the execution this correlation to predict the prefetch efficiency in order to filter out predicted inefficient prefetches. Experimental results show that the proposed filter is able to cancel thenegative impact of prefetching when it is unprofitable while keeping the performance improvement due to prefetching when it is beneficial. Our filter works similarly well when several threads are running simultane-ously which shows that runtime activity enables an efficient adaptation of prefetch by providing information on running-applications behaviors and interactions
Travers, Nicolas. "Optimisation Extensible dans un Mediateur de Données Semi-Structurées." Phd thesis, Université de Versailles-Saint Quentin en Yvelines, 2006. http://tel.archives-ouvertes.fr/tel-00131338.
Full textcontexte de médiation de données XML. Un médiateur doit fédérer des sources de données
distribuées et hétérogènes. A cette fin, un modèle de représentation des requêtes est néces-
saire. Ce modèle doit intégrer les problèmes de médiation et permettre de définir un cadre
d'optimisation pour améliorer les performances. Le modèle des motifs d'arbre est souvent
utilisé pour représenter les requêtes XQuery, mais il ne reconnaît pas toutes les spécifica-
tions du langage. La complexité du langage XQuery fait qu'aucun modèle de représentation
complet n'a été proposé pour reconna^³tre toutes les spécifications. Ainsi, nous proposons un
nouveau modèle de représentation pour toutes les requêtes XQuery non typées que nous appe-
lons TGV. Avant de modéliser une requête, une étape de canonisation permet de produire une
forme canonique pour ces requêtes, facilitant l'étape de traduction vers le modèle TGV. Ce
modèle prend en compte le contexte de médiation et facilite l'étape d'optimisation. Les TGV
définis sous forme de Types Abstraits de Données facilitent l'intégration du modèle dans tout
système en fonction du modèle de données. De plus, une algèbre d'évaluation est définie pour
les TGV. Grâce µa l'intégration d'annotations et d'un cadre pour règles de transformation, un
optimiseur extensible manipule les TGV. Celui-ci repose sur des règles transformations, un
modèle de coût générique et une stratégie de recherche. Les TGV et l'optimiseur extensible
sont intégrés dans le médiateur XLive, développé au laboratoire PRiSM.
Amstel, Duco van. "Optimisation de la localité des données sur architectures manycœurs." Thesis, Université Grenoble Alpes (ComUE), 2016. http://www.theses.fr/2016GREAM019/document.
Full textThe continuous evolution of computer architectures has been an important driver of research in code optimization and compiler technologies. A trend in this evolution that can be traced back over decades is the growing ratio between the available computational power (IPS, FLOPS, ...) and the corresponding bandwidth between the various levels of the memory hierarchy (registers, cache, DRAM). As a result the reduction of the amount of memory communications that a given code requires has been an important topic in compiler research. A basic principle for such optimizations is the improvement of temporal data locality: grouping all references to a single data-point as close together as possible so that it is only required for a short duration and can be quickly moved to distant memory (DRAM) without any further memory communications.Yet another architectural evolution has been the advent of the multicore era and in the most recent years the first generation of manycore designs. These architectures have considerably raised the bar of the amount of parallelism that is available to programs and algorithms but this is again limited by the available bandwidth for communications between the cores. This brings some issues thatpreviously were the sole preoccupation of distributed computing to the world of compiling and code optimization techniques.In this document we present a first dive into a new optimization technique which has the promise of offering both a high-level model for data reuses and a large field of potential applications, a technique which we refer to as generalized tiling. It finds its source in the already well-known loop tiling technique which has been applied with success to improve data locality for both register and cache-memory in the case of nested loops. This new "flavor" of tiling has a much broader perspective and is not limited to the case of nested loops. It is build on a new representation, the memory-use graph, which is tightly linked to a new model for both memory usage and communication requirements and which can be used for all forms of iterate code.Generalized tiling expresses data locality as an optimization problem for which multiple solutions are proposed. With the abstraction introduced by the memory-use graph it is possible to solve this optimization problem in different environments. For experimental evaluations we show how this new technique can be applied in the contexts of loops, nested or not, as well as for computer programs expressed within a dataflow language. With the anticipation of using generalized tiling also to distributed computations over the cores of a manycore architecture we also provide some insight into the methods that can be used to model communications and their characteristics on such architectures.As a final point, and in order to show the full expressiveness of the memory-use graph and even more the underlying memory usage and communication model, we turn towards the topic of performance debugging and the analysis of execution traces. Our goal is to provide feedback on the evaluated code and its potential for further improvement of data locality. Such traces may contain information about memory communications during an execution and show strong similarities with the previously studied optimization problem. This brings us to a short introduction to the algorithmics of directed graphs and the formulation of some new heuristics for the well-studied topic of reachability and the much less known problem of convex partitioning
Books on the topic "Optimisation de l'échange de données"
Genetic algorithms + data structures = evolution programs. 3rd ed. Berlin: Springer-Verlag, 1996.
Find full textGenetic algorithms + data structures = evolution programs. 2nd ed. Berlin: Springer-Verlag, 1994.
Find full textGenetic algorithms + data structures = evolution programs. Berlin: Springer-Verlag, 1992.
Find full textsais-je?, Que, and Claude Charmot. L'échange de données informatisé (EDI). Presses Universitaires de France - PUF, 1997.
Find full textChapuis, Roger. Les bases de données Oracle 8i : Développement, administration et optimisation. Dunod, 2003.
Find full textMichalewicz, Zbigniew. Genetic Algorithms + Data Structures = Evolution Programs. Springer, 2014.
Find full textMichalewicz, Zbigniew. Genetic Algorithms + Data Structures = Evolution Programs. Springer, 2011.
Find full textOnvural, Raif. Data Communications and their Performance (IFIP International Federation for Information Processing). Springer, 1995.
Find full textBook chapters on the topic "Optimisation de l'échange de données"
"V.2 Optimisation à données affines (Programmation linéaire)." In Optimisation et analyse convexe, 168–71. EDP Sciences, 2020. http://dx.doi.org/10.1051/978-2-7598-0700-0-016.
Full text"V.2 Optimisation à données affines (Programmation linéaire)." In Optimisation et analyse convexe, 168–71. EDP Sciences, 2020. http://dx.doi.org/10.1051/978-2-7598-0700-0.c016.
Full textConference papers on the topic "Optimisation de l'échange de données"
KERGADALLAN, Xavier, Sébastien DUPRAY, and Nathalie METZLER. "Optimisation du système public d’information sur les données de houle côtières, CANDHIS." In Journées Nationales Génie Côtier - Génie Civil. Editions Paralia, 2020. http://dx.doi.org/10.5150/jngcgc.2020.009.
Full textMaron, Philippe, and Didier Rihouey. "Optimisation de données bathymétriques à l'aide de Surfer - Application A l'historique des plages d'Anglet." In Journées Nationales Génie Côtier - Génie Civil. Editions Paralia, 2002. http://dx.doi.org/10.5150/jngcgc.2002.037-m.
Full textParisot, Jean-Paul, Sylvain Capo, Stéphane Bujan, Nadia Senechal, and Jean Brillet. "Traitement des données topographiques et bathymétriques acquises sur le littoral aquitain : optimisation des mesures effectuées en quad et au théodolite." In Journées Nationales Génie Côtier - Génie Civil. Editions Paralia, 2008. http://dx.doi.org/10.5150/jngcgc.2008.057-p.
Full textHascoet, E., G. Valette, G. Le Toux, and S. Boisramé. "Proposition d’un protocole de prise en charge implanto-portée de patients traités en oncologie tête et cou suite à une étude rétrospective au CHRU de Brest." In 66ème Congrès de la SFCO. Les Ulis, France: EDP Sciences, 2020. http://dx.doi.org/10.1051/sfco/20206602009.
Full text