Dissertations / Theses on the topic 'Flux d'informations'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Flux d'informations.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
BOUESSEL, DU BOURG TRUBERT ANNE MARIE. "Les flux transfrontieres d'informations : le systeme d'information des entreprises multinationales." Rennes 1, 1990. http://www.theses.fr/1990REN11025.
Full textThe transborder feature of the data flows which are processed and transmitted through a multinational company's information system, raises political, economical, legal, social and cultural issues. The present thesis deals with the management issues about multinational companies i. S. In three fields : strategy and organization, technology, law. Is in part i. In part ii, six monographies are presented about six importantmultinational firms. These specific studies have been carried out through semi-free talks with these firms' i. S. Chief executive officiers. Three of these firms are implementing an evolving i. S. , the other three are managing an elaborated i. S. (i. E. An i. S. Which is structured, performant, accurate and fitting the firm's strategic goals. The part ii also includes a synthesis of the monographies which leads to evolution and improvement prospects and alsoto suggestions towards a better performance of the multinational firms's information system
Chartron, Ghislaine. "Analyse des corpus de données textuelles, sondage de flux d'informations." Paris 7, 1988. http://www.theses.fr/1988PA077211.
Full textChartron, Ghislaine. "Analyse des corpus de données textuelles sondage de flux d'informations /." Grenoble 2 : ANRT, 1988. http://catalogue.bnf.fr/ark:/12148/cb37612583z.
Full textViller-Hamon, Isabelle. "Flux financiers, flux d'informations et reseaux internationaux : l'agence havas et le jeu des echanges, 1850-1914." Paris 3, 2000. http://www.theses.fr/2000PA030175.
Full textCheutet, Vincent. "Contribution à la continuité des flux d'informations et de connaissances dans le lien conception-production." Habilitation à diriger des recherches, Université de Technologie de Compiègne, 2012. http://tel.archives-ouvertes.fr/tel-00801687.
Full textHe, Jianguo. "Modélisation des flux d'informations liées aux outils coupants : développement de méthodologies et d'outils de gestion adaptés." Lyon, INSA, 1991. http://www.theses.fr/1991ISAL0074.
Full textAt a time when quality and profitability have become the main demands regarding the production apparatus, increasing interest has arisen for tool management, which appears as one of the major keys to achieve these goals. A tool management methodology has been developed in this paper. It mainly deals with : the analysis of all the activities linked with tool management and t he structuration of the necessary pieces of information, - the building up of the functional relationships between the database entities, as well as of a tool codification system, - the modelisation of the tasks and the Logical relations of tool management, - the dynamic tool flow simulation, taking the Laws into account , which describe tool failure probability, - the development of several tool replacement strategies based on the stochastic nature of tool lite and their validation through simulation, the elaboration of a basic module for technical and economie calculation and the output of which are indicators enabling the user to appraise the performance of the tool management system
Kassab, Randa. "Analyse des propriétés stationnaires et des propriétés émergentes dans les flux d'informations changeant au cours du temps." Phd thesis, Université Henri Poincaré - Nancy I, 2009. http://tel.archives-ouvertes.fr/tel-00402644.
Full textL'apport de ce travail de thèse réside principalement dans le développement d'un modèle d'apprentissage - nommé ILoNDF - fondé sur le principe de la détection de nouveauté. L'apprentissage de ce modèle est, contrairement à sa version de départ, guidé non seulement par la nouveauté qu'apporte une donnée d'entrée mais également par la donnée elle-même. De ce fait, le modèle ILoNDF peut acquérir constamment de nouvelles connaissances relatives aux fréquences d'occurrence des données et de leurs variables, ce qui le rend moins sensible au bruit. De plus, doté d'un fonctionnement en ligne sans répétition d'apprentissage, ce modèle répond aux exigences les plus fortes liées au traitement des flux de données.
Dans un premier temps, notre travail se focalise sur l'étude du comportement du modèle ILoNDF dans le cadre général de la classification à partir d'une seule classe en partant de l'exploitation des données fortement multidimensionnelles et bruitées. Ce type d'étude nous a permis de mettre en évidence les capacités d'apprentissage pures du modèle ILoNDF vis-à-vis de l'ensemble des méthodes proposées jusqu'à présent. Dans un deuxième temps, nous nous intéressons plus particulièrement à l'adaptation fine du modèle au cadre précis du filtrage d'informations. Notre objectif est de mettre en place une stratégie de filtrage orientée-utilisateur plutôt qu'orientée-système, et ceci notamment en suivant deux types de directions. La première direction concerne la modélisation utilisateur à l'aide du modèle ILoNDF. Cette modélisation fournit une nouvelle manière de regarder le profil utilisateur en termes de critères de spécificité, d'exhaustivité et de contradiction. Ceci permet, entre autres, d'optimiser le seuil de filtrage en tenant compte de l'importance que pourrait donner l'utilisateur à la précision et au rappel. La seconde direction, complémentaire de la première, concerne le raffinement des fonctionnalités du modèle ILoNDF en le dotant d'une capacité à s'adapter à la dérive du besoin de l'utilisateur au cours du temps. Enfin, nous nous attachons à la généralisation de notre travail antérieur au cas où les données arrivant en flux peuvent être réparties en classes multiples.
Hamana, Sabri. "Modélisation et simulation des flux d'informations Ville-Hôpital et évaluation de leur impact sur le parcours de soins." Thesis, Lyon, 2017. https://tel.archives-ouvertes.fr/tel-02873368.
Full textThe French health policy which aim to improve health system by the modernization of health information systems has created a latent need, that of measurement of the impact of information systems on the value creation within healthcare institutions, hence the need of tools and methods for carrying out this evaluation work.The aim of this thesis is to propose a framework for the modelling, analysis and cost evaluation of territorial health-care information systems. For this purpose, we propose a new class of timed Petri nets, called THIS nets (Territorial Health-care Information Systems), which formally describes patient care-pathways, relevant information flows and their interactions. THIS nets are then used for verification of the health information systems and evaluation of their performances such as cycle time distribution and probability of information availability at some target time. A real example of cancer patient health-care information system is used to illustrate the usefulness of the proposed approach. We show that advanced information system allows earlier start of the medical consultations and thus a more efficient care pathway. A case study is proposed through a cost-effectiveness analysis on Electronic Health Record (EHR) implementation versus the patient's paper file in the context of cancer visits. Results show that the adoption of the developed HIS strictly dominated (i.e., was both less costly and more effective) the use of a low HIS with the patient's paper file. Such positive impact was demonstrated on the long term through a service quality analysis using the provided THIS net
Zimmermann, Jakub. "Détection d'intrusions paramétrée par la politique par contrôle de flux de références." Rennes 1, 2003. http://www.theses.fr/2003REN10155.
Full textDebert, Maxime. "Stratégies optimales multi-critères, prédictives, temps réel de gestion des flux d'énergie thermique et électrique dans un véhicule hybride." Phd thesis, Université d'Orléans, 2011. http://tel.archives-ouvertes.fr/tel-00867007.
Full textSali, Mustapha. "Exploitation de la demande prévisionnelle pour le pilotage des flux amont d'une chaîne logistique dédiée à la production de masse de produits fortement diversifiés." Phd thesis, Université Paris Dauphine - Paris IX, 2012. http://tel.archives-ouvertes.fr/tel-00776217.
Full textKeufak, Tameze Hugues Magloire. "Flux internationaux de capitaux et secret bancaire." Thesis, Clermont-Ferrand 1, 2013. http://www.theses.fr/2013CLF10404.
Full textThe regulation of the international flows of fund by a number of scattered and varied legislations modifies considerably the notion of the bank secrecy. One of the fundamental characteristics of this upheaval is that it establishes links between the various operators who intervene in the contract. The control of this operation urges States to mutualize their efforts and to put itself together to defend their respective financial interests through the fight against the illicit capital flows. Besides, the defense of these financial interest conduct unmistakably the substancial redefining of the banker on one hand, in particular the way it perceives the relation with the clientele, of collects and of exchanges information, as well as the cooperation with the public authorities and control internal and international. On the other hand, we note because of this extension of missions of the banker, an extension of the penal risks towards him
Matos, Ana Almeida. "Typage du flux d'information sûr: déclassification et mobilité." Phd thesis, École Nationale Supérieure des Mines de Paris, 2006. http://pastel.archives-ouvertes.fr/pastel-00001765.
Full textAlmeida, Matos Anna Gualdina. "Typage du flux d'information sur déclassification et mobilité." Paris, ENMP, 2006. http://www.theses.fr/2006ENMP1341.
Full textWe adress the issue of confidentiality and declassification in a language-based security approach. We study in particular the use of refined type and effect systems for staticallsy enforcing flexibale information flow policies over imperative higher – order languages with concurrrency. A general methodology for defining and proving the soundness of the type and effect system with respect to such properties is presented. We consider two main topics :- the long standing issue of finding a flexible information control mechanism that enables declassification. Our declassification mechanism takes the form of a local flow policy declaration that implements a local information flow policy- the largely unexplored topic of controlling information flow in a global computing setting, our network model, which naturally generalizeq the local setting, includes a notion of a domain and a standard migration primitive for code and resources. New forms of security leaks that are introduced by code mobility are revealed. In both the above settings, to take into account dynamic flow policies we introduce generalizatios of non-interference respectively named the « non disclosure « and » non disclosure for networks » for policies. Their implementation is supported by a concrete presentation of the security lattice where confidentiality levels are sets of principals , similar to acces control lists
Zanioli, Matteo. "Information flow analysis by abstract interpretation." Paris 7, 2012. http://www.theses.fr/2012PA077262.
Full textProtecting the confidentiality of information stored in a computer system or transmitted over a public network is a relevant problem in computer security. The goal of this thesis is to provide both theoretical and experimental results towards the design of an information flow analysis for the automatic verification of absence of sensitive information leakage. Our approach is based on Abstract Interpretation, a theory of sound approximation of program semantics. We track the dependencies among program's variables using propositional formulae, namely the Pos domain. We study the main ways to improve the accuracy (by combination of abstract domains) and the efficiency (by combination of widening and narrowing operators) of the analysis. The reduced product of the logical domain Pos and suitable numerical domains yields to an analysis strictly more accurate with respect to the ones already in the literature. The modular construction of our analysis allows to deal with the trade-off between efficiency and accuracy by tuning the granularity of the abstraction and the complexity of the abstract operators. Finally, we introduce Sails, a new information flow analysis tool for mainstream languages like Java, that does not require any manual annotation. Sails combines the information leakage analysis with different heap abstractions, inferring information leakage over programs dealing with complex data structures too. We applied Sails to the analysis of the SecuriBench-micro suite and the preliminary experimental results outline the effectiveness of our approach
Tararykine, Viatcheslav. "Modélisation des flux d'information dans un système de e-maintenance." Phd thesis, Université de Franche-Comté, 2005. http://tel.archives-ouvertes.fr/tel-00258814.
Full textGeorget, Laurent. "Suivi de flux d'information correct pour les systèmes d'exploitation Linux." Thesis, Rennes 1, 2017. http://www.theses.fr/2017REN1S040/document.
Full textWe look forward to improving the implementations of information flow control mechanisms in Linux Operating Systems. Information Flow Control aims at monitoring how information disseminates in a system once it is out of its original container, unlike access control which can merely apply rule on how the containers are accessed. We met several scientific and technical challenges. First of all, the Linux codebase is big, over fifteen millions lines of code spread over thirty three thousand files. The first contribution of this thesis is a plugin for the GCC compiler able to extract and let a user easily visualize the control flow graphs of the Linux kernel functions. Secondly, the Linux Security Modules framework which is used to implement the information flow trackers we have reviewed (Laminar, KBlare, and Weir) was designed in the first place to implement access control, rather than information flow control. One issue is thus left open: is the framework implemented in such a way that all flows generated by system calls can be captured? We have created and implemented static analysis to address this problem and proved its correction with the Coq proof assistant system. This analysis is implemented as a GCC plugin and have allowed us to improve the LSM framework in order to capture all flows. Finally, we have noted that current information flow trackers are vulnerable to race conditions between flows and are unable to cover some overt channels of information such as files mapping to memory and shared memory segments between processes. We have implemented Rfblare, a new algorithm of flow tracking, for KBlare. The correction of this algorithm has been proved with Coq. We have showed that LSM can be used successfully to implement information flow control, and that only formal methods, leading to reusable methodology, analysis, tools, etc., are a match for the complexity and the fast-paced evolution of the Linux kernel
Andriatsimandefitra, Ratsisahanana Radoniaina. "Caractérisation et détection de malware Android basées sur les flux d'information." Thesis, Supélec, 2014. http://www.theses.fr/2014SUPL0025/document.
Full text: Information flows are information exchanges between objects in a given environment. At system level, information flows involving data belonging to a given application describe how this application disseminates its data in the system and can be considered as behaviour based profile of the application. Because of the increasing number of Android malware, there is an urgent need to explore new approaches to analyse and detect Android malware. In this thesis, we thus propose an approach to characterize and detect Android malware based on information flows they cause in the system. This approach leverages two other contributions of the thesis which are AndroBlare, the Android version of an information flow monitor named Blare, and the system flow graph, a data structure to represent in a compact and human readable way the information flows observed by AndroBlare. We successfully evaluated our approach by building the profile of 4 different malware and showed that these profiles permitted to detect the execution of applications infected by malware for which we have computed a profile
Valéra, Ludovick. "Amélioration du flux d'information et réduction du temps de passage réseau." Thèse, Université du Québec à Trois-Rivières, 2010. http://depot-e.uqtr.ca/1173/1/030168636.pdf.
Full textLattanzio, Thierry. "Caractérisation des entreprises organisées en "gestion par affaire"." Phd thesis, Paris, ENSAM, 2006. http://pastel.archives-ouvertes.fr/pastel-00002739.
Full textNasr, Allah Mounir. "Contrôle de flux d'information par utilisation conjointe d'analyse statique et dynamique accélérée matériellement." Thesis, CentraleSupélec, 2020. http://www.theses.fr/2020CSUP0007.
Full textAs embedded systems are more and more present in our lives, it is necessary to protect the personal data stored in such systems. Application developers can unintentionally introduce vulnerabilities that can be exploited by attackers to compromise the confidentiality or integrity of the system. One of the solutions to prevent this is to use reactive mechanisms to monitor the behavior of the system while it is running. In this thesis, we propose a generic anomaly detection approach combining hardware and software aspects, based on dynamic information flow tracking (DIFT). DIFT consists of attaching labels representing security levels to information containers, for example files, and specifying an information flow policy to describe the authorized flows. To implement such an approach, we first developed a DIFT monitor which is flexible and non-invasive for the processor, using ARM CoreSight trace components. To take into account the information flows that occur in the different layers, from the operating system to the processor instructions, we have developed different static analysis into the compiler. These analyses generate annotations, used by the DIFT monitor, that describe the dissemination of data in the system at run-time. We also developed a Linux security module to handle information flows involving files. The proposed approach can thus be used to detect different kinds of attacks
Jeon, Doh-Shin. "Essais sur la théorie des incitations : collusion, flux d'information et réduction des effectifs." Toulouse 1, 2000. http://www.theses.fr/2000TOU10009.
Full textGrall, Hervé. "Deux critères de sécurité pour l'exécution de code mobile." Phd thesis, Ecole des Ponts ParisTech, 2003. http://tel.archives-ouvertes.fr/tel-00007549.
Full textMellal, Naçima. "Réalisation de l'interopérabilité sémantique des systèmes, basée sur les ontologies et les flux d'information." Chambéry, 2007. http://www.theses.fr/2007CHAMS045.
Full textThis thesis subscribes with the description of knowledge through the use of ontologies. We choose to represent knowledge by goal-ontologies. In order to solve the problem of semantic alignment of these goals in distributed environment, we propose a process whose purpose is to establish links between goal-ontologies. In this sens, we propose to use the sound IF mathematical model (Information Flow Model), in order to assure the automatization of the proposed process. The IF model guarantees a semantic interoperability between distributed systems and identifies a theory for the formalization of system connections. Thus, the goals represented in terms of ontologies can be semantically connected if they satisfy some specific rules. These ideas are illustrated by a case study extracted from a practical problem
Bellivier, Muriel. "Le Juste-à-temps : implications et conséquences sur les flux d'information, d'hommes et de marchandises." Marne-La-Vallée, 1994. http://www.theses.fr/1994MARN0099.
Full textBen, cheikh Ansem. "E-CARe : une méthode d'ingénierie des systèmes d'information ubiquitaires." Phd thesis, Université de Grenoble, 2012. http://tel.archives-ouvertes.fr/tel-00768025.
Full textBen, Cheikh Ansem. "E-CARe : une méthode d'ingénierie des systèmes d'information ubiquitaires." Thesis, Grenoble, 2012. http://www.theses.fr/2012GRENM020/document.
Full textUbiquitous Information Systems appeared as a consequence to emerging and evolving communication and mobile technologies providing the system with information on its environment, the environment of its users and their profiles. These data constitute the application context and are used to provide personalized, targeted and relevant services. However, ubiquitous services face some difficulties and challenges concerning specially needed contextual data, adaptation degree and computerized decision making. This is due to the gap between advanced ubiquitous services and their applications, and methods and processes for developing and engineering ubiquitous systems. Our goal in this thesis is to propose an engineering method for ubiquitous Information Systems considering different requirements resulting from the mobile and high scalable nature of these systems. The proposed method is based on a development process and a set of generic metamodels and languages facilitating a complete system specification and implementation. The proposed process separates functional, technical and ubiquitous specifications. Ubiquitous specifications enable the structural and event based context models definition while considering user requirements and security requirements. Adaptation and context awareness functionalities are supported by structural and dynamic context models. The proposed event oriented approach is enhanced by the adoption of an event processing architecture. Ubiquitous specifications are integrated into a classical information systems engineering process to constitute the E-CARe process including functional and technical specifications. Our propositions are used to design a user assistance application in the transport domain, highly dependent on the ambient environment and events
Jaber, Mayyad. "Architecture de système d'information distribué pour la gestion de la chaîne logistique : une approche orientée services." Lyon, INSA, 2009. http://theses.insa-lyon.fr/publication/2009ISAL0013/these.pdf.
Full textIn order to improve their competitiveness, many enterprises seek to reduce the cost of doing business, and advance their capability for rapidly developing new services and products. For that, enterprises focus more and more on their core business while they are more and more involved in collaborative organizations. This trend involves reinforcing collaboration relationship with partners leading to virtual organizations. Such inter-enterprises organizations have to pay a particular attention on logistic operations, from procurement to final products distribution. Improving the global performance of such a supply chain relies on both industrial optimization and on an efficient process coordination and information sharing between partners. The enactment of supply chain business process relies heavily on information and communication technologies (ICT). They lead to overcome boarders between enterprises and then create new business opportunities. The core idea of our approach hinges on a global vision for flexible collaboration of supply chain partners and provides a global framework based on neutral technologies and standards to support collaboration management. In this work, we present an open integrated framework to support inter-enterprises collaboration based on facilitating the access to private processes of legacy systems and defining common business process issued from assembling distributed services through a global workflow. Potential involvement of several legacy heterogeneous systems in one collaborative business process is coupled with stronger interoperability requirements both for Business Process organization and information exchange
Klaudel, Witold. "Contribution aux logiciels de simulation généralisée basée sur une approche séquentielle et l'emploi de flux d'information." Châtenay-Malabry, Ecole centrale de Paris, 1989. http://www.theses.fr/1989ECAP0085.
Full textBouzguenda, Lotfi. "Coordination multi-agents pour le Workflow Inter-Organisationnel lâche." Toulouse 1, 2006. http://www.theses.fr/2006TOU10009.
Full textThe objective of Inter-Organizational Workflow (IOW) is to coordinate workflow processes distributed, autonomous and heterogeneous, resulting and being carried out in various organizations. The IOW can be studied according to two scenarios : a loose scenario and a tight scenario. In this thesis, we consider the loose scenario, which corresponds to an occasional co-operation between organizations, without any structural constraint on the co-operation, and where organizations participating are not known in advance but recruited dynamically. This thesis deals with coordination in the loose IOW and more particulary the finding of partners and the negociation between partners. It rests on the idea that agent technology is adapted to tackle the coordination problems, and to allow the design and the execution of loose IOW. More precisely, our contribution consists of five elements : (1) an agent based architecture, (2) a communication language combining (KQML and OWL-S standards, (3) an organizational model and (4) a specification process, validation and publication of the workflow services, and (5) a simulator MatchFlow
Othman, Lotfi ben. "Développement d'un système de gestion de workflows distribué." Sherbrooke : Université de Sherbrooke, 2000.
Find full textCreus, Tomas Jordi. "Roses : Un moteur de requêtes continues pour l’aggrégation de flux RSS à large échelle." Paris 6, 2012. http://www.theses.fr/2012PA066658.
Full textRSS and Atom are generally less known than the HTML web format, but they are omnipresent in many modern web applications for publishing highly dynamic web contents. Nowadays, news sites publish thousands of RSS/Atom feeds, often organized into general topics like politics, economy, sports, culture, etc. Weblog and microblogging systems like Twitter use the RSS publication format, and even more general social media like Facebook produce an RSS feed for every user and trending topic. This vast number of continuous data-sources can be accessed by using general-purpose feed aggregator applications like Google Reader, desktop clients like Firefox or Thunderbird and by RSS mash-up applications like Yahoo! pipes, Netvibes or Google News. Today, RSS and Atom feeds represent a huge stream of structured text data which potential is still not fully exploited. In this thesis, we first present ROSES –Really Open Simple and Efficient Syndication–, a data model and continuous query language for RSS/Atom feeds. ROSES allows users to create new personalized feeds from existing real-world feeds through a simple, yet complete, declarative query language and algebra. The ROSES algebra has been implemented in a complete scalable prototype system capable of handling and processing ROSES feed aggregation queries. The query engine has been designed in order to scale in terms of the number of queries. In particular, it implements a new cost-based multi-query optimization approach based on query normalization and shared filter factorization. We propose two different factorization algorithms: (i) STA, an adaption of an existing approximate algorithm for finding minimal directed Steiner trees [CCC+98a], and (ii) VCA, a greedy approximation algorithm based on efficient heuristics outperforming the previous one with respect to optimization cost. Our optimization approach has been validated by extensive experimental evaluation on real world data collections
Kassab, Randa. "Analyse des propriétés stationnaires et des propriétés émergentes dans les flux d'information changeant au cours du temps." Thesis, Nancy 1, 2009. http://www.theses.fr/2009NAN10027/document.
Full textMany applications produce and receive continuous, unlimited, and high-speed data streams. This raises obvious problems of storage, treatment and analysis of data, which are only just beginning to be treated in the domain of data streams. On the one hand, it is a question of treating data streams on the fly without having to memorize all the data. On the other hand, it is also a question of analyzing, in a simultaneous and concurrent manner, the regularities inherent in the data stream as well as the novelties, exceptions, or changes occurring in this stream over time. The main contribution of this thesis concerns the development of a new machine learning approach - called ILoNDF - which is based on novelty detection principle. The learning of this model is, contrary to that of its former self, driven not only by the novelty part in the input data but also by the data itself. Thereby, ILoNDF can continuously extract new knowledge relating to the relative frequencies of the data and their variables. This makes it more robust against noise. Being operated in an on-line mode without repeated training, ILoNDF can further address the primary challenges for managing data streams. Firstly, we focus on the study of ILoNDF's behavior for one-class classification when dealing with high-dimensional noisy data. This study enabled us to highlight the pure learning capacities of ILoNDF with respect to the key classification methods suggested until now. Next, we are particularly involved in the adaptation of ILoNDF to the specific context of information filtering. Our goal is to set up user-oriented filtering strategies rather than system-oriented in following two types of directions. The first direction concerns user modeling relying on the model ILoNDF. This provides a new way of looking at user's need in terms of specificity, exhaustivity and contradictory profile-contributing criteria. These criteria go on to estimate the relative importance the user might attach to precision and recall. The filtering threshold can then be adjusted taking into account this knowledge about user's need. The second direction, complementary to the first one, concerns the refinement of ILoNDF's functionality in order to confer it the capacity of tracking drifting user's need over time. Finally, we consider the generalization of our previous work to the case where streaming data can be divided into multiple classes
Kassab, Randa Alexandre Frédéric. "Analyse des propriétés stationnaires et des propriétés émergentes dans les flux d'information changeant au cours du temps." S. l. : Nancy 1, 2009. http://www.scd.uhp-nancy.fr/docnum/SCD_T_2009_0027_KASSAB.pdf.
Full textLemaire, Christelle. "Le couplage entre flux physiques et flux d'information associés (F2PIA), apport de l'informatisation d'un système de traçabilité totale : application au cas d'une P.M.E. de produits laitiers." Aix-Marseille 2, 2005. http://www.theses.fr/2005AIX24017.
Full textFoulon, Lucas. "Détection d'anomalies dans les flux de données par structure d'indexation et approximation : Application à l'analyse en continu des flux de messages du système d'information de la SNCF." Thesis, Lyon, 2020. http://www.theses.fr/2020LYSEI082.
Full textIn this thesis, we propose methods to approximate an anomaly score in order to detect abnormal parts in data streams. Two main problems are considered in this context. Firstly, the handling of the high dimensionality of the objects describing the time series extracted from the raw streams, and secondly, the low computation cost required to perform the analysis on-the-fly. To tackle the curse of dimensionality, we have selected the CFOF anomaly score, that has been proposed recently and proven to be robust to the increase of the dimensionality. Our main contribution is then the proposition of two methods to quickly approximate the CFOF score of new objects in a stream. The first one is based on safe pruning and approximation during the exploration of object neighbourhood. The second one is an approximation obtained by the aggregation of scores computed in several subspaces. Both contributions complete each other and can be combined. We show on a reference benchmark that our proposals result in important reduction of the execution times, while providing approximations that preserve the quality of anomaly detection. Then, we present our application of these approaches within the SNCF information system. In this context, we have extended the existing monitoring modules by a new tool to help to detect abnormal behaviours in the real stream of messages within the SNCF communication system
Simo, Jean Claude. "Contrôle du flot d'information par des techniques basées sur le langage de programmation." Master's thesis, Université Laval, 2015. http://hdl.handle.net/20.500.11794/25983.
Full textA program is said to be noninterfering if the values of its public (or low) outputs do not depend on the values of its secret (or high) inputs. Various researchers have demonstrated how this property (or closely related properties) can be achieved through information flow analysis. In this work, we present in detail some existing models of information flow analysis, and sketch a new approach of analysis for concurrent programming. The first part of this thesis presents the different formulations of non-interference, and an overview of the main types of information flow analysis. In the second part, we examine in detail some recent static and dynamic (hybrid) flow-sensitive analysis models, for a simple imperative language. In the third part, we explore two recent models of secure information flow in concurrent programs, which develop a novel treatment of the interaction between threads and the scheduler to prevent undesired interleaving. We end with a sketch of the foundations for another approach, based on the analysis of dependencies between variables of concurrent programs.
Lefray, Arnaud. "Security for Virtualized Distributed Systems : from Modelization to Deployment." Thesis, Lyon, École normale supérieure, 2015. http://www.theses.fr/2015ENSL1032/document.
Full textThis Thesis deals with security for virtualized distributed environments such as Clouds. In these environments, a client can access resources or services (compute, storage, etc.) on-demand without prior knowledge of the infrastructure underneath. These services are low-cost due to the mutualization of resources. As a result, the clients share a common infrastructure. However, the concentration of businesses and critical data makes Clouds more attractive for malicious users, especially when considering new attack vectors between tenants.Nowadays, Cloud providers offer default security or security by design which does not fit tenants' custom needs. This gap allows for multiple attacks (data thieft, malicious usage, etc.)In this Thesis, we propose a user-centric approach where a tenant models both its security needs as high-level properties and its virtualized application. These security objectives are based on a new logic dedicated to expressing system-based information flow properties. Then, we propose security-aware algorithm to automatically deploy the application and enforce the security properties. The enforcement can be realized by taking into account shared resources during placement decision and/or through the configuration of existing security mechanisms
Jaisson, Pascal Marie. "Systèmes complexes gouvernés par des flux : schémas de volumes finis hybrides et optimisation numérique." Châtenay-Malabry, Ecole centrale de Paris, 2006. http://www.theses.fr/2006ECAP1020.
Full textThis thesis deals with pde modeling and numerical resolution of optimisation problems for multithread system and traffic flow. We propose a new hybrid scheme. First, we are interesting by fluid models of a multithread/multitask system proposed by de vusyt. We find odes which are used for the computation of the service times. We numerically solve two problem of optimal control of quality of service (qos) management. Then we deal with traffic data assimilation and algorithms able to predict the traffic flows on road section. The traffic flow is modelized by the aw-rascle hyperbolic system. We have to minimize a functional whose optimization variables are initial condition and/or upstream boundary conditions. We use the roe method to compute the solution of the traffic flow modelling system. Then we compute the gradient of the functional by an adjoint method. This gradient will be used to optimize the functional. Last, we propose a new hybryd scheme with one parameter which permit the scheme to have the tvd property and the space and time second order accuracy. After a first predictor step, we can correct the parameter in the cells where the entropy production is positive. Thus, the scheme can capture the physical solution
BORZIC, BORIS. "Un modele de gestionnaire iteratif de flux informationnel sur internet." Paris, CNAM, 1998. http://www.theses.fr/1998CNAM0314.
Full textThe phenomenal growth of internet and proliferation of contents at one-line users 'disposal have amazingly increased the problem of informational overabundance. Within the context of a cifre contract, a performing access system to personalised information and shared on internet, called iterative management device of information flows, has been elaborated. The purely information research part is leaded through an hybridisation between a linguistic layer of terminological extraction and an infometric ? layer of themes automatic detection. Further this simple transfer of technical know-how, we had to think about the way of integrating this hybrid system in the innovative space, fluctuating and federative which is internet, that owns its proper hypertextual logic of information organisation. Its own characteristics, so much at sources identification level as the documents' nature (new process of electronical writing) have been taken into account. Afterwards, we conducted a categorisation of the documents on internet calculated from the hypertextual and textual cover of each one. This categorisation is the basis of the database iterative process. Whereas the first part uses most advanced technics of documentary world (indexation in complete text by accelerated linguistic treatment) combined with the ones of informative monitor world, technics capable of detecting weak signals (infometric treatment), the inherent part to internet applies itself to robots, to new 3d representation formats and to a systems typology of informational distribution available on the network
Amar, Véronique. "Mise en oeuvre d'outils de travail collaboratif à la Caisse d'Epargne Rhône-Alpes Lyon étude préalable à la mise en place d'un workflow et mise en ligne de la documentation du système d'information /." [S.l.] : [s.n.], 2002. http://www.enssib.fr/bibliotheque/documents/dessid/rsamar.pdf.
Full textPavlos, Delias. "An agent-based workflow management system for marketing decision support." Paris 9, 2009. https://portail.bu.dauphine.fr/fileviewer/index.php?doc=2009PA090081.
Full textTanriverdieva, Khatira. "Etude et faisabilité du projet de valorisation de la littérature grise de la Région Rhône-Alpes application de la GED et du workflow /." [S.l.] : [s.n.], 2002. http://www.enssib.fr/bibliotheque/documents/dessride/rstanriverdieva.pdf.
Full textFragoso, Femenin dos Santos José. "Vers l’établissement du flux d’information sûr dans les applications Web côté client." Thesis, Nice, 2014. http://www.theses.fr/2014NICE4148/document.
Full textIn this thesis, we address the issue of enforcing confidentiality and integrity policies in the context of client-side Web applications. Since most Web applications are developed in the JavaScript programming language, we study static, dynamic, and hybrid enforcement mechanisms for securing information flow in Core JavaScript --- a fragment of JavaScript that retains its defining features. Specifically, we propose: a monitored semantics for dynamically enforcing secure information flow in Core JavaScript as well as a source-to-source transformation that inlines the proposed monitor, a type system that statically checks whether or not a program abides by a given information flow policy, and a hybrid type system that combines static and dynamic analyses in order to accept more secure programs than its fully static counterpart. Most JavaScript programs are designed to be executed in a browser in the context of a Web page. These programs often interact with the Web page in which they are included via a large number of external APIs provided by the browser. The execution of these APIs usually takes place outside the perimeter of the language. Hence, any realistic analysis of client-side JavaScript must take into account possible interactions with external APIs. To this end, we present a general methodology for extending security monitors to take into account the possible invocation of arbitrary APIs and we apply this methodology to a representative fragment of the DOM Core Level 1 API that captures DOM-specific information flows
Huron, Samuel. "Constructive Visualization : A token-based paradigm allowing to assemble dynamic visual representation for non-experts." Thesis, Paris 11, 2014. http://www.theses.fr/2014PA112253/document.
Full textDuring the past two decades, information visualisation (InfoVis) research has created new techniques and methods to support data- intensive analyses in science, industry and government. These have enabled a wide range of analyses tasks to be executed, with tasks varying in terms of the type and volume of data involved. However, the majority of this research has focused on static datasets, and the analysis and visualisation tasks tend to be carried out by trained expert users. In more recent years, social changes and technological advances have meant that data have become more and more dynamic, and are consumed by a wider audience. Examples of such dynamic data streams include e-mails, status updates, RSS 1 feeds, versioning systems, social networks and others. These new types of data are used by populations that are not specifically trained in information visualization. Some of these people might consist of casual users, while others might consist of people deeply involved with the data, but in both cases, they would not have received formal training in information visualization. For simplicity, throughout this dissertation, I refer to the people (casual users, novices, data experts) who have not been trained in information visualisation as non-experts.These social and technological changes have given rise to multiple challenges because most existing visualisation models and techniques are intended for experts, and assume static datasets. Few studies have been conducted that explore these challenges. In this dissertation, with my collaborators, I address the question: Can we empower non-experts in their use of visualisation by enabling them to contribute to data stream analysis as well as to create their own visualizations?The first step to answering this question is to determine whether people who are not trained in information visualisation and the data sciences can conduct useful dynamic analysis tasks using a visualisation system that is adapted to support their tasks. In the first part of this dissertation I focus on several scenarios and systems where different sized crowds of InfoVis non-experts users (20 to 300 and 2 000 to 700 000 people) use dynamic information visualisation to analyse dynamic data.Another important issue is the lack of generic design principles for the visual encoding of dynamic visualization. In this dissertation I design, define and explore a design space to represent dynamic data for non-experts. This design space is structured by visual tokens representing data items that provide the constructive material for the assembly over time of different visualizations, from classic represen- tations to new ones. To date, research on visual encoding has been focused on static datasets for specific tasks, leaving generic dynamic approaches unexplored and unexploited.In this thesis, I propose construction as a design paradigm for non-experts to author simple and dynamic visualizations. This paradigm is inspired by well-established developmental psychological theory as well as past and existing practices of visualisation authoring with tangible elements. I describe the simple conceptual components and processes underlying this paradigm, making it easier for the human computer interaction community to study and support this process for a wide range of visualizations. Finally, I use this paradigm and tangible tokens to study if and how non-experts are able to create, discuss and update their own visualizations. This study allows us to refine our previous model and provide a first exploration into how non-experts perform a visual mapping without software. In summary, this thesis contributes to the understanding of dynamic visualisation for non-expert users
Poignant, Johann. "Identification non-supervisée de personnes dans les flux télévisés." Phd thesis, Université de Grenoble, 2013. http://tel.archives-ouvertes.fr/tel-00958774.
Full textDerdour, Makhlouf. "Modélisation et implémentation d’un système d’information de gestion de flux multimedia pour des architectures logicielles intégrant des appareils sans-fil mobiles." Thesis, Pau, 2012. http://www.theses.fr/2012PAUU3045/document.
Full textThe development of applications for pervasive computing presents a number of challenges for software engineering. In particular, the adaptation of context-aware applications: adapting to the environment (location, time, condition, etc.), connectivity (bit-rate, protocol, etc.), the limitations of the device (screen size media ...) and even the user (physical handicaps, ability, etc.). The programmer is always faced with a complex combination of factors that occur throughout the application. In this context where the multimedia, the mobility of user and the ubiquitous application generalizes, providers want to offer adaptable software applications (context sensitive). Much effort has been devoted to assembly and reassembly of components, thus to the functional adaptation by replacement or reconfiguration of the components in order to satisfy the new need or the new context. The problem we tackle in this thesis is that of semantic and behavioral heterogeneity of components. The objective is to provide mechanisms for adapting multimedia data flows in applications based components, i.e. ensuring the assembly of heterogeneous components. For this, the architecture must be able to check the possibility of assembling components from the manifest (a manifest must contain the technical information of component). The assembly may not accept for two reasons: functional and/or behavioral incompatibility. Our work focuses on the second reason, in case the interfaces of component will not be compatible with the interfaces of adjacent components. For example, it provides images of PNG type, while the other accepts only JPEG images. The taken into account of components interoperability in an assembly is a necessity in such approaches. Indeed, software architectures validate the functional aspects, which is not sufficient to ensure a realistic assembly and to remedy the problems of heterogeneous of data flow exchanged. We propose to take into account the interoperability and to find solutions to problems of heterogeneity, an approach called model MMSA (Meta-Model for Multimedia Software Architecture). It allows the description of software architectures expressing a software system as a collection of components that handle various data types and formats and which interact through connectors to adaptation. We also define a UML 2.0 profile to express and model the new concepts and constraints of MMSA meta-model. The transition to a UML profile is provided by the extension mechanisms provided by UML 2.0 in order to improve the verification and consistency of architectures described in MMSA. We propose for ensure the coherence of applications aim at screw of changing context, a platform of dynamic adaptations. The dynamic adaptation is the process by which a software application is modified in order to take into account a change, be it level of the environment or the application itself, the platform monitors and controls the execution of multimedia applications to detect any change of context. In the event of a change, the platform seeks the possible solutions and takes the appropriate decision to adapt the application to the new context. Then, the platform seeks and selected the services of adaptation necessary in order to integrate them in connectors of adaptation and reassembled with the business components of application. In order to examine the projection of MMSA in UML 2.0, the OCL constraints were dynamically evaluated on a model of monitoring system. We have proposed to the software architects, a tool which gives the possibility of checking the model of architecture to each modification in order to ensure its structural and semantic coherence. The various tests and validations carried out on the models of architecture guarantee our projection perfectly. This visual model supports the creation and the management of models MMSA for software applications
Hachicha, Rim. "Modélisation et analyse de la flexibilité dans les systèmes workflow." Paris, CNAM, 2007. http://www.theses.fr/2007CNAM0565.
Full textThis thesis is devoted to formal modeling and management of workflow system. We are interested to bring a solution to the one of the principal problems of workflow systems is that of flexibility : the models as well as the current systems are not sufficiently flexible and adaptable. For these requirements, we propose a task model and actor model specifying the formal relations between workflow tasks and actors and allowing a flexible assignment of actors to workflow activities. The workflow task allocation is based on the concept of actor/task distance and agent coalition formation process. The model allows checking the interchangeability of the actors and the coherence of the workflows tasks following the evolution of the environment. We propose a distributed agent architecture integrating the formal model and permitting to carry out the functionalities required by the workflow system. This architecture is adaptable, reactive and ensures the reusability of the workflow system. We implemented the proposed model on JADE agent platform using expert system JESS and we validated our model on a real application
Yannis, Georgios. "Système d'information et stratégie dans les transports. Le cas du transport express." Phd thesis, Ecole Nationale des Ponts et Chaussées, 1993. http://tel.archives-ouvertes.fr/tel-00519682.
Full textBurget, Cyril. "Gestion des flux de l'information dans un espace semi-public : l'usage des nouvelles technologies dans le métro parisien." Paris 10, 2006. http://www.theses.fr/2006PA100093.
Full textOur work is divided in three studies : a qualitative study and an ethnographic observation allow us to confront the usage of mobile communication in the Parisian metropolitan and to see how sociability thresholds are built through mobile telephone. The studies analyse the "residential" uses and the users connexion strategies. A third study, focusing on the management and running of the Saint-Marcel station, accounts for the modifications of strategy and users reception on the entire underground network expected to happen in the next few years. This is done, through an exhaustive presentation of the components of that particular station. The presentation allows us to define how the relationships between public, private and virtual spaces come together and to question the setup of technical and human components around four imperatives: to secure, to inform, to supervise and to punish