Дисертації з теми "Date de conception"
Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями
Ознайомтеся з топ-50 дисертацій для дослідження на тему "Date de conception".
Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.
Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.
Переглядайте дисертації для різних дисциплін та оформлюйте правильно вашу бібліографію.
Tran, Thi chien. "Impact des facteurs environnementaux sur la survenue d’une pré-éclampsie sévère." Thesis, Université Paris-Saclay (ComUE), 2016. http://www.theses.fr/2016SACLV057/document.
Повний текст джерелаDuring two last decades, the effect of meteorological factors on human health, especially pregnancy, has become a growing public health concern. However, the influence of meteorological and environmental factors on the occurrence of pre-eclampsia still has to be precisely determined. The main objective of this work is to determine the influence of meteorological conditions at various time during pregnancy (date of conception, near date of conception) on the occurrence of pre-eclampsia in a large French registry of pregnant women and to determine at which moment are the women more susceptible
Noëth, Johannes Georg. "Saamwoon voor die huwelik : 'n teologies-etiese beoordeling / Johannes Georg Noëth." Thesis, North-West University, 2005. http://hdl.handle.net/10394/844.
Повний текст джерелаThesis (Th.M. (Ethics))--North-West University, Potchefstroom Campus, 2005.
Abbott, Karen Elizabeth. "Student nurses' conceptions of computers in hospitals." Thesis, University of British Columbia, 1988. http://hdl.handle.net/2429/28567.
Повний текст джерелаEducation, Faculty of
Educational Studies (EDST), Department of
Graduate
Dubuc, Dominique. "Philosophie de la conception avec les nouveaux outils informatiques." Thesis, McGill University, 1993. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=69772.
Повний текст джерелаThe first chapter consists of an introduction which illustrates the approaches that were used to complete this thesis. Several methods and analyses support the development of approaches upon which future research into the philosophy of design using new computer tools should be based.
The second chapter lists the computer peripherals which determine the features of a computer work station. Without its peripherals, the computer is simply a giant calculator. It is therefore important to address this subject in depth when creating a work station adapted to design.
The third chapter covers the description of current CAD and the manner in which the data are classified according to their use. Following a summary of the history of CAD and the evolution of generations of systems, this chapter describes today's CAD from the point of view of the user, that is, by the data he uses.
The fourth chapter of the thesis describes the progress of a project using the process of continuous design. This chapter looks at the interrelation that exists between the stages in order to show the usefulness of the computer as a design tool. Significant interest has been shown in a new stage: the formalization of the project, which makes the link between the drawing stage and the preproject stage.
The conclusion of this thesis puts the current CAD situation into perspective in order to pave the way for the development of new CAD, better adapted to architects and designers and allowing them to finally conceive their projects on computer.
Ponchateau, Cyrille. "Conception et exploitation d'une base de modèles : application aux data sciences." Thesis, Chasseneuil-du-Poitou, Ecole nationale supérieure de mécanique et d'aérotechnique, 2018. http://www.theses.fr/2018ESMA0005/document.
Повний текст джерелаIt is common practice in experimental science to use time series to represent experimental results, that usually come as a list of values in chronological order (indexed by time) and generally obtained via sensors connected to the studied physical system. Those series are analyzed to obtain a mathematical model that allow to describe the data and thus to understand and explain the behavio rof the studied system. Nowadays, storage and analyses technologies for time series are numerous and mature, but the storage and management technologies for mathematical models and their linking to experimental numerical data are both scarce and recent. Still, mathematical models have an essential role to play in the interpretation and validation of experimental results. Consequently, an adapted storage system would ease the management and re-usability of mathematical models. This work aims at developing a models database to manage mathematical models and provide a “query by data” system, to help retrieve/identify a model from an experimental time series. In this work, I will describe the conception (from the modeling of the system, to its software architecture) of the models database and its extensions to allow the “query by data”. Then, I will describe the prototype of models database,that I implemented and the results obtained by tests performed on the latter
Bogo, Gilles. "Conception d'applications pour systèmes transactionnels coopérants." Habilitation à diriger des recherches, Grenoble INPG, 1985. http://tel.archives-ouvertes.fr/tel-00315574.
Повний текст джерелаDE, VITO DOMINIQUE. "Conception et implementation d'un modele d'execution pour un langage declaratif data-parallele." Paris 11, 1998. http://www.theses.fr/1998PA112124.
Повний текст джерелаBrunet, Solenn. "Conception de mécanismes d'accréditations anonymes et d'anonymisation de données." Thesis, Rennes 1, 2017. http://www.theses.fr/2017REN1S130/document.
Повний текст джерелаThe emergence of personal mobile devices, with communication and positioning features, is leading to new use cases and personalized services. However, they imply a significant collection of personal data and therefore require appropriate security solutions. Indeed, users are not always aware of the personal and sensitive information that can be inferred from their use. The main objective of this thesis is to show how cryptographic mechanisms and data anonymization techniques can reconcile privacy, security requirements and utility of the service provided. In the first part, we study keyed-verification anonymous credentials which guarantee the anonymity of users with respect to a given service provider: a user proves that she is granted access to its services without revealing any additional information. We introduce new such primitives that offer different properties and are of independent interest. We use these constructions to design three privacy-preserving systems: a keyed-verification anonymous credentials system, a coercion-resistant electronic voting scheme and an electronic payment system. Each of these solutions is practical and proven secure. Indeed, for two of these contributions, implementations on SIM cards have been carried out. Nevertheless, some kinds of services still require using or storing personal data for compliance with a legal obligation or for the provision of the service. In the second part, we study how to preserve users' privacy in such services. To this end, we propose an anonymization process for mobility traces based on differential privacy. It allows us to provide anonymous databases by limiting the added noise. Such databases can then be exploited for scientific, economic or societal purposes, for instance
Stylianou, Christos. "Predictive modelling of assisted conception data with embryo-level covariates : statistical issues and application." Thesis, University of Manchester, 2011. https://www.research.manchester.ac.uk/portal/en/theses/predictive-modelling-of-assisted-conception-data-withembryolevel-covariatesstatistical-issues-and-application(a9c4d835-a082-43c7-b980-a1b6b8e165c8).html.
Повний текст джерелаAhmed, Bacha Adda Redouane. "Localisation multi-hypothèses pour l'aide à la conduite : conception d'un filtre "réactif-coopératif"." Thesis, Evry-Val d'Essonne, 2014. http://www.theses.fr/2014EVRY0051/document.
Повний текст джерела“ When we use information from one source,it's plagiarism;Wen we use information from many,it's information fusion ”This work presents an innovative collaborative data fusion approach for ego-vehicle localization. This approach called the Optimized Kalman Particle Swarm (OKPS) is a data fusion and an optimized filtering method. Data fusion is made using data from a low cost GPS, INS, Odometer and a Steering wheel angle encoder. This work proved that this approach is both more appropriate and more efficient for vehicle ego-localization in degraded sensors performance and highly nonlinear situations. The most widely used vehicle localization methods are the Bayesian approaches represented by the EKF and its variants (UKF, DD1, DD2). The Bayesian methods suffer from sensitivity to noises and instability for the highly non-linear cases. Proposed for covering the Bayesian methods limitations, the Multi-hypothesis (particle based) approaches are used for ego-vehicle localization. Inspired from monte-carlo simulation methods, the Particle Filter (PF) performances are strongly dependent on computational resources. Taking advantages of existing localization techniques and integrating metaheuristic optimization benefits, the OKPS is designed to deal with vehicles high nonlinear dynamic, data noises and real time requirement. For ego-vehicle localization, especially for highly dynamic on-road maneuvers, a filter needs to be robust and reactive at the same time. The OKPS filter is a new cooperative-reactive localization algorithm inspired by dynamic Particle Swarm Optimization (PSO) metaheuristic methods. It combines advantages of the PSO and two other filters: The Particle Filter (PF) and the Extended Kalman filter (EKF). The OKPS is tested using real data collected using a vehicle equipped with embedded sensors. Its performances are tested in comparison with the EKF, the PF and the Swarm Particle Filter (SPF). The SPF is an interesting particle based hybrid filter combining PSO and particle filtering advantages; It represents the first step of the OKPS development. The results show the efficiency of the OKPS for a high dynamic driving scenario with damaged and low quality GPS data
Wang, Keqin. "Knowledge discovery in manufacturing quality data to support product design decision making." Troyes, 2010. http://www.theses.fr/2010TROY0005.
Повний текст джерелаThis work studies knowledge extraction in manufacturing quality data (MQD) for support-ing design decisions. Firstly, an ontological approach for analyzing design decisions and identifying designer’s needs for manufacturing quality knowledge is proposed. The decisions are analyzed ranging from task clarification, conceptual design, embodiment design to detail design. A decision model is proposed in which decisions and its knowledge elements are illustrated. An ontology is constructed to represent the decisions and their knowledge needs. Secondly, MQD preparation for further knowledge discovery is described. The nature of data in manufacturing is described. A GT (group technology) and QBOM (Quality Bill of Material)-based method is proposed to classify and organize MQD. As an important factor, the data quality (DQ) issues related with MQD is also analyzed for data mining (DM) application. A QFD (quality function deployment) based approach is proposed for translating data consumers’ DQ needs into specific DQ dimensions and initiatives. Thirdly, a DM-based manufacturing quality knowledge discovery method is proposed and validated through two popular DM functions and related algorithms. The two DM functions are illustrated through real world data sets from two different production lines. Fourthly, a MQD-based design support proto-type is developed. The prototype includes three major functions such as data input, knowledge extraction and input, knowledge search
Agard, Bruno. "Contribution à une méthodologie de conception de produits à forte diversité." Phd thesis, Grenoble INPG, 2002. http://tel.archives-ouvertes.fr/tel-00007637.
Повний текст джерелаÉmirian, Frédéric. "Étude et conception d'une machine parallèle multi-modèles pour les réseaux de neurones." Toulouse, INPT, 1996. http://www.theses.fr/1996INPT091H.
Повний текст джерелаSeitz, Ludwig Brunie Lionel Pierson Jean-Marc. "Conception et mise en oeuvre de mécanismes sécurisés d'échange de données confidentielles." Villeurbanne : Doc'INSA, 2006. http://docinsa.insa-lyon.fr/these/pont.php?id=seitz.
Повний текст джерелаTriki, Salah. "Sécurisation des entrepôts de données : de la conception à l’exploitation." Thesis, Lyon 2, 2013. http://www.theses.fr/2013LYO22026.
Повний текст джерелаCompanies have to make strategic decisions that involve competitive advantages. In the context of decision making, the data warehouse concept has emerged in the nineties. A data warehouse is a special kind of database that consolidates and historizes data from the operational information system of a company. Moreover, a company's data are proprietary and sensitive and should not be sold without controls. Indeed, some data are personal and may harm their owners when they are disclosed, for example, medical data, religious or ideological beliefs. Thus, many governments have enacted laws to protect the private lives of their citizens. Faced with these laws, organizations are, therefore, forced to implement strict security measures to comply with these laws. Our work takes place in the context of secure data warehouses that can be addressed at two levels: (i) design that aims to develop a secure data storage level, and (ii) operating level, which aims to strengthen the rights access / user entitlements, and any malicious data to infer prohibited from data it has access to user banned. For securing the design level, we have made three contributions. The first contribution is a specification language for secure storage. This language is a UML profile called SECDW+, which is an extended version of SECDW for consideration of conflicts of interest in design level. SECDW is a UML profile for specifying some concepts of security in a data warehouse by adopting the standard models of RBAC security and MAC. Although SECDW allows the designer to specify what role has access to any part of the data warehouse, it does not take into account conflicts of interest. Thus, through stereotypes and tagged values , we extended SECDW to allow the definition of conflict of interest for the various elements of a multidimensional model. Our second contribution, at this level, is an approach to detect potential inferences from conception. Our approach is based on the class diagram of the power sources to detect inferences conceptual level. Note that prevention inferences at this level reduces the cost of administering the OLAP server used to manage access to a data warehouse. Finally, our third contribution to the design of a secure warehouse consists of rules for analyzing the consistency of authorizations modeled. As for safety operating level, we proposed: an architecture for enhancing the permissions for configuration, a method for the prevention of inferences, and a method to meet the constraints of additive measures. The proposed architecture adds to system access control, typically present in any secure DBMS, a module to prevent inferences. This takes our security methods against inferences and respect for additivity constraints. Our method of preventing inferences operates for both types of inferences: precise and partial. For accurate inferences, our method is based on Bayesian networks. It builds Bayesian networks corresponding to user queries using the MAX and MIN functions, and prohibits those that are likely to generate inferences. We proposed a set of definitions to translate the result of a query in Bayesian networks. Based on these definitions, we have developed algorithms for constructing Bayesian networks to prohibit those that are likely to generate inferences. In addition, to provide a reasonable response time needed to deal with the prevention treatment, we proposed a technique for predicting potential applications to prohibit. The technique is based on the frequency of inheritance queries to determine the most common query that could follow a request being processed. In addition to specific inferences (performed through queries using the MIN and MAX functions), our method is also facing partial inferences made through queries using the SUM function. Inspired by statistical techniques, our method relies on the distribution of data in the warehouse to decide to prohibit or allow the execution of queries
El, Haddadi Amine. "Conception et développement d'un système d'intelligence économique (SIE) pour l'analyse de big data dans un environnement de cloud computing." Thesis, Toulouse 3, 2018. http://www.theses.fr/2018TOU30033.
Повний текст джерелаIn the information era, people's lives are deeply impacted by IT due to the exposure of social networks, emails, RSS feeds, chats, white papers, web pages, etc. Such data are considered very valuable for companies since they will help them in improving their strategies, analyzing their customers' trends or their competitors' marketing interventions is a simple and obvious example. Also, with the advent of the era of Big Data, organizations can obtain information about the dynamic environment of the markets by analyzing consumer's reactions, preferences, opinions and rating on various social media and other networking platforms. Thus, the companies should be equipped with the consciousness of competitive intelligence (CI), and grasp the key points of CI, with setting up an efficient and simple competitive intelligence system adapted to support Big Data. The objective of this thesis is to introduce a new architectural model of Big Data collecting, analyzing and using, named XEW 2.0. This system operates according to four principal steps, where each of which has a dedicated service : (i) XEW sourcing service (XEW-SS), allows searching, collecting, and processing the data from different sources ; (ii) XEW data warehousing services (XEW-DWS) : This brings a unified view of the target corpus and then, creates a data warehouse accessible from the analytics and visualization services ; (iii) XEW Big Data Analytics service (XEW-BDAS) : allows for making multidimensional analyses by adapting data mining algorithms to Big Data ; (iv) XEW Big Data Visualization service (XEW-BDVS) : allows visualizing Big Data in the form of innovative design and graphs representing, for instance, social networks, semantic networks, strategic alliances networks, etc
Herve, Baptiste. "Conception de service dans les entreprises orientées produit sur la base des systèmes de valorisation de données." Thesis, Paris, ENSAM, 2016. http://www.theses.fr/2016ENAM0026/document.
Повний текст джерелаIn a more and more numeric oriented industrial landscape, the business opportunities for companies to innovate and answer needs inaccessible yep are increasing. In this framework, the internet of things appears as a high potential technology. This innovation lever, where the value-creation is principally based on the data, is not tangible by nature and this is the reason why we conceder it as a service in this thesis. However, the designer has to face a complex universe where a high number expertise and knowledge are engaged. This is the reason why we propose in this thesis a design methodology model organizing the service, the domain knowledge and the data discovery technologies in an optimized process to design the internet of things. This model has been experienced at e.l.m. leblanc, company of the Bosch group, in the development of a connected boiler and its services
Pham, Thi Ngoc Diem. "Spécification et conception de services d'analyse de l'utilisation d'un environnement informatique pour l’apprentissage humain." Thesis, Le Mans, 2011. http://www.theses.fr/2011LEMA1015/document.
Повний текст джерелаThe research topic of this thesis is a part of the REDIM (model driven re-engineering) research project. It focuses specifically on the analysis of tracks collected during the learning session by a TEL (Technology Enhanced Learning) system in order to provide teachers indicators calculated. In our work environment, UTL (Usage Tracking Language) allows users to define the indicators in a form close to the design patterns. It was designed to response capitalization and reuse questions. However, UTL did not initially have any means to formally specify how to calculate indicators based on tracks collected. In general, design patterns are limited to the description, they cannot be automated. In addition, textual descriptions in UTL to produce indicators from tracks do not allow generating automatically an indicator’s values.Our main research objective was therefore to define models, methods and tools for formalizing and automating the calculation of indicators. We propose an extension for UTL named DCL4UTL (Data Combination Language for UTL) to model indicators in a capitalizable, automatable and reusable form to provide meaningful indicators to teachers/designers. With this new version, the indicators can be calculated in real-time or after a learning session in the context of tutoring actions or the reengineering of learning scenarios, respectively.The originality of our approach (DCL4UTL) lies in the fact that this version not only capitalize know-how on analysis techniques of the use an TEL system, but also (1) formally describe models and calculation methods of indicators from tracks collected by a TEL system, (2) integrate external functions (from other analysis tools), and (3) create parameterized intermediate data facilitating the modeling and reuse of indicators’ calculation method. We have also developed an analysis tool to calculate modeled indicators. Our approach and language have been validated by several experiments with several existent TEL systems
Boukorca, Ahcène. "Hypergraphs in the Service of Very Large Scale Query Optimization. Application : Data Warehousing." Thesis, Chasseneuil-du-Poitou, Ecole nationale supérieure de mécanique et d'aérotechnique, 2016. http://www.theses.fr/2016ESMA0026/document.
Повний текст джерелаThe emergence of the phenomenon Big-Data conducts to the introduction of new increased and urgent needs to share data between users and communities, which has engender a large number of queries that DBMS must handle. This problem has been compounded by other needs of recommendation and exploration of queries. Since data processing is still possible through solutions of query optimization, physical design and deployment architectures, in which these solutions are the results of combinatorial problems based on queries, it is essential to review traditional methods to respond to new needs of scalability. This thesis focuses on the problem of numerous queries and proposes a scalable approach implemented on framework called Big-queries and based on the hypergraph, a flexible data structure, which bas a larger modeling power and may allow accurate formulation of many problems of combinatorial scientific computing. This approach is the result of collaboration with the company Mentor Graphies. It aims to capture the queries interaction in an unified query plan and to use partitioning algorithms to ensure scalability and to optimal optimization structures (materialized views and data partitioning). Also, the unified plan is used in the deploymemt phase of parallel data warehouses, by allowing data partitioning in fragments and allocating these fragments in the correspond processing nodes. Intensive experimental study sbowed the interest of our approach in terms of scaling algorithms and minimization of query response time
Saoudi, Massinissa. "Conception d'un réseau de capteurs sans fil pour des prises de décision à base de méthodes du Data Mining." Thesis, Brest, 2017. http://www.theses.fr/2017BRES0065/document.
Повний текст джерелаRecently, Wireless Sensor Networks (WSNs) have emerged as one of the most exciting fields. However, the common challenge of all sensor network applications remains the vulnerability of sensor nodes due to their characteristics and also the nature of the data generated which are of large volume, heterogeneous, and distributed. On the other hand, the need to process and extract knowledge from these large quantities of data motivated us to explore Data mining techniques and develop new approaches to improve the detection accuracy, the quality of information, the reduction of data size, and the extraction of knowledge from WSN datasets to help decision making. However, the classical Data mining methods are not directly applicable to WSNs due to their constraints.It is therefore necessary to satisfy the following objectives: an efficient solution offering a good adaptation of Data mining methods to the analysis of huge and continuously arriving data from WSNs, by taking into account the constraints of the sensor nodes which allows to extract knowledge in order to make better decisions. The contributions of this thesis focus mainly on the study of several distributed algorithms which can deal with the nature of sensed data and the resource constraints of sensor nodes based on the Data mining algorithms by first using the local computation at each node and then exchange messages with its neighbors, in order to reach consensus on a global model. The different results obtained show that the proposed approaches reduce the energy consumption and the communication cost considerably which extends the network lifetime.The results also indicate that the proposed approaches are extremely efficient in terms of model computation, latency, reduction of data size, adaptability, and event detection
Bournez, Colin. "Conception d'un logiciel pour la recherche de nouvelles molécules bioactives." Thesis, Orléans, 2019. http://www.theses.fr/2019ORLE3043.
Повний текст джерелаKinases belong to a family of proteins greatly involved in several aspects of cell control including division or signaling. They are often associated with serious pathologies such as cancer. Therefore, they represent important therapeutic targets in medicinal chemistry. Currently, it has become difficult to design new innovative kinase inhibitors, particularly since the active site of these proteins share a great similarity causing selectivity issues. One of the main used experimental method is fragment-based drug design. Thus, we developed our own software, Frags2Drugs, which uses this approach to build bioactive molecules. Frags2Drugs relies on publicly available experimental data, especially co-crystallized ligands bound to protein kinase structure. We first developed a new fragmentation method to acquire our library composed of thousands of three-dimensional fragments. Our library is then stored as a graph object where each fragment corresponds to a node and each relation, representing a possible chemical bond between fragments, to a link between the two concerned nodes. We have afterwards developed an algorithm to calculate all possible combinations between each available fragment, directly in the binding site of the target. Our program Frags2Drugs can quickly create thousands of molecules from an initial user-defined fragment (the seed). In addition, many methods for filtering the results, in order to retain only the most promising compounds, were also implemented. The software were validated on three protein kinases involved in different cancers. The proposed molecules were then synthesized and show excellent in vitro activity
Marty, Guy. "Contribution à la conception et à la réalisation d'une machine EDIFACT." Toulouse 3, 1995. http://www.theses.fr/1995TOU30100.
Повний текст джерелаMatulovic, Broqué Maja. "Aide à la conception et à l'implémentation d'un mécanisme d'exécution des règles actives." Versailles-St Quentin en Yvelines, 1999. http://www.theses.fr/1999VERS0008.
Повний текст джерелаUn systeme de bases de donnees actif est capable d'executer automatiquement des actions predefinies en reponse a des evenements specifiques lorsque certaines conditions sont satisfaites. Les regles actives, de la forme evenement - condition - action, sont au cur de cette approche. Les systemes actifs existants ont un comportement predefinis et ne permettent pas d'adapter ce comportement aux besoins des applications. Le developpeur a, alors, recours aux interfaces passives du sgbd pour implementer les fonctionnalites requises. Il en resulte la dispersion du code ou la degradation des performances. Apres avoir analyse les problemes d'utilisation des systemes actifs existants, cette these propose une approche de type boite a outils permettant la construction de mecanismes d'execution des regles actives specifiques a un domaine d'application (ou a une application particuliere). Ainsi, nous proposons une architecture de reference, un ensemble d'architectures operationnelles envisageables et un ensemble de composants logiciels (classes java) architecture permettant d'implementer un mecanisme d'execution adapte (par specialisation de classes java) a l'application utilisateur. L'application du rafraichissement d'un entrepot de donnees est utilisee pour etudier comment definir une boite a outils. Un entrepot est alimente par diverses bases operationnelles et il est necessaire de propager les modifications des donnees dans les sources operationnelles sur les donnees d'entrepot. Apres une etude du probleme, nous proposons la specification du rafraichissement au moyen d'un workflow. Le workflow est mis en uvre au moyen de regles actives. Un scenario particulier est defini, le mecanisme d'execution des regles associe est implemente et l'architecture fonctionnelle, adaptee a cette famille d'application, est definie. A partir de cette implementation un ensemble d'experimentations nous permet de definir une boite a outils pouvant etre adaptee a d'autres domaines d'application
Gamatié, Abdoulaye. "Design and Analysis for Multi-Clock and Data-Intensive Applications on Multiprocessor Systems-on-Chip." Habilitation à diriger des recherches, Université des Sciences et Technologie de Lille - Lille I, 2012. http://tel.archives-ouvertes.fr/tel-00756967.
Повний текст джерелаPirayesh, Neghab Amir. "Évaluation basée sur l’interopérabilité de la performance des collaborationsdans le processus de conception." Thesis, Paris, ENSAM, 2014. http://www.theses.fr/2014ENAM0033/document.
Повний текст джерелаA design process, whether for a product or for a service, is composed of a large number of activities connected by many data and information exchanges. The quality of these exchanges, called collaborations, requires being able to send and receive data and information which are useful, understandable and unambiguous to the different designers involved. The research question is thus focused on the definition and evaluation of the performance of collaborations, and by extension, of the design process in its entirety. This performance evaluation requires the definition of several key elements such as object(s) to be evaluated, the performance indicator(s) and action variable(s).In order to define the object of evaluation, this research relies on a study of the literature resulting in a meta-model of collaborative process. The measurement of the performance of these collaborations is for its part based on the concept of interoperability. The measurement of the performance of the collaborations is for its part based on the concept of interoperability. Furthermore, this measure estimates the technical and conceptual interoperability of the different elementary collaborations. This work is concluded by proposing a tooled methodological framework for collaborations' performance evaluation. Through a two-step approach (modeling and evaluation), this framework facilitates the identification of inefficient collaborations and their causes. This framework is illustrated and partially validated using an academic example and a case study in design domain
Haddi, Zouhair. "Conception et développement d'un système multicapteurs en gaz et en liquide pour la sécurité alimentaire." Thesis, Lyon 1, 2013. http://www.theses.fr/2013LYO10292/document.
Повний текст джерелаElectronic noses and tongues systems based on chemical and electrochemical sensors are an advantageous solution for the characterisation of odours and tastes that are emanating from food products. The cross-selectivity of the sensor array coupled with patter recognition methods is the key element in the design and development of these systems. In this context, we have demonstrated the ability of an electronic nose device to discriminate between different types of drugs, to analyse cheeses freshness, to identify adulterated cheeses and to differentiate between potable and wastewaters. We have also succeeded to correctly classify drinking waters (mineral, natural, sparkling and tap) and wastewaters by using a potentiometric electronic tongue. This study was validated by Gas Chromatography coupled with Mass Spectrometry (GC-MS). Furthermore, we have developed a voltammetric electronic tongue based on a Diamond Doped Boron electrode to differentiate treatment stages of domestic and hospital wastewaters and to identify different heavy metals (Pb, Hg, Cu, Cd, Ni and Zn) contained in Rhône river. The Differential Pulse Anodic Stripping Voltammetry (DPASV) was used as an electrochemical method to characterise the studied waters. Finally, the hybrid multisensor systems have proven to be good analytical tools to characterise the products of food industry such as Tunisian juices and Moroccan olive oils
Jaziri, Faouzi. "Conception et analyse des biopuces à ADN en environnements parallèles et distribués." Thesis, Clermont-Ferrand 2, 2014. http://www.theses.fr/2014CLF22465/document.
Повний текст джерелаMicroorganisms represent the largest diversity of the living beings. They play a crucial rôle in all biological processes related to their huge metabolic potentialities and their capacity for adaptation to different ecological niches. The development of new genomic approaches allows a better knowledge of the microbial communities involved in complex environments functioning. In this context, DNA microarrays represent high-throughput tools able to study the presence, or the expression levels of several thousands of genes, combining qualitative and quantitative aspects in only one experiment. However, the design and analysis of DNA microarrays, with their current high density formats as well as the huge amount of data to process, are complex but crucial steps. To improve the quality and performance of these two steps, we have proposed new bioinformatics approaches for the design and analysis of DNA microarrays in parallel and distributed environments. These multipurpose approaches use high performance computing (HPC) and new software engineering approaches, especially model driven engineering (MDE), to overcome the current limitations. We have first developed PhylGrid 2.0, a new distributed approach for the selection of explorative probes for phylogenetic DNA microarrays at large scale using computing grids. This software was used to build PhylOPDb: a comprehensive 16S rRNA oligonucleotide probe database for prokaryotic identification. MetaExploArrays, which is a parallel software of oligonucleotide probe selection on different computing architectures (a PC, a multiprocessor, a cluster or a computing grid) using meta-programming and a model driven engineering approach, has been developed to improve flexibility in accordance to user’s informatics resources. Then, PhylInterpret, a new software for the analysis of hybridization results of DNA microarrays. PhylInterpret uses the concepts of propositional logic to determine the prokaryotic composition of metagenomic samples. Finally, a new parallelization method based on model driven engineering (MDE) has been proposed to compute a complete backtranslation of short peptides to select probes for functional microarrays
Charfi, Manel. "Declarative approach for long-term sensor data storage." Thesis, Lyon, 2017. http://www.theses.fr/2017LYSEI081/document.
Повний текст джерелаNowadays, sensors are cheap, easy to deploy and immediate to integrate into applications. These thousands of sensors are increasingly invasive and are constantly generating enormous amounts of data that must be stored and managed for the proper functioning of the applications depending on them. Sensor data, in addition of being of major interest in real-time applications, e.g. building control, health supervision..., are also important for long-term reporting applications, e.g. reporting, statistics, research data... Whenever a sensor produces data, two dimensions are of particular interest: the temporal dimension to stamp the produced value at a particular time and the spatial dimension to identify the location of the sensor. Both dimensions have different granularities that can be organized into hierarchies specific to the concerned context application. In this PhD thesis, we focus on applications that require long-term storage of sensor data issued from sensor data streams. Since huge amount of sensor data can be generated, our main goal is to select only relevant data to be saved for further usage, in particular long-term query facilities. More precisely, our aim is to develop an approach that controls the storage of sensor data by keeping only the data considered as relevant according to the spatial and temporal granularities representative of the application requirements. In such cases, approximating data in order to reduce the quantity of stored values enhances the efficiency of those queries. Our key idea is to borrow the declarative approach developed in the seventies for database design from constraints and to extend functional dependencies with spatial and temporal components in order to revisit the classical database schema normalization process. Given sensor data streams, we consider both spatio-temporal granularity hierarchies and Spatio-Temporal Functional Dependencies (STFDs) as first class-citizens for designing sensor databases on top of any RDBMS. We propose a specific axiomatisation of STFDs and the associated attribute closure algorithm, leading to a new normalization algorithm. We have implemented a prototype of this architecture to deal with both database design and data loading. We conducted experiments with synthetic and real-life data streams from intelligent buildings
Bosom, Jérémie. "Conception de microservices intelligents pour la supervision de systèmes sociotechniques : application aux systèmes énergétiques." Thesis, Université Paris sciences et lettres, 2020. http://www.theses.fr/2020UPSLP051.
Повний текст джерелаMany institutions and companies aim to manage precisely the energy performance of their building stocks, in particular by relying on the Internet of Things (IoT) which allows the large-scale deployment of sensors. The supervision of these building stocks can then be done using distributed computing and by exploiting the data collected using machine learning methods. The concept of Trusted Third Party for Energy Measurement and Performance (TTPEMP) is needed with Cloud technologies to manage these energy ecosystems. The question addressed in this thesis is the design of a distributed and scalable supervision system, ranging from data collection to dashboards provisioning, allowing management of the buildings’ infrastructures of several institutions. The resolution of this goal faces several major difficulties: the different scales of space and time, the system’s components heterogeneity, the inherent challenges of distributed computing and building energy modeling. Distributed computing raises issues related to microservices orchestration and choreography, in particular those of scalability. In this context, highlighting the genericity of the provided solution over the technical details requires an abstract formalism. To this end, the presentation of the supervision system architecture makes use of the Orc process algebra which is suitable for the choreography of concurrent and distributed processes that are subject to delays and failures. Our second contribution consists in providing a hierarchical model called Multi-Institution Building Energy System (MIBES) designed for the modeling of the TTPEMP. This model highlights different subsystems, that are essential for decision-making : sensors, sites, groups of sites (building stocks) and organizations. It rationally prepares the development of algorithms by providing multiple views at the different modeling levels. These algorithms are organized as an extensible library of microservices. The adoption of Development and Operations (DevOps) methods responds to human organization by advocating human collaboration between the departments of the organization in charge of the project and the automation of the latter. By integrating these DevOps principles, a prototype of the supervision system is developed in order to demonstrate the various advantages brought by our approach. These advantages are expressed in the form of scaling, reproducibility and decision-making facilities. The prototype thus produced forms a solid basis for buildings’ smart supervision and can be reused for other applications such as Smart Grids
Abbas, Nivine. "Conception et performance de schémas de coordination dans les réseaux cellulaires." Thesis, Paris, ENST, 2016. http://www.theses.fr/2016ENST0068/document.
Повний текст джерелаInterference is still the main limiting factor in cellular networks. We focus on the different coordinated multi-point schemes (CoMP) proposed in the LTE-A standard to cope with interference, taking into account the dynamic aspect of traffic and users’ mobility. The results are obtained by the analysis of Markov models and system-level simulations. We show the important impact of the scheduling strategy on the network performance in the presence of mobile users considering elastic traffic and video streaming. We propose a new scheduler that deprioritizes mobile users at the cell edge, in order to improve the overall system efficiency. We show that it is interesting to activate Joint Processing technique only in a high-interference network, its activation in a low-interference network may lead to performance degradation. We propose a new coordination mechanism, where a cell cooperates only when its cooperation brings a sufficient mean throughput gain, which compensates the extra resource consumption. Finally, we show that the coordination of beams is not necessary when a large number of antennas is deployed at each base station; a simple opportunistic scheduling strategy provides optimal performance. For a limited number of antennas per base station,coordination is necessary to avoid interference between the activated beams, allowing substantial performance gains
Kahelras, Mohamed. "Conception d'observateurs pour différentes classes de systèmes à retards non linéaires." Thesis, Université Paris-Saclay (ComUE), 2019. http://www.theses.fr/2019SACLS005/document.
Повний текст джерелаTime-delay is a natural phenomenon that is present in most physical systems and engineering applications, thus, delay systems have been an active area of research in control engineering for more than 60 years. Observer design is one of the most important subject that has been dealt with, this is due to the importance of observers in control engineering systems not only when sensing is not sufficient but also when a sensing reliability is needed. In this work, the main goal was to design observers for different classes of nonlinear delayed systems with an arbitrary large delay, using different approaches. In the first part, the problem of observer design is addressed for a class of triangular nonlinear systems with not necessarily small delay and sampled output measurements. Another major difficulty with this class of systems is the fact that the state matrix is dependent on the un-delayed output signal which is not accessible to measurement. A new chain observer, composed of sub-observers in series, is designed to compensate for output sampling and arbitrary large delays.In the second part of this work, another kind of triangular nonlinear delayed systems was considered, where this time the delay was considered as a first order hyperbolic partial differential equation. The inverse backstepping transformation was invoked and a chain observer was developed to ensure its effectiveness in case of large delays. Finally, a new observer was designed for a class of nonlinear parabolic partial differential equations under point measurements, in the case of large delays. The observer was composed of several chained sub-observers. Each sub-observer compensates a fraction of the global delay. The stability analyses of the error systems were based on different Lyapunov-Krasovskii functionals. Also different mathematical tools have been used in order to prove the results. Simulation results were presented to confirm the accuracy of the theoretical results
Cordeil, Maxime. "Exploration et exploitation de l’espace de conception des transitions animées en visualisation d’information." Thesis, Toulouse, ISAE, 2013. http://www.theses.fr/2013ESAE0044/document.
Повний текст джерелаData visualizations allow information to be transmitted to users. In order to explore and understand the data, it is often necessary for users to manipulate the display of this data. When manipulating the visualization, visual transitions are necessary to avoid abrupt changes in this visualization, and to allow the user to focus on the graphical object of interest. These visual transitions can be coded as an animation, or techniques that link the data across several displays. The first aim of this thesis was to examine the benefits and properties of animated transitions used to explore and understand large quantities of multidimensional data. In order to do so, we created a taxonomy of existing animated transitions. This taxonomy allowed us to identify that no animated transition currently exists that allows the user to control the direction of objects during the transition. We therefore proposed an animated transition that allows the user to have this control during the animation. In addition, we studied an animated transition technique that uses 3D rotation to transition between visualizations. We identified the advantages of this technique and propose an improvement to the current design. The second objective was to study the visual transitions used in the Air Traffic Control domain. Air Traffic Controllers use a number of visualizations to view vast information which is duplicated in several places: the Radar screen, the strip board, airplane lists (departures/arrivals) etc. Air traffic controllers perform visual transitions as they search between these different displays of information. We studied the way animations can be used in the Air Traffic Control domain by implementing a radar image prototype which combines three visualizations typically used by Air Traffic Controllers
Adame, Issifou. "Conception et réalisation de la décentralisation sur micro-ordinateur d'une base de données économique." Lyon 1, 1985. http://www.theses.fr/1985LYO19008.
Повний текст джерелаLiu, Yinling. "Conception et vérification du système d'Information pour la maintenance aéronautique." Thesis, Lyon, 2019. http://www.theses.fr/2019LYSEI133.
Повний текст джерелаOperational support is one of the most important aspects of aeronautical maintenance. It aims to provide a portfolio of services to implement maintenance with a high level of efficiency, reliability and accessibility. One of the major difficulties in operational support is that there is no platform that integrates all aircraft maintenance processes in order to reduce costs and improve the level of service. It is therefore necessary to build an autonomous aircraft maintenance system in which all maintenance information can be collected, organized, analyzed and managed in a way that facilitates decision-making. To do this, an innovative methodology has been proposed, which concerns modelling, simulation, formal verification and performance analysis of the autonomous system mentioned. Three axes were addressed in this thesis. The first axis concerns the design and simulation of an autonomous system for aeronautical maintenance. We offer an innovative design of an autonomous system that supports automatic decision making for maintenance planning. The second axis is the verification of models on simulation systems. We propose a more comprehensive approach to verifying global behaviours and operational behaviours of systems. The third axis focuses on the analysis of the performance of simulation systems. We propose an approach of combining an agent-based simulation system with the “Fuzzy Rough Nearest Neighbor” approach, in order to implement efficient classification and prediction of aircraft maintenance failures with missing data. Finally, simulation models and systems have been proposed. Simulation experiments illustrate the feasibility of the proposed approach
Abi, Akle Audrey. "Visualisation d’information pour une décision informée en exploration d’espace de conception par shopping." Thesis, Châtenay-Malabry, Ecole centrale de Paris, 2015. http://www.theses.fr/2015ECAP0039/document.
Повний текст джерелаIn Design space exploration, the resulting data, from simulation of large amount of new design alternatives, can lead to information overload when one good design solution must be chosen. The design space exploration relates to a multi-criteria optimization method in design but in manual mode, for which appropriate tools to support multi-dimensional data visualization are employed. For the designer, a three-phase process - discovery, optimization, selection - is followed according to a paradigm called Design by Shopping. Exploring the design space helps to gain insight into both feasible and infeasible solutions subspaces, and into solutions presenting good trade-offs. Designers learn during these graphical data manipulations and the selection of an optimal solution is based on a so-called informed decision. The objective of this research is the performance of graphs for design space exploration according to the three phases of the Design by Shopping process. In consequence, five graphs, identified as potentially efficient, are tested through two experiments. In the first, thirty participants tested three graphs, in three design scenarios where one car must be chosen out of a total of forty, for the selection phase in a multi-attribute situation where preferences are enounced. A response quality index is proposed to compute the choice quality for each of the three given scenarios, the optimal solutions being compared to the ones resulting from the graphical manipulations. In the second experiment, forty-two novice designers solved two design problems with three graphs. In this case, the performance of graphs is tested for informed decision-making and for the three phases of the process in a multi-objective situation. The results reveal three efficient graphs for the design space exploration: the Scatter Plot Matrix for the discovery phase and for informed decision-making, the Simple Scatter Plot for the optimization phase and the Parallel Coordinate Plot for the selection phase in a multi-attribute as well as multi-objective situation
Spinelli-Flesch, Marie. "Pensée et construction lors de la naissance du gothique." Besançon, 1990. http://www.theses.fr/1990BESA1018.
Повний текст джерелаThe great edifices analysis during the twelth century and texts going with permit to precise thought-construction relations. The devoutness to God and the saints founds all constructions and provides their financements. The importance of the relics at Saint-denis makes Suger do a spectacular presentation. In the plan, is the augustinian idea of beautiful witch modify the new needs (liturgy, increasing of the faithfuls). Religious feeling modified by the new christology contributed to the changes of the gothic art. The comparison of Suger's thought and the pseudo-Denys shows fondamental differences between them and deny the influence on the first gothic of the corpus dionysiacum. In geometry, the know of the "litterati" doesn't present any experimental interest but a classificatory one. The builders' experiments are only occasionaly helped by the know of the scolars. The technical progress and the gothic forms are mostly the fact of the masons hardly supported by the sleeping-partners' will
Zaher, Noufal Issam al. "Outils de CAO pour la génération d'opérateurs arithmétiques auto-contrôlables." Grenoble INPG, 2001. http://www.theses.fr/2001INPG0028.
Повний текст джерелаCapdevila, Ibañez Bruno. "Serious game architecture and design : modular component-based data-driven entity system framework to support systemic modeling and design in agile serious game developments." Paris 6, 2013. http://www.theses.fr/2013PA066727.
Повний текст джерелаFor the last ten years, we witness how the inherent learning properties of videogames entice several creators into exploring their potential as a medium of expression for diverse and innovative (serious) purposes. Learning is at the core of the play experience, but it usually takes place at the affective and psychomotor domains. When the learning targets the serious content, cognitive/instructional designers must ensure its effectiveness at the cognitive domain. In such eminently multidisciplinary teams (game, technology, cognition, art), understanding and communication are essential for an effective collaboration from the early stage of inception. In a software engineering approach, we focus on the (multidisciplinary) activities of the development process rather than the disciplines themselves, with the intent to uniform and clarify the field. Then, we propose a software foundation that reinforces this multidisciplinary model thanks to an underdesign approach that favors the creation of collaborative design workspaces. Thereby, Genome Engine can be considered as a data-driven sociotechnical infrastructure that provides non-programmer developers, such as game designers and eventually cognitive designers, with a means to actively participate in the construction of the product design, rather than evaluating it once in usage time. Its architecture is based on a component-based application framework with an entity system of systems runtime object model, which contributes to modularity, reuse and adaptability, as well as to provide familiar abstractions that ease communication. Our approach has been extensively evaluated with the development of several serious game projects
Whitman, Isabelle M. "Dante, Damnation, and The Undead: How The Conception of Hell Has Changed in Western Literature from Dante's Inferno to The Zombie Apocalypse." ScholarWorks@UNO, 2015. http://scholarworks.uno.edu/td/1997.
Повний текст джерелаCohen, Albert. "Contributions à la conception de systèmes à hautes performances, programmables et sûrs: principes, interfaces, algorithmes et outils." Habilitation à diriger des recherches, Université Paris Sud - Paris XI, 2007. http://tel.archives-ouvertes.fr/tel-00550830.
Повний текст джерелаCombelles, Cécil. "Modélisation ab-initio Appliquée à la Conception de Nouvelles Batteries Li-Ion." Phd thesis, Université Montpellier II - Sciences et Techniques du Languedoc, 2009. http://tel.archives-ouvertes.fr/tel-00421182.
Повний текст джерелаGodot, Xavier. "Interactions Projet/Données lors de la conception de produits multi-technologiques en contexte collaboratif." Thesis, Paris, ENSAM, 2013. http://www.theses.fr/2013ENAM0024/document.
Повний текст джерелаAs an industrial point of view, product design activity answer to firmsdevelopment needs. This activity requires a lot of heterogeneous knowledge and skills, whichhave to converge towards a common goal: describe a product meeting the market needs.Consequently, there are many interactions between the firm, its market and the design activity.Therefore, a development project must take into account specifications and constraints of eachelement. The goal of this PhD is to define a generic methodological framework allowing to builtand control a product design project depending on the firm development goals and its ownresources. For this, it is important to include many technical factors (such innovation, multitechnologicalproducts and numerical data specificities) but also economical and financialfactors (as the difficult competitive environment or limited financial resources). All theseheterogeneous parameters involve a global approach of the problem. That is why a two-stageresearch approach is applied to build this framework. In the first stage, a conceptual diagram isdesigned using items coming from the company goals, its market and design activity.Interactions and behavior of all these items are deduced from this conceptual diagram. Theseresults are formalized through a generic process. This last one is finally applied to severalexamples from SME working in the mechanical field
Sakka, Mohamed Amin. "Contributions à la modélisation et la conception des systèmes de gestion de provenance à large échelle." Thesis, Evry, Institut national des télécommunications, 2012. http://www.theses.fr/2012TELE0023/document.
Повний текст джерелаProvenance is a key metadata for assessing electronic documents trustworthiness. It allows to prove the quality and the reliability of its content. With the maturation of service oriented technologies and Cloud computing, more and more data is exchanged electronically and dematerialization becomes one of the key concepts to cost reduction and efficiency improvement. Although most of the applications exchanging and processing documents on the Web or in the Cloud become provenance aware and provide heterogeneous, decentralized and not interoperable provenance data, most of Provenance Management Systems (PMSs) are either dedicated to a specific application (workflow, database, ...) or a specific data type. Those systems were not conceived to support provenance over distributed and heterogeneous sources. This implies that end-users are faced with different provenance models and different query languages. For these reasons, modeling, collecting and querying provenance across heterogeneous distributed sources is considered today as a challenging task. This is also the case for designing scalable PMSs providing these features. In the fist part of our thesis, we focus on provenance modelling. We present a new provenance modelling approach based on semantic Web technologies. Our approach allows to import provenance data from heterogeneous sources, to enrich it semantically to obtain high level representation of provenance. It provides syntactic interoperability between those sources based on a minimal domain model (MDM), supports the construction of rich domain models what allows high level representations of provenance while keeping the semantic interoperability. Our modelling approch supports also semantic correlation between different provenance sources and allows the use of a high level semantic query language. In the second part of our thesis, we focus on the design, implementation and scalability issues of provenance management systems. Based on our modelling approach, we propose a centralized logical architecture for PMSs. Then, we present a mediator based architecture for PMSs aiming to preserve provenance sources distribution. Within this architecture, the mediator has a global vision on all provenance sources and possesses query processing and distribution capabilities. The validation of our modelling approach was performed in a document archival context within Novapost, a company offering SaaS services for documents archiving. Also, we propose a non-functional validation aiming to test the scalability of our architecture. This validation is based on two implementation of our PMS : he first uses an RDF triple store (Sesame) and the second a NoSQL DBMS coupled with the map-reduce parallel model (CouchDB). The tests we performed show the limits of Sesame in storing and querying large amounts of provenance data. However, the PMS based on CouchDB showed a good performance and a linear scalability
Peyret, Thomas. "Architecture matérielle et flot de programmation associé pour la conception de systèmes numériques tolérants aux fautes." Thesis, Lorient, 2014. http://www.theses.fr/2014LORIS348/document.
Повний текст джерелаWhether in automotive with heat stress or in aerospace and nuclear field subjected to cosmic,neutron and gamma radiation, the environment can lead to the development of faults in electronicsystems. These faults, which can be transient or permanent, will lead to erroneous results thatare unacceptable in some application contexts. The use of so-called rad-hard components issometimes compromised due to their high costs and supply problems associated with exportrules.This thesis proposes a joint hardware and software approach independent of integrationtechnology for using digital programmable devices in environments that generate faults. Ourapproach includes the definition of a Coarse Grained Reconfigurable Architecture (CGRA) ableto execute entire application code but also all the hardware and software mechanisms to make ittolerant to transient and permanent faults. This is achieved by the combination of redundancyand dynamic reconfiguration of the CGRA based on a library of configurations generated by acomplete conception flow. This implemented flow relies on a flow to map a code represented as aControl and Data Flow Graph (CDFG) on the CGRA architecture by obtaining directly a largenumber of different configurations and allows to exploit the full potential of architecture.This work, which has been validated through experiments with applications in the field ofsignal and image processing, has been the subject of two publications in international conferencesand of two patents
Masse, Pierre-Aymeric. "Conception contextuelle des interactions entre un modèle de processus opérationnel et des modèles de processus supports." Thesis, Ecole nationale supérieure Mines-Télécom Atlantique Bretagne Pays de la Loire, 2019. http://www.theses.fr/2019IMTA0039.
Повний текст джерелаNowadays, the use of business process management techniques within companies allows a significant improvement in the efficiency of the operational systems. These techniques assist business experts in modelling business processes, implementation, analytics, and enhancements. The execution context of a business process contains information to identify and understand the interactions between itself and other processes. The first process is triggered following a customer need (operational process) and the others, following the need of the operationnal process (support processes). The satisfaction of a customer need depends on an effective interaction between an operational process and the related support processes. These interactions are defined through operational data, manipulated by the operational process, and support data, manipulated by the support processes. Our work is based on the framework of Model Driven Engineering. The approach proposed in this thesis is based on the annotation of operational or support process models. This annotation is performed with the help of an ontology defining the business domain described by these processes. These annotations are then exploited to constitute a set of data, called contextual data. The analysis of the traces of the execution of the operational process and of these contextual data makes it possible to select the best sub set of contextual data, in the business sense. Thus, an operational process can be associated with a set of support processes via the contextual data
Hamouda, Cherif. "Étude d’une architecture d’émission/réception impulsionnelle ULB pour dispositifs nomades à 60 GHz." Thesis, Paris Est, 2014. http://www.theses.fr/2014PEST1098/document.
Повний текст джерелаThis work deals with the feasibility study of a radio architecture dedicated to mobile WPAN applications at 60 GHz and characterized by a low power consumption. Data rates of the order of Gbps, high compactness and low power consumption are obtained by co-designing the antenna and the front-end. Before proposing the architecture matching the specification needs, a preliminary study of the propagation channel at 60 GHz is made. The two main standards IEEE 802.15.3c and 802.11.ad the channel are studied. The analysis of a single-band architecture suitable for low-power systems shows a data rate limitation when directional antennas are used in the standard channel 802.11.ad. To address this problem, a multi-band impulse architecture MBOOK using a non-coherent receiver is proposed. This architecture allows high throughput with the use of four sub-bands. It also leads to a low power consumption through the use of a non-coherent receiver and a differential transmitter topology avoiding combiners. To validate the concept of the proposed architecture, differential antennas dedicated to the differential architecture are designed. Patch antennas excited by differential microstrip lines fulfil the needs of the specifications but occupy a large area. In order to miniaturize the antenna, slot-fed patch antennas are designed using two orthogonal linear polarizations excited by a pair of differential inputs. To achieve the high directivity required in LOS scenarios without using antenna arrays or dielectric lenses, metamaterials are used. The antenna measurement is based on the realization of a WR-15 waveguide-to-microstrip line transition to connect the antenna to the network analyzer. The differential measurement of the antenna patch exhibits a good agreement with the simulated results. The TriQuint's TQP15 technology is used to design the various circuits of the front-end. The emitter architecture is validated once the overall consumption has been evaluated. This work ends with an evaluation of the throughput of the system taking into account the influence of the antenna and the propagation channel. This evaluation shows the potential of the architecture in terms of high throughput. We finally propose an approach based on the LTCC technology for the antenna / front-end assembly
Saïdi, Houssem Eddine. "Conception et évaluation de techniques d'interaction pour l'exploration de données complexes dans de larges espaces d'affichage." Thesis, Toulouse 3, 2018. http://www.theses.fr/2018TOU30252/document.
Повний текст джерелаToday's ever-growing data is becoming increasingly complex due to its large volume and high dimensionality: it thus becomes crucial to explore interactive visualization environments that go beyond the traditional desktop in order to provide a larger display area and offer more efficient interaction techniques to manipulate the data. The main environments fitting the aforementioned description are: large displays, i.e. an assembly of displays amounting to a single space; Multi-display Environments (MDEs), i.e. a combination of heterogeneous displays (monitors, smartphones/tablets/wearables, interactive tabletops...) spatially distributed in the environment; and immersive environments, i.e. systems where everything can be used as a display surface, without imposing any bound between displays and immersing the user within the environment. The objective of our work is to design and experiment original and efficient interaction techniques well suited for each of the previously described environments. First, we focused on the interaction with large datasets on large displays. We specifically studied simultaneous interaction with multiple regions of interest of the displayed visualization. We implemented and evaluated an extension of the traditional overview+detail interface to tackle this problem: it consists of an overview+detail interface where the overview is displayed on a large screen and multiple detailed views are displayed on a tactile tablet. The interface allows the user to have up to four detailed views of the visualization at the same time. We studied its usefulness as well as the optimal number of detailed views that can be used efficiently. Second, we designed a novel touch-enabled device, TDome, to facilitate interactions in Multi- display environments. The device is composed of a dome-like base and provides up to 6 degrees of freedom, a touchscreen and a camera that can sense the environment. [...]
Bugnet, Henri. "Conception et test d'un circuit intégré (ASIC) : application aux chambres multifils et aux photomultiplicateurs de l'expérience GRAAL." Université Joseph Fourier (Grenoble), 1995. http://www.theses.fr/1995GRE10192.
Повний текст джерелаBrejla, Tomáš. "Návrh koncepce prevence ztráty dat." Master's thesis, Vysoká škola ekonomická v Praze, 2011. http://www.nusl.cz/ntk/nusl-114106.
Повний текст джерелаToure, Carine. "Capitalisation pérenne de connaissances industrielles : Vers des méthodes de conception incrémentales et itératives centrées sur l’activité." Thesis, Lyon, 2017. http://www.theses.fr/2017LYSEI095/document.
Повний текст джерелаIn this research, we are interested in the question of sustainability of the use of knowledge management systems (KMS) in companies. KMS are those IT environments that are set up in companies to share and build common expertise through collaborators. Findings show that, despite the rigor employed by companies in the implementation of these KMS, the risk of knowledge management initiatives being unsuccessful, particularly related to the acceptance and continuous use of these environments by users remains prevalent. The persistence of this fact in companies has motivated our interest to contribute to this general research question. As contributions to this problem, we have 1) identified from the state of the art, four facets that are required to promote the perennial use of a platform managing knowledge; 2) proposed a theoretical model of mixed regulation that unifies tools for self-regulation and tools to support change, and allows the continuous implementation of the various factors that stimulate the sustainable use of CMS; 3) proposed a design methodology, adapted to this model and based on the Agile concepts, which incorporates a mixed evaluation methodology of satisfaction and effective use as well as CHI tools for the completion of different iterations of our methodology; 4) implemented the methodology in real context at the Société du Canal de Provence, which allowed us to test its feasibility and propose generic adjustments / recommendations to designers for its application in context. The tool resulting from our implementation was positively received by the users in terms of satisfaction and usages