Dissertations / Theses on the topic 'Produits industriels – Simulation, Méthodes de'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Produits industriels – Simulation, Méthodes de.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Tichadou, Stéphane. "Modélisation et quantification tridimensionnelles des écarts de fabrication pour la simulation d'usinage." Nantes, 2005. http://www.theses.fr/2005NANT2092.
Full textCantegrit, Eric. "Modélisation et simulation orientées objets des processus par lot." Lille 1, 1988. http://www.theses.fr/1988LIL10045.
Full textPereyrol, Frédéric. "Interaction produit-procédé : extension de la modélisation de la commande avec suivi automatique du produit." Montpellier 2, 1993. http://www.theses.fr/1993MON20147.
Full textBenoit, Scott. "Évaluation de l'éco-efficacité des procédés de transformation des produits laitiers : développement d'un outil de simulation." Doctoral thesis, Université Laval, 2018. http://hdl.handle.net/20.500.11794/31384.
Full textEco-efficiency is a concept specifically designed for the business world and which links theenvironmental and economic performances of a product or service. Since 2012, eco-efficiency is subject to a standardised assessment (ISO 14045) which imposes life cycle analysis to conduct the assessment of the potential environmental impacts. Eco-efficiency assessment thus offers the business world the opportunity to make decisions not only based on economical criteria but also factoring in the potential environmental impacts. Among the countless industrial activities likely to benefit from the eco-efficiency concept is dairy processing. Indeed, this sector of the agri-food industry surely enables valorisation of dairy raw material but it concomitantly exploits a significant part of the natural resources. The first objective of this thesis was to conduct a state of play on the development and implementation of the ecoefficiency concept in the dairy processing field. A first study enabled to look at the dairy processing activity from a life-cycle perspective, and to investigate the successive developments of the eco-efficiency concept in this field. In particular, the study showed that process simulation presented a sound interest in overcoming the difficulties associated with eco-efficiency assessment according to the ISO 14045 standard: the need for a complete inventory of material and energy flows, and for a certain expertise in life cycle analysis. Therefore, the second objective of this thesis was to develop a process simulation tool enabling the ecoefficiency assessment of the dairy processes. A second study thus led to the development of a software prototype allowing for eco-efficiency assessment of dairy products. This prototype relies on a process simulator specifically designed for dairy processes and which includes datasets of potential environmental impacts. These features allow for both the generation of detailed inventories of material and energy flows and potential environmental impact assessments, thereby overcoming the challenges identified in the first study. The developed prototype not only allows for eco-efficiency assessments of the modelled processes but also enables identification of the improvement opportunities, comparison of multiple scenarios of raw milk valuation, and assessment of the economic viability of the modelled scenarios. This tool was used in a third and last study in order to assess the contribution of pressure driven filtration operations to the overall eco-efficiency of dairy processes. These operations are omnipresent in the dairy processing activity and hold a potential for eco-efficiency improvement which has not been yet demonstrated. Three scenarios of Cheddar cheese production were compared in this study: two integrating pressure driven filtration processes at the cheese milk standardisation stage, and one that did not include such operations. Results revealed that although introducing pressure driven filtration processes at the cheese milk standardisation stage can significantly improve the cheese yields, it does not allow for an improvement of the eco-efficiency of the cheese production process. Analysis of the results showed that t he potentials for eco-efficiency improvements in dairy processing through the introduction of pressure driven filtration operations could probably be achieved by incorporating them in the by-product valuation processes. The research work conducted within the framework of this thesis fulfilled all the different objectives set and therefore should help make eco-efficiency assessment more accessible to all the decision-makers related in one way or another to the dairy processing industry.
Cossard, Nicolas. "Un environnement logiciel de modélisation et d'optimisation pour la planification de la production dans la chaîne logistique." Clermont-Ferrand 2, 2004. http://www.theses.fr/2004CLF21548.
Full textJose, Flores Alberto. "Contribution aux méthodes de conception modulaire de produits et processus industriels." Grenoble INPG, 2005. http://www.theses.fr/2005INPG0127.
Full textThe object of this work is the set up of strategies aiming at developing a variety of products. It proposes the development of modules of components for the optimal construction of products according to the advantages reached in the fields of production and design. When we need to obtain a certain variety of production in a very short time and with limited costs, the fact of using several times a module allows to configure and to obtain products quite fast, as well as the reduction of several costs such as those linked to the design and the production management. The use of common modules (called platform in this work) also allows delayed differentiation of the products on the production line. In situations of high diversity, which enable millions of combinations of components, how should we choose the platform components? Moreover, that choice needs to fit in the order of the production sequence to enable delayed differentiation. To solve that problem, one strategy is to reorganize the production sequence. Then, another problem we will have to face would be the reorganization of that sequence. The purpose of this study is to propose tools for the decision making allowing to solve these problems : a proposition for the choice of platform components corresponding to the necessary components of the products a proposition for the choice of the production sequence as well as for the choice of the platforms, algorithms allowing to optimize such choices
Tichadou, Stéphane Hascoët Jean-Yves Legoff Olivier. "Modélisation et quantification tridimensionnelles des écarts de fabrication pour la simulation d'usinage." [S.l.] : [s.n.], 2005. http://castore.univ-nantes.fr/castore/GetOAIRef?idDoc=00000.
Full textAbdelfeteh, Sadok. "Formulation de matériaux de construction à base de sous-produits industriels avec des méthodes issues de l’intelligence artificielle." Thesis, Lille 1, 2016. http://www.theses.fr/2016LIL10077/document.
Full textThe environmental issue has become a major concern for governments and industry. Effective waste management is part of the priority actions in order to achieve a green circular economy. This efficient management imposes first maximum recovery of waste, knowing the large tonnage produced is different sectors. The field of civil engineering is particularly concerned. The valorization of alternative materials in the field of civil engineering has grown significantly in recent years. However, this practice knows some limitations, including the lack of clear regulations and especially the lack of tools and methods suitable for design of materials including alternatives materials. In this context, the present work focuses on the development of mix design method of building materials based on industrial by-products. This hybrid method combines the Genetic Algorithms (GA) as multi-objective optimization tools and Genetic Programming (GP) in its two versions, classical GP and MGGP (MultiGene Genetic Programming) as modeling tools for complex problems by Machine Learning approach. Specific studies were carried out also or these innovative tools, to demontrates benefits and weaknesses of these tools on these applications in civil engineering. Finally, this method of formulation of building materials based on industrial sub products proposed in this work was tested on two case studies (design of high performance concrete and mortars made of alternative materials) and validated by the laboratory tests. The results are conclusive and promising to generalize the method to other applications of Civil Engineering
Heymann-Germa, Dominique. "Simulation et produits multimédias de formation : le jeu des contraintes." Grenoble 3, 1996. http://www.theses.fr/1996GRE39002.
Full textMultumedia tools with their potentialities of simulation appear particularly helpful for learning by doing. An experimentation let us bring the role and the nature of the constraints to light, which we have to integrate into a multimedia simulation in order to favour the engagment of a student in a simulated situation - we definited this engagment as indispensable preamble to learning by doing
Barrué, Hélène. "Approches eulérienne et lagrangienne pour la simulation numérique de suspensions d'hydroxyde d'alumine dans des cristalliseurs industriels." Toulouse, INPT, 1998. http://www.theses.fr/1998INPT019G.
Full textMontan, Séthy Akpémado. "Sur la validation numérique des codes de calcul industriels." Paris 6, 2013. http://www.theses.fr/2013PA066751.
Full textNumerical verification of industrial codes, such as those developed atEDF R&D, is required to estimate the precision and the quality ofcomputed results, even more for code running in HPC environments wheremillions of instructions are performed each second. These programsusually use external libraries (MPI, BLACS, BLAS, LAPACK). Inthis context, it is required to have a tool as nonintrusive aspossible to avoid rewriting the original code. In this regard, theCADNA library, which implement the Discrete Stochastic Arithmetic,appears to be one of a promising approach for industrial applications. In the first part of this work, we are interested in an efficientimplementation of the BLAS routine DGEMM (General Matrix Multiply)implementing Discrete Stochastic Arithmetic. The implementation of abasic algorithm for matrix product using stochastic types leads to anoverhead greater than 1000 for a matrix of 1024*1024 comparedto the standard version and commercial versions of xGEMM. Here, wedetail different solutions to reduce this overhead and the results wehave obtained. A new routine DgemmCADNA have been designed. This routine has allowed to reduce the overhead from 1100 to 35compare to optimized BLAS implementations (GotoBLAS). Then, we focus on the numerical verification of Telemac-2D computedresults. Performing a numerical validation with the CADNA libraryshows that more than 30% of the numerical instabilities occurringduring an execution come from the dot product function. A moreaccurate implementation of the dot product with compensated algorithmsis presented in this work. We show that implementing these kind ofalgorithms, in order to improve the accuracy of computed results doesnot alter the code performance
Marchal, Cathie. "Développement de méthodes analytiques pour la caractérisaton de lots industriels du nitroxyde SG1." Thesis, Aix-Marseille, 2013. http://www.theses.fr/2013AIXM4767.
Full textThis PhD study has been carried out within an industrial context. It deals with the characterization of industrial batches of N-tert-butyl-N- (1-diethylphosphono-2,2-dimethylpropyl)-N-oxyle also called SG1. The distinctive feature of this molecule lies in its specific stability which is uncommon for radical species. The first part of this study consists in developping an analytical method, easy to implement in a QC laboratory for the determination of SG1 purity. The chosen analytical method has been developped by High Performances Liquid Chromatography (HPLC). This method requires the use of a standard sample of SG1, which has been obtained after purification of industrial batches of SG1 by the use of an innovative separative technology, the Centrifugal Partition Chromatography (CPC). The identification of the impurities, suspected as the major impurities in the industrial batches of SG1, has been carried out by NMR spectroscopy or by the study of their fragmentation observed by Tandem Mass Spectrometry – Electrospray Ionization. The use of these techniques enabled the identification of fifteen major impurities. Quantification methods by various techniques (HPLC-UV, HPLC-MS, GC-MS, other direct spectroscopic techniques like Infra-Red or other types of techniques like conductimetry) have been developped for twelve impurities
Rey, Aurélien. "Mise au point de méthodes pour l'analyse de substances critiques issues des rejets industriels et de la fabrication des produits de la filière cuir." Phd thesis, Université Claude Bernard - Lyon I, 2012. http://tel.archives-ouvertes.fr/tel-01068734.
Full textRey, Aurélien. "Mise au point de méthodes pour l’analyse de substances critiques issues des rejets industriels et de la fabrication des produits de la filière cuir." Thesis, Lyon 1, 2012. http://www.theses.fr/2012LYO10015/document.
Full textTaking in account the increasing needs and demands in environmental and consumer protection, CTCis always seeking improvement in analytical methods and development of new ones dealing with leather,fabrics and aqueous samples. In this thesis, several new methods were developed to be able to handleanalytical requests dealing with leather and textile materials being parts of shoes, clothes and other leathergoods.A GC/MS method using chemical ionization was developed to detect short polychlorinated alkanes down to aconcentration of 0.6 μg/L in aqueous sample and 2 mg/kg in leather samples. Alkylphenols and theirethoxylates were similarly determined by GC/MS down to 0.05 μg/L.Flame retardants are another large class of chemicals becoming suspicious. Polybromodiphenylethers weredetermined in aqueous samples and leathers. The respective GC/MS highest limits of quantification (LOQ)were0.05 µg/l and 80 μg/kg. Other members of this class are hexabromocyclododecane andorganophosphates. Both were determined by LC/MS-MS with LOQ of about 6 mg/kg.Carcinogenic polyaromatic hydrocarbons were also determined in leather samples using GC/MS-MS down to250 μg/kg.The last improved GC/MS analytical method was handling sewage sludge seeking multi residues of organicpolluants down to the 0.1 µg/l level or below. The analytical performances developed or improved allowedfor an efficient and useful control of the various sample received from the CTC customers and followinginternational quality rules
Pioch, Magali. "Contribution à l'étude du devenir, en milieu urbain, pendant le ruissellement des eaux pluviales, des produits de fission émis en cas d'accident nucléaire." Montpellier 2, 1993. http://www.theses.fr/1993MON20068.
Full textHouel, Nicolas. "Intérêts et limites de l’analyse cinématique par imagerie : contribution à la réalisation de modèles cinématiques et dynamiques du mouvement : application au ski de fond et au saut à ski." Poitiers, 2004. http://www.theses.fr/2004POIT2318.
Full textSpecific parameters of top athletes' performance can rarely be identified other than under competition conditions. In ski jumping, and in the various disciplines of cross country skiing, the difference in performance among the top skiers worldwide is in the order of one percent. Analyses of the kinematics based on video systems that do not require the use of markers, are well suited to this type of study, provided that the movements studied are carried out within a restricted volume. This is not the case in skiing. Excluding competition conditions, motion sensors using accelerometers seem promising, as they permit a detailed analysis of performance mechanical parameters, as well as a follow-up during training. However, the use of such devices implies a rigorous and methodical analysis of the phenomenon under study. Beyond clinical analyses, dynamic or kinematical models, and their resulting simulations, appear as the most relevant tools in terms of understanding movements under sports conditions, and also in terms of prediction of the performance. Whereas the descriptive model, presented for the analysis of the drafting effect in cross country skiing, is certainly interesting from a pedagogical point of view, the modelling of the dynamics, presented to characterize the mechanical properties of the optimum vertical jump, appears as the most appealing, and probably the most effective method as far as understanding and simulating movement is concerned
Callot, Stanislas. "Analyse des mécanismes macroscopiques produits par les interactions rotor/stator dans les turbomachines." Ecully, Ecole centrale de Lyon, 2002. http://bibli.ec-lyon.fr/exl-doc/scallot.pdf.
Full textUnsteady phenomena produced by the relative motion between fixed and moving rows in turbomachinery is caracterized by different scales in space and time. From the numerical point of view, taking into account those effets requires new models. The purpose of this work is a better understanding of the unstaeday mechanisms in a multistage turbomachinery. In ordre to cast of any restricting hypothesis over the spatial periodicities, numerical simulations are operated over the whole circumference of each row. In the single stage case, it is shown that the unsteady flow presents a phase-lagged periodic condition which may be described by the double Fourier decomposition proposed by Tyler & Sofrin. The spatial modes precise the interaction between rows and a comparison is made with the Adamczyk's decomposition. The numerical simulation of a one and a half stage brings an extension of the analysis of the interactions in a multistage machine
Patard, Pierre-Alain. "Ingénierie des produits structurés : essais sur les méthodes de simulation numérique et sur la modélisation des données de marché." Lyon 1, 2008. http://www.theses.fr/2008LYO10187.
Full textThis thesis gathers a set of studies dealing with the problematic of numerical procedures and with the problematic of market data modelling met during the development of an equity derivatives valuation tool. The first part relates to the use of Monte Carlo and Quasi-Monte Carlo simulations in order to price derivatives. It insists more particularly on the choice and the implementation of uniform generators, on the techniques employed to simulate Gaussian variables and on the variance reduction procedures that can be applied to improve the convergence rate of the estimators. The second part relates to the modelling of the market parameters, which influence the stock price dynamic. The first two chapters deal successively with the zero curve construction and the implied volatility surface fitting under the no-arbitrage assumption. The third chapter resolves the European option-pricing problem in the presence of discrete cash dividends
Kadmi, Yassine. "Étude, modélisation et simulation de la formation des sous-produits de chloration émergents dans l’eau potable." Rennes, Ecole nationale supérieure de chimie, 2015. http://www.theses.fr/2015ENCR0023.
Full textChlorination is the primary method used to disinfect water for human consumption. Unfortunately, this process leads to the formation of chlorination by-products of (CBPs) due to the reaction between chlorine and the organic matter naturally present in the water. To identify and quantify the CBPs, which may coexist in chlorinated water, at trace or ultra-trace levels, we have developed advanced analytical methods. The methodology developed in this work is an approach specifically dedicated to the study and modeling, which incorporates the basic parameters (effect of pH, temperature, organic matter content, the initial chlorine dose,…) involved in the formation of several CBPs. To assess the influence of these parameters, kinetic-performed evaluations on different waters under an experimental design with a series of tests, which covered the various factors mentioned above. To do this, we have developed an experimental methodology to proceed with the completion of the analysis of CBPs formed simultaneously in the same water in the same operation. Furthermore, kinetic studies, predictive models of simultaneous formation of target molecules, were established. Subsequently, a model of the formation of such coupled byproducts, along with the consumption of chlorine in the water, was carried out. Thus, the methodology proposed in this study was tested and validated following an experimental study of chlorination performed on various types of natural waters
Abdelhedi, Mohamed. "Étude et réalisation d'un système de conception assistée par ordinateur pour la robotique : application à la simulation et à l'évaluation des performances de robots." Montpellier 2, 1988. http://www.theses.fr/1988MON20141.
Full textLeroy, Agnes. "Un nouveau modèle SPH incompressible : vers l’application à des cas industriels." Thesis, Paris Est, 2014. http://www.theses.fr/2014PEST1065/document.
Full textIn this work a numerical model for fluid flow simulation was developed, based on the Smoothed Particle Hydrodynamics (SPH) method. SPH is a meshless Lagrangian Computational Fluid Dynamics (CFD) method that offers some advantages compared to mesh-based Eulerian methods. In particular, it is able to model flows presenting highly distorted free-surfaces or interfaces. This work tackles four issues concerning the SPH method : the imposition of boundary conditions, the accuracy of the pressure prediction, the modelling of buoyancy effects and the reduction of computational time. The aim is to model complex industrial flows with the SPH method, as a complement of what can be done with mesh-based methods. Typically, the targetted problems are 3-D free-surface or confined flows that may interact with moving solids and/or transport scalars, in particular active scalars (e.g. the temperature). To achieve this goal, a new incompressible SPH (ISPH) model is proposed, based on semi-analytical boundary conditions. This technique for the representation of boundary conditions in SPH makes it possible to accurately prescribe consistent pressure boundary conditions, contrary to what is done with classical boundary conditions in SPH. A k-epsilon turbulence closure is included in the new ISPH model. A buoyancy model was also added, based on the Boussinesq approximation. The interactions between buoyancy and turbulence are modelled. Finally, a formulation for open boundary conditions is proposed in this framework. The 2-D validation was performed on a set of test-cases that made it possible to assess the prediction capabilities of the new model regarding isothermal and non-isothermal flows, in laminar or turbulent regime. Confined cases are presented, as well as free-surface flows (one of them including a moving body in the flow). The open boundary formulation was tested on a laminar plane Poiseuille flow and on two cases of propagation of a solitary wave. Comparisons with mesh-based methods are provided with, as well as comparisons with a weakly-compressible SPH (WCSPH) model using the same kind of boundary conditions. The results show that the model is able to represent flows in complex boundary geometries, while improving the pressure prediction compared to the WCSPH method. The extension of the model to 3-D was done in a massively parallel code running on a Graphic Processing Unit (GPU). Two validation cases in 3-D are presented, as well as preliminary results on a simple 3-D application case
Véjar, Andrés. "Génération de modèles de simulation adaptatifs, pilotée par les trajectoires produits." Phd thesis, Université Henri Poincaré - Nancy I, 2011. http://tel.archives-ouvertes.fr/tel-00670061.
Full textBoitier, Vincent. "Mise en oeuvre et contrôle d'un robot S. C. A. R. A. à 2 degrés de liberté, actionné par des muscles artificiels pneumatiques de Mckibben." Toulouse, INSA, 1996. http://www.theses.fr/1996ISAT0047.
Full textGarrault-Gauffinet, Sandrine. "Etude expérimentale et par simulation numérique de la cinétique de croissance et de la structure des hydrosilicates de calcium, produits d'hydratation des silicates tricalcique et dicalcique." Dijon, 1998. http://www.theses.fr/1998DIJOS057.
Full textKhoury, Elias. "Modélisation de la durée de vie résiduelle et maintenance prédictive : application à des véhicules industriels." Troyes, 2012. http://www.theses.fr/2012TROY0027.
Full textMaintenance has become in many fields such as the automotive field, a very important aspect due mostly to its economic dimension. In this context, we are interested in improving maintenance decision making in order to reduce its costs mainly. We focus specifically on the predictive maintenance approach using the residual useful lifetime (RUL) as a tool for decision support. The RUL integrates information about the state of a system and its environment in the past, present and future (prediction). At first, we consider degradation based failure models. We study and develop several models that can describe different behaviours of degradation and failure mechanisms. In particular we consider a case study on engine oil. For these different models, we propose methods to estimate the distribution of the RUL conditionally to the state of the system and its environment. Subsequently, we propose predictive maintenance strategies in several configurations and we show how the RUL can be used in decision making. The conducted studies show the benefit of using the RUL and allow us to quantify the resulting gain depending on the considered case and the way the RUL is used
Furieri, Bruno. "Erosion éolienne de tas de stockage de matières granulaires sur sites industriels : amélioration des méthodes de quantification des émissions." Phd thesis, Université de Valenciennes et du Hainaut-Cambresis, 2012. http://tel.archives-ouvertes.fr/tel-00853659.
Full textXia, Qing. "modèles et méthodes pour le génération de processus de fabrication reconfigurables." Thesis, Paris, ENSAM, 2017. http://www.theses.fr/2017ENAM0004/document.
Full textConventional manufacturing process planning approaches are inefficient to handle the process planning complexity induced by product variety and manufacturing dynamics. Reconfigurable process planning (RPP) is an emerging CAPP approach targeting to the generation of process plans for a product/part family. This thesis aims to give major contributions to the representation models and generation methods to support reconfigurable process planning at two granularity levels: product family and part family. The proposed approaches for RPP are compatible with an extended concept of product/part family which is defined by using the concept of “domain”. A feature-based product/part variety model is developed to represent the required information for RPP by using modular and platform-based techniques. Mathematical models and graph-based representation models are proposed to describe the reconfigurable process plan at two granularity levels. Based on the representation models, the generation methods and algorithms are then developed for RPP. In addition, a global framework is proposed to describe how the proposed RPP models and methods work together to handle the product/part variety and manufacturing dynamics. To test the feasibility of the proposed models and methods, a gear pump family and an oil pump body family are used as illustrative examples throughout this thesis
Schrive, Etienne. "Étude et réalisation d'un système de conception assistée par ordinateur pour la robotique : application à la programmation graphique des robots de type SCARA." Montpellier 2, 1988. http://www.theses.fr/1988MON20142.
Full textRharmaoui, Ahmed. "Contribution à la représentation des objets et des métiers industriels : application à la conception-préparation à la fabrication." Montpellier 2, 1993. http://www.theses.fr/1993MON20201.
Full textSoubacq, Stéphane. "Etude de la détente dynamique d'un plasma laser. : Influence du champ effectif laser." Pau, 2003. http://www.theses.fr/2003PAUU3022.
Full textThe first objective of this thesis was to analyze the breakdown of an air gap subjected to an high voltage and irradiated by a Nd:YAG laser. Experimental measurements of the breakdown thresholds show a dependence on the gas pressure. The introduction of the effective laser field on the time lag to breakdown allows to render an account of the experimental measurements. The second objective concerned a modeling of the optical gas breakdown. For the preionization phase, we simulated the evolution of the electronic density and temperature (ne1019cm-3, Te4×104K). The dynamic phase was modelized using a 2D aerodynamic code. The numerical results relating to only the neutrals at the LTE, describe the physical phenomena correctly (ellipsoidal shape, u104m/s, ne1018cm-3, T105K). Measurements of plasma expansion velocity, as well as electronic density measured by laser interferometry were carried out and compared with the numerical results
Gagnol, Vincent. "Modélisation du comportement dynamique des électrobroches UGV." Phd thesis, Université Blaise Pascal - Clermont-Ferrand II, 2006. http://tel.archives-ouvertes.fr/tel-00695243.
Full textZhang, Min. "Discrete shape modeling for geometrical product specification : contributions and applications to skin model simulation." Phd thesis, École normale supérieure de Cachan - ENS Cachan, 2011. http://tel.archives-ouvertes.fr/tel-00670109.
Full textLaugier, Christian. "Raisonnement géométrique et méthodes de décision en robotique : application à la programmation automatique des robots." Habilitation à diriger des recherches, Grenoble INPG, 1987. http://tel.archives-ouvertes.fr/tel-00325156.
Full textDecourselle, Thomas. "Etude et modélisation du comportement des gouttelettes de produits phytosanitaires sur les feuilles de vignes par imagerie ultra-rapide et analyse de texture." Phd thesis, Université de Bourgogne, 2013. http://tel.archives-ouvertes.fr/tel-00949360.
Full textBen, Jbara Noah. "Risk management in supply chains : a simulation and model-based approach." Thesis, Université Grenoble Alpes (ComUE), 2018. http://www.theses.fr/2018GREAI003.
Full textControlling risks is an important issue for companies. Far from being only the prerogative of natural disasters, the disruptions of today's supply chains can sometimes be caused by minor events amplified by the flaws of increasingly complex industrial organizations, causing severe economic losses.Risk management in supply chains is a recent theme and the proposed solutions are not yet able to meet the needs of practitioners. One of the solutions to analyse risks is using simulation. But, despite its effectiveness to cover the complexity of the chain, it still presents a major weakness which is the difficulty of implementation.The aim of this thesis is to facilitate and to adapt the simulation for risk analysis of supply chains. Thus, we have developed a modeling framework for simulation which enables an easy construction of models of supply chain structure, behavior and if the associated risks. This is done through the proposition of a set of meta-models and libraries, defined on the basis of the SCOR reference model. In addition, we proposed a translation guide for the translation of the conceptual model of supply chains into a simulation model and enabling testing risk scenario. Additionaly, we developed a library of simulation modules.A case study was conducted and the results show the relevance of the proposed approach
El, Jannoun Ghina. "Adaptation anisotrope précise en espace et temps et méthodes d’éléments finis stabilisées pour la résolution de problèmes de mécanique des fluides instationnaires." Thesis, Paris, ENMP, 2014. http://www.theses.fr/2014ENMP0077/document.
Full textNowadays, with the increase in computational power, numerical modeling has become an intrinsic tool for predicting physical phenomena and developing engineering designs. The modeling of these phenomena poses scientific complexities the resolution of which requires considerable computational resources and long lasting calculations.In this thesis, we are interested in the resolution of complex long time and large scale heat transfer and fluid flow problems. When the physical phenomena exhibit sharp anisotropic features, a good level of accuracy requires a high mesh resolution, hence hindering the efficiency of the simulation. Therefore a compromise between accuracy and efficiency shall be adopted. The development of space and time adaptive adaptation techniques was motivated by the desire to devise realistic configurations and to limit the shortcomings of the traditional non-adaptive resolutions in terms of lack of solution's accuracy and computational efficiency. Indeed, the resolution of unsteady problems with multi-scale features on a prescribed uniform mesh with a limited number of degrees of freedom often fails to capture the fine scale physical features, have excessive computational cost and might produce incorrect results. These difficulties brought forth investigations towards generating meshes with local refinements where higher resolution was needed. Space and time adaptations can thus be regarded as essential ingredients in this recipe.The approach followed in this work consists in applying stabilized finite element methods and the development of space and time adaptive tools to enhance the accuracy and efficiency of the numerical simulations.The derivation process starts with an edge-based error estimation for locating the regions, in the computational domain, presenting sharp gradients, inner and boundary layers. This is followed by the construction of nodal metric tensors that prescribe, at each node in the spatial mesh, mesh sizes and the directions along which these sizes are to be imposed. In order to improve the efficiency of computations, this construction takes into account a fixed number of nodes and generates an optimal distribution and orientation of the mesh elements. The approach is extended to a space-time adaptation framework, whereby optimal meshes and time-step sizes for slabs of time are constructed in the view of controlling the global interpolation error over the computation domain
Rakotozafy, Andriamaharavo. "Simulation temps réel de dispositifs électrotechniques." Thesis, Université de Lorraine, 2014. http://www.theses.fr/2014LORR0385/document.
Full textIndustrial controllers are always subjected to parameters change, modifications and permanent improvements. They have to follow off-the-shelf technologies as well as hardware than software (libraries, operating system, control regulations ...). Apart from these primary necessities, additional aspects concerning the system operation that includes sequential, protections, human machine interface and system stability have to be implemented and interfaced correctly. In addition, these functions should be generically structured to be used in common for wide range of applications. All modifications (hardware or software) even slight ones are risky. In the absence of a prior validation system, these modifications are potentially a source of system instability or damage. On-site debugging and modification are not only extremely expensive but can be highly risky, cumulate expenditure and reduce productivity. This concerns all major industrial applications, Oil & Gas installations and Marine applications. Working conditions are difficult and the amount of tests that can be done is strictly limited to the mandatory ones. This thesis proposes two levels of industrial controller validation which can be done in experimental test platform : an algorithm validation level called Software In the Loop (SIL) treated in the second chapter ; a physical hardware validation called Hardware In the Loop (HIL) treated in the third chapter. The SIL validates only the control algorithm, the control law and the computed references without taking into account neither the actual physical commands nor the physical input feedbacks managed by the Input/Output boards. SIL validation of the system where industrial asynchronous motor is fed and regulated by a three level Variable Speed Drive with a three level voltage source converter is treated in the second chapter with a particular modeling approach adapted to such validation. The last chapter presents the HIL validation with various hardware implementations (Field Programmable Gate Array (FPGA), processors). Such validation checks both the control algorithm and the actual physical Input/Output signals generated by the dedicated boards. Each time, the modeling approach is chosen according to the hardware implementation. Currently this work has contributed to the system validation used by General Electric - Power Conversion © (GE-PC) as part of their validation phase that is mandatory for Oil & Gas projects and Marine applications
Munier, Laurent. "Simulations expérimentale et numérique des effets retardés d'une explosion en milieu clos et en présence de produits liquides." Thesis, Aix-Marseille 1, 2011. http://www.theses.fr/2011AIX10091/document.
Full textIs it possible to model collateral effects due to an explosion (on a chemical facility for instance) occuring in a closed volume containing liquid chemical products storage units ?This thesis deals with a zerodimensionnal modelisation of such a 3D complex problem to asses the final thermodynamic state of chemical products released in the atmosphere. Developped sub-models take into account:- the unsteady time histories of the internal overpressure and temperature,- the unsteady liquid ejection (droplets sizes)- the unsteady modelisation of the local heat and mass transfers between the gas phase and the liquid phase- the unsteady ejection process of the resulting multiphase mixture in the environment.Models and sub-models are validated thanks to many experimental results
Petronijevic, Jelena. "Maîtrise globale des risques dans un projet de développement de produit." Thesis, Paris, HESAM, 2020. http://www.theses.fr/2020HESAE034.
Full textDevelopment projects, with their processes from design to manufacturing, aim to deliver a product with desired characteristics. Global risk management in the product development should, then, consider all the mentioned aspects: from process to final product. Unfortunately, this is often not the case. The thesis, hence, aims at representing project and process risks and their interactions. The proposed model includes interactions within development process and its product individually, but also those related to FBS framework. The solution focuses on risk assessment and presents identification, analysis and evaluation of risks. Simulator has been developed for risk evaluation. This tool is based on multi-agent technology, fuzzy cognitive maps, epidemiological simulation and design theory. The solution provides comprehensive process-product view on risk management. It enables bottom-up and, both, qualitative and quantitative risk representation. Monte Carlo, multi-view and “what if” analysis provide global risk evaluation
Nouri, Ahmed Saïd. "Généralisation du régime glissant et de la commande à structure variable : applications aux actionneurs classiques et à muscles artificiels." Toulouse, INSA, 1994. http://www.theses.fr/1994ISAT0003.
Full textHamerlain, Mustapha. "Commande hiérarchisée à modèle de référence et à structure variable d'un robot manipulateur à muscles artificiels." Toulouse, INSA, 1993. http://www.theses.fr/1993ISAT0013.
Full textMokhtarian, Hossein. "Modélisation intégrée produit-process à l'aide d'une approche de métamodélisation reposant sur une représentation sous forme de graphes : Application à la fabrication additive." Thesis, Université Grenoble Alpes (ComUE), 2019. http://www.theses.fr/2019GREAI013/document.
Full textAdditive manufacturing (AM) has created a paradigm shift in product design and manufacturing sector due to its unique capabilities. However, the integration of AM technologies in the mainstream production faces the challenge of ensuring reliable production and repeatable quality of parts. Toward this end, Modeling and simulation play a significant role to enhance the understanding of the complex multi-physics nature of AM processes. In addition, a central issue in modeling AM technologies is the integration of different models and concurrent consideration of the AM process and the part to be manufactured. Hence, the ultimate goal of this research is to present and apply a modeling approach to develop integrated modeling in additive manufacturing. Accordingly, the thesis oversees the product development process and presents the Dimensional Analysis Conceptual Modeling (DACM) Framework to model the product and manufacturing processes at the design stages of product development process. The Framework aims at providing simulation capabilities and systematic search for weaknesses and contradictions to the models for the early evaluation of solution variants. The developed methodology is applied in multiple case studies to present models integrating AM processes and the parts to be manufactured. This thesis results show that the proposed modeling framework is not only able to model the product and manufacturing process but also provide the capability to concurrently model product and manufacturing process, and also integrate existing theoretical and experimental models. The DACM framework contributes to the design for additive manufacturing and helps the designer to anticipate limitations of the AM process and part design earlier in the design stage. In particular, it enables the designer to make informed decisions on potential design alterations and AM machine redesign, and optimized part design or process parameter settings. DACM Framework shows potentials to be used as a metamodeling approach for additive manufacturing
Sobecki, Nicolas. "Upscaling of Thermodynamic Properties for Flow Simulation in Low Permeability Unconventional Reservoirs." Thesis, Sorbonne université, 2019. http://www.theses.fr/2019SORUS005/document.
Full textTight oil and shale gas reservoirs have a significant part of their pore volume occupied by micro (below 2nm) and mesopores (between 2 and 50nm). This kind of environment creates strong interaction forces in the confined fluid with pore walls as well as between its own molecules and then changes dramatically the fluid phase behavior. An important work has therefore to be done on developing upscaling methodology of the pore size distribution for large scale reservoir simulations. Firstly, molecular simulations are performed on different confined fluids in order to get reference thermodynamic properties at liquid/vapor equilibrium for different pore sizes. Then, the comparison with commonly used modified equation of state (EOS) in the literature highlighted the model of flash with capillary pressure and critical temperature and pressure shift as the best one to match reference molecular simulation results. Afterwards fine grid matrix/fracture simulations have been built and performed for different pore size distributions. Then, coarse grid upscaling models have then been performed on the same synthetic case and compared to the reference fine grid results. A new triple porosity model considering fracture, small pores and large pores with MINC (Multiple Interacting Continua) approach, has shown very good match with the reference fine grid results. Finally a large scale stimulated reservoir volume with different pore size distribution inside the matrix has been built using the upscaling method developed here
Haouari, Lobna. "MODELISATION ET SIMULATION DE L’INTRODUCTION DE TECHNOLOGIES RFID DANS DES SYSTEMES DE CONFIGURATION A LA DEMANDE." Thesis, Saint-Etienne, EMSE, 2012. http://www.theses.fr/2012EMSE0680/document.
Full textRadio Frequency IDentification allows quick and secure identification of objects. In mass customisation systems, RFID technologies can be peculiarly efficient, because they are able to support the complex flows of information which characterize these systems.In this study, we focus on RFID technologies effects on configuration to order (CTO) systems.We base the research on an existing case in order to obtain reliable information directly usable by decision makers. The rarity of studies offering quantitative, detailed and real case based measures makes the originality of this thesis.RFID technology implementation's effect is analysed by a discrete event simulation approach and is presented in two levels:The first level relates direct changes brought about by RFID (e.g. faster execution of the many checks due to the wide range of products, reduced workload for resources…). These changes have an impact on system's performance in terms of lead time, late orders' rate, etc.The second level is axed on deeper changes occurring due to the increased product visibility and the ease of collecting large amounts of data with an RFID technology.These changes mainly focus on the dynamic allocation of workload. Reconsidering of processes and proposing changes deeper than the simple direct technology impact is a breakthrough, in this study, because of the lack of publications highlighting this benefit adequately.In conclusion, RFID contribution in CTO systems and, extensively, in assembly to order systems may be undeniable. Moreover, beyond the direct technology impact, rethinking how the system works by exploiting the deeper potential of technology can increase profits
Guyot, Julien. "Particle acceleration in colliding laser-produced plasmas." Thesis, Sorbonne université, 2019. http://www.theses.fr/2019SORUS616.
Full textEnergetic charged particles are ubiquitous in the Universe and are accelerated by galactic and extragalactic sources. Understanding the origin of these "cosmic rays" is crucial in astrophysics and within the framework of high-energy-density laboratory astrophysics we have developed a novel platform on the LULI laser facilities to study particle acceleration in the laboratory. In the experiments, the collision of two laser-produced counter-propagating plasmas generates a distribution of non-thermal particles with energies up to 1 MeV. The aim of this work is to provide a theoretical framework to understand their origin. Magneto-hydrodynamic simulations with test particles show that the plasma collision leads to the growth of bubble and spike structures driven by the magnetic Rayleigh-Taylor instability and the generation of strong electric fields. We find that particles are accelerated to energies up to a few hundred of keV in less than 20 ns, by repeated interactions with these growing magnetic Rayleigh-Taylor perturbations. The simulations and a stochastic acceleration model recover very well the experimentally measured non-thermal energy spectrum. In conclusion, we have identified in the laboratory a new particle acceleration mechanism that relies on the growth of the magnetic Rayleigh-Taylor instability to stochastically energize particles. This instability is very common in astrophysical plasmas, with examples including supernovae remnants and coronal mass ejections, and we suggest that it may contribute to the energization of particles in these systems
Shandilya, Neeraj. "Study of the (nano) particles emission during mechanical solicitation and environmental weathering of the products." Thesis, Compiègne, 2015. http://www.theses.fr/2015COMP2170/document.
Full textEngineering nanomaterials (ENM) like metal oxide nanoparticles, carbon nanotubes, nanofibers, etc. possess various innovative properties and their industrial use creates new opportunities. However, they also present new risks and uncertainties. There is an ever growing production and use of the products containing these ENM, like nanocomposites or nanocoatings, which result in an increasing number of workers and consumers exposed to ENM upon their emission (in the form of aerosols) from the products containing them. One of the most favored approaches, to minimize this emission, would be a preventive one which would focus on altering the product’s material properties during its design phase itself without compromising with any of its added benefits.This thesis advocates this approach. It attempts to understand the ENM emission phenomenon and its yielding mechanisms on the basis of combined experimental and theoretical approaches. The experimental set-up, developed during this thesis, is equipped with the necessary elements which can (i) seek to reproduce the real life activities on a laboratory scale (ii) identify the emission mechanism (iii) carry out both qualitative as well as quantitative*analysis of the emitted ENM simultaneously. Whilst the mean chosen for applying the mechanical solicitation or stress is an abrasion process, for the environmental weathering, it is an accelerated UV exposure process in the presence of humidity and heat. The results suggest that depending upon 18 material and process properties/parameters, the microscopic entities present on the surface of a product, called asperities, undergo mainly 4 types of removal mechanisms during abrasion. It is these mechanisms that decide the shape, size and the number of the aerosol particles emitted. Moreover, for the given test samples and experimental conditions studied during the thesis, application of the mechanical stresses alone was found to generate the emitted ENM aerosols in which ENM is always embedded inside the product matrix, thus, a representative product element. In such a case, the emitted aerosols comprise of both nanoparticles as well as microparticles. But if the mechanical stresses are coupled with the environmental weathering, then the eventual deterioration of the product, after a certain weathering duration, may lead to the emission of the free ENM aerosols too. All these experimental findings, pertaining to the effect of the mechanical stresses alone, have also been put into the perspective with classical material and mechanics state laws using a predictive analytical model. A close agreement** of the estimated results of this model with the experimentally measured ones has validated its functioning. This model was used to perform a sensitivity analysis on the aforementioned 18 parameters to rank the influence of a25% variation in each of their values on the particle emission for the given conditions.Thus, during the present thesis, both experimental and theoretical approaches have been developed to study the emission. Despite the fact that these approaches are perfectible, they can still be used during product design phase for the product to be “nanosafe by design”
Mansuy, Mathieu. "Aide au tolérancement tridimensionnel : modèle des domaines." Phd thesis, Université de Grenoble, 2012. http://tel.archives-ouvertes.fr/tel-00734713.
Full textDel, Sorbo Dario. "An entropic approach to magnetized nonlocal transport and other kinetic phenomena in high-energy-density plasmas." Thesis, Bordeaux, 2015. http://www.theses.fr/2015BORD0336/document.
Full textHydrodynamic simulations in high-energy-density physics and inertial con nement fusion require a detailed description of energy uxes. The leading mechanism is the electron transport, which can be a nonlocal phenomenon that needs to be described with quasistationary and simplified Fokker-Planck models in large scale hydrodynamic codes. My thesis is dedicated to the development of a new nonlocal transport model based on a fast-moving-particles collision operator and on a first moment Fokker-Planck equation, simplified with an entropic closure relation. Such a closure enables a better description of the electron distribution function in the limit of high anisotropies, where small scale electrostatic instabilities could be excited. This new model, so called M1, is successfully compared with the well known nonlocal electron transport model proposed by Schurtz, Nicolaï and Busquet, using different collision operators, and with the reduced Fokker-Planck model, based on a small-anisotropies polynomial closure relation (P1). Several typical configurations of heat transport are considered. We show that the M1 entropic model may operate in two and three dimensions and is able to account for electron transport modifications in external magnetic fields. Moreover, our model enables to compute realistic electron distribution functions, which can be used for kinetic studies, as for the plasma stability in the transport zone. It is demonstrated that the electron energy transport may strongly modify damping of Langmuir and ion acoustic waves, while the simplified nonlocal transport models are not able to describe accurately the modifications of the distribution function and plasma wave damping. The structure of the M1 model allows to naturally take into account self-generated magnetic fields, which play a crucial role in multidimensional simulations. Moreover, magnetic fields could also be used for the focusing of energetic particles in alternative ignition schemes. The M1 model reproduces the results of the local transport theory in plasma, developed by Braginskii, in a broad range of degrees of magnetization and predicts new results in the nonlocal regime. This work constitutes a first validation of the entropic closure assumption in the weakly-anisotropic regime. It can be added to the existing tests, in the strongly-anisotropic regimes
Panadés-Barrueta, Ramón Lorenzo. "Full quantum simulations of the interaction between atmospheric molecules and model soot particles." Thesis, Lille 1, 2020. http://www.theses.fr/2020LIL1R022.
Full textWe aim at simulating full quantum mechanically (nuclei and electrons) the processes of adsorption and photoreactivity of NO2 adsorbed on soot particles (modeled as large Polycyclic Aromatic Hydrocarbons, PAHs) in atmospheric conditions. A detailed description of these processes is necessary to understand the differential day-nighttime behavior of the production of HONO, which is a precursor of the hydroxyl radical (OH). In particular, the specific mechanism of the soot-mediated interconversion between NO2 and HONO is to date not fully understood. Due to its particular relevance in this context, we have chosen the Pyrene-NO2 system. The first stage in this study has consisted in the determination of the stable configurations (transition states and minima) of the Pyrene-NO2 system. To this end, we have used the recently developed van der Waals Transition State Search using Chemical Dynamics Simulations (vdW-TSSCDS) method, the generalization of the TSSCDS algorithm developed in our group. In this way, the present work represents the first application of vdW-TSSCDS to a large system (81D). Starting from a set of judiciously chosen input geometries, the aforementioned method permits the characterization of the topography of an intermolecular Potential Energy Surface (PES), or in other words the determination of the most stable conformations of the system, in a fully automated and efficient manner. The gathered topographical information has been used to obtain a global description (fit) of the interaction potential, necessary for the dynamical elucidation of the intermolecular interaction (physisorption), spectroscopic properties and reactivity of the adsorbed species. To achieve this last goal, we have developed two different methodologies together with the corresponding software packages. The first one of them is the SpecificReaction Parameter Multigrid POTFIT (SRP-MGPF) algorithm, which is implemented in the SRPTucker package. This method computes chemically accurate (intermolecular) PESs through reparametrization of semiempirical methods, which are subsequently tensor decomposed into Tucker form using MGPF. This software has been successfully interfaced with the Heidelberg version of the Multi-configuration Time-DependentHartree (MCTDH) package. The second method allows for obtaining the PES directly in the mathematical form required by MCTDH, thence its name Sum-Of-Products Finite-Basis-Representation (SOP-FBR). SOP-FBR constitutes an alternative approach to NN-fitting methods. The idea behind it is simple: from the basis of a low-rank Tucker expansion on the grid, we replace the grid-based basis functions by an expansion in terms of a orthogonal polynomials. As in the previous method, an smooth integration with MCTDH has been ensured. Both methods have been successfully benchmarked with a number of reference problems, namely: the Hénon-Heiles Hamiltonian, a global H2O PES, and the HONO isomerization PES (6D)
Alfonso, Lizarazo Edgar. "Optimization of blood collection systems : Balancing service quality given to the donor and the efficiency in the collection planning." Thesis, Saint-Etienne, EMSE, 2013. http://www.theses.fr/2013EMSE0698/document.
Full textActivity reports of the French Blood Establishment (EFS) indicate a growing demand for Labile Blood Products (LBP) as red blood cells (RBC), platelets and plasma. To ensure the vital demand of labile blood products (LBP), it’s essential to optimize the logistics related with the collection of blood components. To deal with this situation, the EFS Auvergne-Loire carry out a reflection in order to use more efficiently the collection devices in fixed and mobile sites, to improve the quality of service offered to the donor and the efficiency of human resources. In this context we have developed in this thesis operational tools for (i) modeling of blood collection devices (ii) The regulation of flows donors (iii) Planning of bloodmobile collections.The method analysis of collection devices is based on techniques of discrete event simulation. A preliminary modeling of donors’ flow in fixed and mobile collection systems using Petri nets was conducted. For the regulation of flow of donors, i.e. the optimal capacity planning and appointment scheduling of blood collections, two approaches were considered: (a) Simulation based-optimization.(b) Mathematical Programming: Mixed integer nonlinear programming (MINLP) based on queuing networks and mathematical programming representation of discrete event systems. For planning of bloodmobile collections. Two models have been developed: (a) At the tactical level: Mixed integer linear programming (MIP) to determine the weeks in which the mobile collection must be organized in order to ensure the regional self-sufficiency of RBC. (b) At the operational level: Mixed integer linear programming (MIP) for the planning of human resources in charge of blood collections